You get an open model which is a 95% of Opus 4.6 quality and 80% cheaper in most inference providers and also can run on your own hardware
Also they did the hard parts of:
* crawling the content
* running the fine tuning (or training)
Better than 1 or 2 companies taking control of the whole AI economy
Name training is always shallow, Claude itself would claim it's GPT-3, GPT-4, or Reddit (heh) when confused. It's just dataset contamination, because the web is full of slop. Never trust self-reported names.
Gemini 2.0 Exp 1206 was reported to be indirectly trained on Claude's outputs with humans in between [1], which was pretty consistent with its outputs at the time. No other Gemini versions except two experimental ones were similar to Claude.
[1] https://techcrunch.com/2024/12/24/google-is-using-anthropics...