Cohere launches a family of open multilingual models
What Happened
Cohere's Tiny Aya models support over 70 langauges
Our Take
Cohere's right to go open and multilingual, but "over 70 languages" is a press release, not a product. The question's whether Aya actually competes on quality or just breadth.
Multilingual models are getting commoditized fast. Open + fine-tuning = you can support any language if the base is good enough. Cohere's betting they're good enough (probably they're fine—not amazing, not trash). Good for specific cases: customer support in Hindi, moderation in Turkish, etc.
But "open model for many languages" won't differentiate if base capability's average. We'll know in three months when people ship with it.
What To Do
If you need multilingual inference cheap, test Aya against Claude Opus on your specific languages—probably saves money, might lose quality.
Builder's Brief
What Skeptics Say
70-language support degrades sharply in low-resource languages where training data is thin; Cohere's open model ecosystem lags Meta and Alibaba in tooling, adoption, and community momentum, limiting real-world deployment.
Cited By
React
Get the weekly AI digest
The stories that matter, with a builder's perspective. Every Thursday.