Cohere, the $5.5 billion Toronto-based AI company, just launched Tiny Aya: an open-source family of models that supports 70+ languages and runs entirely offline on devices. No cloud required. No API fees. While OpenAI, Google, and Anthropic fight over who can build the biggest brain, Cohere went small and wide.
Roughly 1.5 billion people speak English as a first or second language. The other five-ish billion do not, and most lack reliable broadband. Current AI models, including GPT-4, degrade noticeably outside English and a handful of European languages. Cohere is targeting the markets that got skipped: remote clinics in sub-Saharan Africa, agricultural systems in Southeast Asia, local commerce across Latin America. Offline-first means these models can actually reach them.
Whether “70+ languages” is a genuine capability or a marketing number remains to be seen. GPT-4 claims multilingual support too, but performance in Yoruba or Bengali tells a different story. Google has Gemini Nano, Microsoft has Phi, and Meta keeps iterating on Llama, but none of them have packaged this many languages in a lightweight offline format. The enterprise AI market is projected to hit $150 billion by 2027, and most of that revenue will come from outside the United States. Cohere just built an interesting on-ramp.
Author’s note: This is not a sponsored post. I am the author of this article and it expresses my own opinions. I am not, nor is my company, receiving compensation for it. This work was created with the assistance of various generative AI models.