Cohere Launches Tiny Aya: 3.35B Open-Weight Models Supporting 70+ Languages for Offline Use

Read in other languages: 한국어日本語
LLM Feb 22, 2026 By Insights AI 1 min read 4 views Source

Overview

Cohere unveiled Tiny Aya at the India AI Summit on February 17, 2026 — a family of compact, open-weight multilingual models designed to run offline on standard laptops. The release targets language accessibility in regions underserved by English-centric AI tools.

Model Specifications

  • Parameters: 3.35 billion
  • License: Open-weight (MIT)
  • Training infrastructure: Single cluster of 64 H100 GPUs
  • Languages supported: 70+

Regional Variants

Cohere released region-specific fine-tuned variants:

  • TinyAya-Fire: South Asian languages — Hindi, Urdu, Bengali, Punjabi, Gujarati, Tamil, Telugu, Marathi
  • TinyAya-Earth: African languages
  • TinyAya-Water: Asia-Pacific, Western Asian, and European languages

Availability

Models are available on HuggingFace, Kaggle, and Ollama for local deployment, as well as through the Cohere platform API. The efficient training footprint — just 64 H100 GPUs — also positions Tiny Aya as a reference point for cost-effective multilingual model development.

Source: TechCrunch

Share:

Related Articles

Comments (0)

No comments yet. Be the first to comment!

Leave a Comment

© 2026 Insights. All rights reserved.