Sarvam 105B: First Competitive Indian Open-Source LLM with Indic Language Support

Sarvam AI has released Sarvam 105B, a 105-billion parameter open-source large language model built specifically for Indian languages and multilingual tasks, positioning it as the first competitive open-source LLM from India's AI ecosystem. The model supports all 22 scheduled Indian languages alongside English and competes with international open-weight models like Llama and Qwen at equivalent parameter scales on Indic benchmarks. Released via the Sarvam API and available for research, the 105B model is accompanied by a smaller 30B variant; the launch attracted 117 Hacker News points and 30 comments on March 7, 2026.

Key Takeaways

  • Sarvam 105B supports all 22 scheduled Indian languages plus English — trained on Indic multilingual data at scale; accompanied by a 30B model for lower-resource deployment scenarios
  • First Indian open-source LLM described as "competitive" with international models on Indic benchmarks; targets developers building Indian language apps, voice assistants, and enterprise AI tools for South Asian markets
  • Announced via Sarvam AI blog at sarvam.ai/blogs/sarvam-30b-105b; 117 HN points and 30 comments on March 7, 2026 — available via Sarvam API for developers

Original source: Sarvam AI