Sarvam AI launches LLMs for all 22 Indian languages
Sarvam AI just launched Sarvam-30B and Sarvam-105B, two large language models trained on trillions of tokens and supporting all 22 official Indian languages.
Announced at the India AI Impact Summit, these models aim to make AI more local and accessible.
Sarvam-30B and Sarvam-105B: What's the difference?
Sarvam-30B is tuned for fast chats and voice assistants, using a billion parameters with a 32k-token context—great for real-time tasks.
Sarvam-105B takes things up a notch with nine billion parameters and a huge 128k-token context for deeper reasoning. It even beat GPT-120B on some tough benchmarks.
Other tools include text-to-speech and document vision
Alongside the LLMs, Sarvam AI rolled out text-to-speech, speech-to-text, and document vision tools—like reading Indian scripts with 84% accuracy.
Sarvam plans to open-source the 30B and 105B models (open-source timing not specified in the source article), with super affordable pricing, making advanced AI way more accessible than global options.