Sarvam's new AI models support all 22 Indian languages
Sarvam AI just launched three new large language models (LLMs) at the India AI Impact Summit, built specifically for Indian languages.
These models—one with 3 billion, one with 30 billion and another with 105 billion parameters—cover all 22 official Indian languages and are optimized for voice-first use, making chatting in your own language a lot smoother.
The 2 LLMs are optimized for different use cases
The 30B model is tuned for fast, real-time chats with a big context window (32,000 tokens), while the beefier 105B model can tackle more complex stuff thanks to an even larger context window (128,000 tokens).
Plus, Sarvam unveiled a vision model for parsing documents involving Indian scripts.
The models run efficiently thanks to a 'mixture-of-experts' setup
Both LLMs use a "mixture-of-experts" setup that only fires up the parts needed for each question. This means they run efficiently without wasting resources.
Training was backed by India's government and powered by Yotta and NVIDIA gear.
Sarvam plans to open-source these models soon.
Competing with global giants in India's AI market
Having raised more than $50 million from investors including Lightspeed Venture Partners and Khosla Ventures, Sarvam is positioning its homegrown models to compete with global players such as Google and OpenAI in India's booming AI scene.