d-Matrix raises $275 million to accelerate AI inference
Sid Sheth and Sudeep Bhoja, the Indian-born engineers behind d-Matrix, are making big moves in AI infrastructure.
Analysts estimate that inference accounts for more than 60% of total AI compute spend at major hyperscalers.
Just last November (November 2025), they secured $275 million in new funding to boost their platform, aiming for faster speeds and much better energy efficiency than standard GPUs.
DIMC and JetStream NICs
d-Matrix's Corsair platform uses digital in-memory compute (DIMC) as part of a broader platform that can deliver super-fast performance—think up to 30,000 tokens per second on Llama 70B—and support models with up to 100 billion parameters in a single rack.
They're also rolling out JetStream NICs; full production was expected by year-end 2025 (Dec 2025) to help scale even further, keeping them ahead in the race for smarter, greener AI.