NVIDIA's new AI servers are 10x faster—here's why it matters
NVIDIA just announced its latest AI servers, claiming up to 10 times better performance for serving mixture-of-expert AI models, such as those from DeepSeek and Moonshoot AI in China.
With 72 leading chips packed inside each server, NVIDIA is stepping up its game as rivals like AMD—and to a lesser extent, Cerebras—compete in the AI hardware space.
What makes these servers special?
These new servers shine with "mixture-of-expert" AI models—think of them as smart systems that assign different tasks to specialized mini-experts inside the model.
DeepSeek helped kick off this trend earlier in 2025, and now OpenAI, Mistral (France), and Moonshoot are all on board too.
Thanks to super-fast connections between the chips, NVIDIA's servers handle these complex models quicker than most competitors.
The competition heats up
Not wanting to be left behind, AMD is planning its own multi-chip AI servers for next year.
With both companies racing ahead, expect even faster and more efficient AI tech soon.