Next Article
OpenAI adopts Google AI chips to reduce costs
OpenAI is now using Google's custom TPUs (tensor processing units) for apps like ChatGPT, aiming to cut costs and not rely so heavily on Microsoft.
This move helps OpenAI manage the growing expenses of running massive AI models, while exploring new tech beyond NVIDIA chips.
OpenAI wants to lower what it pays Microsoft
Google rarely lets outsiders use its TPUs, so this is a big deal—even if OpenAI can't access Google's latest hardware just yet.
Meanwhile, things with Microsoft are getting complicated: OpenAI wants to lower what it pays Microsoft and rethink some exclusivity deals.
All in all, OpenAI seems keen to keep its options open as the AI race heats up.