Page Loader
Summarize
How Google is helping OpenAI reduce its dependence on NVIDIA
OpenAI is renting Google AI chips

How Google is helping OpenAI reduce its dependence on NVIDIA

Jun 29, 2025
06:36 pm

What's the story

OpenAI has started renting Google's artificial intelligence (AI) chips to power its ChatGPT and other products. The move comes as part of a collaboration between two major players in the AI space. OpenAI is one of the top customers of NVIDIA's graphics processing units (GPUs), using them for model training and inference computing. Google is now helping OpenAI lower this dependence.

Strategic partnership

Google expanding external availability of TPUs

Earlier this month, Reuters reported that OpenAI was looking to add Google Cloud service to its growing computing capacity needs. The collaboration comes as Google is expanding the external availability of its tensor processing units (TPUs), that were historically reserved for internal use. The move has already attracted customers like Apple and start-ups such as Anthropic and Safe Superintelligence, both founded by former OpenAI leaders.

Chip transition

A major shift for OpenAI

The decision to rent Google's TPUs marks a major shift for OpenAI, as it is the first time the company has significantly used non-NVIDIA chips. This also shows a departure from its reliance on Microsoft's data centers.

Cost reduction

Google isn't renting its most powerful TPUs to OpenAI

OpenAI hopes that by renting TPUs through Google Cloud, it can bring down the cost of inference. However, it's worth noting that Google is not renting its most powerful TPUs to OpenAI, as per a report by The Information citing a Google Cloud employee. The tech giant has not commented on the matter while OpenAI did not immediately respond when contacted by Reuters.