Openai has recently begun to rent Google’s Artificial Intelligence Chips, Power Chatgpt and its other products, a source close to the case on Friday told Reuters.
The graphics of the Chatgpt manufacturer Nvidia is one of the largest buyers of the processing units (GPU), using AI chips to train the model as well as for computing, a process in which an AI model uses its trained knowledge to create predictions or decisions based on new information.
Openai planned to connect Google Cloud Services to meet its growing needs for computing capacity, the Reuters specifically reported earlier this month, marking a surprising cooperation between two leading contestants in the AI region.
For Google, the deal comes as it is expanding the external availability of its in-house tensor processing units (TPU), which were historically reserved for internal use. This helped Google to win customers including Big Tech Player Apple as well as startups such as anthropic and safe superintendent, two chat-make competitors were launched by former Openai leaders.
Openai has used non-Nvidia chips to the first time to hire Google’s tpus and shows the shift of the Sam Altman-led company to the shift of Backer Microsoft’s data centers. It can potentially promote TPU as a cheap option for the GPU of NVidia, according to the information, which reported the first growth.
Openai hopes that TPUS, which he rents via Google Cloud, will help reduce the cost of estimate.
However, Google, Google, an Openai contestant in the AI race, is not renting his most powerful TPU to his rival, the information said citing Google Cloud Employee.
Google refused to comment, while Openai did not immediately respond to Reuters when contacted.
In addition to Openai in its customer list, Google showed how tech giants have capitalized on their in-house AI technology from hardware to software to accelerate the growth of their cloud business.
© Thomson Reuters 2025