OpenAI turns to Google’s AI chips to power its products, states report..!!

MKJEDI Ranger

06-29 0:39

Hey Questers,

Hope u all r doing fine... 


OpenAI, the company behind ChatGPT, has started renting artificial intelligence chips from Google to power its tools like ChatGPT, according to a report from Reuters.

Figure 1, view larger image

Until now, OpenAI mostly used Nvidia’s powerful AI chips (GPUs) for both training its AI models and for running them (called inference). But with growing demand, OpenAI needed more computing power and decided to also use Google Cloud services.


This is surprising because Google and OpenAI are major rivals in the AI space. Still, Google agreed to rent out its own AI chips called TPUs (Tensor Processing Units), which were once used only by Google itself. Google is now making these chips available to others, including big names like Apple and other OpenAI competitors like Anthropic.


This is the first time OpenAI is using non-Nvidia chips in a big way. It also shows that OpenAI is becoming less dependent on Microsoft, its main financial backer whose data centers it usually uses.


The main reason for using Google’s TPUs is that they may be cheaper than Nvidia’s GPUs, which could help OpenAI cut costs, especially when running AI tasks.


However, reports say Google is not giving OpenAI access to its most powerful TPUs, possibly to keep a competitive edge. Both Google and OpenAI have not commented on the deal yet.


Overall, this move shows how Google is using its own AI hardware and cloud services to grow its cloud business—even by working with competitors like OpenAI.


Signing Off, 

@MKJEDI

Ranger@iQOO Connect

Figure 2, view larger image

Tech