OpenAI Says It Has No Plan to Use Google’s In-House Chip
OpenAI said it has no active plans to use Google’s in-house chip to power its products, two days after Reuters and other news outlets reported on the AI lab’s move to turn to its competitor’s artificial intelligence chips to meet growing demand.
A spokesperson for OpenAI said on Sunday that while the AI lab is in early testing with some of Google’s tensor processing units (TPUs), it has no plans to deploy them at scale right now.
Google declined to comment.
While it is common for AI labs to test out different chips, using new hardware at scale could take much longer and would require different architecture and software support. OpenAI is actively using Nvidia’s graphics processing units (GPUs), and AMD’s AI chips to power its growing demand. OpenAI is also developing its chip, an effort that is on track to meet the “tape-out” milestone this year, where the chip’s design is finalized and sent for manufacturing.
OpenAI has signed up for Google Cloud service to meet its growing needs for computing capacity, Reuters had exclusively reported earlier this month, marking a surprising collaboration between two prominent competitors in the AI sector. Most of the computing power used by OpenAI would be from GPU servers powered by the so-called neocloud company CoreWeave.
Google has been expanding the external availability of its in-house AI chips, or TPUs, which were historically reserved for internal use. That helped Google win customers, including Big Tech player Apple, as well as startups like Anthropic and Safe Superintelligence, two ChatGPT-maker competitors launched by former OpenAI leaders.
© Thomson Reuters 2025
Source link