Google parent Alphabet (GOOG, GOOGL) on Wednesday said that it plans to sell its custom Tensor Processing Units (TPUs) to select customers who will install the chips in their own data centers.
The move is a change from Google’s prior strategy, which saw it rent out TPU capacity to customers from its own data centers — and is yet another strike at AI chip king Nvidia (NVDA).
The announcement, during the company’s Q1 earnings call, comes a week after Alphabet announced two new TPUs: its TPU 8t for AI training and TPU 8i for inferencing.
“As TPU demand grows from AI labs, capital markets firms, and high-performance computing applications, we’ll begin to deliver TPUs to a select group of customers in their own data centers in a hardware configuration to expand our addressable market opportunity,” Alphabet CEO Sundar Pichai said during the company’s first quarter earnings call.
Alphabet didn’t disclose potential customers, but it signed a multiple-gigawatt agreement for next-generation TPUs with Anthropic (ANTH.PVT) earlier this month, with chips expected to begin coming online in 2027.
And according to The Information, Alphabet has also entered into a multibillion-dollar chip deal with Meta (META).
Alphabet’s TPU maneuvers put it into ever greater competition with Nvidia, which has largely dismissed any fears that Alphabet’s offerings will erode its lead in the space, saying that its chips offer greater flexibility for AI developers.
Google isn’t the only company moving in on Nvidia’s turf. Amazon (AMZN) is also offering up its own chips to customers.
In his annual shareholder letter, Amazon CEO Andy Jassy said that the company’s chip business, which includes its Graviton, Trainium, and Nitro processors, has an annual revenue run rate of greater than $20 billion.
But because Amazon only monetizes its chips through its AWS EC2 (Elastic Compute Cloud) service, the CEO explained that $20 billion is likely an understatement and that it would probably be closer to $50 billion.
Like Google, Amazon signed a new agreement for 5 gigawatts of AI chip capacity with Anthropic, but also inked a deal for 2 gigawatts of chips with OpenAI.
On the CPU side, Amazon said it will deploy its AWS Graviton chips for Meta (META) to use across its agentic AI workloads.
Email Daniel Howley at dhowley@yahoofinance.com. Follow him on X/Twitter at @DanielHowley.
Click here for the latest technology news that will impact the stock market
Read the latest financial and business news from Yahoo Finance
