Google announced a long-rumored datacenter CPU last month, but the Chocolate Factory may already be the third-largest designer of datacenter processors, according to research.
The search giant, like the other big cloud operators, makes its own custom silicon, in this case its Tensor Processing Units (TPUs). Although not offered for sale, shipments of Google’s custom silicon for its own bit barns reached the 2 million mark last year, which semiconductor research outfit TechInsights believes puts it behind only Nvidia and Intel in terms of market share.
TechInsights says it has gleaned these figures from its "unique expertise in the semiconductor supply chain," hints from memory players, and its own coverage of the ASIC market.
In a newly published report, TechInsights claims that Google's TPU shipments accelerated with each new generation, following the company's growth. With TPU v4 (introduced in 2021) and the emergence of large language models, the size of Google's chip business significantly increased (see chart below), benefiting its main ASIC partner, Broadcom.
The cloud giant's strategy is to run TPUs for internal workloads and Nvidia GPUs for cloud computing, and TechInsights says it believes Google actually has the largest installed base of AI accelerators across the industry and the largest AI computing infrastructure.
How did this happen? According to the report, the server market went through a "massive inventory correction" last year, as the hyperscalers bumped up the lifetime of their installed base (extending amortization by delaying replacing servers) and allocated more capital expenditure toward deploying accelerated servers and Nvidia GPUs.
This has already been noted by other industry watchers such as Omdia, which reported last year and earlier this year that the hyperscalers were pumping investment into beefier systems to meet the demands of AI, while extending the extending the lifecycle of existing servers.
When the market shares of AMD and Intel were diving in the first half of 2023, Google was "overtaking the former and chasing the latter," TechInsights claims, adding that it believes the Chocolate Factory's shipment value for Q1 2024 will move Google even closer to Intel, possibly poised to overtake it.
Google isn't the only cloudy company making its own custom processors and accelerators; Microsoft disclosed last year that it was working on Azure Cobalt, an Arm-based server CPU and the Maia 100 AI accelerator chip, and AWS has for years been using its own Graviton Arm-based server CPUs.
In fact, a report issued by Bernstein Research last year estimated that nearly 10 percent of servers worldwide are Arm-based, and more than 50 percent of those are deployed by AWS.
However, as the second chart (above) shows, AWS is still nowhere close to Google’s TPU market share, if TechInsights’ figures are correct, and the company says it expects the fifth generation (TPU V5e and TPU V5p) to be more widely deployed than previous ones because of the "explosive growth" of large language models such as Google's own Gemini.
Meanwhile, the cloudy giant's in-house Axion Arm-based processor is slated to be available later this year, and TechInsights reckons it will ramp faster than Graviton, as Google has the software infrastructure readily available for the chip.
This move is "necessary to keep pace with AWS, Microsoft, and, to a lesser extent, Alibaba," according to the report, but also shows how the datacenter semiconductor market is rapidly changing from one long dominated largely by Intel, and now driven by the emergence of AI and accelerated computing. ®
https://www.theregister.com//2024/05/21/google_now_thirdlargest_in_datacenter/
Created by Tan KW | Nov 17, 2024
Created by Tan KW | Nov 16, 2024
Created by Tan KW | Nov 16, 2024
Created by Tan KW | Nov 16, 2024
Created by Tan KW | Nov 16, 2024
Created by Tan KW | Nov 16, 2024
Created by Tan KW | Nov 16, 2024