Future Tech

AI to boost datacenter capex by 28.5% and become the top server workload

Tan KW
Publish date: Fri, 28 Jun 2024, 06:16 PM
Tan KW
0 449,930
Future Tech

AI is currently the big driver in datacenter investment and will push capital expenditure on the facilities up by nearly 30 percent this year, and is also on track to become the top server workload by deployment within a few years.

The AI boom continues to shape datacenter spending, according to the latest Cloud and Data Center Market Snapshot report from analyst firm Omdia. It found that AI applications represent the fastest growing category when measured in number of servers deployed per year.

Omdia has forecast that AI will overtake most other server workloads - such as databases and analytics - this year in terms of server deployments, and is set to surpass telecoms by 2027. AI is consistently called out as the top investment priority during capex allocation.

Last year, Omdia's data pointed to AI accounting for all server spending growth. Now it claims demand for AI has accelerated datacenter investment, with 2024 projected to see a 28.5 percent increase in capex spending "backed by the corporate cash reserves of major hyperscalers."

Servers sales are set grow 74 percent to $210 billion this year - up from 2023's figure of $121 billion. However, datacenter thermal management spend is forecast to grow by 22 percent, reaching $9.4 billion. Power distribution infrastructure revenue will exceed $4 billion in revenue for the first time, and uninterruptible power supply revenue will grow 10 percent to $13 billion.

But when it comes to servers being procured for AI purposes, Omdia forecasts that the units destined for training AI models will increase at just five percent per year going forwards - compared with a rate of 17 percent for servers intended for inferencing.

The reason is that server demand for AI training is largely driven by a small number of hyperscalers. Those buyers focus on achieving maximum efficiency of their AI-optimized hardware, decreasing the need for as many servers.

AI training can be classified as an R&D activity, and will therefore become subject to plan-based budget allocation, according to Omdia - meaning a share of revenue reinvested.

Conversely, the number of servers needed for inferencing will grow as the number of people using AI applications increases.

Meanwhile, a side effect of the growth in demand for more powerful server hardware has been a parallel boom in the deployment of liquid cooling systems.

Omdia's data shows that single-phase direct-to-chip technology is by far the most popular variety of liquid cooling tech, thanks to its simplicity and maturity, and this is likely to remain the case.

In contrast, two-phase direct-to-chip cooling utilizes phase-change to manage higher chip loads and is currently a niche technology - but Omdia hinted at "significant growth prospects."

The growth of immersion cooling systems "fell short" last year because of regulatory and cost barriers, Omdia observes, and the technology is still largely seen as the preserve of high performance computing.

Total revenue for liquid cooling systems looks set to top $5 billion by 2028, according to Omdia's projections, crossing the $2 billion mark by the end of this year. ®

 

https://www.theregister.com//2024/06/28/datacenter_capex_tai/

Discussions
Be the first to like this. Showing 0 of 0 comments

Post a Comment