Future Tech

Delays? We're still shipping 'small quantities' of Nvidia's GB200 in Q4, Foxconn insists

Tan KW
Publish date: Thu, 15 Aug 2024, 06:09 AM
Tan KW
0 466,459
Future Tech

Nvidia's alleged Blackwell supply problem may not be as bad as first thought, according to Foxconn executives who claimed they would begin shipping a small volume of GB200 systems in the fourth quarter.

"We are on track to develop and prepare the manufacturing of the new AI server to start shipping in small volumes in the last quarter of 2024, and increase the production volume in the first quarter of next year," Foxconn spokesperson James Wu said in a report.

However, Wu hinted that the product's timeline may have changed, noting that it's normal for shipment schedules to shift when specs and technologies are upgraded. Whether or not this was actually the case for Nvidia's Blackwell parts, Wu insisted that Foxconn would be the first supply to ship GB200 accelerators.

The GB200, announced this spring, is the second generation of Nvidia's Grace-superchip family, featuring a pair of 1,200W Blackwell GPUs alongside a 72-core Grace CPU. In its full form, 36 of these GB200 superchips - 72 GPUs in total - are designed to be packed into 18 1U servers all interconnected by high-speed NVLink switch fabrics. Dubbed the DGX NVL72, the system boasts 13.5TB HBM3e and 1.44 exaFLOPS FP4 performance.

The Foxconn executive's remarks come a little over a week after reports surfaced claiming that Nvidia had warned Microsoft that shipments of its Blackwell GPUs had been delayed until the first quarter of 2025.

The alleged issue is that Nvidia and its manufacturing partner TSMC may have run into challenges with the advanced packaging tech used to stitch together the compute dies HBM3e memory modules together. Making matters worse, CoWoS capacity remains extremely limited with TSMC CEO C.C. Wei warning that the AI chip shortage could last through 2025.

As a result, Nvidia is said to be prioritizing its flagship GB200 parts over its lower spec HGX B100 and B200 configurations and will bring a new, trimmed-down version of Blackwell called the B200A to market. That chip will allegedly be monolithic and feature four HBM stacks, making it roughly half the size of the chip we looked at back at this spring.

In a statement to The Register in response to the reports an Nvidia spokesperson reiterated that broad Blackwell sampling had begun and production was on track to ramp in the second half.

Nvidia had previously promised Blackwell would start making its way into customer's hands in the latter half of 2024. At the time, this led us to believe that a small number of Blackwell chips would reach the market in Q4 with the vast majority reaching customers in 2025.

There's also the matter of Nvidia's H200, which only began shipping in volume in the third quarter. The parts are essentially a bandwidth boosted version of the venerable H100 boasting 141GB of HBM3e good for 4.8TB/s of memory bandwidth. These factors should make the H200 a popular option for large language model (LLM) inferencing as performance is largely limited by memory bandwidth and capacity.

However, the H200 also poses a potential problem for Nvidia's forthcoming B200A. Assuming Nvidia just cut the original B200 in half to make it, it'd have a capacity of 96GB capable of 4TB/s of memory bandwidth.

The B200A may not offer much if any performance uplift either, considering the top spec'd part only boasted about 2.5x the 8-bit floating point performance of its Hopper counterparts. Cut that in half and you're potentially looking at as little as a 25 percent uplift. Of course, if Nvidia maintains the B200's 1,000W power target, the performance uplift could be higher depending on how far they can push the clocks.

Having said that, if Nvidia has actually run into production challenges and now has a bunch of Blackwell dies that can't be stitched together, a cut down version would be a pretty easy way to salvage existing inventory, especially if they can be sold at a lower cost. ®

 

https://www.theregister.com//2024/08/14/nvidia_foxconn_blackwell/

Discussions
Be the first to like this. Showing 0 of 0 comments

Post a Comment