NVIDIA CEO Jensen Huang announced at GTC 2024 that the Blackwell B200 AI chip could cost between $30,000 and $40,000 per unit, surpassing its predecessors in performance and efficiency.
NVIDIA CEO Jensen Huang, right after GTC 2024, confirmed that the company’s cutting-edge AI chip, Blackwell B200, might cost anywhere between $30,000 and $40,000 per unit. According to a report by CNBC, NVIDIA has spent over $10 billion on research and development of the Blackwell chip, which outperforms all the other AI chips in the market by a huge margin.
The new Blackwell AI chip, currently high in demand, is slightly more expensive than NVIDIA’s H100 Hopper AI processor, which is said to cost $25,000 to $40,000, introduced back in 2022. This price includes not just the chip but also the cost of integrating the chip into the data centre.
Fueled by the new NVIDIA Blackwell architecture, the groundbreaking NVIDIA GB200 NVL72 arrives to power a new era of computing and generative AI with unparalleled performance, efficiency, and scale. #GTC24
— NVIDIA Data Center (@NVIDIADC) March 18, 2024
Compared to the H100, the B200 is not only more powerful but also more power-efficient. NVIDIA announced three different variants of the latest AI accelerators on Monday — B100, a B200, and a GB200, which combines two Blackwell chips with an ARM-based CPU. Blackwell is four times faster than Hopper, and the B200 has a whopping 288GB of HBM3e memory, meant for AI workloads, cloud, and data centres.
In terms of benchmark numbers, the B200 offers 7 times more performance than the H100 on the GPT-3 benchmark, and 2000 Blackwell chips could train a model like GPT-4 with over 1.8T parameters in just 90 days.
A single Blackwell AI processor can offer up to 20 petaflops of FP4 horsepower and consists of 208 billion transistors. The GB200, which combines two Blackwell AI processors into one, can deliver real-world AI performance for 30x LLM inference workload performance with 25 per cent more power efficiency than the H100.
Source:indianexpress.com