Nvidia's Blackwell chip could push the company into a new stratosphere as the AI revolution continues
Nvidia stock (NVDA) is on pace for a comfortable triple-digit percentage gain again in 2024, after a nearly 240% surge in 2023.
This year’s rise is due in no small part to a product that wasn’t even shipped until the last quarter of the year — Blackwell.
It’s the largest GPU (graphics processing unit) ever built, created by connecting two dies via a high bandwidth interface (HBI). In layperson’s terms, that translates to a lot of power at high efficiency, which is why it’s been in such demand from the so-called hyperscalers — companies like Alphabet (GOOG) and Microsoft (MSFT) that are building out huge data centers to power large language models (LLMs).
That hot demand for Blackwell and the 170% surge in Nvidia shares is what led Yahoo Finance to name the chip its 2024 Product of the Year.
Read more: Here’s why Walmart won the 2024 Yahoo Finance Company of the Year award
“Technically, Blackwell is a beast,” Matt Kimball, an analyst at Moor Insights & Strategy, told Yahoo Finance in an email. “Blackwell is an exponential leap forward as it is a dual GPU on a single chip with faster connectivity, a larger pool of high bandwidth memory (HBM3E), and the introduction of NVIDIA’s decompression engine to make data processing much faster (up to 6x relative to Hopper).”
The breakthrough AI chip
First announced in March, Blackwell is in the right place at the right time. Data centers that power generative AI or LLMs are currently occupied with inputting data and training those models. Blackwell and its predecessor, Hopper, are well-suited for that work.
Blackwell represents an enormous advancement in power. It contains 208 billion transistors, more than two and a half times the number in Hopper.
Its GB200 NVL72 server, which combines 72 Blackwell GPUs with 36 Grace CPUs, clocks up to a 30x performance increase compared to the same number of Hopper GPUs for LLM inference workloads. It also uses up to 25x less energy.
“For us to leapfrog ourselves by an order of magnitude is pretty unheard of,” Dion Harris, director of accelerated data center, HPC (high-performance computing), and AI at Nvidia, said in a phone interview. “We were limited by physics, but we recognized that innovation with the HBI would allow us to extend the die-level communication and compute.”
US business spending on generative AI has sextupled in a year, going from $2.3 billion in 2023 to $13.8 billion in 2024, according to Menlo Ventures. And the trend is only going up, as major companies from banking and retail to tech and hospitality race to introduce advanced chatbots and assistants to their customers.
Nvidia’s clients are paying dearly for their chips. The company doesn’t break out Blackwell sales specifically, but KeyBanc analyst John Vinh estimates that Blackwell will account for $4 billion to $5 billion in sales in the current quarter.
“After Q4, we should start to see a pretty quick ramp,” he said in a phone interview. “For next year, we’re modeling $187 billion in compute data center revenues, of which 80% of that should be Blackwell.”
“Having such dominant market share, they have pricing power,” he added. “As a hyperscaler, you pretty much have to pay Nvidia whatever they want.”
Nvidia’s rivals such as AMD (AMD) and even Amazon (AMZN) are working hard to develop their own alternatives. How long the GPU giant’s AI chip dominance can last has been up for debate ever since its shocking upside revenue beat in May 2023.
Can Nvidia’s dominance continue?
Those questions have gotten louder as of late. Its shares have pulled back 12% from their record close on Nov. 7. And since the beginning of December, other semiconductor news has been getting buzz, including Amazon’s Trainium2 and Nova semiconductors.
Broadcom (AVGO) has been gaining steam following its earnings report, catapulting its market cap to over $1 trillion. The crux of the excitement is the company’s contract AI chip business, which CEO Hock Tan said would reach a “total serviceable market” of $60 billion to $90 billion by 2027.
That said, “No one has been able to come out with a successful custom solution” that matches Nvidia’s, Vinh said.
Because Nvidia’s earnings have been growing at a rapid clip alongside its share price, its valuation has remained competitive with other chipmakers. Its forward price-to-earnings ratio, for example, stands at nearly 31, compared to 25 for AMD and 46 for Broadcom, according to Yahoo Finance data.
Companies like Amazon remain Nvidia customers even as they develop competing products. Nvidia dominates the market for training generative AI models — accounting for roughly 85% share — “and they’re not losing,” said Ben Bajarin, CEO of Creative Strategies.
But once that training is complete, the models will turn to inference — putting all that information they’ve digested and learned to work. The power needed for inference is generally lower than that of training, though Nvidia CEO Jensen Huang has consistently maintained that his company’s chips are well-suited to that stage as well.
One factor in whether customers will continue to buy is cost. In addition to being the most advanced chips, Nvidia’s are also the most expensive. The company touts its TCO, or total cost of ownership, emphasizing computing power relative to energy usage over the life of the chips.
“I doubt they’ll have the same market share they have today in inference,” Bajarin said. “There will be much more competition for inference.”
Thus far, however, Nvidia’s biggest point of friction has been its own supply chain. A chip of Blackwell’s complexity and power has hit manufacturing and design snags, as Huang has acknowledged. After delays, the chip is expected to ship this Q4 and ramp up production next year.
“Blackwell production is in full steam,” Huang said in the company’s latest earnings call. “I think we’re in great shape with respect to the Blackwell ramp at this point.” Still, the company said demand will exceed supply for several quarters.
Nvidia’s pace of overall sales growth is predicted to slow over the next year, a factor of the “law of large numbers,” Jordan Klein, analyst at Mizuho Group, said in an email. Nvidia’s sales rose by 126% in calendar 2023 over the prior year. They’ll rise by 111% this year, according to analysts surveyed by Bloomberg, and 52% next year.
“Nothing to worry about really, demand still exceeds supply,” Klein said.
Read more about Yahoo Finance’s 2024 company awards:
Julie Hyman is the co-host of Market Domination on Yahoo Finance. You can find her on social media @juleshyman.
Click here for the latest stock market news and in-depth analysis, including events that move stocks
Read the latest financial and business news from Yahoo Finance