Nvidia Stock Investors Just Got Great News From CEO Jensen Huang. It's Time to Buy.
Nvidia (NVDA 1.56%) has been a cornerstone of the artificial intelligence (AI) trade since it became a popular investment theme following OpenAI’s release of ChatGPT in late 2022. The stock has advanced 1,100% since January 2023, but shares have added just 1% in the past six months.
Fortunately, CEO Jensen Huang just gave investors great news that should ease concerns about whether AI spending is sustainable and whether Nvidia can hold its leadership position in AI infrastructure. Here are the important details.
Image source: The Motley Fool.
Jensen Huang says data center spending will reach $3 trillion to $4 trillion annually by 2030
Many investors are worried about the sustainability of AI spending, but Nvidia CEO Jensen Huang attempted to quell those concerns during the company’s fourth-quarter earnings call in February. “Compute demand is growing exponentially — the agentic AI inflection point has arrived,” he told analysts.
Companies are developing increasingly complex models to keep pace in the AI arms race. For instance, the GPT-3 models developed by OpenAI arranged text based on probabilities, meaning they were essentially sophisticated autocomplete systems. But the GPT-5 models solve multistep problems through reasoning.
Reasoning models are more compute-intensive, which means more Nvidia GPUs are needed for training and inference. Reasoning models also produce better outcomes and broaden the number of AI use cases, per JPMorgan strategist Stephanie Aliaga. “Beneath the near trillion-dollar headlines is a real computing platform shift decades in the making that is reshaping industries and business models,” she said.
Huang also told analysts how reasoning models will evolve: “The wave that we’re seeing now is the agentic AI inflection and the next inflection beyond that is physical AI, where we take AI and these agentic systems into physical applications.” Physical AI is an emerging discipline focused on autonomous machines like robots and vehicles. “That’s a giant opportunity,” he told analysts.
Huang expects data center spending to reach $3 trillion to $4 trillion annually by 2030. For context, the top five hyperscalers are forecast to spend $700 billion on capital expenditures (capex) in 2026, meaning total capex might be somewhere around $1 trillion this year. So Huang thinks the market can triple or quadruple by the end of the decade, which implies annual growth between 32% and 41%.
So what? McKinsey & Company estimates data center GPUs and networking equipment account for over 50% of data center spending. Bernstein and TD Cowen have made similar estimates. Nvidia is the dominant supplier in both markets, meaning the company could soon have a multitrillion-dollar opportunity in the data center segment alone.
Jensen Huang says Nvidia systems generate the most revenue at the lowest cost
Tokens are the fundamental unit of data processed by AI models during training and inference to enable predictions, content generation, and reasoning. One token might be as short as a single character or as long as a full phrase depending on the language. In English, a token is roughly equivalent to 75 words, according to OpenAI.
Jensen Haung says inference tokens per watt — which means performance per unit power consumed — is the most important revenue metric for cloud services providers. The profitability of a cloud platform is directly tied to the number of tokens it can process or generate per watt. In that respect, Nvidia has a competitive moat.
Nvidia essentially builds entire data centers, supplying clients with rack-scale solutions for AI computing that span GPUs, CPUs, and networking. That lets the company optimize for performance and power efficiency at the system level rather than the component level, which gives it an important edge. “Nvidia produces the lowest cost per token and data centers running on Nvidia generate the highest revenues,” according to Huang.
Beyond that, Nvidia also has a more robust ecosystem of software development tools than any competitor. Much ado has been made about custom chips posing a threat to Nvidia’s dominance in AI infrastructure, but custom chips need custom software that developers must build from scratch. Nvidia systems eliminate that expense, making its AI compute platform even more attractive.
Here’s the big picture: Nvidia will likely remain the industry standard in AI infrastructure for the foreseeable future and its addressable market is expanding quickly. Indeed, Wall Street estimates the company’s earnings will increase at 38% annually over the next three years. That makes the current valuation of 37 times earnings look very attractive. Patient investors should feel comfortable buying a position in this stock today.