The Evidence Is Piling Up: Nvidia's AI Chip Dominance May Be About to Come to an End
Nvidia (NVDA +1.80%) has been one of the biggest beneficiaries of the artificial intelligence (AI) chip boom. Its graphics processing units (GPUs) are parallel processors, designed to break down certain types of massively complex calculations into a host of smaller parts, and then perform all of those small calculations simultaneously, rather than taking each task in sequence. And it turns out, the process of training large language models (LLMs) depends heavily on just the sort of tasks where GPUs excel.
As a result, over the past few years, demand for Nvidia’s industry-leading GPUs has skyrocketed, driving stunning growth in the company’s revenue and earnings.
Major hyperscalers and AI companies, such as Amazon (AMZN 1.40%), Microsoft, Meta Platforms, and Alphabet‘s (GOOG +0.05%) (GOOGL 0.04%) Google, have long relied on Nvidia’s hardware to train powerful AI models.
What’s worth noting is that Nvidia’s rivals haven’t been able to make much of a dent in its AI chip dominance. It controls an estimated 81% of the AI data center chip market, according to IDC. The good news for Nvidia stock investors is that the company’s red-hot growth could continue — the company is forecasting total sales of $1 trillion for its Blackwell and Vera Rubin architectures across 2026 and 2027.
However, there is ample evidence that Nvidia’s position in AI chips is gradually weakening.
Image source: Nvidia.
Nvidia’s customers are turning into competitors
Training LLMs requires a lot of computing power, which is why Amazon, Meta, Microsoft, Alphabet, and others have been purchasing millions of Nvidia GPUs. However, these customers have also been designing their own chips to run AI workloads cost-effectively in their data centers. The high costs and supply constraints associated with Nvidia’s popular graphics cards explain why these customers have been working on their own chips in-house for a long time.
Today’s Change
(1.80%) $3.73
Current Price
$211.56
Key Data Points
Market Cap
$5.1T
Day’s Range
$206.51 – $214.19
52wk Range
$115.21 – $216.82
Volume
44K
Avg Vol
171M
Gross Margin
71.07%
Dividend Yield
0.02%
Google, for instance, launched the first generation of its Tensor Processing Unit (TPU) in 2015, while Amazon’s in-house Trainium custom chip was launched in December 2020. Both companies have improved their chips over the years. In fact, they are now selling these chips to third parties.
Amazon, for instance, recently revealed that its chip business recorded 40% sequential growth in the first quarter of 2026. The annual revenue run rate of Amazon’s semiconductor business is now more than $20 billion. What’s more, the “Magnificent Seven” company notes that the segment’s revenue run rate is improving by triple-digit percentages year over year.
Another key point is that the segment’s annual run rate would be closer to $50 billion if it included its “sales” of chips to itself for use in AWS data centers. What’s more, the demand for Amazon’s Trainium chips is so strong that access to them is fully booked. Its custom AI processors are being deployed by Anthropic, OpenAI, Uber, and even Meta Platforms, which uses Amazon’s in-house Graviton central processing unit (CPU) to support agentic AI applications.
As it turns out, Amazon has a whopping $225 billion in purchase commitments for its Trainium AI chips, clearly suggesting that its semiconductor business is poised for terrific growth.
Meanwhile, Google has also been making waves in the AI chip market. The tech giant has sizable deals in place with Meta Platforms and Anthropic for the deployment of its TPUs. CEO Sundar Pichai sees the TPU business as one of its key growth drivers, and the company is now selling its chips to more customers.
On Alphabet’s latest earnings call, Pichai remarked:
As TPU demand grows from AI labs, capital markets firms, and high-performance computing applications, we will begin to deliver TPUs to a select group of customers in their own data centers in the hardware configuration to expand our addressable market opportunity.
This addressable opportunity could be massive in the long run. Though Google hasn’t publicly revealed the size of its TPU business yet, investment firm D.A. Davidson estimates that it could be worth a whopping $900 billion in the long run, assuming the company decides to seriously sell its chips to third parties.
It now appears that Google is indeed becoming serious about its TPU business, and that’s likely to create more problems for Nvidia’s AI chip empire.
Can Nvidia fight back?
Nvidia isn’t going to sit and watch while its customers turn into competitors. The reason Amazon and Google’s custom processors have been gaining tremendous traction is that they are application-specific integrated circuits — chips that are optimized to handle a relatively narrow range of workloads, in contrast to Nvidia’s more flexible GPUs, which are suitable for a broad range of tasks. Custom chips can thus perform AI inference tasks more efficiently, reducing the total operating cost of data centers.
Nvidia is countering the threat from the likes of Amazon and Google by making improvements to its own hardware that significantly reduce the cost of AI inference with its GPUs. Also, Nvidia has decided to offer its Vera server CPU as a stand-alone product for the first time, rather than only offering it as part of the Vera Rubin platform. The company has made this move as it is seeing strong interest in its Vera server CPUs. In fact, the company believes that its server CPU business could become a multibillion-dollar play.
Nvidia’s efforts to push the envelope in product development should help it ward off the rising competition. Also, investors shouldn’t forget that the AI chip market continues to expand rapidly. Bank of America estimates that the global semiconductor market could clock $2 trillion in revenue in 2030. Gartner, on the other hand, estimates that AI chips will account for half of the global semiconductor market by the end of the decade.
So, there is ample room for more than one major player to thrive in this space. Nvidia reported $194 billion in data center revenue last year, and the size of the addressable market suggests it still has significant room for growth in this segment. Additionally, the company is taking steps to defend its dominance. As such, it is easy to see why analysts remain bullish about Nvidia’s prospects; it can continue to record healthy data center sales growth even if it loses some market share in AI chips.
NVDA Revenue Estimates for Current Fiscal Year data by YCharts.
Moreover, its forward earnings multiple of 24 is well below the tech-focused Nasdaq Composite index’s average earnings multiple of 40.6. The company’s earnings growth potential suggests it is undervalued right now. That’s why it makes sense to hold on to this AI stock, despite rising competition in the data center chip market.