Will NVIDIA Be Knocked Off Its Perch By Quantum?
Earlier this month, NVIDIA CEO Jensen Huang seemingly dashed the hopes of speculative tech investors banking on quantum computing to become the next big thing in the ever-advancing race for faster and faster computer processing. Huang, speaking at the Consumer Electronics Show, stated his view that practical quantum computers were still about two decades away, causing a selloff among the stocks of quantum startups.
While Huang is certainly an authority on the subject of cutting-edge computing, he has also received his share of criticism for presenting such a pessimistic view of quantum computing. Alan Baratz, CEO of D-Wave Quantum, called Huang’s remarks “dead wrong”.
Some critics have gone even further, suggesting that his negative comments were meant to cover up quantum computing’s potentially fundamental threat to NVIDIA’s business. We take a look at the cases for and against quantum computing to determine whether this emerging technology has the potential to disrupt NVIDIA.
Key Points
- Quantum computing offers exponential power growth but faces major hurdles, including extreme operating conditions and error sensitivity.
- NVIDIA is investing in hybrid quantum-classical solutions and leveraging its resources to adapt to quantum advancements.
- Practical quantum computing is unlikely to disrupt NVIDIA soon, as high costs and technical barriers will sustain demand for classical GPUs in the foreseeable future.
The Argument for Quantum Computing Shaking up NVIDIA
On paper, there’s a fairly good argument to be made that quantum computing presents risks for NVIDIA.
Quantum computers have massive theoretical advantages over traditional computing systems, even the very advanced kind that NVIDIA specializes in. For this reason, some quantum bulls believe that the technology’s ongoing development will pose a serious threat to NVIDIA’s GPU business.
On the plus side, quantum computers are based on units known as qubits, which take the place of individual transistors in a classical computer. To get somewhat technical for a moment, a transistor in a classical computer can store a single value, being either a 0 or a 1 at any given time, a qubit can store both values simultaneously thanks to the property of quantum superposition. Put simply, a classical computer’s power increases in a linear fashion as more transistors are added, a quantum computer’s power rises exponentially as more qubits are added.
This ability to increase computing power exponentially allows quantum computers to tackle problems that even today’s most powerful classical computers would take unfathomably large amounts of time to solve. For example, the Willow quantum chip recently unveiled by Alphabet can perform in five minutes a calculation that would take a classical supercomputer 10 septillion years. For reference, current estimates suggest that the universe itself is currently only about 13.8 billion years old.
The Realities of Quantum Computing
Although proponents of quantum computing might be bullish on It overtaking classical computing, there are some really high hurdles to overcome. For example, quantum computers are required to operate under extremely specific conditions, requiring cooling to nearly absolute zero with liquid helium. That alone creates challenges in maintaining the environment in which a quantum computer must operate.
Because they require such precise conditions to function, quantum computers are also notoriously error-prone. Qubit quantum states can be thrown off by something as simple as an external vibration or a slight change in temperature, causing information stored in the qubits to be lost.
The more qubits a quantum computer includes, the less time it has to perform a calculation before one or more of its qubits is affected by some form of external noise. It’s even possible that the progressively smaller units of time allowed for computation as the number of qubits increases could place a hard limit on how large a functioning quantum computer can be.
There’s also a serious headwind stemming from research that has called into question whether or not classical computers are really doomed to underperform quantum computers.
One research paper published last year revealed how classical computers can use sophisticated algorithms to mimic the workings of quantum computers without the problems associated with information loss. If such a solution can be brought to fruition, classical computing may remain relevant for longer and bridge the current gap between the computing powers of the two technologies.
Together, these problems put very real obstacles between current quantum computing technology and widespread commercial viability. Although quantum computers could one day help to solve complex problems like accurately predicting large weather systems or devising new, novel pharmaceuticals, the technology is still young and must overcome several major complications before it can be deployed at scale by real-world businesses.
How Likely Is It That Quantum Computing Will Disrupt NVIDIA?
Bringing all of this information back down to earth to see how it will affect NVIDIA, it starts to look fairly likely that Jensen Huang’s timeline for practical quantum computing could be correct. The combination of engineering challenges that must be solved before quantum computers can achieve day-to-day usefulness will probably require years of additional research and vast amounts of resources.
While quantum computing has the real potential to disrupt NVIDIA’s present business of selling cutting-edge classical GPUs in the long-term, there’s little reason to believe that this shift will happen any time in the immediate future.
So, will quantum computing disrupt NVIDIA? In the end, it seems unlikely that quantum computing will fundamentally disrupt NVIDIA. Even if practical quantum computers do become a reality faster than Jensen Huang expects, their costs and complications will likely still leave plenty of room for high-speed classical computing in the market. So, while quantum may well be the next frontier in technological advancement, it doesn’t seem to pose an immediate risk to a business as large and successful as NVIDIA has become.