Nvidia (NASDAQ: NVDA) is the world's leading supplier of graphics processing units (GPUs) for data centers, which are the key pieces of hardware for developing artificial intelligence (AI). Since the beginning of 2023, Nvidia's market value has grown by an eye-popping $3 trillion thanks to soaring demand for those chips, but the company is just getting warmed up.
Every new generation of AI model so far has required significantly more computing capacity than the last, which is a major tailwind for Nvidia's hardware sales. During a recent conference call where management discussed the company's results for the fiscal 2026 first quarter (ended April 27), CEO Jensen Huang made a series of comments about future demand that should be music to investors' ears.
Nvidia stock looks attractive right now relative to its history, so here's why it could be a great buy on the back of Huang's latest remarks.
Demand for computing capacity could soar a thousand times
Nvidia's H100 GPU, which was built on the company's Hopper architecture, was the top-selling data center chip for AI development during 2023 and for most of 2024. It was designed for both training and inference workloads; training is when developers feed mountains of data into AI models to make them "smarter," and inference is the process by which AI models turn that data into responses for the end user.
In 2023 and 2024, AI chatbot applications were great at generating one-shot responses, meaning they prioritized speed when compiling information and feeding it to the end user. Those applications were revolutionary at the time, but the underlying large language models (LLMs) occasionally made mistakes or provided incomplete answers.
In 2025, next-generation "reasoning" models are solving that problem by autonomously cleaning up errors in the background before rendering responses. To put it another way, they spend time thinking to ensure the information they provide is as accurate as possible. This comes with a downside -- reasoning models take longer to generate answers, and they require significantly more computing capacity than their predecessors.
Nvidia designed a new architecture called Blackwell to power those inference workloads, and it produces up to 40 times more performance than the Hopper architecture. But it might not be enough, because Huang says some reasoning models consume a staggering 1,000 times more tokens (words, punctuation, and symbols) than the old one-shot LLMs.
Huang says the Blackwell-based GB200 GPU NVLink 72 is the best system on the market for reasoning inference workloads right now, but Nvidia is also gearing up to ship its new Blackwell Ultra GB300 GPUs this year, which will offer even more performance. Hardware needs to keep getting better; otherwise, reasoning models will take too long to generate responses, and people simply won't use them anymore.
An annual opportunity worth $1 trillion
Nvidia's data center business generated $39.1 billion in revenue during the fiscal 2026 first quarter, a 73% increase from the year-ago period. It now accounts for 89% of the company's total revenue, so it's the main point of focus for investors.
At Nvidia's GTC conference in March, Huang told the audience that AI infrastructure spending will keep growing and could surpass $1 trillion annually by 2028, thanks to the incredible demand for inference computing capacity from reasoning models. Then, in his conference call with investors for the fiscal 2026 first quarter, he said Nvidia is on track to fill "most" of that demand, so the company's data center revenue probably still has room to soar.
Nvidia's hardware remains leaps and bounds ahead of the competition. Plus, it isn't just about chips -- the company has the entire stack covered, selling networking equipment and even a software platform called CUDA, which developers can use to optimize GPUs for specific tasks. Once data center operators are locked into the Nvidia ecosystem, it becomes very inconvenient (and expensive) to switch.
Nvidia stock looks like a bargain relative to its history
Based on Nvidia's $3.19 in trailing-12-month earnings per share (EPS), its stock is trading at a price-to-earnings (P/E) ratio of 44.3. That's a 26% discount to its 10-year average of 59.8, suggesting it's undervalued right now.
Plus, Wall Street's consensus estimate (provided by Yahoo! Finance) suggests Nvidia will generate $4.28 in EPS for the whole of fiscal 2026, placing its stock at a forward P/E ratio of just 32.1.
NVDA PE Ratio data by YCharts
In other words, Nvidia stock would have to soar by 38% over the next year or so just to maintain its current P/E ratio, or by 86% for its P/E ratio to trade in line with its 10-year average, assuming Wall Street's EPS forecast proves to be accurate.
However, investors should stay focused on the longer term, because if Jensen Huang is right about where inference demand and data center spending are headed, Nvidia's stock could be orders of magnitude higher than where it is today.
Should you invest $1,000 in Nvidia right now?
Before you buy stock in Nvidia, consider this:
The Motley Fool Stock Advisor analyst team just identified what they believe are the 10 best stocks for investors to buy now… and Nvidia wasn’t one of them. The 10 stocks that made the cut could produce monster returns in the coming years.
Consider when Netflix made this list on December 17, 2004... if you invested $1,000 at the time of our recommendation, you’d have $668,538!* Or when Nvidia made this list on April 15, 2005... if you invested $1,000 at the time of our recommendation, you’d have $869,841!*
Now, it’s worth noting Stock Advisor’s total average return is 789% — a market-crushing outperformance compared to 172% for the S&P 500. Don’t miss out on the latest top 10 list, available when you join Stock Advisor.
See the 10 stocks »
*Stock Advisor returns as of June 2, 2025
Anthony Di Pizio has no position in any of the stocks mentioned. The Motley Fool has positions in and recommends Nvidia. The Motley Fool has a disclosure policy.