Will Data Center AI Chip Demand Keep Aiding Micron's Sales Growth?

By Anirudha Bhagat | November 11, 2025, 8:19 AM

Micron Technology, Inc. MU generated record revenues of $37.38 billion in fiscal 2025, mainly driven by strong momentum in its data center business as demand for artificial intelligence (AI) infrastructure continues to rise. In fiscal 2025, the company’s data center products generated $20.75 billion in revenues and accounted for 56% of total sales.

Micron’s data center end-market comprises two units — the Cloud Memory Business Unit (“CMBU”) and the Core Data Business Unit (“CDBU”). In fiscal 2025, CMBU revenues surged 257% year over year to $13.52 billion, while CDBU sales climbed 45% to $7.23 billion. Solid growth across these business units was mainly driven by robust sales of high-bandwidth memory (HBM), high-capacity DRAM and solid-state drives used in AI workloads.

Micron’s latest generation of HBM3E and LPDDR5 server memory is gaining traction. One of its major customers, NVIDIA, is using these products for its H200 Tensor Core graphics processing units. The memory chip maker is also ramping up the production of its 1-gamma DRAM and G9 NAND technologies. These technologies enhance speed and efficiency while improving the cost structure. These products would help Micron strengthen its position in the AI-driven data center market.

Micron expects AI servers and traditional data centers to remain major growth drivers in fiscal 2026, supported by tight DRAM supply and expanding AI adoption. With its sustained focus on bringing in advanced memory solutions, the company is well-positioned to capitalize on this trend.

The Zacks Consensus Estimate for fiscal 2026 revenues is pegged at $53.27 billion, indicating year-over-year growth of 42.5%.

How Do Micron’s Rivals Stack Up in the Memory Chip Race?

Although there are no U.S. stock exchange-listed direct competitors for MU in the memory chip space, Intel Corporation INTC and Broadcom Inc. AVGO play key roles in the HBM supply chain and AI hardware ecosystem.

Intel is expanding its AI memory chip portfolio by integrating HBM into its high-performance accelerators. Intel's flagship AI accelerator, the Gaudi 3, features 128GB of HBM2e memory to provide high memory bandwidth for large-scale AI training and inference workloads.

Broadcom is expanding its AI chip business by developing high-performance custom AI accelerators and integrated advanced networking solutions that enable hyperscalers to utilize vast amounts of HBM effectively. Broadcom is co-designing and producing proprietary custom AI chips for companies like OpenAI, Google, Meta and ByteDance.

Micron’s Price Performance, Valuation and Estimates

Shares of Micron have surged around 201% year to date compared with the Zacks Computer – Integrated Systems industry’s gain of 83.9%.

Micron YTD Price Return Performance

Zacks Investment Research

Image Source: Zacks Investment Research

From a valuation standpoint, MU trades at a forward price-to-earnings ratio of 15.19, significantly lower than the industry’s average of 25.34.

Micron Forward 12-Month P/E Ratio

Zacks Investment Research

Image Source: Zacks Investment Research

The Zacks Consensus Estimate for Micron Technology’s fiscal 2026 and 2027 earnings implies a year-over-year increase of 95.7% and 14.5%, respectively. Bottom-line estimates for fiscal 2026 and 2027 have been revised upward in the past 60 days.

Zacks Investment Research

Image Source: Zacks Investment Research

Micron Technology currently sports a Zacks Rank #1 (Strong Buy). You can see the complete list of today’s Zacks #1 Rank stocks here.

Want the latest recommendations from Zacks Investment Research? Today, you can download 7 Best Stocks for the Next 30 Days. Click to get this free report


 
Intel Corporation (INTC): Free Stock Analysis Report
 
Micron Technology, Inc. (MU): Free Stock Analysis Report
 
Broadcom Inc. (AVGO): Free Stock Analysis Report

This article originally published on Zacks Investment Research (zacks.com).

Zacks Investment Research

Latest News