Micron Q3: 2025 Earnings – The Memory Inflection Point
Key Takeaways Micron’s AI-Driven Growth Is Accelerating: HBM revenue grew over 50% quarter-over-quarter, reaching a $6B+ run rate, with expectations to approach $8B in the coming quarters. AI server demand is driving strength across HBM, LPDDR, and high-capacity DIMMs. Strong Financial Momentum Acr
Key Takeaways
- Micron’s AI-Driven Growth Is Accelerating: HBM revenue grew over 50% quarter-over-quarter, reaching a $6B+ run rate, with expectations to approach $8B in the coming quarters. AI server demand is driving strength across HBM, LPDDR, and high-capacity DIMMs.
- Strong Financial Momentum Across the Board: Q3 results beat expectations on revenue ($9.3B), gross margin (39%), and EPS ($1.91). Q4 guidance also topped consensus, with revenue guided to $10.7B and EPS to $2.50, supported by improving pricing and tight DRAM inventory.
- Like-for-Like DRAM Pricing Is Improving: Despite a consumer mix headwind dragging down blended ASPs, non-HBM DRAM pricing improved sequentially, aided by low channel inventories and stronger end-market demand.
- HBM Market Share and Roadmap Outpacing Expectations: Micron is ramping HBM3E 12-Hi faster than expected and now sees its HBM market share aligning with its overall DRAM share by 2H 2025. The company is actively working with customers on HBM4 adoption in 2026.
- Valuation Reflects Much of the Upside: While the business fundamentals remain strong, especially into FY26, current valuation already prices in much of the near-term AI-driven growth. Future upside will likely depend on sustained pricing strength, DRAM content growth, and continued share gains in HBM.
Micron’s management made it clear on their Q3 earnings call that they see memory reaching an important inflection point, shaped by the growing demands of AI. The company described memory as a central part of AI infrastructure, with a direct impact on performance, power efficiency, and system scalability. Sanjay Mehrotra highlighted that high-performance memory, including HBM and LPDRAM, plays a foundational role in enabling AI workloads. As models become more complex and inference scales across data center and edge environments, the pressure on memory bandwidth, density, and power consumption continues to increase.
Management pointed to HBM as a key driver of this shift. Micron expects the HBM market to reach $35 billion in 2025, nearly double from the previous year. Bit demand growth for HBM is projected to exceed overall DRAM growth, reflecting rising content per accelerator. The company emphasized the importance of upcoming architectural transitions, such as 12-high HBM3E, HBM4, and HBM4E. Each generation increases memory bandwidth and expands the trade ratio, which raises the number of wafers needed per memory stack. These changes increase memory intensity in AI systems and require scaling across both manufacturing and packaging.
Micron also described the role of LPDRAM and SSDs in supporting AI beyond the data center. The company remains the only volume supplier of LPDRAM in servers and is seeing strong DRAM content growth in smartphones. These trends are being supported by AI features that demand higher memory capacity in client devices. Micron views its SSD and NAND portfolio as well aligned with these developments, particularly in AI server storage where performance and power efficiency are critical.
To support these shifts, Micron has reorganized its business units around end markets. The company believes this structure will improve its ability to align with customer needs in data center, PC, mobile, automotive, and industrial markets. Management outlined a clear strategy to expand its HBM footprint, scale manufacturing capacity, and invest in leading-edge nodes like 1-gamma and G9. The commentary throughout the call reflected a long-term view of memory as a performance driver in AI systems. Micron plans to expand its position through continued execution on product development, manufacturing readiness, and customer alignment. The company is focused on delivering the memory infrastructure required for the next generation of AI platforms.
View From the Street
The Q&A session with analysts reflected a mix of enthusiasm and healthy caution—an acknowledgment that Micron is executing well in the near term while entering a much more strategically important position in the AI ecosystem. Several analysts leaned into the HBM story, probing both the sustainability of Micron’s share gains and the broader scaling of the HBM TAM relative to the accelerator market. There was visible positive sentiment around Micron’s ability to reach its DRAM-like share of the HBM market sooner than expected, with analysts recognizing that the company’s early execution on 12-high HBM3E and its HBM4 roadmap positions it well in a space that’s rapidly becoming mission-critical. The fact that Micron is now shipping to four customers across GPU and ASIC platforms resonated as a sign of real traction, not just narrative.
That said, analysts were also clearly trying to gauge how much of the current strength is structural versus cyclical, especially as it relates to gross margin trends and capacity planning. This is an important part of the analysis as a segment of the memory market is likely to shift to less cyclical trends as HBM and other LPDRAM get built-in (tightly integrated) to new AI infrastructure. The pricing environment was a key focus—analysts were encouraged by the Q3 margin beat and Q4 guide but wanted clarity on whether 42% gross margins are sustainable or if they are more a function of the current mix and tight supply. Management’s answer—pointing to a favorable mix (more DRAM, more data center) and disciplined bit allocation—was well received, though investors will likely want to see a few more quarters of consistency before anchoring a new baseline.
There were also nuanced questions on the HBM pricing and supply-demand dynamics heading into 2026. Analysts asked whether customer demand was starting to exceed Micron’s planned supply and whether recent execution issues at competitors could create further upside. While management didn’t quantify the upside, they emphasized strong engagement with customers and confidence in both the roadmap and capacity buildout. The tone here suggested investors are bullish on Micron’s HBM trajectory but remain cautious about how long that window of competitive advantage will last.
Key investor sentiments included:
- Positive:
- Clear recognition of HBM execution and early leadership in 12-high and HBM4.
- Enthusiasm around structural growth drivers in AI across data center and edge.
- Confidence in management’s operational discipline and strategic CapEx allocation.
- Cautious:
- Questions around how sticky gross margins will be, especially if NAND pricing softens.
- Concerns about potential tariff impacts and the modest level of pull-ins flagged by management.
- Curiosity about long-term share dynamics in HBM, especially if competitors regain footing.
Overall, the analyst community seems to be increasingly aligned with Micron’s long-term AI thesis—but they’re watching closely to see if near-term strength evolves into durable competitive positioning. The tone was constructive throughout, with investors acknowledging that Micron’s role in the AI infrastructure stack is growing—what remains to be seen is how effectively that translates into sustained financial leverage across cycles.
Key Financial and Operational Highlights (Q3 FY25)
- Revenue: $9.30B (+15.5% Q/Q, +36.6% Y/Y) — above consensus (MS: $8.83B, JPM: $8.81B).
- Gross Margin: 39.0% — beats consensus (MS: 37.0%, JPM: 36.5%).
- EPS: $1.91 — beats consensus (MS: $1.60, JPM: $1.58).
- HBM revenue: ~$1.5B, up 50% Q/Q, reaching a $6B+ run-rate.