The Revolution of Artificial Intelligence (AI) would not be possible without the semiconductor industry. The majority of development occurs inside data centers filled with graphic processing units (GPU) of main suppliers as Nvidia (Nvda 3.16%)) And Advanced micro-apparents (Nasdaq: AMD).
However, AI workloads also require more memory and storage capacity not only in data centers, but also in personal computers (PC) and smartphones, and Micron technology (Mu 2.32%)) is one of the best suppliers of this equipment. The company has just published its financial results for its second quarter of the 2025 financial year (finished on February 27), and they revealed a flambé of income led by the AI -related demand.
This is why Micron’s actions could be an excellent purchase for investors who are looking for an exhibition at AI boom.


Image source: Getty Images.
AI requires an increasing amount of memory
Memory is an essential component of the high data intensity AI training And AI inference workloads. He stores information in a ready state so that it can be called instantly by GPUs, which accelerates the processing time. Micron’s HBM3E (high bandwidth memory) for the data center is the best in industry, offering 50% more capacity than competing solutions while consuming 30% less energy.
Nvidia Use Micron’s HBM3E in its Blackwell GB 200 at the head of the industry Gpuwhich is currently the gold stallion for IA development. Nvidia will also use it in the next Blackwell Ultra GB300 GPU, which will provide even more treatment power for AI applications. Given that the four best NVIDIA customers have already ordered 3.6 million Blackwell chips, it is not surprising that Micron is completely sold from his HBM3E solutions for the 2025 calendar year and already undergoes high demand for its 2026 offer.
Micron says the HBM market was worth $ 16 billion in 2024 and is expected to be more than $ 35 billion this year. This could then be worth $ 100 billion by 2030, so there will be astronomical financial awards to stay ahead of the competition. To do this, Micron plans to launch its new HBM4E solution for the data center in 2026, which will provide a 60% increase in the bandwidth compared to the previous generation.
But Micron’s AI opportunity transcends the data center. As the chips become more powerful, the workloads of AI will switch to PCs and smartphones, allowing chatbots and other applications to do offline. This will create a faster user experience and make them accessible from anywhere. The company says that PCs IA already require a minimum DRAM (memory) capacity of 16 gigabytes, against an average of 12 gigabytes for PCs not AI last year.
Likewise, most AI smartphones now require 12 or more capacity gigabytes, against 8 gigabytes last year. Micron smartphone memory solutions are used in a number of devices powered by Android of the best manufacturers, especially Samsung.
More memory capacity is reflected in more income for Micron, so the company is positioned to win, if AI workloads are treated in data centers or on devices.
Micron’s revenues are arrowed, led by the data center
Micron generated $ 8 billion in total income During its 2025, second quarter, which represented an increase of 38% compared to the period of the previous year. However, there was a much greater history of growth below the surface of the title number.
Micron’s calculation and networking segment revenues, where it explains its data center sales, has climbed a record of $ 4.6 billion by 109%. In addition, revenues attributable to HBM represented $ 1 billion in this figure, which was also a record.
On the other hand, revenues from Micron’s mobile segment decreased 33% to $ 1 billion because customers had accumulated stocks that softened demand. However, the company plans to see modest growth in mobile activity this civilian calendar year, especially since the adoption of IA smartphones is putting itself at high speed.
Micron’s high high -end result has led to a significant increase in profitability, with profit by action (EPS) Double at $ 1.41. This trend is expected to continue during the current third quarter of the fiscal 2025 – the company provides $ 8.8 billion in income and $ 1.37 in BPA, which represents growth from one year to the other respectively of 29% and 356%.
Micron Stock looks like a good deal compared to his peers
Given that certain Micron materials such as HBM3E memory are already sold this year, there is a certain predictability to the financial results of the company. Wall Street consensus forecasts (according to Yahoo!) suggest that its EPS will be $ 6.93, placing its stock in a Price / benefit ratio (P / E) of only 13.6.
It is a 40% discount to AMD P / E ratio before 22.7, and an even higher discount of 47% compared to the P / E ratio of Nvidia of 25.9:
NVDA PE ratio (forward) data by Ycharts
As I pointed out earlier, Nvidia uses Micron HBM3E in its flagship GPUs like the GB200 and the GB300. Since Nvidia already has orders for millions of these chips, Micron is likely to experience monumental growth in tandem sales over the next two years. Consequently, it is difficult to justify the lowering discount in Micron’s stock.
In addition, Micron will benefit considerably while the workloads of AI pass from data centers to PCs and smartphones, so that the company is perfectly positioned to capitalize on this technological revolution. As a result, I think the Micron stock could be an excellent addition to any balanced wallet.