1 super semiconductor reserves (without Nvidia or AMD) to buy a hand over a fist

  • NVIDIA and Advanced Micro (AMD) supply some of the world’s most powerful data center chips for artificial intelligence (AI).

  • Micron Technology supplied with memory and storage chips is also very important for AI workloads in data centers, computers and smartphones.

  • Currently, Micron’s shares are trading a very attractive assessment, providing investors with great potential purchase.

  • 10 shares we like more than Micron Technology ›

When it comes to artificial intelligence (ai) chips, most investors usually think about graphics processing devices (GPU) Nvidia (NASDAQ: NVDA) and Advanced micro devices (NASDAQ: AMD); GPU sales have grown to both companies over the last couple of years, and in the case of NVIDIA, they added trillions to their market capitalization.

But Micron technology (Name: MU) It also deserves recognition for their memory and storage chips, which are increasingly important for AI workloads in data centers, personal computers and even smartphones. The company has just announced its financial fiscal 2025. The results of the third quarter (which ended on May 29), revealing the constant demand for AI related memory capabilities.

2025 Micron Stock increased by 42%, but that’s why it can still be a screaming purchase.

Image Source: Getty Images.

The GPU data center is intended for parallel processing, which means they can make several calculations at the same time and manage the giant data sets needed to install AI models. However, these workloads also require a high -level memory (HBM), which stores information in the prepared state, so that GPU can call it instantly.

The Micron HBM3E solution to the Data Center determines industrial activities and efficiency. In fact, NVIDIA has selected it for its latest Blackwell and Blackwell Ultra GPUS, while Advanced Micro devices (AMDs) will also use it in the future MI355X GPU. Micron is now preparing to produce commercial quantities of the new HBM4 data center solution next year, which will increase the increase in productivity above HBM3E and consume 20% less power, making it ideal for the next generation “reasoning” for AI models.

In this calendar year, Micron estimates that the HBM market addressed to it will be worth $ 35 billion, hoping that the number by 2030 will be worth it. It will increase to $ 100 billion, so the company will have a huge opportunity.

However, some workloads are now processed on personal computers (PCS) and smartphones without the need for external calculation of data centers. This trend will accelerate when the chips become more powerful, which is already increasing the demand for drams. Micron says that PCS is usually required for the minimum drama capacity-16 gigabytes compared to 12 gigabytes of its colleagues, not Ai, and Smart phones show a similar increase.

Leave a Comment