Currencies33840
Market Cap$ 3.44T-0.20%
24h Spot Volume$ 50.00B-1.43%
DominanceBTC62.10%+0.24%ETH8.53%+0.87%
ETH Gas2.87 Gwei
Cryptorank

U.S.' largest memory chips maker Micron raises Q4 forecast to $11B on AI demand


by Hannah Collymore
for CryptoPolitan
U.S.' largest memory chips maker Micron raises Q4 forecast to $11B on AI demand

Micron has raised its Q4 financial predictions based on a surge in demand for artificial intelligence hardware, with the expectation that the demand wil persist throughout the year.

With increasing AI adoption, Micron, a producer of hardware essential to AI development, is experiencing a surge in demand.

Micron raises Q4 predictions

Micron Technology Inc., the largest U.S.-based manufacturer of computer memory chips, has raised its fiscal fourth-quarter revenue prediction up to $11B. This change was brought on by the surge in demand for artificial intelligence (AI) hardware.

The new figure significantly exceeded Wall Street’s initial expectations and sent Micron shares climbing in after-hours trading.

The Idaho-based company announced on Wednesday that it expects its fourth-quarter revenue to range between $10.4B and $11B. These figures are well above the average analyst estimate of $9.89, according to data compiled by Bloomberg.

Investors responded positively to the prediction, and Micron shares gained over 6% in extended trading. This year, the company’s stock has surged by 51% so far, outperforming many peers in the semiconductor sector.

Earnings per share, excluding certain items, are projected to come in at around $2.50, which is also more than the estimated $2.03.

This increase is due to Micron experiencing growing demand for its high-bandwidth memory (HBM) products, which are integral to training and operating large-scale AI models.

AI demand boosts Micron’s optimism

Micron’s latest earnings guidance shows just how important memory technology has become to AI infrastructure and innovation.

Companies like cloud providers or chipmakers that build AI systems are relying on more advanced memory components to support the heavy processing requirements of modern AI workloads.

Micron’s high-bandwidth memory is used in data centers and servers that run machine learning and generative AI applications. These applications, such as those powering AI chatbots or image generators, require fast data access and vast storage capacity.

Micron previously faced tighter profit margins due to an oversupply in some memory segments and weakened demand for consumer electronics, but the shift toward AI has helped the company bounce back from recent struggles.

“The growing complexity of AI models requires more advanced memory solutions, and we’re seeing strong interest in our HBM products,” Micron said in its earnings statement Wednesday.

Wall Street analysts have stated that Micron is in a strategic position in the current AI industry and has the ability to monetize the growing demand for specialized memory.

Although overall PC and smartphone markets remain relatively stagnant, Micron’s diversification into AI, cloud, and data center customers has helped offset these weaknesses. Analysts expect that trajectory to continue as AI development accelerates globally.

Micron’s leadership suggested that the interest in AI adoption will likely persist through 2025. However, the company remains cautious about broader macroeconomic risks, including fluctuations in consumer tech spending and potential supply chain disruptions.

KEY Difference Wire helps crypto brands break through and dominate headlines fast

Read the article at CryptoPolitan

Read More

Meta closes on new green energy deals to power data center locations

Meta closes on new green energy deals to power data center locations

Social media giant Meta Platforms has signed four deals with renewable energy firm In...
OpenAI fingers Beijing-backed Zhipu AI as strong contender despite US ban

OpenAI fingers Beijing-backed Zhipu AI as strong contender despite US ban

Banned in the US, but backed at home, Zhipu AI has been dubbed one of China’s “AI tig...

U.S.' largest memory chips maker Micron raises Q4 forecast to $11B on AI demand


by Hannah Collymore
for CryptoPolitan
U.S.' largest memory chips maker Micron raises Q4 forecast to $11B on AI demand

Micron has raised its Q4 financial predictions based on a surge in demand for artificial intelligence hardware, with the expectation that the demand wil persist throughout the year.

With increasing AI adoption, Micron, a producer of hardware essential to AI development, is experiencing a surge in demand.

Micron raises Q4 predictions

Micron Technology Inc., the largest U.S.-based manufacturer of computer memory chips, has raised its fiscal fourth-quarter revenue prediction up to $11B. This change was brought on by the surge in demand for artificial intelligence (AI) hardware.

The new figure significantly exceeded Wall Street’s initial expectations and sent Micron shares climbing in after-hours trading.

The Idaho-based company announced on Wednesday that it expects its fourth-quarter revenue to range between $10.4B and $11B. These figures are well above the average analyst estimate of $9.89, according to data compiled by Bloomberg.

Investors responded positively to the prediction, and Micron shares gained over 6% in extended trading. This year, the company’s stock has surged by 51% so far, outperforming many peers in the semiconductor sector.

Earnings per share, excluding certain items, are projected to come in at around $2.50, which is also more than the estimated $2.03.

This increase is due to Micron experiencing growing demand for its high-bandwidth memory (HBM) products, which are integral to training and operating large-scale AI models.

AI demand boosts Micron’s optimism

Micron’s latest earnings guidance shows just how important memory technology has become to AI infrastructure and innovation.

Companies like cloud providers or chipmakers that build AI systems are relying on more advanced memory components to support the heavy processing requirements of modern AI workloads.

Micron’s high-bandwidth memory is used in data centers and servers that run machine learning and generative AI applications. These applications, such as those powering AI chatbots or image generators, require fast data access and vast storage capacity.

Micron previously faced tighter profit margins due to an oversupply in some memory segments and weakened demand for consumer electronics, but the shift toward AI has helped the company bounce back from recent struggles.

“The growing complexity of AI models requires more advanced memory solutions, and we’re seeing strong interest in our HBM products,” Micron said in its earnings statement Wednesday.

Wall Street analysts have stated that Micron is in a strategic position in the current AI industry and has the ability to monetize the growing demand for specialized memory.

Although overall PC and smartphone markets remain relatively stagnant, Micron’s diversification into AI, cloud, and data center customers has helped offset these weaknesses. Analysts expect that trajectory to continue as AI development accelerates globally.

Micron’s leadership suggested that the interest in AI adoption will likely persist through 2025. However, the company remains cautious about broader macroeconomic risks, including fluctuations in consumer tech spending and potential supply chain disruptions.

KEY Difference Wire helps crypto brands break through and dominate headlines fast

Read the article at CryptoPolitan

Read More

Meta closes on new green energy deals to power data center locations

Meta closes on new green energy deals to power data center locations

Social media giant Meta Platforms has signed four deals with renewable energy firm In...
OpenAI fingers Beijing-backed Zhipu AI as strong contender despite US ban

OpenAI fingers Beijing-backed Zhipu AI as strong contender despite US ban

Banned in the US, but backed at home, Zhipu AI has been dubbed one of China’s “AI tig...