#hbm

[ follow ]
from24/7 Wall St.
9 hours ago

Up 145% in 2025, This AI Infrastructure Stock Is Still Deeply Discounted

When it comes to artificial intelligence, a few names dominate the conversation like Nvidia ( NASDAQ:NVDA ), Taiwan Semiconductor Manufacturing ( ), or even Intel ( NASDAQ:INTC ) in recent months. These companies rightfully claim the spotlight. These players drive the AI narrative because they deliver tangible results - record revenues, market share gains, and innovations that fuel everything from chatbots to autonomous systems. Investors flock to them, bidding up shares on every earnings beat or product launch. Yet beneath the hype, AI's foundation relies on more than just processing power and fabrication prowess. Data storage and high-speed memory are the unsung necessities that enable seamless data flow , preventing bottlenecks in the AI pipeline.
Artificial intelligence
Mobile UX
fromGSMArena.com
1 week ago

Samsung Q3 earnings guidance reveals very solid performance

Samsung expects its largest quarterly profit since 2022 as AI-driven memory chip demand boosts revenue to KRW 86 trillion and profit to KRW 12.1 trillion.
fromTheregister
1 month ago

PC DRAM costs to climb as fabs favor servers and HBM

PC memory prices are set to rise as the major suppliers allocate manufacturing capacity to the more lucrative server DRAM and HBM instead amid reports of tightening supplies. Memory prices are set for an increase in Q4 of 2025, according to market watcher TrendForce, which points the finger at the three top DRAM makers - Samsung, SK Hynix, and Micron Technology.
Gadgets
Tech industry
fromTheregister
1 month ago

Micron close to selling all the HBM it will make next year

Micron has six HBM customers and near-complete 2026 HBM3E sell-out, expects rising memory margins and will invest $18B in FY2026 capital expenditures.
fromTheregister
1 month ago

Nvidia's context-optimized Rubin CPX GPUs were inevitable

Nvidia on Tuesday unveiled the Rubin CPX, a GPU designed specifically to accelerate extremely long-context AI workflows like those seen in code assistants such as Microsoft's GitHub Copilot, while simultaneously cutting back on pricey and power-hungry high-bandwidth memory (HBM). The first indication that Nvidia might be moving in this direction came when CEO Jensen Huang unveiled Dynamo during his GTC keynote in spring. The framework brought mainstream attention to the idea of disaggregated inference.
Artificial intelligence
Tech industry
fromMedium
2 months ago

SK Hynix Forecasts 30% Annual Growth in AI Memory Market Through 2030

SK Hynix forecasts the high-bandwidth memory market to grow about 30% annually through 2030, driven by robust AI demand and custom HBM development.
[ Load more ]