Close Menu

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Open Source AI Is Moving From Sideshow To Strategy

    April 20, 2026

    GenAI Bitcoin Thriller Has To Sell This Junk To Any Sucker It Can

    April 20, 2026

    DeepSeek Seeks First Outside Funding at $10 Billion Valuation – Unite.AI

    April 19, 2026
    Facebook X (Twitter) Instagram
    ailogicnews.aiailogicnews.ai
    • Home
    ailogicnews.aiailogicnews.ai
    Home»AI Trends»AI Gets Memory With Chips From Micron And Others
    AI Trends

    AI Gets Memory With Chips From Micron And Others

    AI Logic NewsBy AI Logic NewsApril 20, 2025No Comments4 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
    Share
    Facebook Twitter LinkedIn Pinterest Email

    Integrated Circuit, Film-Layout of a Printed Circuit Board. (Photo by Mediacolors/Construction … More Photography/Avalon/Getty Images)

    Getty Images

    News broke yesterday that Micron Technology is shaking things up with a new focus on a “cloud memory business unit” that will create something called HBM chips or high bandwidth memory chips.

    HBM chips are 3-D stacked SDRAM microprocessors traditionally used with high-performance hardware setups. Over in the world of model design, we’re seeing LLMs get more memory capacity and more utility out of the context data that they keep in memory.

    So it makes sense that this hardware revolution would be occurring. The interesting thing is who the players are.

    The HBM Market

    Insiders note that Micron is a top global provider of HBM chips, along with Samsung and a company called SK Hynix.

    So who’s actually making these chips?

    Take Samsung, for example. Industry news reveals that Samsung is working with its rival foundry partner TSMC to develop the HBM chips.

    We’ve seen so many times how TSMC has a dominant position in the market as a foundry. Other companies use TSMC for the raw fabrication power, and develop their own plans on top of TSMC‘s production capability. That in turn has led to everything from a shortage of vehicle chips, to more recently, some troublesome geopolitical problems around production having to do with export controls. It seems like the world would be in a lot better shape if there were, say, a dozen foundry makers around the world.

    Anyway, in creating these high-design chips, do Samsung and TSMC compete with Nvidia?

    Not exactly.

    Other industry reporting shows that Nvidia was planning to buy the chips from Samsung, but the vendor company couldn’t meet Nvidia’s bar.

    A March 20 press release shows Nvidia CEO Jensen Huang saying Samsung “has an important part to play,” but noting that the company hasn’t formally ordered Samsung HBM3E chips.

    The HBM Chip: What’s Inside

    First of all, the HBM chip is a 3-D stacked DRAM type of chip.

    The memory unit sits close to a CPU or GPU, to conquer latency and provide high bandwidth with low power consumption.

    I asked ChatGPT more about the specs for these chips, and it came out with this:

    Bandwidth: 819 GB per second, per stack

    Speed: 6.4 GB per pin

    Capacity: up to 64 GB per stack

    Thermals: better efficiency

    Use cases: AI, HPC, GPUs (In this context, we’re talking mainly about using it for AI applications.)

    ChatGPT also gave me this interesting graphic comparing the HBM‘s build to something called GDDR6, a gaming chip that’s cheaper and more widely available:

    You can get more from public resources like this one on how the HBM has been engineered to fit very specific needs.

    The Market Fallout

    Let’s look briefly at this corner of the tech market, for enterprise context that CEOs (or anyone else) might want to know about.

    First, we have Nvidia down around 40% from all-time highs within the past year, and crawling back down toward $100 per share in recent trading cycles, ostensibly based on U.S. export controls. The assertion of Huang and company that Nvidia is poised to lose $5.5 billion due to new rules has been big news lately. Then there’s Micron, at around $70 per share currently, about one half of all-time high values, and down significantly since winter. As for Samsung, which looks like it’s down 8% in a short time frame. Companies like AMD are also down.

    “A warning from AI chips champion Nvidia that it will face a $5.5 billion hit from tightened U.S. controls on exports to China marks a new chapter in the escalating tit-for-tat between Washington and Beijing,” AJ Bell investment director Russ Mould said, as quoted by Elsa Ohlen writing for Barron’s.

    That’s a little on some of the great new hardware developments happening now. The context, in terms of LLM news, is the advancement of models with persistent memory. I’ve talked about using an AI chat companion from Sesame, for example, and how “Maya” seemed to remember my name, as a return user, on a good day. Along with chain of thought, memory is a big capability builder for all of those vibrant use cases that we have come to expect from our neural net friends and neighbors.

    Source link

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleAI Entrepreneur Lucy Guo Beats Taylor Swift To Become Youngest Self-Made Woman Billionaire — And She’s A College Dropout
    Next Article How Sam Altman was fired and rehired at OpenAI, in only five days
    AI Logic News

    Related Posts

    AI Trends

    Open Source AI Is Moving From Sideshow To Strategy

    April 20, 2026
    AI Trends

    China Writes The AI Companion GDPR For A $30B Market

    April 19, 2026
    AI Trends

    Some Way Stations In The AI 2027 Road Map

    April 19, 2026
    Demo
    Top Posts

    DeepSeek V4 And Tencent’s New Hunyuan Model To Launch In April

    March 17, 202641 Views

    OpenAI’s Simo Said to Warn Staff Ag

    March 17, 202633 Views

    Houston’s Small Biz Gets Smarter: H

    July 29, 202513 Views
    Latest Reviews
    ailogicnews.ai
    © 2026 Lee Enterprises

    Type above and press Enter to search. Press Esc to cancel.