Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

Samsung’s arch-rival presents HBM3E memory chip that could power Nvidia’s Blackwell B100 AI GPU — with 16 layers, 48GB and 10.24Gbps transfer rate, this may well be the key to make ChatGPT 6 live

Sed ut perspiciatis unde. Samsung is set to showcase numerous new products at the forthcoming International Solid-State Circuits Conference (ISSCC), including a superfast DDR5 memory chip, 280 layer QLC NAND flash memory, and the the world’s fastest GDDR7. But while Samsung will certainly draw a lot of attention, it’s not the only game in town, as its South Korean rival SK hynix is also set to reveal its new HBM3E DRAM straight after Samsung finishes talking about its 3D-NAND flash memory at the High-Density Memory and Interfaces session.HBM3E (High Bandwidth Memory gen 3 Extended) is a groundbreaking memory technology that offers a significant leap in performance and power efficiency and is designed to meet the escalating demands of high-performance computing, AI, and graphics applications.HBM3E is the 5th generation of HBM, and interconnects multiple DRAM chips vertically, significantly increasing data processing speed, capacity, and heat dissipation.According to SK hynix, its new memory chip can process data up to 1.15 terabytes per second, equivalent to processing over 230 Full-HD movies of 5GB each in a second. It also boasts a 10% improvement in heat dissipation, thanks to the implementation of the cutting-edge Advanced Mass Reflow Molded Underfill technology (or MR-MUF).SK Hynix sees its HBM3E as the driving force behind AI tech innovation, but it could also power Nvidia’s most powerful GPU ever — the B100 Blackwell AI GPU. Micron has stated it won’t release the next generation of its high-bandwidth memory unit, HBM4, until 2025. This has led to speculation that Nvidia may seek an alternative supplier for the B100 Blackwell.Although Samsung was considered the most likely contender for this, with its new ‘Shinebolt’ HBM3E memory, SK Hynix is well-positioned to step in with its new product.There’s no official word on this yet, but it likely won’t be long until we find out which of the Korean companies Nvidia chooses.Source link Save my name, email, and website in this browser for the next time I comment.By using this form you agree with the storage and handling of your data. * Δdocument.getElementById( "ak_js_1" ).setAttribute( "value", ( new Date() ).getTime() );Tech dedicated news site to equip you with all tech related stuff.I agree that my submitted data is being collected and stored.✉️ Send us an emailTechToday © 2024. All Rights Reserved.TechToday.co is a technology blog and review site specializing in providing in-depth insights into the latest news and trends in the technology sector.TechToday © 2024. All Rights Reserved.Be the first to know the latest updatesI agree that my submitted data is being collected and stored.



This post first appeared on VedVyas Articles, please read the originial post: here

Share the post

Samsung’s arch-rival presents HBM3E memory chip that could power Nvidia’s Blackwell B100 AI GPU — with 16 layers, 48GB and 10.24Gbps transfer rate, this may well be the key to make ChatGPT 6 live

×

Subscribe to Vedvyas Articles

Get updates delivered right to your inbox!

Thank you for your subscription

×