On Tuesday at ISC 2018, Samsung discussed its Aquabolt HBM2 technology and made a rather unusual claim about demand for its high-end memory standard. According to the company, even if it doubled its manufacturing capacity for HBM2 today, it still wouldn’t be able to meet existing demand for the standard.
This would seem to convey two different things about HBM2. On the one hand, of course, it implies that HBM2 is robustly demanded by solutions all across the market. On the other, it implies that HBM2 remains so difficult to manufacture, or represents such a tiny percentage of Samsung’s overall manufacturing capability, that even doubling the amount of HBM2 memory it builds wouldn’t really move the needle much as far as answering market need. Neither of those statements say much good about the chances of seeing HBM2 on consumer graphics cards, and indeed, the focus for the memory technology really doesn’t seem to be in the consumer GPU market.
Samsung could manufacture 2x the HBM2 and it would still not be enough to satisfy market demand. No wonder it’s so expensive! #ISC18 pic.twitter.com/QoF4EtMasW
— Glenn K. Lockwood (@glennklockwood) June 25, 2018
Samsung is advertising Aquabolt as being capable of delivering up to 307GBps per chip in 8GB capacities, which would put a 4-Hi stack similar to the one AMD used on the first Radeon Fury X at well over 1TBps of memory bandwidth. To put that in additional perspective, one Aquabolt HBM2 stack can provide more memory bandwidth than a GTX 1070 or any AMD GPU in the RX 500 family. It’s also far more bandwidth per-stack than AMD specifies for its Vega 64, which offers 484GB/s of bandwidth in two stacks, or 242GB/s per stack.
The impact of increasing HBM2 adoption in GPUs is one of the bigger puzzles in the larger GPU market. Years ago, it seemed as if HBM would begin a straightforward process of replacing GDDR in high-end GPUs, before eventually waterfalling into at least midrange cards. Back in 2014-2015, we predicted that Fury X would use HBM only at the high-end, Vega would push HBM2 into the midrange, and by HBM3 or so we’d see it replacing GDDR on all but the lowest-end cards. This migration path roughly parallels the previous adoption of GDDR5 or GDDR3, with the memory initially debuting only at the high-end of the market before rolling out across entire families.
This has not occurred. Instead, HBM2 remains isolated to AMD’s top-end and none of Nvidia’s consumer cards. None of the rumors we hear about Nvidia’s next-generation GPUs suggest they’ll be adopting HBM2, either. One could make an argument that AMD’s need for HBM2 was partially driven by higher power consumption in its Polaris and Vega class of GPUs than the company might have preferred — which opens the door for a return to more standard memory types, even at the highest end, for both companies. But as of right now, HBM2 seems like a genuine success story — with only limited potential in the consumer market.
Rivet Launches Blazing Fast, Intel-Based Killer Wireless-AC 1550 Chip, New Xbox Router
Rivet Networks has launched a new Wi-Fi chip based on an Intel solution, as well as a new, Xbox One-optimized router debuting this spring.
Astronomers Detect Mysteriously Contorted Fast Radio Bursts
The signals from FRB 121102 appear to be "twisted" in a way that indicates a very extreme stellar environment.
NASA Finds Vast Deposits of Ice Just Under Martian Surface
We've known for years that there is at least some water ice on Mars, but it's been hard to pin down where it is and how easy it would be to extract. New data from NASA's Mars Reconnaissance Orbiter indicates it could be almost everywhere.
ET Deals Roundup: $200 Gift Card with Vizio M-Series 4K HDTV, $15 Fast Qi Wireless Charger, and more
Fixing to unload some of those gift cards from the holidays? Heck, we're less than a month away from Valentine's Day, so maybe you're on the prowl for some top-notch gifts for your significant other. Well, we've tracked down today's best online bargains for laptops, 4K TVs, and other gadgets, and compiled them all right here for your convenience.