Intel Working on New Discrete GPUs, Codenamed Arctic Sound

Intel Working on New Discrete GPUs, Codenamed Arctic Sound

When Intel hired Raja Koduri last year, it was clear the company was serious about graphics. Koduri has spent much of his professional career in graphics for AMD and ATI; bringing him on-board was a signal that Intel intended to invest into GPUs more aggressively than it had done before.

Intel’s Arctic Sound reportedly began life as a video streaming app processor intended for data centers before being repurposed by Raja as a full-on discrete GPU.

Bonus: Apparently @Rajaontheedge is redefining Arctic Sound (first Intel dGPU), was originally targeted for video streaming apps in data center, but now being split into two: the video streaming stuff and gaming. Apparently wants to “enter the market with a bang.”

— Ashraf Eassa (@TMFChipFool) April 6, 2018

We’ve heard rumors of at least two solutions — a discrete GPU and a new integrated part. This last suggests that Intel’s partnership with AMD could be short-lived and that it may have served a near-term need for both companies. Intel wants to demonstrate it’s serious about building graphics solutions; AMD wanted more GPU sales.

Why Intel Wants to Build a GPU

Intel has good reason to be interested in the GPU market. When Nvidia invented CUDA nearly ten years ago, it wasn’t clear if programmable GPUs would actually make a meaningful dent in a very CPU-centric industry. Today, GPUs dominate in many fields, from traditional applications like 3D and video rendering, to machine learning, artificial intelligence, and self-driving cars. While some companies, like Google, build their own custom TensorFlow processing units, most firms can’t afford these kinds of costs.

Intel has redefined its own focus to emphasize the cloud, exascale computing, and data centers. Having a GPU in-house strengthens all of those pillars, while simultaneously giving the company an opportunity to compete for a larger share of the profits in desktops and notebooks that ship with an attached GPU. With CPU performance increasing only marginally year-on-year, investing in GPUs also would give Intel a more exciting topic to talk about.

Can Intel Build a GPU?

Yes. That’s going to be a somewhat controversial statement, given Intel’s less-than-illustrious historic performance in this market. The original Intel i740 was supposed to showcase the performance of AGP, but the card offered abysmal performance and was withdrawn from the market in August 1999. Intel’s next discrete GPU, codenamed Larrabee, was an attempt to create a hybrid CPU-GPU that would’ve been programmed in x86 and would’ve used very little specialized graphics hardware. Larrabee never even made it to market; the chip was instead repurposed as Knights Ferry, a prototype MIC (Many Integrated Core) architecture.

This is what Larrabee was supposed to deliver. It didn’t quite play out that way.
This is what Larrabee was supposed to deliver. It didn’t quite play out that way.

Then, of course, there’s the legion of terrible Intel integrated GPUs. Intel’s first decent integrated GPU was Sandy Bridge in 2011. From 2011-2015, each generation of Intel GPU improved markedly on the latter. Post-Skylake, however, Intel has marched in place. The company has never fielded a competitive desktop GPU and its integrated solutions have never lit the world on fire, either. It’s easy to look at the company’s past and think Intel simply can’t pull off a GPU.

That, however, would be a mistake. Intel’s unwillingness to commit to winning a market has unquestionably harmed it before. But if Chipzilla is actually throwing its hat into the GPU ring, it’s got the cash reserves to hire engineers and the expertise to build its own parts. If Intel is serious about entering this market and is willing to play a long game, it could emerge as a potent competitor to Nvidia and AMD. Intel can afford to spin multiple parts for multiple market segments, it can pay to jump-start a GPU design from scratch, and it can tailor its own GPUs to match its manufacturing nodes.

There’s no guarantee of an Intel victory in this business. But it’s definitely got the chops to contend for the title.

Continue reading

Samsung Now Producing 16Gbit GDDR6 for Upcoming GPUs

Samsung is ramping up production of GDDR6 with an 18Gbit clock and a new process node.

GPU Prices Skyrocket, Breaking the Entire DIY PC Market

GPU prices are so high, they're killing the value proposition of building your own gaming PC.

Nvidia Calls for Limits as Crypto Hysteria Pushes GPU Prices Sky High

You can thank the surge of interest in cryptocurrency for the increase in graphics card prices, but Nvidia is trying to do something about it. "Trying" is the operative word here.

How To Boost Older GPU Performance, Since You Can’t Buy a New One

With GPU prices beyond crazy, there's no way we're recommending anybody buy right now. If you're stuck making do with an older card, here's some advice and tips for maximizing every last bit of performance it can give you.