As AMD and Intel Struggle, a Competitive CPU Market Re-emerges

As AMD and Intel Struggle, a Competitive CPU Market Re-emerges

Last week, we covered two key developments regarding AMD and Intel. First, AMD chose to move its GPUs back to 7nm at TSMC, raising the question of what this might say about GlobalFoundries own 7nm ramp, particularly given that GF recently replaced its CEO for the fourth time in a decade and has had well-known and repeated problems ramping cutting-edge process nodes. Then Intel declared it would push back its own 10nm ramp from 2018 to 2019 — and not even the first half of 2019, just “2019.” Neither one of these things is great for either company, though the fine folks at AMD would undoubtedly note that one of them is speculation and the other is now established fact. And of course, they’d be right about that.

But there’s a larger point here — one that’s easy to forget, given how long it’s been since AMD was slugging it out with Intel for the top of the CPU market. From mid-2006 to 2011 (the launch of Bulldozer), AMD fought a rear-guard action to take on Intel’s Core 2 Duo and Core 2 Quad, and it did so reasonably well. You have to go back more than 12 years to find a period of time when AMD was solidly in the driver’s seat or competing for the same, and it’s therefore easy to forget the way things used to look. Cast your eyes back further, and some of the problems we’re seeing in the CPU industry will look a little more familiar.

The Way We Were

From 1996-2006, the CPU market could practically have doubled as a soap opera. CPUs like the Pentium MMX and Pentium II were supposed to crush every non-Intel x86 manufacturer on the planet. This largely worked — over several years, Intel ate into what little market share companies like WinChip and Cyrix were able to build for themselves. AMD, however, was the bug that refused to be squished. In 1995, AMD bought the struggling CPU designer NexGen and brought that company’s CPU to market as the K6, later extended as the K6-2 and K6-3.

Prior to the launch of the K7 Athlon, these cores were little more than an annoyance to Intel — AMD’s K6 and K6-2 offered gaming and FPU performance that was just good enough to skate by as the poor man’s gaming system, particularly since they leaned on the older Socket 7 (later extended into Super Socket 7) technology. Up until August 1999, the only real harm Intel had taken from its x86 upstarts was being forced into releasing a Celeron with an L2 cache (Mendocino) as opposed to the terrible, crippled variant with just L1 that they’d tried launching first (Deschutes). But the debut of the original K7 Athlon changed everything. Before K7, AMD was the “good enough” alternative you could reasonably game on.

A full rehash of the product launches and misadventures that hit both companies would fill several articles, but here’s a short list.

AMD

AMD found itself with a fabulous CPU and terrible chipsets. The chipset problem went on for years, with VIA cheerfully willing to rely on the infamous VT82C686B, despite knowing that it contained a software RAID bug that would literally permanently corrupt data on your hard drives if you used them in a RAID array while simultaneously using a SoundBlaster — and back then, almost everyone used a SoundBlaster. Nvidia’s entrance into the chipset market was generally hailed by everyone who ever got bitten by VIA’s habit of selling a new chipset… only to sell you the real version of the chipset that you should have bought 6-12 months later, labeled with an “A” to distinguish it. Everyone was pretty fine with this at first (KT133, KT133A), but round about the time we got the KT266A (aka “The one with DDR performance that was actually better than using SDR RAM”) folks were real sick of that particular habit.

Incidentally, VIA’s PC business mostly died when it decided that it could take Intel on over needing a bus license for the Pentium 4. VIA, which held roughly a third of Intel’s Pentium 3 chipset business, declared it didn’t need a license. Intel said it did. VIA said it didn’t — and all of the various motherboard OEMs dropped VIA chipsets like a bad habit when Intel told them to. VIA’s Intel business never recovered from the Pentium 4 debacle and its AMD business died in the K8 era, when AMD users flocked to Nvidia chipsets instead.

AMD enjoyed a generally strong run from 1999-2001, but the 130nm shrink of the Pentium 4 transformed that CPU from turd to titan. Intel raced up the clock charts from early 2002 to mid-2003, leaping from 2GHz to 3.06GHz, increasing its FSB clock, and hammering the Thunderbird and later Palomino CPU cores. AMD’s first stab at 130nm fell completely flat and there seemed to be little hope left for the company to mount a defense against the P4, until it turned things around just two months later with the launch of a second 130nm die shrink — this time with an extra metal layer and a few hundred MHz of additional clock. The Thoroughbred “B” core didn’t put AMD back on top, but helped the company fight a rear-guard action against the P4 (which was, incidentally, eating its lunch) long enough to launch the Athlon 64 in 2003.

AMD K8 architecture
AMD K8 architecture

Even then, Intel was far from finished. It’s easy to remember Prescott as being a terrible chip from Day 1, but when the chip debuted on launch day there plenty of people who thought Intel would turn the core around when it shifted from Socket 478 to LGA 775. It didn’t — and that represented a tremendous opportunity for AMD, who took advantage of these shifts to win new markets in servers and to aggressively bring dual-core parts to market. AMD’s Athlon 64 and 64 X2 had their golden age from 2004 to mid-2006, but it took AMD years to lay the ground for those successes, and to dodge challenges that ranged from the strength of its motherboard partners to its own fab and foundry difficulties. With the exception of the 180nm shift, which it made with IBM assistance, AMD was often half a node to a node behind Intel (the gap between the two companies lengthened as time went on), and it often seemed to get fewer benefits from node shifts as well for a variety of reasons.

Intel

Intel is genuinely difficult to summarize throughout this same period. From 1996-2006, it attempted to take over the RAM market (and failed), had to recall the Pentium 3 1.13GHz, had to recall the entire i820 with Memory Translator Hub motherboard family, weathered a storm of criticism concerning its new CPU architecture, fought off withering reviews of that same architecture, launched a new 130nm variant of said Netburst uarch (Northwood) that laughed at all the haters as it skyrocketed from 2GHz to 3.06GHz in less than 18 months, then watched Northwood’s successor explode like the Russian Taiga post-Tunguska after it shrank down to 90nm.

Prescott ran so hot, it melted plastic stands beneath test motherboards and caused a Prescott-compatible small form factor system sent to my former employer to *ignite.* No exaggeration.
Prescott ran so hot, it melted plastic stands beneath test motherboards and caused a Prescott-compatible small form factor system sent to my former employer to *ignite.* No exaggeration.

Seriously, though. By January 1, 2006 Intel looked pretty damn cooked. AMD’s Athlon 64 X2 family had knocked the Pentium 4 off enthusiast radars. The advent of dual cores had made AMD the superior solution for workstations and Opteron would hit ~20 percent of the server market that same year. But even allowing for the impact Intel’s market manipulations had on AMD’s overall success, there were some substantial land mines in AMD’s path as well — the company had paid twice what it should’ve for ATI, costing it billions of dollars. Its Phenom architecture would fail to match Intel’s Core 2 Duo, and while AMD’s Phenom II was a pretty good CPU, it wasn’t good enough to catch Nehalem. Whatever else one might think of Intel, Santa Clara didn’t put a TLB error in Phenom, delay Bulldozer, repeatedly push back the launch of AMD’s “Fusion” processors, or build a CPU that’s often been called “AMD’s Pentium 4.”

And what even AMD wasn’t paying enough attention to, back in January, 2006 was that Intel had been quietly evolving its mobile Pentium M architecture for several years, quietly assembling the pieces of a puzzle that would lead to its domination of the CPU market for nearly a decade. I still have a Tualatin Pentium 3 CPU — a 130nm die shrink of the 180nm P3 that was beaten down by Athlon and Thunderbird before being replaced by the generally inferior Pentium 4 Willamette. In 2001, the P3 Tualatin was the best overall CPU core Intel had that it didn’t want to sell because Netburst was supposed to be the future. In 2008, Nehalem — Tualatin’s descendent — launched a nine-year period of Intel dominance.

The Moral of the Story

In the late 1990s, prior to the launch of Athlon, few gave much chance for AMD to survive. On January 1, 2006, there were many in the enthusiast community who thought Intel’s market share would continue to collapse. Today, it’s common to see people suggest that ARM or Samsung or TSMC (or some combination of all of the above) will be the end of Intel. And some fall back to the old AMD / Intel argument. Old fandoms die hard, and with Intel and AMD’s mutual focus on the PC space, it’s easy to frame the current fight as just another iteration of the same battle the two have waged before.

The advantage of stepping back and taking the longer view is that it sets some of these episodes in context. The computing world is not the same as it was in 2006 or 1996, but Intel’s 10nm delay will not, on balance, particularly cripple the company — though it may certainly create conditions that favor Intel’s competitors more than Intel itself. And AMD, having weathered disasters much larger than a hypothetical 7nm delay, is unlikely to be crippled here, even if I’m right that there are reasons to be concerned.

After over a decade of stasis, the topsy-turvy, take-no-prisoners resumption of meaningful competition can look more climactic than it is. This type of struggle, where both companies push to bring new products to market, fight with uncooperative technologies, and fend off shots from one another used to be a lot more common than it has been of late. It’s not the sign of the end for either company — it’s a bit of competitive limbering-up after a very long freeze.

Whether or not either company can continue delivering meaningful year-on-year improvements in the face of declining node shrink values and the difficulties of improving single-threaded performance is a very good question — but again, beyond the scope of this novella article. The point is, when two companies are actually pushing each other, the end result isn’t a perfect set of improvements delivered on a precise cadence year after year. Both times AMD actually managed to slug it out with Intel across the entire CPU stack, the result was a glorious train wreck of launches, moves, counter-moves, face-plants, abject failures, and occasional soaring successes. It was, in a word, interesting. And it looks like things might be getting more interesting over the next few years.

Continue reading

Intel Launches AMD Radeon-Powered CPUs
Intel Launches AMD Radeon-Powered CPUs

Intel's new Radeon+Kaby Lake hybrid CPUs are headed for store shelves. Here's how the SKUs break down and what you need to know.

NASA’s OSIRIS-REx Asteroid Sample Is Leaking into Space
NASA’s OSIRIS-REx Asteroid Sample Is Leaking into Space

NASA reports the probe grabbed so much regolith from the asteroid that it's leaking out of the collector. The team is now working to determine how best to keep the precious cargo from escaping.

Intel’s Raja Koduri to Present at Samsung Foundry’s Upcoming Conference
Intel’s Raja Koduri to Present at Samsung Foundry’s Upcoming Conference

Intel's Raja Koduri will speak at a Samsung foundry event this week — and that's not something that would happen if Intel didn't have something to say.

Review: The Oculus Quest 2 Could Be the Tipping Point for VR Mass Adoption
Review: The Oculus Quest 2 Could Be the Tipping Point for VR Mass Adoption

The Oculus Quest 2 is now available, and it's an improvement over the original in every way that matters. And yet, it's $100 less expensive than the last release. Having spent some time with the Quest 2, I believe we might look back on it as the headset that finally made VR accessible to mainstream consumers.