Yesterday, former Intel engineer Francois Piednoël tweeted that the long era of the Intel “Extreme Edition” processor family was about to come to an end. Intel is now pushing back on that messaging, claiming that the branding is very much alive.
As a brand, the Extreme Edition dates back to September 2003 — and I’ve got a fun story to tell about it. Back then, Intel had ruled the entire PC performance stack across consumer, desktop, and workstations almost since the introduction of the Pentium 4 Northwood in January 2002. But while AMD had kept a tight lid on Athlon 64, Intel knew it had a problem on its hands. The Athlon 64’s integrated memory controller made it an absolutely killer gaming CPU (games, at the time, were entirely single-threaded and did not benefit from features like Hyper-Threading).
Intel had one card it could play. Its Gallatin Xeon servers were bog-standard Pentium 4s at the time, but they included an additional 2MB of L3 cache. These chips also ran at a lower bus clock than your standard consumer Pentium 4 — desktop chips at the time were using an 800MHz bus (200MHz “quad pumped”), while the Gallatin Xeon’s were still stuck on a 533MHz (133MHz) bus. Intel rushed these chips out to reviewers so quickly, the chip I got from the company was blank. No model number. No “Intel Confidential.” Nothing whatsoever.
But what it lacked in labeling it made up for in multipliers. At the time, Intel CPUs were multiplier-locked and could not be increased above the maximum multiplier specified by the factory. The final clock speed of a CPU at the time was determined by its CPU multiplier * FSB clock. Running on an 800MHz bus, a 3.2GHz P4 would have a 16x multiplier. But the same Gallatin CPU, running on a 533MHz bus, had a 24x multiplier (24 * 133). And remember — these CPUs were Gallatin Xeons that had been yanked off the line. They weren’t even badged.
As a result, the Pentium 4 3.2GHz EE that Intel sampled me — and as far as I know, everybody else who also got chips that looked like mine did — had a functionally unlocked multiplier. It didn’t really give that first EE the juice it needed to take on the Athlon 64 — Northwood was already near the top of its frequency range, and I think that specific CPU topped out around 3.6GHz up from a base 3.2GHz without insane cooling — but it’s the only Intel chip I ever tested with a completely blank heatspreader with no text at all, and the only Intel CPU from this era to have a functionally unlocked multiplier that I ever personally played with. Intel had come down pretty hard on the multiplier issue, both to prevent people from easily overclocking cheap CPUs into high-end competitors, and to crack down on the practice of unethical OEMs selling people low-end parts that were overclocked to look like chips they weren’t.
And while it was AMD, not Intel, that took the first leap with a thousand-dollar CPU launch, it’s an idea that has stood the test of time. Since 2003, both AMD and Intel have generally offered these high-end chips, though AMD was driven from the market for many years. Just as in 2003, we often see server chips tapped for the high-end parts, bringing some of the features of these systems into marginally lower-cost markets (depending on the feature, obviously). I can’t say I’m opposed to the idea of a branding change — I was tired of “Extreme” branding 10+ years ago — but halo products like the EE are valuable enough to Intel that the company would never dump the idea. The language used to describe these chips may change, but the concept of the halo brand remains eternal. As an Intel spokesperson told Tech Report: “There is no change to the branding of the Intel Core Extreme Edition processor and Intel Core X-series processor family.”
Dinosaur-Killing Impact May Have Superheated Earth’s Atmosphere for 100,000 Years
The Cretaceous-Paleogene extinction event didn't just kick off a short-term apocalyptic winter — it may have turned the planet's thermostat up dramatically for a hundred thousand years.
Apple Defends Killing OpenGL, OpenCL as Developers Threaten Revolt
Apple's plans to kill off OpenGL aren't popular with game developers and the company is attempting to further explain its position.