Happy 40th Anniversary to the Original 8086 and the x86 Architecture

Happy 40th Anniversary to the Original 8086 and the x86 Architecture

Forty years ago today, Intel launched the original 8086 microprocessor — the grandfather of every x86 CPU ever built, including the ones we use now. This, it must be noted, is more or less the opposite outcome of what everyone expected at the time, including Intel.

According to Stephen P. Morse, who led the 8086 development effort, the new CPU “was intended to be short-lived and not have any successors.” Intel’s original goal with the 8086 was to improve overall performance relative to previous products while retaining source compatibility with earlier products (meaning assembly language for the 8008, 8080, or 8085 could be run on the 8086 after being recompiled). It offered faster overall performance than the 8080 or 8085 and could address up to 1MB of RAM (the 8085 topped out at 64KB). It contained eight 16-bit registers, which is where the x86 abbreviation comes from in the first place, and was originally offered at a clock speed of 5MHz (later versions were clocked as high as 10MHz).

8086 CPU die. Image by Wikipedia
8086 CPU die. Image by Wikipedia

Morse had experience in software as well as hardware and, as this historical retrospective makes clear, made decisions intended to make it easy to maintain backwards compatibility with earlier Intel products. He even notes that had he known he was inventing an architecture that would power computing for the next 40 years, he would’ve done some things differently, including using a symmetric register structure and avoiding segmented addressing. Initially, the 8086 was intended to be a stopgap product while Intel worked feverishly to finish its real next-generation microprocessor — the iAPX 432, Intel’s first 32-bit microprocessor. When sales of the 8086 began to slip in 1979, Intel made the decision to launch a massive marketing operation around the chip, dubbed Operation Crush. The goal? Drive adoption of the 8086 over and above competing products made by Motorola and Zilog (the latter founded by former Intel employees, including Federico Faggin, lead architect on the first microprocessor, Intel’s 4004). Project Crush was quite successful and is credited with spurring IBM to adopt the 8088 (a cut-down 8086 with an 8-bit bus) for the first IBM PC.

One might expect, given the x86 architecture’s historic domination of the computing industry, that the chip that launched the revolution would have been a towering achievement or quantum leap above the competition. The truth is more prosaic. The 8086 was a solid CPU core built by intelligent architects backed up by a strong marketing campaign. The computer revolution it helped to launch, on the other hand, transformed the world.

All that said, there’s one other point want to touch on.

It’s Been 40 Years. Why Are We Still Using x86 CPUs?

This is a simple question with a rather complex answer. First, in a number of very real senses, we aren’t really using x86 CPUs anymore. The original 8086 was a chip with 29,000 transistors. Modern chips have transistor counts in the billions. The modern CPU manufacturing process bears little resemblance to the nMOS manufacturing process used to implement the original design in 1978. The materials used to construct the CPU are themselves very different and the advent of EUV (Extreme Ultraviolet Lithography) will transform this process even more.

Modern x86 chips translate x86 microcode into internal micro-ops for more efficient execution. They implement features like out-of-order execution and speculative execution to improve performance and limit the impact of slow memory buses (relative to CPU clocks) with multiple layers of cache and capabilities like branch prediction. People often ask “Why are we still using x86 CPUs?” as if this was analogous to “Why are we still using the 8086?” The honest answer is: We aren’t. An 8086 from 1978 and a Core i7-8700K are both CPUs, just as a Model T and 2018 Toyota are both cars — but they don’t exactly share much beyond that most basic classification.

Furthermore, Intel tried to replace or supplant the x86 architecture multiple times. The iAPX 432, Intel i960, Intel i860, and Intel Itanium were all intended to supplant x86. Far from refusing to consider alternatives, Intel literally spent billions of dollars over multiple decades to bring those alternative visions to life. The x86 architecture won these fights — but it didn’t just win them because it offered backwards compatibility. We spoke to Intel Fellow Ronak Singhal for this article, who pointed out a facet of the issue I honestly hadn’t considered before. In each case, x86 continued to win out against the architectures Intel intended to replace it because the engineers working on those x86 processors found ways to extend and improve the performance of Intel’s existing microarchitectures, often beyond what even Intel engineers had thought possible years earlier.

Is there a penalty for continuing to support the original x86 ISA? There is — but today, it’s a tiny one. The original Pentium may have devoted up to 30 percent of its transistors to backwards compatibility, and the Pentium Pro’s bet on out-of-order execution and internal micro-ops chewed up a huge amount of die space and power, but these bets paid off. Today, the capabilities that consumed huge resources on older chips are a single-digit percent or less of the power or die area budget of a modern microprocessor. Comparisons between a variety of ISAs have demonstrated that architectural design decisions have a much larger impact on performance efficiency and power consumption than ISA does, at least above the microcontroller level.

Will we still be using x86 chips 40 years from now? I have no idea. I doubt any of the Intel CPU designers that built the 8086 back in 1978 thought their core would go on to power most of the personal computing revolution of the 1980s and 1990s. But Intel’s recent moves into fields like AI, machine learning, and cloud data centers are proof that the x86 family of CPUs isn’t done evolving. No matter what happens in the future, 40 years of success are a tremendous legacy for one small chip — especially one which, as Stephen Moore says, “was intended to be short-lived and not have any successors.”

Feature image by Thomas Nguyen via Wikipedia

Continue reading

Cyberpunk 2077 Is a Wreck on the Original Xbox One, PS4
Cyberpunk 2077 Is a Wreck on the Original Xbox One, PS4

The Xbox One and PS4 versions of Cyberpunk 2077 look abysmal and apparently play terribly after CDPR delayed the game by nearly a month to polish them.

Big Bounce or Big Bang? Scientists Still Grappling With Origin of Universe
Big Bounce or Big Bang? Scientists Still Grappling With Origin of Universe

Most experts on physics and cosmology accept the inflation model, a straight line from the big bang to our infinitely expanding universe. However, some scientists hold onto the possibility of a "Big Bounce" instead of a bang, and they're still actively searching for evidence that could upend the conventional wisdom.

NASA and Blue Origin Will Simulate Lunar Gravity With Spinning Rockets
NASA and Blue Origin Will Simulate Lunar Gravity With Spinning Rockets

Bezos' Blue Origin rockets are still taking shape, but the company has announced a partnership with NASA to improve artificial gravity testing. Basically, they're going to make rockets spin.

DARPA Taps Blue Origin for Nuclear-Powered Rocket
DARPA Taps Blue Origin for Nuclear-Powered Rocket

The contracts from the Pentagon's Defense Advanced Research Projects Agency (DARPA) direct Blue Origin, Lockheed Martin, and General Atomics to design a nuclear thermal propulsion system that could accelerate human spaceflight—literally.