Apple’s M1 unveil this week wasn’t just a triumph for the company. It signaled a profound power shift within the personal computer industry — one with significant consequences for companies like Dell, HP, and Lenovo should the present trend continue unchallenged.
x86 CPUs have dominated PCs across laptops, servers, and desktops for at least the past 25 years. The business model is straightforward: Companies such as AMD, Micron, Nvidia, and Intel design (and build, in Intel’s case) components, which are then purchased by other businesses and used as building blocks to assemble complete PCs. With few exceptions, the companies that design the underlying architectures are different from those that build the components, which are often (but not always) different from the company that ships the final system to the consumer.
Once upon a time (read: the mid-1990s), companies like HP, Compaq, and IBM all had their own CPU divisions and shipped their own custom server and workstation hardware to compete with x86. When x86 ate their markets, it did so with an implicit promise: Stick with Intel (or occasionally, AMD) and these various OEMs would get more regular performance improvements with fewer problems and upsets than they could achieve by designing and fabbing their own products.
Evaluated over the scope of a quarter-century, this partnership has been quite successful. But it rests, implicitly, upon a series of assumptions. Among them: That Intel (or x86 more generally) would deliver the best performance/dollar ratio that a customer could expect and that the net overall rate of improvement — the “reward” for buying into x86, if you will — would be higher for these systems than systems built by any other vendor.
The advantage of this system, from an OEM perspective, is that it guarantees an even playing field. While the licensing models are very different, you can draw an analogy between ARM providing a hard IP license for a standardized CPU like the Cortex-A73 to any customer who wants one and Intel selling the same Core i9-10900K to any OEM. HP doesn’t have to worry that Dell is developing its own custom microarchitecture to build a better CPU, and Dell doesn’t have to worry about a skunkworks project running at Lenovo. CPU introductions, chipset changes, and launches are all standardized and occur at regular cadences, which keeps the cost of new hardware development low and makes manufacturing volumes easier to estimate. We’re talking mostly about CPUs here, but AMD and Nvidia use the same basic model for GPUs.
So long as Apple used x86 hardware, it was tied to the same computing paradigm as everyone else. But Apple isn’t using x86 anymore.
Apple Upsets the Cart
Apple’s M1 isn’t going to revolutionize the market overnight, but if the chip delivers the improvements Apple promises and Intel can’t find a way to close the gap in the next 18-36 months, things in the x86 universe could get ugly. The higher Apple can scale its performance, the more aggressively it can challenge the x86 hierarchy with chips that draw less power while hitting higher absolute performance targets. In the short term, nothing is going to change. In the long term, manufacturers are going to start asking whether it’s worth paying such high premiums to Intel when ARM cores can obviously deliver better performance. Consumers will ask the same thing. It might take a generation or two for Apple to get things moving, but if the performance is there, software stacks and customer interest will follow. Even high pricing is only a limited deterrent — if customers weren’t willing to pay top-drawer price for top-drawer performance, Intel wouldn’t have the margins it does.
If the last eight years have demonstrated everything, it’s that these trends move much more slowly than consumers and CEOs often expect. ARM’s server market share remains fractional when pundits confidently predicted double-digit market share by 2020 ten years ago — but ARM chips are moving into servers. Even before Apple, ARM chips were beginning to move into laptops via Windows on ARM.
If Apple’s custom silicon proves a winning strategy, people in the PC world will notice, especially if customers start moving their purchases towards Apple hardware. This directly threatens Intel’s margins on its CPUs and could push the company to cut its prices, but it’s not going to be great for AMD’s bottom line, either. Samsung is moving out of the custom ARM core business, but it’s not crazy to imagine a new player emerging in the PC space, especially now that Nvidia owns ARM. I don’t think there’s any reason to expect a near-term Nvidia entry into this market, but Windows on ARM exists and Nvidia now owns ARM. If the company wants to make a foray into designing its own custom CPUs with attached Nvidia graphics as part of a long-term push to license its designs to PC OEMs, it owns all of the pieces of the puzzle it needs to try.
If you think about it, the Intel/AMD duopoly works out great for the OEMs in certain respects. They can threaten Intel with shifting their business to AMD when they want to negotiate some aspect of a purchase agreement and vice-versa. Since Intel has a significant competitor across the market, there’s room to position one company as the top-end solution and one company as the price/performance winner. This framework also supports the high prices Intel charges for top-end chips because we all accept that buying at the far end of the curve naturally carries a price premium.
But what happens if Intel and/or AMD aren’t at the high end of the curve any more? All of a sudden, it starts to look as if the OEM is the idiot paying for branding as opposed to fielding the solution their customers actually want.
All of this will play out relatively slowly in-market. It takes years to design and ramp new chips. No matter how good the M1 is, Apple isn’t going to just take over the PC industry in 18-24 months because people buy what they buy for a lot of reasons, including brand familiarity. There are a thousand reasons, ranging from game drivers to emulation, why there will be no fast x86-ARM transition in the mainstream PC market. Any such transition will play out relatively slowly — but if AMD and Intel can’t catch ARM, it will happen in the long run.
If Apple gains sufficient advantage from designing its own custom ARM core, players in the PC space will start looking to take advantage of the trend. It might take 3-5 years before the mainstream market begins to shift, but at least some business will move over. PC OEMs typically operate on terrible margins and ARM chips are cheaper than their x86 counterparts. If Dell can eke an extra 0.25 percent of margin out of an ARM core as opposed to an x86 core, it’ll do so and all the marketing funds in the world won’t entice consumers who have been lured away by promises of better battery life and higher performance.
To be clear, I’m not convinced that AMD and Intel can’t beat Apple. AMD’s performance has improved rapidly every single year since Zen debuted. If Zen 4 boosts performance and efficiency as Zen 3 did, the company will have its own potent narrative in 2021. Intel has hit major manufacturing problems in the past few years, but if the company can recover its own mojo or borrow some from Samsung, it’ll likely remain the dominant market player.
Intel’s exposure and risk are larger than AMD’s because its market share and price/performance position are bigger, easier targets, but both manufacturers are facing a serious threat. Tiger Lake and Zen 3 are both improvements on what came before, but Alder Lake and Zen 4 need to be at least as strong as their predecessors to apparently keep parity with what Apple has cooking. The competitive showdown may have begun with a quad-core chip, but it isn’t going to end there.
Why Apple’s M1 Chip Threatens Intel and AMD
Intel's own history suggests it and AMD should take Apple's new M1 SoC very seriously.
How L1 and L2 CPU Caches Work, and Why They’re an Essential Part of Modern Chips
Ever been curious how L1 and L2 cache work? We're glad you asked. Here, we deep dive into the structure and nature of one of computing's most fundamental designs and innovations.
How to Be a Task Manager Wizard, According to the Guy Who Wrote It
The author of the original Windows Task Manager has some tips on how to use it more effectively, including a few we'd never heard of — and one we tossed in ourselves if you have trouble with Task Manager getting stuck behind fullscreen-stealing windows.