How Much Does It Matter That PCs Are Faster Than Consoles?

How Much Does It Matter That PCs Are Faster Than Consoles?

Earlier this week, I argued that the backward-compatibility features Microsoft is launching with the Xbox Series X will put consoles on an equal footing with PCs for the first time ever. My point was not to claim that PC and console gaming would now literally be identical, but that adding accessory and software backward compatibility represented the last major feature gap between the two platforms and that the Xbox Series X represented a convergence between PC and consoles.

Plenty of readers disagreed. This article is the first of several I intend to write on various counter-arguments or views raised by readers. According to some vehement readers, the entire point of PC gaming is to maintain higher frame rates than anything a console is capable of. Speed (and to a lesser extent, higher detail levels) aren’t just a nice perk — they’re the defining characteristic that separates PC and console gaming. 60fps is one common target, but some of the people who responded claimed higher targets, including 144 and even 240fps. Many of the same people who emphasized this argument made other comments indicating they believed I was a console gamer.

PCs Used to Be the Slowest Gaming Solution Around

When I was young, PC games were slow. While there were 8-bit machines from Atari and Commodore that offered richer graphics and audio options, the generic hardware in the IBM PC and its immediate successors wasn’t well-suited to fast-paced gaming. Graphics on the PC began as a crude series of static images — the first graphical adventure game, Mystery House, was published in 1980 by On-Line Systems, the forerunner of Sierra On-Line. Lists of the best games of the 1980s are dominated by RPGs, simulations, and adventure games.

Games that primarily relied on typed input mostly didn’t rely on fast gameplay to avoid turning them into inadvertent typing speed tests. Early Sierra games like Kings Quest I – III, Space Quest I and II, Police Quest I, and the original Leisure Suit Larry didn’t even pause the game when you started to type.

I went to my friends’ houses to play games like Legend of Zelda, Super Mario Bros., and Double Dragon because they were faster and smoother than anything I had access to at home. Platforming on the PC in the late 1980s looked like Captain Comic, shown below. I beat this game. It wasn’t nearly as much fun as Super Mario Bros. and it didn’t play as well. This blog post dives into some of the technical reasons why a $2,000 PC in 1991 couldn’t match a $200 console — because, at the time, it couldn’t.

If you fell in love with PC gaming in this era, it wasn’t because PCs outperformed consoles. It took far longer to load data off a floppy disk or even hard drive than an NES cartridge, and the NES, Genesis, and Super Nintendo had better graphics than the PC at launch. You played Ultima IV: Quest of the Avatar instead of Final Fantasy because Ultima IV actually let you talk to each and every NPC, asking them detailed questions about their lives, as opposed to limiting the NPCs to single-response answers. Ultima IV had standard RPG elements like levels and spellcasting, but part of the feeling of progression came from learning which NPCs knew what information and puzzling out various webs of information. Squint a little and it’s an organic quest system without the in-game tracking.

Consoles had nothing to match this kind of capability. In the era before dialog menus, keyboard support made game engines feel as if they supported far richer levels of text interactivity than they actually did. It’s an advantage the PC lost after games moved away from emphasizing text input and towards the use of dialog menus.

In the late 1980s, if you wanted to make a game run meaningfully faster, you copied it to your hard drive. A faster CPU would play animations faster, but games of the day loaded data one screen at a time. Above a certain point, running a game off multiple 5.25-inch floppies and swapping between them hit performance much harder than a slower CPU. You also ran the risk of discovering one of your floppies had gone bad mid-game. If you wanted fast, animated, on-screen gaming, you bought a console. If you wanted deep, thoughtful, slower-paced titles, you played on PC.

Games like Wolfenstein 3D and Doom made tremendous waves in the PC market precisely because they were some of the first PC games to offer fast-paced, arcade-like play. But Wolf 3D didn’t show up until 1992, with Doom following in 1993. At the time, I could play Wolfenstein 3D reasonably well on my 386SX-16 with 8MB of RAM, but I had to shrink Doom to postage-stamp size to play it, and the only version of Doom 2 I had access to required a CD-ROM, which my machine didn’t have. I played older games like Civilization on my home PC and titles like Civilization II and Doom 2 on my best friend’s computer, since his parents had bought a 486-DX2. CD-ROM capabilities were huge in the mid-90s and game developers quickly found ways to offer additional features on the CD-ROM versions of their games.

Performance Evolves Slowly

The arrival of 3D cards in the late 1990s revolutionized gaming, but they didn’t immediately unlock super-high frame rates for your average PC gamer. Higher spend has always equaled higher frame rates, but the speed at which the market moved and the higher number of variables to track made it harder and more expensive to get decent performance than it is today. In 1997, if you had money, you bought a Pentium or Pentium II. If you didn’t have money, you bought a K6 or K6-2.

The arrival of Athlon in 1999 kicked off the first high-end war between AMD and Intel, but I’d bought my first system in 1997. As a college student, I was firmly in the “don’t have money” crowd, and these are the kind of frame rates you got from Quake III Arena in 2000 if you had, say, a K6-2 300. All of these graphs are from nearly the same period of time — two articles from Anandtech, written in June and November of 2000. All images below by Anandtech.

Image by Anandtech
Image by Anandtech

Here’s the same test from a K6-3 450:

Image by Anandtech
Image by Anandtech

Finally, the same test at the same resolution, running on an array of higher-end CPUs:

Graph by Anandtech
Graph by Anandtech

Only one of those graphs — the last one — looks like the kind of CPU results we see in games today. The K6-3 was so much faster than the K6-2 thanks to the combined impact of an on-die L2 cache and its higher clock speed. What’s unusual is that the large performance gains aren’t confined to the highest-end GPU, but show up even when tested with older cards like the TNT2 Ultra. Even a low-end card like the Voodoo 3 2000 showed much stronger CPU scaling than what we see today. And all of these chips were new by modern standards — the K6-2 300 was less than 2.5 years old in November 2000. The “old” Voodoo 3 2000 was barely 15 months old.

My initial forays into overclocking were driven by a desire to squeeze more performance out of budget hardware so I could game at something that didn’t look like potato quality running at terrible frame rates. That was scarcely uncommon. In the era when CPU and GPU performance were both leaping ahead, relatively few people could afford to stay on the absolute cutting edge of technology. While Quake 3 showed what looks like modern CPU performance clustering, this wasn’t universal. Compare it with Unreal Tournament, from the same review:

Image by Anandtech
Image by Anandtech

Games like Unreal Tournament still showed strong impacts from CPU and illustrated how quickly the market was moving. The Athlon 500 (KX133) that couldn’t even hit 45fps? It was a few days short of its first birthday. If you had wanted to stay on top of that mythical 60fps target, you might have found yourself replacing $500 – $1,000 worth of components every single year. I’m sure some people did. Most didn’t.

The Limits of FPS as a Metric:

There are three specific reasons I’ve spent so much time on the history above. First, I wanted to illustrate how the evolution of gaming itself has changed which system components need to be upgraded in order to improve performance, and how much the position of PCs versus consoles has changed.

Second, I wanted to emphasize that defining PC gaming strictly in terms of high frame rates is ahistorical relative to how many of us experienced that history. The fastest CPU you could buy the month after Quake came out (June 1996) was a Pentium 200 MMX. The minimum CPU requirement for Quake 3 Arena in 1999 was a Pentium II 233. There was no upgrade path between the two platforms. Most PC gamers didn’t spend thousands of dollars every single year to stay current on hardware. You bought when you could, overclocked where you could, and played what you could with the result.

Third, I wanted to discuss how attempting to define gaming superiority in terms of how frame rate punishes PC players who simply can’t afford to spend as much money as others. By extension, it implies that the only valuable games are games that can push PC hardware. There’s nothing wrong with having enough disposable income to choose a higher-performing platform or with loving games that push the envelope, but you aren’t less of a PC gamer just because you game on a low-end PC. One of the best aspects of the indie boom has been a proliferation of games that don’t require much in the way of hardware. A low-end 2019 laptop may not play much in the way of modern titles, but it’ll play the biggest hits of 1997 – 2007 just fine.

Like a lot of you, I enjoy playing PC games at higher frame rates than consoles typically offer. Like a lot of you, I consider this a major strength. I don’t, however, consider higher frame rates to be an intrinsic advantage of the platform. They’re only an advantage if you have access to them in the first place, and not all PC gamers do. There are a lot of flaws in the Steam Hardware Survey, but according to the best data we have, the GTX 1060, 1050 Ti, and 1050 are the top three GPUs on the market today, with 28.92 percent of the market between them. The people who buy those GPUs are still PC gamers, but they aren’t necessarily enjoying better frame rates than they’d see on an Xbox One X.

Continue reading

Review: The Oculus Quest 2 Could Be the Tipping Point for VR Mass Adoption
Review: The Oculus Quest 2 Could Be the Tipping Point for VR Mass Adoption

The Oculus Quest 2 is now available, and it's an improvement over the original in every way that matters. And yet, it's $100 less expensive than the last release. Having spent some time with the Quest 2, I believe we might look back on it as the headset that finally made VR accessible to mainstream consumers.

Asteroid Bennu Might Be Hollow and Doomed to Crumble
Asteroid Bennu Might Be Hollow and Doomed to Crumble

A new analysis from the University of Colorado Boulder’s OSIRIS-REx team suggests the Bennu is much less stable than expected. In fact, it could completely go to pieces in the coming eons.

What Does It Mean for the PC Market If Apple Makes the Fastest CPU?
What Does It Mean for the PC Market If Apple Makes the Fastest CPU?

Apple's M1 SoC could have a profound impact on the PC market. After 25 years, x86 may no longer be the highest-performing CPU architecture you can practically buy.

Microsoft: Pluton Chip Will Bring Xbox-Like Security to Windows PCs
Microsoft: Pluton Chip Will Bring Xbox-Like Security to Windows PCs

Intel, AMD, and Qualcomm are working to make Pluton part of their upcoming designs, which should make PCs more difficult to hack, but it also bakes Microsoft technology into your hardware.