4K vs. UHD: What’s the Difference?

4K vs. UHD: What’s the Difference?

Now that 4K displays are mainstream, let’s look at two terms that have become conflated with one another: 4K and UHD, or Ultra HD. TV makers, broadcasters, and tech blogs are using them interchangeably, but they didn’t start as the same thing, and technically still aren’t. From a viewer standpoint, there isn’t a huge difference, and the short answer is that 4K is sticking, and UHD isn’t — though high-quality Blu-ray drives are sometimes marketed as 4K Ultra HD. But there’s a little more to the story.

4K vs. UHD

The simplest way of defining the difference between 4K and UHD is this: 4K is a professional production and cinema standard, while UHD is a consumer display and broadcast standard. To discover how they became so confused, let’s look at the history of the two terms.

The term “4K” originally derives from the Digital Cinema Initiatives (DCI), a consortium of motion picture studios that standardized a spec for the production and digital projection of 4K content. In this case, 4K is 4,096 by 2,160 and is exactly four times the previous standard for digital editing and projection (2K, or 2,048 by 1,080). 4K refers to the fact that the horizontal pixel count (4,096) is roughly four thousand. The 4K standard is not just a resolution, either: It also defines how 4K content is encoded. A DCI 4K stream is compressed using JPEG2000, can have a bitrate of up to 250Mbps, and employs 12-bit 4:4:4 color depth. (See: How digital technology is reinventing cinema.)

Ultra High Definition, or UHD for short, is the next step up from what’s called full HD, the official name for the display resolution of 1,920 by 1,080. UHD quadruples that resolution to 3,840 by 2,160. It’s not the same as the 4K resolution made above — and yet almost every TV or monitor you see advertised as 4K is actually UHD. Sure, there are some panels out there that are 4,096 by 2,160, which adds up to an aspect ratio of 1.9:1. But the vast majority are 3,840 by 2,160, for a 1.78:1 aspect ratio.

4K vs. UHD: What’s the Difference?

Why Not 2160p?

Now, it’s not as if TV manufacturers aren’t aware of the differences between 4K and UHD. But presumably for marketing reasons, they seem to be sticking with 4K. So as to not conflict with the DCI’s actual 4K standard, some TV makers seem to be using the phrase “4K UHD,” though some are just using “4K.”

To make matters more confusing, UHD is actually split in two. There’s 3,840 by 2,160, and then there’s a big step up, to 7,680 by 4,320, which is also called UHD. It’s reasonable to refer to these two UHD variants as 4K UHD and 8K UHD — but, to be more precise, the 8K UHD spec should probably be renamed QUHD (Quad Ultra HD). (Read: 8K UHDTV: How do you send a 48Gbps TV signal over terrestrial airwaves?)

The real solution would have been to abandon the 4K moniker entirely and instead use the designation 2160p. Display and broadcast resolutions have always referred to resolution in terms of horizontal lines, with the letters “i” and “p” referring to interlacing, which skips every other line, and progressive scan, which doesn’t: 576i (PAL), 480i (NTSC), 576p (DVD), 720p, 1080i, 1080p, and so on.

The reason this didn’t happen is that the number didn’t match the size of the resolution increase. “2160p” implies that the resolution is double that of 1080p HD, while the actual increase is a factor of 4. The gap between 720p and 1080p is significantly smaller than the gap between 4K and 1080p, though how much you notice the upgrade will depend on the quality of your TV and where you sit. Further complicating matters, there’s the fact that just because a display has a 2160p vertical resolution doesn’t mean it supports a 3,840 or 4,096-pixel horizontal width. You’re only likely to see 2160p listed as a monitor resolution, if at all. Newegg lists three displays as supporting 4K explicitly as opposed to UHD (4,096 by 2,160), but they’ll cost you. Clearly, these sorts of displays are aimed at the professional set.

Now that there are 4K TVs everywhere, it would take a concerted effort from at least one big TV manufacturer to right the ship and abandon the use of 4K in favor of UHD. In all honesty, though, it’s too late. The branding ship has sailed.

Sebastian Anthony wrote the original version of this article. It has since been updated several times with new information.

Continue reading

4K vs. UHD: What’s the Difference?
4K vs. UHD: What’s the Difference?

The terms UHD and 4K have become so conflated that TV makers, broadcasters, and tech blogs are using them interchangeably — but they're not the same thing.

Comparison of Apple M1, A14 Shows Differences in SoC Design
Comparison of Apple M1, A14 Shows Differences in SoC Design

A new analysis of the M1 breaks down the die design versus the smartphone-class A14 SoC.

L2 vs. L3 cache: What’s the Difference?
L2 vs. L3 cache: What’s the Difference?

What's the difference between L3 and other types of cache, and how does it impact system performance?

USB-C vs. USB 3: What’s the Difference?
USB-C vs. USB 3: What’s the Difference?

USB-C and USB 3.x branding are both in the market right now, and it can be confusing to figure out which numbers to pay attention to, or what cable you need. We dive into the differences.