Why 4K TVs are stupid (still)

Editors’ Note: An updated article entitled Why Ultra HD 4K TVs are still stupid was published on January 28, 2013.

A few months ago, hot on the multitude of 4K TV announcements at CES, I wrote an article called “Why 4K TVs are stupid.”

I was shocked, shocked to find so many angry, contrary opinions on the subject. I mean, this is the Internet. Surely everyone is cordial and like-minded.

The comment section was the usual bog of ad hominem, straw man, and plain nonsense arguments. But buried deep within the chaff were a few good questions worthy of rebuttal. So if you’ll indulge…

There were an amazing number of comments from people who clearly didn’t read the article at all. Let me make this pithy and clear: The 4K resolution is awesome, but 4K televisions are stupid. Your eye has a finite resolution, and at the distance that most people sit from their TVs, it’s unlikely you’d be able to tell the difference between 720p and 1080p, let alone 4K (roughly 4,096×2,304 pixels). Countless comments were some variation of “well, I sit closer” or “I have a huge projection screen.” Yes, if you sit closer than the average (9 feet) or have a huge screen (as I do), then 4K may be beneficial. I explicitly say this in the original article. I also mention 4K would be great for passive 3D.

So I’m going to skip those comments and move to some of the thought provokers. Please note, I’ve trimmed some of these down to be clearer, and edited them to make it look less “Internet comment section-y.” I don’t believe I changed any of their meanings, though I’m a little foggy on what was meant by “you’re a moron.”

“Something to think about. It’s not ‘seeing’ the pixels, it’s how smooth the edges of objects appear.” Posted by “lostviking”

If you’re seeing jagged edges on objects, you are seeing pixels. More likely, though, you’re seeing deinterlacing artifacts. For more info on them, check out 1080i and 1080p are the same resolution.

“There is no technical reason why you couldn’t sit 5 feet from the television.” Posted by “TwentyFifteenAcuity”

Technically true. Even 1080p is high enough resolution that you could probably sit a lot closer to your current TV. The image would seem much bigger, and there are other potential benefits. I’m going to guess, though, that most people don’t want to sit closer to their TVs. So it’s not a technological limitation (which arguably it was with 480i standard definition), but personal preference.

This same guy actually had another interesting comment:

“Your assumption of normal vision being ‘1 arc minute acuity’ is based on 20/20 being perfect.”

I wouldn’t say “perfect” but I would say “average.” If someone wants to make different mathematical assumptions based on a less established standard, that’s fine. There were several comments that did this, saying “well I have such-n-such vision, therefore…”

Look, if you have 20/1 vision, I’m envious, but most people don’t. For me to write an article based on what people with abnormally good vision could see, the rest of us on the hump of the bell curve would find it useless. Or let me put it a different way, it would be just as illogical for me to write an article based on what people with worse eyesight could see.

If you have better eyesight than 20/20, then yes, you might be able to benefit from higher resolutions in screen sizes smaller than what someone with 20/20 needs.

“Personally, I’m gonna wait for the 4D TVs to come out. They’ll come in a tesseract shape.” Posted by “Krantzstone”

I love this comment.

There were a few like this:

“Have you ever seen the difference between the ABC/Disney family of networks that only upgraded to 720p versus the other major networks? I’ll agree most sitcoms look the same on both, but check out sporting events. Night and day. I live in CHI; sometimes Bulls games are broadcast on WGN (Full HD) and ESPN (720), flipping between the two is very clear.” Posted by “pckline”

First of all, ABC (and therefore ESPN) and FOX stated that progressive 720p was chosen specifically because it was better with sports than interlaced 1080i. That’s not, however, what’s wrong with this comment and the others like it. “Pckline” is describing source resolution, not display resolution. These are very different, but easily confused. All TVs (or more likely, cable/sat boxes) upconvert 720p signals. Check out What is upconverting and When HD isn’t HD for more info.

He also goes on to say, “I am surprised at how many CNET regulars and writers don’t view OLED or 4K2K as an upgrade to look forward to.” Which is just not true.

“I see many parallels between this article and the comment ‘no one will ever need more than 640K of memory.’ One lesson history has taught us is that if you invent it, someone will find a use for it. You hint at that w/ the future of 3D.” Posted by “swimpunk”

This is what’s called a false equivalency. I remember buying my first PC, when the salesman told my father and I we’d “never need more than a 340 MB hard drive.” I knew that was BS when I heard it. It was a salesman trying to sell what he had in store, not a expert making a prediction. (Also, it’s worth noting the “640K” quote was not Bill Gates, as some have attributed. That’s an urban legend).

The false equivalency here is the fact that there’s always going to be a use for more hard-drive space (and processing power, etc) but current TVs are already greater than the finite resolution of the human eye. We’re talking a physiological limitation, not a technological limitation. TVs, at the sizes people buy and the distances people sit, are better than what the average person can see. Higher resolutions aren’t beneficial because we can’t actually use them. It would be like putting a mechanical spoiler on your car that only activates above 400 mph. Not only would you never use it, it’s not possible for you to use it.

But wait! You say, what about larger screen sizes? Funny you should ask:

“The trend is going towards bigger TVs and bigger TVs need better resolution.” Posted by “Carlnolip”

In the abstract, Carl is right. TVs have and will continue to get bigger. However, the reality is much different. The average screen size (based on units sold) is still well under 50 inches. If you want a massive TV, well, you’re off the bell curve just as much as the ultravision guys. The fact is, the majority of the buying public doesn’t want massive TVs (bizarrely, I agree), and in the sizes they’re buying now and in the future, 4K doesn’t make any sense.

I think that even if 80-inch TVs were $1,000, they still wouldn’t sell in the numbers of 42- or 50-inch TVs. Before you disagree with me, ask the spouses of a few people you know. And this is from someone who’s had a 100-inch TV for 10 years.

“It doesn’t matter if we need that kind of TV or not…when they come, actual 1080p TVs will be cheaper (and so more people will be able to afford them).” Posted by “Ganxx”

A fair point.

“I think the problem is that we’re not even using 1080p to its full potential right now.” Posted by “NocturnalCT”

Fantastic comment. The vast majority of available content isn’t using the full potential of what we already have. Not all HD is the same, and streaming, cable, and satellite broadcasts are horribly inferior to Blu-ray and over-the-air. Once again, check out When HD isn’t HD.

“Now in five years or so when you guys start doing reviews with these TVs and compare them to a 1080p model just remember what you said. I have a feeling the review will say something like: Great TV, but no 4K. Tell me I’m wrong and I will call you a liar.” Posted by “bweber85”

And Betty when you call me, you can call me Al.

Related stories

“This dude is just NUTS!” Posted by “Gradius3”

Dooooon’t rush to judgment until all the facts are in.

“I hope the writer of this article didn’t get paid for his opinions about 4k. How can 4k be stupid? All this talk about sitting closer to the TV and not seeing the benefits unless it’s a certain size, etc., is pure crap. How can your eyes & mind not be blown away from watching an OLED set with 4 times the resolution of a Kuro?” Posted by “SlimTV”

Depends on your definition of “paid.” Your eyes and mind won’t be blown away by something you can’t physically see. As for OLED, this.

“This post is at best, unbearable to read. At worst, it is an atrocious abomination of journalism, punishable by taking your smartphone away from you for a few days. Please go back to grade school and take refresher courses in spelling, punctuation, and proofreading. Then go back to college and retake English Composition 1 to beef up on your content, structure, and delivery.” Posted by “EducatedConsumer10000”

I guess you won’t be buying my book.

“Just because you can’t see individual pixels doesn’t mean you won’t be able to tell difference in picture quality. CD’s don’t play anything above 20 kilohertz, but I can definitely tell a difference between a CD’s and DVD’s audio.” Posted by “bigdog9271”

This is an interesting analogy, one that on the surface seems to prove a point. However, the reality is somewhat different. A lot of what you’re hearing in DVD-Audio (other than the higher bit rate, which is arguably more important), is fewer artifacts in the audible frequencies caused by the filters used to “cap off” CDs below 22.5 kHz. I talk about this and how Dolby tries to fix it with Blu-ray here.

This ties in well with the next comment:

“Geoff. As an expert you should know that contrast and color fidelity are also directly related to the number of pixels. If you have an area of 400 pixels to display 2,000 grades of color you will be able to display only 400 out of those 2,000. Now if you have an area of 1,600 pixels, you will be able to display 4 times more grades of color.” Posted by “Muzztard”

This is an interesting comment. Using the current TV system (which I’ll discuss in a moment), each pixel could be used to potentially show a different shade of color. More pixels, more visible shades per gradation. However, the eye’s color resolution is significantly worse than its black-and-white vision. I’d argue it would be unlikely most people would see a difference for all the same reasons mentioned above. However, unlike the strict stance I took against the increase in resolution, I see some validity in “Muzztard’s” argument. When 4K TVs do inevitably ship, I look forward to testing this aspect. Seeing as most people don’t notice, care, or adjust their TVs away from inaccurate color, I don’t think this is going to be a major selling point. Not sure how it could increase contrast, though. That part I don’t get.

For the many other comments about how 4K will increase color accuracy, or any of the other regurgitated Apple marketing hype from their Retina displays, these are flat-out wrong. The pixels themselves don’t have anything to do with color accuracy. Our current TV system is 8-bit, which means 256 steps (0-255, though generally only 16-235 are used, so 219 steps, but let’s not nitpick). This is for each of the three colors, so there are a possible 16,777,216 colors (256x256x256). In reality, there are more variables than this simple math, but I’m getting off track. If you increase the bit depth of the video system, to say 10-bit, now you have over a billion colors (1,024×1,024×1,024). More bits, more gradations, and a smoother picture. More bits, more gradations, more subtle shades of color. While we’re at it, how about expanding the color palette so there are even deeper, more realistic colors to choose from?

Except, we’re not talking about any of that. We’re talking about the resolution. If you want to talk about increasing bit depth or expanding the color palette, I’m game for the conversation. Just increasing the number of pixels won’t do any of these.

And if you really want to pick nits, the potential for both is already possible with xvYCC and Deep Color, neither of which are currently implemented (nor are there any serious plans to).

Bottom line
To sum up: 4K is awesome, but you don’t need it in a TV. To say that because I don’t think 4K is necessary that somehow I’m antitechnology shows an amusing lack of understanding how Google works.

Lastly, I’d like to call out all the commenters who posted rational questions, reasonable comments, and polite interactions.

Both of you.


Got a question for Geoff? First, check out all the other articles he’s written on topics like HDMI cables, LED LCD vs. plasma, Active vs Passive 3D, and more. Still have a question? Send him an e-mail! He won’t tell you what TV to buy, but he might use your letter in a future article. You can also send him a message on Twitter: @TechWriterGeoff.

Check Also

‘House of the Dragon’ Season Finale: That Killer Ending Explained

After years of anticipation and over two months of airtime, House of the Dragon season 1 is done and dusted. Excitement was high for HBO’s Game of Thrones prequel, and with around 30 million people watching each episode, it’s proven to be a hit. That’s despite no one really knew what to expect from the …

Leave a Reply