CNET Reader R. Savoy asks:
I recently purchased a 60-inch plasma and used your recommended picture settings. Question: I have my picture settings set to Screen Fit instead of 16:9 and on some stations when they go from hi-def to 4:3 I get a white flickering line right above the top of the picture. This problem doesn’t present itself on the 16:9 settings, please advise, thank you.
Good question. Annoying answer, sadly.
It’s not you, and there’s nothing wrong with your TV. It’s the station.
There are two types of programming broadcast by TV stations: HD and upconverted SD. Your TV interprets these both as “HD,” but in fact they’re radically different. True HD, like almost all prime-time programming, is the clear, detailed amazing image you bought your TV for.
Upconverted SD is older, standard-definition content “blown up” to fit your screen. It’s blurry, looks like crap, and makes you wonder how we lived with such terrible picture quality for so long (hint, it’s not as noticeable on smaller screens).
Two articles that further explain this are What is Upconverting and When HD isn’t HD.
In addition to looking right terrible, upconverted SD has another problem. 1080p plasmas and LCDs have 1,920 pixels across, and 1,080 pixels vertically. The 1080i signals of your local CBS and NBC stations also have 1,920 by 1,080 pixels. With native HD content, and a “Screen Fit,” “Native,” “Just,” or similar setting on your TV, you get to see a 1:1 mapping of the original signal. This is by far the best way to watch such content.
Related stories
- LED LCD vs. plasma vs. LCD
- Active 3D vs. passive 3D: What’s better?
- When HD isn’t HD
- Contrast ratio (or how every TV manufacturer lies to you)
- How big a TV should I buy?
- Why all HDMI cables are the same
Non-native HD content, like standard-definition content upconverted by a station, doesn’t have this pixel-perfect accuracy to the original source material. Standard-definition content originally had some “wiggle room” as to where the edges of the picture were. Wikipedia actually has a pretty good article about this.
In the old days, analog TVs had overscan that cut off the edges of the broadcast signal. So the noise and other oddities at the extreme edges of the picture weren’t visible on screen. With modern HDTVs, this overscan is a software option that merely zooms in on the image slightly. This can create processing artifacts with “real” 1080i/1080p content.
The only problem with these pixel-perfect modes (that we still highly recommend) is when a station is blindly upconverting the entire analog signal. So all the extraneous “stuff” at the edges of the image gets upconverted along with the actual content. This is the white line you’re seeing on your screen.
My advice is to ignore it, or become friendly with the format setting that is (ideally) on your remote. If you’re watching an old episode of “Airwolf” and there’s a line that’s bothering you, there should be an overscan option that cuts out all or most of it. Just make sure you turn the overscan off when you go back to real HD content.
It’s not doing anything to your TV, by the way. If you start seeing weird lines on all content, OK, then something’s wrong with your TV.
By the way, ABC, FOX, and some other channels broadcast 720p (1,280 by 720 pixels), which is scaled by either your cable box or your TV to fit your screen. The picture noise mentioned above is still possible, you’re just not getting a pixel-perfect fit, as there are half as many pixels in the content as there are on your TV. Not really important to our conversation, just wanted to mention it.
Got a question for Geoff? Click “Geoffrey Morrison” below then click the E-mail link in the upper right to e-mail, wait for it…Geoffrey Morrison! If it’s witty, amusing, and/or a good question, you may just see it in a post just like this one. No, I won’t tell you what TV to buy. Yes, I’ll probably truncate and/or clean up your e-mail. You can also send me a message on Twitter: @TechWriterGeoff.