Cutting through the 4K UHD Confusion

gene

gene

Audioholics Master Chief
Administrator
Ultra High Definition is finally starting to take hold in the market with equipment prices that are beginning to drop into the range for buyers of normal means with more HD content becoming available for everyone’s viewing enjoyment. But what does the new technology entail and how does one navigate all the new specifications, acronyms, and jargon along with all the recycled, updated, and bastardized terminology from the previous generation of HDTV. That is where Audioholics comes in to do some critical review to help you, the reader, sort through it all.

Some Key Points of Understanding:
  • The idea of standards in the consumer electronics industry is just short of being a complete oxymoron:
    • When there are multiple competing standards, effectively there is not a standard.
    • When standards contain optional components, this also is effectively not a standard.
  • The CTA (CEA) and the Ultra HD Forum/UHD Alliance are trying to one up each other on who dictates UHD TV standards for design and performance parameters.
  • ITU-R Rec.2020 is used as the basis of most UHD TV, but other standards crop up such as DCI-P3.
  • 4K TVs are not actually 4K, but 4K cinema is, hence, part of the reason for the shift from calling TVs 4K to UHD.
  • UHD TVs are also not necessarily required to have 4K resolution based on the predominant recommendations, and hence the rest of the reason for the shift.
  • Why the consumer electronics industry decided to go with 4K in the first place rather than with the more widely familiar vertical resolution designation, 2160p, which would have reduced consumer confusion… ¯\_(ツ)_/¯.
  • Remember that movies are filmed at 24 Hz and they represent a significant chunk of the currently available 4K content. In other words, the need for 60 Hz HFR is mostly BS at present unless you are a fan of Soaps.
  • Most, if not all, available 4K content uses color subsampling, likely to be at least 4:2:0, to further help with data size and bandwidth limitations beyond just straight up data compression.
  • So videophiles, don’t get all uptight just yet about 60 Hz, 4:4:4 color sampling 4K content as there really is not much, if any, that is actually available meaning that any you are watching involves your UHD TV and other AV gear faking it with interpolation and upscaling. I personally prefer to watch what was actually filmed/recorded/produced rather than what my TV concocts on the fly while I am watching it.
  • Ultra HD Premium TVs are supposed to be compatible with the Rec. 2020 color space, but they currently are only required and able to produce 90% of the 30% smaller DCI-P3 color space currently used in the film industry. The best UHD TV’s are just getting to where they are able to display all of DCI-P3 color space, so at this point, Rec. 2020 color is a goal, not really a feature, and it is likely not just around the corner.
  • HDR includes a lot of other things that have nothing to do with dynamic range and is in the early stages of a potential format war between HDR10 and Dolby Vision with HDR10+ and HLG10 waiting in the wings. Once again, none of them are really a standard at this point, by definition.
  • Most current TVs that support HDR only support one format, some support two, but I do not think there are any available yet that support all three formats.
  • The HDR format that your TV most likely does not support, HLG10, is the best way to provide backwards compatibility with the oodles of legacy SDR content and to minimize bandwidth necessary to supply both.
  • Don’t expect over the air UHD TV broadcasts any time soon due to legacy bandwidth limits and the fact that the ATSC 3.0 specification to broadcast that UHD TV with was just recently released, which leads to a distinct lack of ATSC 3.0 tuners on the currently available UHD TVs.
  • Baring a jump in the efficiency of encoding technology, 1080p HDR broadcasting is way more likely than 4K broadcasting in the near future. Good thing it is all called UHD now.
  • Satellite and cable TV have more 4K content than broadcast TV, but not by much.
  • Streaming is a serviceable way to get a fair amount of 4K TV, but that may change if the current FCC allows the existing uncompetitive ISP environment to flourish having revoked Net Neutrality thereby allowing ISPs to choke customers on the price and bandwidth for the necessary internet connection speeds.
  • Ultra HD Blu-ray, while less convenient than streaming, assuming you even have the Internet connection speeds required for it, produces the best available 4K picture and the number of movies and TV shows available is only increasing.
uhd.jpg


Read: Demystifying 4K UHD or Whatever They Call It
 
M

markw

Audioholic Overlord
At this time, I see the "4k" designation as akin to the tern :"Hi-Fi" in the 50's and early 60's. Not any specific standard but more of a marketing tool.
 
Last edited:
VonMagnum

VonMagnum

Audioholic Chief
When my 92" screen still looks plenty sharp at 10 feet with 1080p (now called "2K" after horizontal despite it being shy of 2K at 1920), I have to wonder what the point of 4K is other than another excuse to sell you another TV to replace one that has nothing wrong with it considering most people aren't buying 4K sets anywhere near that size.

HDR + WCG is the real reason people have told me. Yeah, see above. It's a mess. I see people complaining all the time on the bluray forums about black crush or weird glitches with Dolby Vision on certain movies and why should they have to go back to inferior HDR10, etc., but I keep thinking I just enjoy the movie in 1080p.

Do I even want to have to worry about if I have enough nits to play a particular movie right (they seem to be optimized for whatever the guy wants doing the disc)? Is it exposed/displaying correctly on my display? Projectors can't supposedly do HDR worth a damn anyway. It seems like a convoluted hassle and then I'm back to just the 4K resolution increase when the bottom left corner of my existing lens already isn't as sharp as the rest. How much do I have to pay to get a good enough lens for an even 4K picture for a barely perceptible resolution increase?

Even a $20k Sony doesn't have enough nits to do HDR correctly. I might as well wait for a 92" OLED or QLED set it a retractable version that are starting to appear. But then I'll lose 3D, which I really enjoy.... Horrible situation.
 
davidscott

davidscott

Audioholic Ninja
I think I'll stick with my 1080 sets for awhile but thanks for the details. Very informative.
 
T

TankTop5

Audioholic General
Would I be wrong in assuming an average or almost cheap 4K is as good or better than a couple year old very high 2k?

It looks like most people here are mostly interested in home theater but frame rate when gaming is extremely important. Too much below 60 FPS on a multi player first person shooter can be a major handicap.


Sent from my iPhone using Tapatalk
 
BoredSysAdmin

BoredSysAdmin

Audioholic Slumlord
Thanks to the previous FCC, setting the internet speed definition of broadband at 25mbs download is generally should be beneficial to UHD streaming, but this is again not a requirement, merely a definition, which ISPs free to completely ignore. In fact, Pai tried to redefine it to back to 10mbps to appease his owners (large Telcos).
Federal net neutrality is currently dead, and I doubt that despite currently ongoing states lawsuits it would be reenacted during this administration term.
As for ATSC 3.0 - yes it's a new format, but I am genuinely curious to why the author is being so skeptical regarding its technical merits (yes, major adoption is likely years away, agreed). It's not like MIT nuclear scientists developing it or anything. Oh, hold on, yes they do: https://www.atsc.org/newsletter/atsc-3-0-where-we-stand/
One more piece of info, apparently Phoenix, AZ was chosen to be testbed site for ASTC 3.0 trials: https://www.phoenixnextgentv.com/
 
Last edited:
VonMagnum

VonMagnum

Audioholic Chief
What bugs me is that they're planning to allow broadcasters to charge for service, ending huge chunks of "free" airwaves for the public. I got rid of cable precisely because I had to endure so much advertising (ever increasing amounts to the point where it averaged 22 minutes of ads per hour compared to 10 minutes in the 1960s and 1970s on free TV no less) and I STILL paid $123 a month for 2 HD boxes that were constantly plagued by hard drive failures (the cable boxes got so hot and yet were so slow in the GUI it was absurd) and for that I paid a $10 a month "rental" fee for each year after year plus anothe 99 cents per remote? Awful. Just awful.

I had enough. $12 a month on Netflix gets me up to 4K shows with zero advertising and yet Netflix can still afford to make their own shows and movies to boot on top of tons of older ones. Prime does the same for a similar price per month and I got it for free shipping reasons, not TV , music and movies, but I get those too yet they think I'm going to pay for over the air TV with advertising and spotty reception??? No thanks.
 
Auditor55

Auditor55

Audioholic General
I'm glad they are using UHD. 4K is and was always marketing hype. UHD is more appropriate terminology. From my experience the difference between 4K and 1080p is hardly noticeable if at all. TV manufactures know that most people think resolution means better, not considering the point of diminishing returns. Today we see the big 3 TV manufacturers, Sony, Samsung and LG already pushing 8K resolution TV's to the public when 4K has barely taken hold. They know that resolution sells and they know there will be those that feel their current 4K TV is already obsolete with really explaining to them the benefit of purchasing an 8K TV.
 
Stanton

Stanton

Audioholics Contributing Writer
Federal net neutrality is currently dead, and I doubt that despite currently ongoing states lawsuits it would be reenacted during this administration term.
Which is a good thing, because (trying not to get "political") Net Neutrality (as previously defined) was another example of un-necessary regulation over an adequately performing free-market...but that is a discussion for another time/article.
 
Stanton

Stanton

Audioholics Contributing Writer
Would I be wrong in assuming an average or almost cheap 4K is as good or better than a couple year old very high 2k?
If that "cheap" 4k TV also supports WCG and (some form of) HDR, then most likely yes. The flat-panel TV revolution has finally given us "affordable" large-screen (>40") TVs, which are more likely to display the benefits of 4k resolution. Now 8k is a different story...and that doesn't even address (the lack of) 8k content.
 
BoredSysAdmin

BoredSysAdmin

Audioholic Slumlord
Which is a good thing, because (trying not to get "political") Net Neutrality (as previously defined) was another example of un-necessary regulation over an adequately performing free-market...but that is a discussion for another time/article.
Adding a dump rating to your post would be a childish thing to do, but consider that I exactly did that.
There are so many things factually wrong in your short comment.
 
M

mdinno

Junior Audioholic
So older AVR's like mine(Pioneer SC-71) that have HDMI 1.4a with 4K scaling/passthrough. Is it really worth it to upgrade?
 
BMXTRIX

BMXTRIX

Audioholic Warlord
Would I be wrong in assuming an average or almost cheap 4K is as good or better than a couple year old very high 2k?

It looks like most people here are mostly interested in home theater but frame rate when gaming is extremely important. Too much below 60 FPS on a multi player first person shooter can be a major handicap.


Sent from my iPhone using Tapatalk
Yes, you would be very wrong.

Almost all TV manufacturers are going to 4K, but the technology used to create an image hasn't changed that much. That means, that a cheap LCD display isn't going to touch what the last model Pioneer plasma TVs looked like from over a decade ago. About the only technology competing with that level of image quality is the OLED displays. Guess what? They aren't cheap.

Full array LED lighting displays (FALD) also cost more than the cheapest LCDs and they are very good looking, but they aren't the cheapest.

As it turns out, time and time again, it costs more to get the better image quality, and most people would happily put their 10 year old Pioneer Kuro against almost anything on the market today.

It goes back to the source. A high quality source will yield a high quality image, and that's where cheaper 4K displays get a bit of an advantage. Just because they support that higher quality source from the start. But, it won't fix their black levels, motion detail, or shadow detail. Those three things matter a great deal more than resolution.
 
BMXTRIX

BMXTRIX

Audioholic Warlord
So older AVR's like mine(Pioneer SC-71) that have HDMI 1.4a with 4K scaling/passthrough. Is it really worth it to upgrade?
Upgrade to what? A good receiver that supports all your existing content and current display doesn't require upgrading. If you go to a 4K TV and don't have any content for it which requires HDCP 2.2, then you are fine as well.

I think one of the things that I didn't see discussed much (or at all?) is GAMING!

The absolute top request I hear about 4K, and 18Gb/s 4K, and 60hz frame rate 4K is about GAMING! Gamers really want to push things as much as they can. Some are yabbering on about 120hz 4K compatibility, which isn't here yet. But, they want it. And I would say that at least 50% of the projector market lists 'gaming' as a pretty important aspect to why they are purchasing 4K and a part of their usage.

I found the article to be VERY technical, but it does ignore that gamers want 4K/60 compatibility for the current crop of XBox and PS4 systems. With a new PS5/XboxII on the horizon, people want to be ready for it.

But, there isn't any reason for anyone to have to upgrade unless they are buying gear which will require it.

If someone is buying new, then there is no reason NOT to buy gear that doesn't support HDMI 2.0 and have HDCP 2.2 support.
 
M

mdinno

Junior Audioholic
Upgrade to what? A good receiver that supports all your existing content and current display doesn't require upgrading. If you go to a 4K TV and don't have any content for it which requires HDCP 2.2, then you are fine as well.

I think one of the things that I didn't see discussed much (or at all?) is GAMING!

The absolute top request I hear about 4K, and 18Gb/s 4K, and 60hz frame rate 4K is about GAMING! Gamers really want to push things as much as they can. Some are yabbering on about 120hz 4K compatibility, which isn't here yet. But, they want it. And I would say that at least 50% of the projector market lists 'gaming' as a pretty important aspect to why they are purchasing 4K and a part of their usage.

I found the article to be VERY technical, but it does ignore that gamers want 4K/60 compatibility for the current crop of XBox and PS4 systems. With a new PS5/XboxII on the horizon, people want to be ready for it.

But, there isn't any reason for anyone to have to upgrade unless they are buying gear which will require it.

If someone is buying new, then there is no reason NOT to buy gear that doesn't support HDMI 2.0 and have HDCP 2.2 support.
Thanks...BTW my tv is a Pioneer Kuro 111FD that is still going strong!!
 
T

TankTop5

Audioholic General
Yes, you would be very wrong.

Almost all TV manufacturers are going to 4K, but the technology used to create an image hasn't changed that much. That means, that a cheap LCD display isn't going to touch what the last model Pioneer plasma TVs looked like from over a decade ago. About the only technology competing with that level of image quality is the OLED displays. Guess what? They aren't cheap.

Full array LED lighting displays (FALD) also cost more than the cheapest LCDs and they are very good looking, but they aren't the cheapest.

As it turns out, time and time again, it costs more to get the better image quality, and most people would happily put their 10 year old Pioneer Kuro against almost anything on the market today.

It goes back to the source. A high quality source will yield a high quality image, and that's where cheaper 4K displays get a bit of an advantage. Just because they support that higher quality source from the start. But, it won't fix their black levels, motion detail, or shadow detail. Those three things matter a great deal more than resolution.
Thanks for that answer. I know Fujitsu used to make the highest quality screens and I think that’s the case with the Pioneer Kuro, you don’t here the name Fujitsu but are they still making high quality screens?


Sent from my iPhone using Tapatalk
 
BMXTRIX

BMXTRIX

Audioholic Warlord
Thanks for that answer. I know Fujitsu used to make the highest quality screens and I think that’s the case with the Pioneer Kuro, you don’t here the name Fujitsu but are they still making high quality screens?
I'm not sure about Fujitsu. They certainly aren't spoken of at all in my memory as a 'highest quality screen' in the last 20 years. I could have missed them, or they could have been after a different market segment. Before flat panels, Sony Trinitron had a very strong reputation, and Fujitsu may have leveraged that technology in their displays. But, since the flat panel came out, and plasma displays became 'the' tech to own, Pioneer and Panasonic did a very solid job leading the pack.

Now, the Pioneer Kuro is still considered one of the best ever made.

But, LG with OLED and even Samsung with their top tier LCD models are doing incredibly well on image performance and in a time when HDR is spoken of more and more, the really bright LCDs are pretty amazing to see. They certainly do an excellent job in a well lit family room setup.

At the end of the day, the vast majority of the world just doesn't care about 'best' quality, and that's why Pioneer couldn't sustain their Kuros. It's why pricing has to continue to fall for LG to have competitive pricing on OLED. But, they have a unique product that is very well regarded in terms of image quality, and they have maintained that product for several years now, and hopefully will keep pushing quality up in years to come.
 
BoredSysAdmin

BoredSysAdmin

Audioholic Slumlord
I feel like Vizio P-Quantum has stolen the Samsung crown of top quality LCD tv. LG still has a leg up with contrast on OLED panels, but with 2019 models with newer Quantum X and 480 backlight zones, the difference should be minimal.
 
panteragstk

panteragstk

Audioholic Warlord
I feel like Vizio P-Quantum has stolen the Samsung crown of top quality LCD tv. LG still has a leg up with contrast on OLED panels, but with 2019 models with newer Quantum X and 480 backlight zones, the difference should be minimal.
I think the only thing I'd upgrade my plasma for would be a display capable of OLED black levels with QLED brightness. 1000 nits (after calibration) or more and dead black. That and no motion issues, burn in, any other artifacts.

As of now it doesn't seem that display exists. I haven't really looked at a calibrated QLED set (Vizio or otherwise) to see what black levels they are capable of, but the other issues with LCD would make them a no go for me.
 

Latest posts

newsletter

  • RBHsound.com
  • BlueJeansCable.com
  • SVS Sound Subwoofers
  • Experience the Martin Logan Montis
Top