About HDR TV

KALM-150x150"

Tom demystifies High-Dynamic Range as it pertains to televisions and helps you decide which version might be best for your budget.

Featuring Tom Merritt.

MP3

Please SUBSCRIBE HERE.

A special thanks to all our supporters–without you, none of this would be possible.

Thanks to Kevin MacLeod of Incompetech.com for the theme music.

Thanks to Garrett Weinzierl for the logo!

Thanks to our mods, Kylde, Jack_Shid, KAPT_Kipper, and scottierowland on the subreddit

Send us email to [email protected]

Episode Transcript:

I went to buy a new TV and they tried to sell me HDR
Is that ANOTHER HD. Aren’t there enough?
And how does this fit with 4K is it just another name for the same thing?
Are you confused?
Don’t be.
Let’s help you Know a Little more about HDR TV.

HDR stands for High-dynamic range and it’s a selling point for TVs. There can be some confusion about this because the same term is used for things besides video. On this episode, We’re talking about how HDR pertains to video, specifically your TV or movie watching.
There is also HDR for still photography and other imaging. And it works differently. You’ll most likely run across that in regards to your phone’s camera. It’s even applied to audio sometimes. We may talk about them on another day but today is not that day.
But let’s start with the root of the term for all these meanings. Dynamic Range. You probably think of it as contrast.
This is one of those phrases you hear and you think you know what it means but when pressed you may or may not be able to come up with a good definition.
Put simply, dynamic range is the ratio between the highest and lowest values. Hence you hear people say contrast ratio. Let’s say Hockey teams usually score fewer than 10 goals. Their dynamic range is 0-10. Or let’s say the cooldown on your WoW character’s spells is 3 seconds at the fastest and 8 at the slowest. Your character’s dynamic range is 3-9.
Of course that’s not how dynamic range is used and it’s not how it gets expressed. It’s often expressed as a ratio or logarithm and it applies to signals. But for our purposes you just need to understand that it’s a range in which values can fluctuate. A higher contrast ratio means you can have brighter brights and darker darks.
High dynamic range means the range gets, well, higher than usual. And we could kind of stop there. HDR means brighter brights and darker darks. But you probably know that HDR promises better color too. So let’s dig a little deeper.
Whether your display (or TV) can show brighter brights and darker darks depends on two main things. Does the signal tell it to. And is the device capable of doing it, if told to. So to have HDR you have to have a signal sent to your display that has the HDR data in it. That signal however doesn’t affect the display’s capabilities which is why you have to have a monitor that is capable of HDR so it can actually display those brighter brights and darker darks.
And there are different ways of encoding HDR in the signal, so your display has to be capable of interpreting the format of the HDR signal too. An HDR10 display may or may not also be capable of displaying Dolby Vision HDR. We’ll get to that a bit later.
But let’s start with how it works no matter what format.
HOW IT WORKS
HDR is a way of encoding the video signal to include more information. The standard dynamic range signal, as it is now called– before it was just called “how you do it”– had limitations imposed by the capabilities of the technology that existed when it was created. Why have higher or darker brightness, or more colors if there was no device even on the horizon that could use it. But as devices improved, a new way of defining the signal to take advantage of their capabilities was needed.
HDR for video came about in 2014. Before that you generally tried to improve video by increasing pixels, so you had higher resolution, or improving frame rate so it was smoother. Those do improve perceived quality but HDR doesn’t add things or speed them up HDR improves the pixels perceived fidelity though brightness and and because of that brightness, also through perceived color.
Since it all depends on the brightness, nee contrast ratio, let’s start there.
People measure brightness in nits. You’ve probably seen this when you shop. It comes up a lot in displays. One nit is one standard candle of luminance per square meter.
Standard video has a brightness of 100 nits. HDR can encode this up to 1000 nits, sometimes up to 10,000.
And it’s not just the brightest bright it also applies to the dark level too. Remember we said the old standard defined things based on old capabilities? In a standard display everything below a certain brightness was the same shade of black no matter how you encoded the data. Because when the standard was created no display was capable of showing anything blacker. Displays always had a little glow. Well now displays can show very dark areas thanks to things like local dimming. And since HDR has a wider range of brightness, the darks can look darker. Not just darker but because of the control a video maker can show more details in the darkness. Flames for instance can show the gaps in the flame instead of just looking like a big reddish-white splotch. More details even without more resolution.
Now we’ve been talking all about brightness aka luminance and contrast ratio but one of the big selling points of HDR is color. Particularly color accuracy- how closely the color looks to how you think it should look or how it looks in real life.
HDR doesn’t actually address color directly. Technically the dynamic range only addresses brightness. But the way our brains work a greater range of light makes us think we see a greater range of color.
So an HDR video can use a wide color gamut. The word gamut, just means the number of distinguishable colors that can be represented. That won’t necessarily mean a wider range of colors, but can mean more variations within. Think of it this way. A really small gamut might be Blue red purple and green. A wider one would be Sky blue, sea blue, brick red, apple red, purple, violet, green and yellow.
Now there is some confusion here. You can have a wide color gamut, on a standard dynamic range TV but because the brightness isn’t there you won’t be able to tell. You won’t find many examples of that though. Usually if a display bothers to have a wide color gamut it doesn’t want to waste it by not supporting at least one type of HDR.
Also though, a TV can have HDR capability, because it handles the brightness, but not have the color gamut meaning the HDR won’t look as if it does as much as it would on another TV. You might get brighter brights and darker darks but not eye-popping color.
So a good HDR TV has HDR’s higher contrast ratio and can support a wide color gamut.
When you can make the pixels brighter and therefore viewers can distinguish more colors, you can make the same video at the same resolution and frame rate look better and more detailed. You can increase brightness in a smaller area, say, so you can show reflection on a shiny object. Shadows can be darker. And all the control over light and dark and the extra color variation you can employ means you can have better depth, even in 2D images.
Time for an important point. HDR is often linked to 4K TVs and higher. But it technically HDR can be used on good old 1080p and lower. It’s just that when you have fewer pixels to control, pixel control doesn’t have the same effect. 4K is the first resolution with enough pixels to really help HDR pop.
Now as we’ve been saying this is what goes into the HDR signal. The pudding in which the proof is, is the display. HDR was made a little future proof. Which mean, no display can show the maximum nits or colors that HDR could enable. Good news, room to grow with the current signals! Other news, people have different ideas on how to implement HDR for current displays. So we have a format war!
Yay?
Let’s run over a few of the formats and their highlights and weak points before we wrap this up.
HDR10 is an open standard supported by the Consumer Technology Association, the CTA. You know, the folks who put on CES every year. It’s capable of up to 10,000 nits but is usually mastered somewhere between 1,000 and 4,000. HDR10 uses static metadata to tell the display how to adjust if it has lower peak brightness than what the master was made for. But it’s locked for the video. It doesn’t adapt from scene to scene or display to display. The metadata is not dynamic. The display’s software looks at the signal and the metadata and chooses how to adapt which may not end up looking how the author intended.
Because it’s an open standard this is available on pretty much every HDR -capable TV and video format. Alongside one or more of the following.
HDR10+ is HDR 10 with dynamic metadata developed by Samsung. Here dynamic metadata means it can be changed scene to scene so the TV isn’t bothering being ready to show brightness that isn’t needed until later in the video. This reflects what the video maker intended and doesn’t leave all the decisions to the display software. Samsung makes this free to content creators and charges a license to display manufacturers.
Dolby Vision is a proprietary system from Dolby Labs and also has dynamic metadata which doesn’t just adapt scene to scene but can also adapt to the capabilities of the display it runs on. It has luminance capability up to 10,000 nits and usually mastered up to 4,000 nits. Dolby Vision also allows up to 12 bits of color instead of the usual ten, so it has more color depth. Everybody who wants to use it has to pay Dolby.
HLG10 is one that can be used for video or still images. Giving implementers some flexibility. It doesn’t use metadata though. However it’s backwards-compatible with some 4K and 8K standard dynamic range displays (which the others aren’t) and hey it is royalty free. It was developed by BBC and NHK for broadcasters and is used by several streaming services too. There’s a similar format called PQ10 out there too.
Is one of these better than the other? Generally speaking the ones with dynamic metadata, Dolby Vision and HDR10+ are considered to be better because they are truer to the intent of the video creators with the dynamic metadata and all. And there’s a nice healthy “discussion” about which one of those is better.
So there you have it. HDR! Now when you head into buying that new TV, you’ll have a little better idea what it means, which one the TV has and what it means to your viewing pleasure.
In other words, I hope you know a little more about HDR TV.