There’s a new high dynamic range format coming. It’s called HDR10+. Here’s what you need to know.
Geoffrey Morrison headshot
Geoffrey Morrison
Nov. 6, 2017 12:19 p.m. PT
6
Just when you thought you’d gotten the whole high dynamic range thing figured out, here comes a whole new format.
HDR is the latest enhancement to make it to televisions, streaming devices and your favorite TV shows and movies. The good news is that it can deliver the best picture quality available in home video today. The bad news? It comes in multiple formats, adding one more potential point of confusion to the already overwhelming TV buying process.
We already have HDR10, Dolby Vision and HLG, but apparently that’s not enough. Now there’s a new one, HDR10+, the “+” being the key. Created by Samsung, HDR10+ has recently gotten some other big-name backers, like Panasonic, Philips Amazon and 20th Century Fox.
So with the manufacturing and content side on board (some companies of it anyway), it’s starting to look like HDR10+ could be the real deal.
So what makes it different from the others? I’m glad you asked.
Data, meta and otherwise
To explain the difference between HDR10 and HDR10+, we need to talk about metadata. Metadata is additional info, beyond the video signal itself, that gets transmitted along with an HDR movie or TV show. It basically tells the TV how to show the high dynamic range content. They’re like secret Ikea instructions that turn your Billy bookcase into a library.
HDR10 has static metadata; HDR10+ and Dolby Vision have dynamic metadata. Since this is one of the biggest differences between HDR10 and DV, adding it to the license-free HDR10 is potentially a big deal.
With HDR10, the TV gets one set of instructions at the beginning of the show or movie. This single, static set says, “OK, when this show says jump, this is how high.” This is fine, but is a one-size-fits-all approach. If a movie, say, has a wide variety of scenes, this single piece of metadata might not allow for the best image.
hdr10-via-samsung
It’s a little hard to tell from this graphic, but if you notice the frames on the right show different levels of brightness in the sky. This is just an example, using a standard dynamic range image on your standard dynamic range screen. The idea with HDR10+’s dynamic metadata is that a filmmaker can determine how to best show each shot or scene. With HDR10’s static metadata, a single setting is used, a compromise that has to cover the darkest and brightest scenes.
Samsung
Dolby Vision has dynamic metadata — and soon HDR10+ will, too. This allows for fine-tuning how the HDR looks not for the entire movie, but all the way down to per-scene or even a per-frame basis. Most content probably won’t go that far, but this extra level of control lets filmmakers decide exactly how everything shot in a movie should look on your TV. Potentially, this could mean better picture quality over vanilla HDR10. Now a movie can give a TV instructions on how high to jump essentially on a continuous basis. (Very bossy.)
Here’s how Samsung describes it:
HDR10+ provides for scene-by-scene adjustments for the optimum representation of contrast from the HDR source content. Being an open format, it’s license/royalty free and therefore easily adoptable by manufacturers and content producers with quality maintained through a HDR10+ certification and logo program.
HDR+
Oh, and just so there’s no confusion, HDR10+ has absolutely nothing to do with Google’s HDR+, an enhancement to camera phones. Similar names, totally unrelated. Well, they both have to do with HDR, but otherwise, not the same.
Formatted futures
If you read all this and decided that HDR10+ exists because Samsung doesn’t want to pay Dolby licensing fees for HDR, well, you’d be right. That’s definitely the reason, though I’m sure they also just want HDR to succeed, too.
RELATED TOPICS ON CNET
How HDR works
HDR10 vs. Dolby Vision vs. HLG: How do HDR formats compare?
Why all HDR on TVs isn’t the same
HDR for photography vs. HDR for TVs: What’s the difference?
With the HDR10 ecosystem being a bit like the wild wild west, adding another layer of complexity could create additional problems. Will HDR10+ look the same, worse, or better than Dolby Vision? Impossible to say. Most likely it will come down to the specific transfers, content and so on.
Or to put it another way, it’s probable that HDR10+ and Dolby Vision will potentially look about the same. Dolby’s ace in the hole is, and will be, its hands-on involvement with the TVs themselves. A manufacturer pays Dolby not just for the ability decode Dolby Vision content. Dolby will also show them how to make their TV look as good as possible with said DV content. There’s nothing like that on the HDR10 side — but there might be with HDR10+.
You may have noticed in the quote above mention of a “certification.” There are no details about this yet, but according to Samsung they hope to have something to announce by CES 2018 in January. They told CNET it will be a “quality-based certification and logo program for devices.” What the level of performance TVs will have to meet to be certified is, we’ll have to wait and see. This could be similar to what Dolby does, or it could be as simple as “yep, that’s an image.”
Lastly, the question you’ve probably wanted to ask this whole time: Will your TV work with it? Maybe yes, maybe no. Once again, Samsung:
Our entire 2017 lineup has an HDR10+-capable engine, and we will consider them for certification when the program is announced. We are evaluating options for 2016 displays.
Whether other companies will join the HDR10+ bandwagon remains to be seen. LG has Dolby Vision, and it’s not like they jump at things created by Samsung. Other companies, we shall see. If they sign on, will they be able to firmware-update HDR TVs to work with HDR10+? I wouldn’t count on it, but I suppose it’s possible.