There have been many advances in video quality in recent years, but High Dynamic Range (HDR) might be the most exciting. Where technologies like Ultra-HD can give you a more precise picture, and where resolution drives up the number of pixels used for an image, HDR uses this existing hardware to make images more vivid and lifelike.
By using metadata in the video file to specify shades of brightness and color more specifically than ever before, HDR produces better, more lifelike images on Television screens. And most TVs sold over the last few years and in production today have the hardware to produce HDR images.
The world of high dynamic range is new and exciting, and the technology for it is only just starting to saturate the market. Amazon Prime Video and Netflix are already getting in on the game as well as a number of blockbuster films. Pretty soon, it’s expected that this HDR standard will be baked into most content that’s produced, and pretty much any device with an HD screen designed to view it.
What Is HDR and How Does It Work?
The best way to understand HDR is to take it from the top: a director or producer of a film or Television show has a vision for how that content should look. They shoot the video, edit it in post-production, and then eventually it gets to you on a screen. At this point, in the post-production, the precise colors and brightness levels are set.
After that, the path of that video signal to your screen basically has two different routes. Before HDR, what would happen is that the signal would go to your Television, and based on the Televisions hardware and color abilities, it would look, well, just however it ended up looking.
Some brands, like Samsung, might even have advanced post-processing changes (color ranges like “dynamic,” “vivid,” etc.) that would shift the colors and brightnesses even more. So there’s a lot of potential that what you’re looking at is not really what the producer of the content had in mind.
However, with HDR, something new and different is happening: instead of letting the Television’s limitations or post-processing drive how the image looks, HDR allows the content producer to specify exactly how bright a pixel should be and exactly what color it should be. Then, this info is embedded in the video source.
These enhancements are stored as metadata–information about the image data, and using that info is where we get back to talking about hardware: In order for you to make use of that metadata, you need a receiver or media device that can pass this metadata on to the screen from the receiver, and also a Television that is capable of interpreting that data and producing the levels of brightness and colors that are specified.
An Important Note about HDR
It’s important to note that HDR is not the same as increasing the brightness and contrast on your display device. Although contrast increases the brightness and darkness of some colors, it still leaves your display device with the same range of colors. The HDR metadata, though, can specify shades of brightness and color beyond what you can modify in the screen settings.
Further, when you are adjusting the contrast on your display device, your display device is deciding how to apply the contrast. HDR is applied by the film’s director and editorial team, and some directors might choose different ranges of brightness for different scenes. Others might only apply HDR to certain scenes.
Unlike contrast, HDR is a tool content producer’s use to enhance your visual experience in a specific way. There are no settings to adjust beyond turning on your display device’s HDR capability. This is actually a huge added benefit because once HDR is on everything is pre-adjusted and there’s nothing else for you to worry about.
And another bonus is that this is a simple standard to implement, with plenty of modestly prices panels like the Samsung UN43RU7100FXZA 43-Inch 4K (on Amazon) and the TCL 65″ Class 5-Series 4K (also on Amazon) each sporting HDR at very approachable prices.
The Different Versions of HDR
When shopping for a new HDR-compatible device, there is a lot to choose from. Because the criteria for HDR is simply “darker” and “lighter” there are several different parameters that make different versions of HDR have different advantages.
The most common versions of HDR that you may encounter are:
- Dolby Vision
- Hybrid Log-Gamma (HLG)
And unfortunately, the different versions of HDR aren’t compatible with each other. This means you have to make sure your display is compatible with your media source (streaming Netflix, linking to an XBOX, etc.)
For example, if your Blu-ray has one form of HDR and your TV has another, the TV won’t be able to read the HDR of the Blu-Ray and will only show a standard dynamic range or SDR.
However, as time goes by, it’s expected that most display devices will be compatible with multiple forms of HDR and most media will eventually use multiple forms of HDR as well. For now, you have to choose and each format has distinct advantages and disadvantages.
Right now, Dolby Vision has the best image quality. This is because unlike other standards, Dolby Vision has ‘dynamic’ metadata. That is, whereas other formats allow for static metadata to tell your display device how to show your media over the whole movie or show, Dolby Vision allows that metadata to be adjusted between shots or even frame-by-frame.
Dolby Vision is a proprietary format, though, which means content producers and device manufacturers have to pay Dolby a licensing fee to use Dolby’s technology. As a result, fewer media and display devices are compatible with Dolby Vision.
HDR10 and HDR10 Plus
HDR10 is the preferred format for now if you want to watch as much HDR content as possible. HDR10 is an open format, meaning there are no licensing fees required in order to use it. While this means more content and more display devices available with HDR10, it also means that there is less control over its implementation.
With Dolby Vision, content producers and device manufacturers are held to a certain standard so that the media will have the same image quality from the display device to display device. HDR10 has no such oversight and as a result, different HDR10-compatible devices may display the HDR content differently.
Importantly, HDR10 has static metadata, which means the scenes with especially darker darks or brighter brights will be muted by the inability for the metadata to change these settings on the fly. HDR10+ addresses this, adding dynamic metadata to the standard, but it’s just as widely adopted yet.
Hybrid Log-Gamma or HLG is a newer HDR format that has one distinct advantage over the other two: it’s backward-compatible. This means that if you have HLG content but only an SDR display device, the content will display with image quality as good as if you had bought the content in standard dynamic range.
HLG was co-developed by the BBC and Japan’s NHK. For now, HLG is primarily intended for broadcast TV to avoid having to make multiple channels for the same content the way some do with high and standard definition broadcasts.
This backward compatibility is an obvious advantage, especially if you have multiple display devices but don’t intend on upgrading all of them. At the same time, though, what you gain in backward compatibility, you lose in range. HLG simply doesn’t have the same range of darkness and brightness that Dolby Vision and HDR10 do.
The Verdict on the Different Versions of HDR
For now, HDR10 appears to be the most widely available standard. For this reason, you will be able to get the most out of your device if you go with an HDR10 compatible display device.
Still, Dolby Vision is regarded as the superior standard because of the control Dolby has over its implementation. HLG is good if you have multiple display devices in your home but only intend on updating a portion of them.
It’s expected that in a few years, most display devices and media will be compatible with multiple formats, so it might make sense to hold off on jumping into HDR until the technology becomes more available and affordable. Given that much media and many devices have migrated over to HDR already, it might become universal sooner than you would think.
How Do I Use HDR?
Using HDR is fairly simple. All that’s required is HDR content, a compatible HDR display device (TV or Projector), and HDR compatibility for all devices in between.
HDR only works with HDMI, and you will need a Premium HDMI cable to get the full benefits of it. You will need to make sure your devices have HDR turned on. After that, you should be good to go. Here are some instructions for enabling HDR on some of our favorite devices.
Enabling HDR on a Samsung TV is fairly straightforward. Open your menu and go to “Picture.” From the Picture menu, select “Expert Settings.” From here, scroll down and enable the “HDR+ Mode” option, and restart your TV.
Click the “Home” button and scroll to “HDMI.” From here select “Settings,” and choose “Advanced” in the following menu. From here select “Picture,” then “HDMI ULTRA DEEP COLOR.” Turn this feature on, and restart your TV.
Go to “Home” then “Settings.” Select “External Inputs” and from there, “HDMI Signal Format.” Select the SHIELD input for HDMI and select “Enhanced Mode.” Your TV should reboot automatically.
Windows 10 comes compatible with HDR10 media. To enable HDR, click the “Start” button that looks like a window at the bottom left-hand corner of the screen. From here, select “Settings,” then “System,” and “Display.”
If you have multiple display devices (such as a multi-monitor setup or a laptop hooked up to another display device), choose the HDR-compatible display under “Select and Rearrange Displays.”
From here select “Windows HD Color Settings.” Under “Display Capabilities” make sure it says “Yes” next to “Play HDR Games and Apps.” Below that, turn on the “Play HDR Games and Apps” by selecting the switch.
Xbox One X
Like Windows 10, the Xbox One X comes compatible with HDR10 media. To enable this feature, press the Xbox button to open the Xbox guide. Scroll down to the “Settings” tab and select “All Settings.” From here, select “Display & Sound,” and then “Video Input.” Next, choose “Video Modes.” Make sure both the “Allow 4K” and “Allow HDR” checkboxes are checked.
Using a Premium HDMI cable, connect the console to the correct HDR-compatible HDMI port on your display device. It must be connected directly to the display device and cannot be connected to a receiver or switch box.
On the PS4, go to “Settings,” then “Sound and Screen,” then “Video Output Settings.” In the Video Output Settings Menu, set the “HDR” and “Deep Colour” output settings to “Automatic.”
The PS4 also allows you to calibrate your HDR settings for a customized viewing experience. From the Video Output Settings menu, select “Adjust HDR” and follow the instructions given on-screen.
Are You Ready for HDR?
HDR is an exciting breakthrough technology. It provides a vivid media watching experience that makes the image look true to life. Its deep blacks and bright whites make shadows, sun glare, and textures come to life, and there are already plenty of devices with some form of HDR technology.
However, not all HDR-capable devices are compatible with all HDR media. Because of the different versions of HDR, choosing the right device for the content you want to watch is paramount.
In all likelihood, as HDR becomes more widely used, display devices and gaming consoles will likely come equipped to handle multiple HDR formats. Because of this, it may make sense to hold off on buying an HDR-capable device until devices supporting multiple formats become more ubiquitous and affordable.
Setting your device up to display HDR content is usually fairly simple and requires only a little manipulation of your device’s settings. Be sure you consult your device’s manual before hooking it up as some devices, such as the PlayStation 4, aren’t compatible with all HDR-capable devices.
HDR is still a relatively new technology and will likely improve as time goes on. That said, there’s no reason you can’t get in on the action now and enjoy your favorite games, movies, and shows with stunning color ranges.d