High Dynamic Range (HDR) is an exciting new technology that makes your viewing experience come to life. HDR gives you a wider range of white and black, allowing for lifelike reflections, sun glares, and shadows. HDR comes in two main varieties: HDR10 which is open source and Dolby Vision which is proprietary. Which kind of HDR you are using will determine what equipment you need for HDR.
Your receiver must be compatible with the version of HDR you are using. For HDR10, you will need an HDMI 2.0a connector. If you are using Dolby Vision or Hybrid Log-Gamma, your receiver must either be compatible out of the box, or the manufacturer may have a firmware update to make it compatible.
Eventually, most receivers should support all of these formats, but for now, the details here matter. Some receivers, like the Sony STRDH590 (on Amazon) support HDR10 and not Dolby vision, while others, like the Pioneer VSX-LX303 (also on Amazon), support both.
Whether you’re building your first home theater from scratch (in which case our guide on the main components here might help), or looking to upgrade your existing setup, you’ll definitely want to be aware of what HDR is and how this standard should guide your decisions.
What Is HDR?
High Dynamic Range is meant to correct the limits of camera technology relative to the human eye. We have a more detailed guide on what HDR is and why it matters, but here are the basics of what HDR means in both the photo and video arenas.
Think of light as being on a ten-point spectrum, where one is the darkest black and ten is the lightest white. When we see with our eyes, we are able to see from two to nine. In other words, our eyes come pretty close to seeing all shades of light.
A camera, on the other hand, can only capture a smaller range of light. The length of the exposure, the time period over which the camera captures light, determined what range will be captured. And taking one shot isn’t enough.
Still-picture cameras solve this with a technology also called HDR that is different from HDR used for home theater systems. Camera HDR solves the exposure problem by taking two pictures in rapid succession: one short-exposure, one long-exposure.
These two images, barring sudden motion, should be identical apart from the amount of light captured. HDR is applied in a camera by combining these two exposures digitally allowing for both the full range of brightness as well as for the full range of detail otherwise lost.
Video HDR aims for a similar effect but is created in a different way. Motion picture cameras do not have the ability to add HDR. Instead, motion picture HDR is applied after filming as a post-production effect.
Because they were designed for standard dynamic range (SDR) media, most display devices are only designed to handle an SDR signal which may give a range of perhaps three to seven.
HDR expands the range of brightness and thus requires a display device that can handle a wider range. This lighting information is stored in the content as metadata, which is why your receiver needs to be able to decode and read it.
The Different Versions of HDR
The control the director has over the HDR applied to their film will depend on what version of HDR they are using. Although there are several different versions of HDR, two have come to dominate the field: HDR10 and Dolby Vision. Another standard, Hybrid Log-Gamma (HLG), is growing in popularity but is not as widely used as either of the other two right now.
At present, HDR10 is the most widely available HDR format. This is due in no small part to the fact that it is open source. This means that anyone may implement HDR10 into their equipment and media without having to pay a licensing fee. HDR10 is applied by the director for the duration of the program you are viewing.
This is in contrast to Dolby Vision’s dynamic metadata. HDR works by transmitting metadata–information about how the image should be displayed–that corrects the SDR colors to their wider range of HDR colors.
Whereas HDR10’s metadata is static, unchanging for the duration of the content, Dolby Vision allows the director to apply different HDR modifications scene-by-scene or frame-by-frame.
Unlike HDR10, Dolby Vision is proprietary, owned by Dolby Laboratories. This means that any content producer or device manufacturer that wants to employ Dolby Vision must pay Dolby Laboratories a royalty fee. While this means content producers and device manufacturers use Dolby Vision less often, it also means that there is a greater degree of quality control than there is with HDR10.
Hybrid Log-Gamma (HLG) is a third HDR technology. HLG was developed by the British Broadcasting Channel (BBC) and the Japanese Broadcasting Corporation (NHK) specifically for TV news broadcasts. Thus far, it has not been widely adopted by broadcasters despite the increasing availability of HLG-compatible devices.
Despite being developed by private corporations, HLG technology is royalty-free like HDR10, but not open-source.
All three formats have certain requirements in order to make them work correctly. When it comes to display devices, compatibility obviously means the capability to display brighter whites and darker blacks within certain specified standards. In transmission devices like receivers, however, compatibility requirements are less intuitive.
For HDR10, all that is required of a receiver is that it supports HDMI 2.0a or higher. The receiver does not have to have any HDR10-specific hardware or firmware apart from the HDMI 2.0a port. Any receiver with HDMI 2.0a or higher is compatible with HDR10, whether it advertises so or not.
Dolby Vision, on the other hand, does not have such stringent hardware requirements. However, it does have strict firmware requirements. This means that not only can receivers with different HDMI port standards work with Dolby Vision, but also receivers that have firmware updates issued from the manufacturer.
The third type of HDR, hybrid log-gamma (HLG), is designed for TV broadcasts and is not currently in wide use. One benefit of HLG is that it is backward compatible. Unlike HDR10 and Dolby Vision, HLG signals can be displayed on SDR devices. This allows broadcasters to be able to broadcast HDR-compatible signals on one channel without having to have a separate HDR-specific channel.
Like HDR10, HLG is royalty-free, however, the relative lack of HLG content has made adoption of HLG technology slow. Like Dolby Vision, HLG compatibility is integrated via firmware, meaning that receivers that did not come compatible with HLG may have manufacturer firmware updates to make them compatible.
Which Should You Get?
In all likelihood, as HDR technology becomes more widespread, more and more receivers will come compatible with both HDR10 and Dolby Vision (and HLG). While there’s no telling whether HLG will catch on, it’s clear that both HDR10 and Dolby Vision are being adopted by content producers to enhance their media.
If you want to upgrade to an HDR viewing experience, you’ll probably want to make sure your equipment is compatible with both HDR10 and Dolby Vision. But the good news is, the longer your wait, the harder it should be to find equipment that’s not compatible with both.