If you’re in the market for a new monitor or TV, you may have come across the acronym ‘HDR’, followed by various numbers. If you’re not sure what that means for the quality of your new monitor, then this article is for you!
Table of Contents
What is HDR?
First off, let’s define HDR, which stands for High Dynamic Range. When a monitor has HDR, it changes the way luminance and color are represented in the video signal, allowing for improved contrast ratios and increased brightness overall. It also increases the darkness of shadows and the vibrancy of colors.
Basically, an HDR monitor will look markedly better than an SDR (standard dynamic range) monitor.
But, to reap all the benefits of an HDR monitor, you need HDR content to tell the HDR monitor exactly the range and nuance of colors it wants displayed. Most games nowadays have HDR built-in, and there are HDR options available on streaming services like Netflix and Prime Video.
What Do the Numbers Mean?
So now we know what HDR is, the next step is to understand the numbers that often follow HDR, such as HDR10 and HDR400.
HDR10 is an open-source media standard protocol, used by all HDR devices. You can think of it as a sort of ‘bare minimum’ that a monitor must have to be considered a true HDR screen. All TVs and PC monitors must support HDR10 to be considered HDR compatible.
The 10 refers to the 10-bit color gamut that HDR monitors operate at – as opposed to a regular 8-bit standard, or SDR.
10-bit standard HDR monitors use a full range of 1024 shades of primary colors, bringing a dynamic brightness and vibrancy that you just don’t get with 8-bit monitors. HDR10 has a nit level of 1000 nits of peak brightness, which is useful to know in order to understand other ‘HDR’ classifications.
This, and the improved contrast that comes from dynamic dimming and brightness adjustment, combine to create a super-realistic image on the screen, much closer to what the human eye naturally sees.
HDR400 is a bit of a misnomer. Technically, HDR400 is a certified HDR classification, but in actual practical usage terms, you may be left somewhat disappointed when you plug in your new HDR400 monitor.
Unlike with HDR10, 400 doesn’t relate to color bits, but instead to nits, so overall peak brightness. HDR400 offers a peak brightness of 400 nits, but that may not be sustained, and it may not cover the whole screen. When compared with the 1000 nits of peak brightness offered by HDR10, it starts to become clear which option is preferable.
It’s recommended that for a true HDR experience, you need a minimum of 600 nits of brightness. As HDR400 cannot offer that, you will only get an approximation of HDR.
While HDR10 is a regulated standard, HDR400 specs can vary from monitor to monitor. The plus side of HDR400 is that it is much more affordable than a HDR10 monitor. And, depending on what you’re used to, and the brightness levels of the room you’ll be viewing your monitor in, it may be good enough for what you need.
DisplayHDR400, 500, 600 and More
VESA (Video Standards Electronics Association) is an international non-profit group run by a board of directors and includes members from companies such as Apple, Samsung, LG, HP, Microsoft, and many more.
VESA established a list of criteria that must be met in order for a monitor to be accredited as true HDR. They use the term ‘DisplayHDR___’ as a kind of certification that the specs of the monitor meet certain requirements.
For DisplayHDR400, for example, the following specs must be tested and proven to be working in the device:
- 8-bit image processing
- Global dimming to improve dynamic contrast
- Peak luminance of 400 cd/m2 (nits)
- Minimum color gamut requirements exceed that of SDR
The same goes for DisplayHDR500:
- 10-bit image processing
- Local dimming
- Peak luminance of 500 cd/m2 (nits)
- Notable increase in color gamut from DisplayHDR400
The requirements increase with each classification, all the way up to DisplayHDR1400, which is currently the highest level of DisplayHDR-regulated accreditation.
You can visit VESA’s website for a full breakdown of their spec requirements to help you figure out if what you’re buying is actually certified HDR.
HDR10 vs HDR400
So now you can probably see that HDR10 and HDR400 aren’t really comparable. HDR10 is the basic standard – HDR400 is (theoretically) an HDR display with a maximum of 400 nits of brightness.
Having said that, if all labeling were accurate and honest, an HDR10 display would produce a better, brighter picture than an HDR400 display.
The world of HDR specifications and certifications can be very, very confusing. It’s totally understandable if you feel overwhelmed or out of your depth when trying to get your head around this.
And the industry knows that – that’s partly why VESA introduced their certification system. It’s far too easy for manufacturers to slap an ‘HDR’ sticker onto their monitors or TV’s specs, fully knowing that the product might not actually meet any of the minimum requirements.
My advice is to not only look for ‘HDR,’ or any variations of it, but to also check the other specifications to make sure they meet the standards you expect for your monitor.
The most important things to look out for are a high nit count of at least 600; some sort of dimming, either local or global, to provide that high level of contrast; a 10-bit color gamut, and perhaps most importantly, a DisplayHDR logo, so you know that the specifications meet the VESA requirements, and you will actually get an HDR monitor or TV.