Last Updated: February 4, 2021
Imagine you are in a dark office somewhere, working on a computer. Finally, after several exhausting hours, it’s time for you to go home. You step out of the office, and the bright sunlight blinds your tired eyes. Fortunately, in about five minutes, they adjust to the light.
Still, what about our devices?
Can a camera, or a TV adjust to the light vs. dark levels in a scene?
HDR makes that possible.
But what is HDR, exactly?
HDR is an acronym for high dynamic range, a method used for over a century.
That’s right – almost 170 years ago (around the year 1850) Gustave Le Grey invented the HDR technique.
In simple words, the dynamic range represents the ratio of light to dark. Therefore, a high dynamic range increases the contrast ratio to create an image that’s closer to real life.
That, essentially, is the meaning of HDR.
Let’s delve deeper, shall we?
What Is HDR and How It Works
Back in 1850, Gustave Le Grey wanted to take photographs where the sky and the sea are both visible in detail. Unfortunately, when he focused on the sea, the clouds lost some of their features and vice versa.
Ain’t that a bummer.
See, he didn’t have a powerful HDR camera to take the superb photograph his artistic soul longed for.
What he had, though, was an idea.
So he took two photos of the same scene – one focusing on the sky and one on the sea. Then he overlaid the two negatives and printed this:
Gustave Le Gray – Brig upon the Water. Credit: Public domain image.
This photograph is one of his many works, which marked the birth of HDR as a technique and later on as a technology.
Back when Gustave Le Grey made this photo, the accent was on contrast alone.
Later on, when cameras evolved, colors were added to the mix. So, for several years now, it has been incredibly easy to take an HDR photo with our smartphone.
The technique our camera uses today is the same Gustave Le Grey used 170 years ago. Our phones capture at least three pictures of the same scene and overlay them to create a sharper image. Still, it’s much simpler today, thanks to the built-in HDR software in our smartphones.
So what does HDR mean in modern times?
In short – a better contrast ratio and more authentic colors.
The whole idea of HDR is to make images more like what your eyes see in the real world.
Before we continue, you have to know there’s a crucial distinction between HDR photography and HDR for displays.
To clarify each one, let’s review them in more detail.
What Is HDR in Photography and Why Does It Matter?
As mentioned before, using HDR allows your camera to take several pictures and combine them to create HDR images.
So how does HDR work step-by-step?
First, take a picture with your smartphone. When HDR is enabled, you’ll notice it takes more time to take the picture, compared to the same photo with HDR disabled.
That’s because your device is making several pictures with different exposure in one burst. These pictures, called stops, start with a very dark image and double the amount of light in every subsequent one. This technology is also known as HDRI, or high-dynamic-range imaging.
Thanks to HDR, the details in the darker places are better visible, and the contrast between light and dark makes the photo more lifelike and vivid. Like this one:
Although you can make breathtaking photos using HDR, bear in mind that it’s not a universal recipe for mind-blowing images.
Do’s and Don’Ts in HDR Photography
HDR can make your photo a masterpiece or a piece of you-know-what.
So using HDR might be a good idea if you’re shooting one of these:
Usually, there is a big difference in contrast between the sky and land/sea (the problem Gustave Le Grey had.) In such a case, shooting several pictures and combining them can fix this issue and reveal more details. Both in the brighter and the darker parts of a scene.
- Low-light scenes
If there isn’t enough light available, you can enhance the image by using HDR. This will reveal more details in the dark and the light spectrum alike. This technique doesn’t just increase the brightness levels of the entire photo. Instead, it combines the darker parts with the lighter ones, thus creating a beautiful lifelike effect.
- Too bright scenes
You know how sometimes you take a picture of someone and the sun shines directly in their face, creating unwanted shadows?
Or how the glare from a car’s windshield can cripple an otherwise beautiful photo? HDR can fix this by reducing the white levels to balance the overall image.
These images are astonishing, but without HDR, they would hardly take your breath away.
Nonetheless, there are several situations when you shouldn’t enable HDR.
- Scenes with moving objects
Since you already know what HDR is and how it works, it’s easy to imagine the result you’ll get from overlaying several captures with moving objects.
- Scenes with intense colors
If you are shooting a scene, which already has vivid colors, using HDR may mute them.
- Scenes with intended high contrast
Imagine you want to take a photo of a silhouette. Enabling HDR wouldn’t be a good idea since it will try to reveal more of the object’s features.
You can’t create such a photo with HDR enabled:
Well, that concludes our brief HDR photography tutorial – you now know how high dynamic range can make or break a stunning image.
Let’s turn our attention to a technology we weren’t able to use in the last 170 years. Specifically – HDR TV.
What Is HDR TV and Is It Worth It?
We already know the basics of HDR, thanks to the previous paragraphs. Be that as it may, the focus so far has been on capturing HDR images. Now let’s find out about HDR video.
HDR for TV emerged in the last few years, and the big manufacturers quickly adopted the technology for their newer models.
Simply put – it looks magnificent.
In a video, the high dynamic range expands the contrast between light and dark, thus adding depth to the scene.
And since the contrast is half of what makes HDR TV look great, let’s delve a little deeper.
The contrast in a video refers to the difference between the brightest and darkest parts of a scene. HDR, in essence, improves the whitest whites and the darkest blacks. It also reveals more details in between.
A unit, known as a nit measures light levels. Each nit symbolizes the amount of light a candle produces. The sunlight equals 105 nits.
For reference, the average FullHD TV can achieve up to 500 nits of brightness.
We don’t particularly care about FullHD TVs in this article, so let’s see how the big boys handle nits.
The darkest black is zero nits. So far, only OLED TVs are capable of achieving a perfect black (more on that in a bit). In terms of light, they aren’t able to produce as many nits as a LED TV.
So, to be graded HDR-compatible, a TV must be able to display bright scenes at least 1,000 nits for LED and 540 for OLED TVs. In fact, some of the top-notch TVs can display up to 2,000 nits.
So, there are two factors, in terms of contrast, that earn a TV the HDR label in a store.
The king of pop revealed both of them almost three decades ago – it’s Black or White. Except in this case – it’s both.
Since we’ve already covered the whites – how about we enter the darkness just for a while?
You’ve probably noticed OLED TVs have a lower barrier to entry to get the HDR sticker (540 vs. 1,000 for LED)
That’s because they can achieve the required ratio between white and black more easily because they can display absolute black.
See, these TVs can turn off their pixels, thus achieving perfect black. LED TVs, on the other hand, can’t. That’s why the latter has to compensate for the lack of pitch-black pixels by producing better white levels.
Bottom line – LED and OLED TVs achieve the required contrast ratio in different ways.
That said, the great thing about HDR technology is that it doesn’t expand just contrast.
The good news is HDR technology often comes hand in hand with WCG (Wide Color Gamut), which increases the number of color shades a TV can produce.
Here’s how the color gamut looks like, and every HDR TV can display the UHDTV colors, marked on this image:
CIExy1931.svg: Sakurambo derivative work: GrandDrake [CC BY-SA 3.0]
The average FullHD TV can produce 8-bit color, which equals around 16 million colors. Although that sounds impressive, an HDR TV can process 10-bit color, also known as deep color. In theory, such a TV can produce over a billion colors.
The result you’ll get when you combine the increased contrast ratio with the expanded color range is breathtaking.
So, if the time has come to change your TV, you should consider a model that supports HDR. It’s not just the new fad like 3D used to be. The difference between a regular 4K and a 4K HDR TV is undeniable, even to the naked eye.
Many people get confused that 4K is HDR. That’s not always the case. To make things absolutely clear, we must answer a simple question.
What Is 4K?
The 4K standard refers to the TV’s resolution – or to simplify it even more – the number of pixels on the screen.
4K TVs have 3,840 horizontal and 2,160 vertical lines of pixels, and their number is the same, regardless of screen size. Commonly known as 3840×2160 resolution.
For comparison – 4K TVs have four times higher pixel count than FullHD TVs (1080p).
In short, 4K stands for the number of pixels, whereas HDR refers to quality.
Today most HDR TVs are at least 4K (or 8K and above), but not all 4K TVs are HDR compatible.
So if you are wondering between a regular 4K and HDR, you may be asking yourself: “Is HDR worth it?”
The answer is a resounding yes!
Many brands label their 4K TVs HDR compatible. That doesn’t mean they can reach the contrast and color levels we mentioned.
To be sure a TV can produce HDR content, look for the ULTRA HD Premium logo. It looks like this:
Although a big screen helps appreciate the HDR experience, some phones can also play HDR.
Smartphone manufacturers also adopted the HDR technology to improve their clients’ viewing experience. The list of HDR phones grows day by day, and there are more than 50 models on the market today. LG phones also support the Dolby Vision format (more on that in a bit.)
So where can you find HDR content for your phone?
The most popular streaming services like Youtube, Amazon Video, and Netflix, offer HDR content for mobile.
Although HDR phones produce an incredible picture, they can’t compare to the images a TV can provide.
To take full advantage of this technology, TV manufacturers use different HDR formats to enhance their models.
HDR Formats Explained
As of today, different HDR TVs use one or more of the five HDR formats:
- Dolby Vision
The main difference between them is the way each one uses metadata (or doesn’t use it at all.)
So What Is Metadata?
In short, metadata is the information which transforms a standard video into HDR video. There are two types – dynamic and static metadata.
- Dynamic metadata can adjust the high dynamic range on the fly – scene-by-scene.
- Static metadata doesn’t change, which can result in a loss of details in different scenes.
Think of it as a lamp. You turn it on at night and turn it off during daytime. This is “dynamic metadata.” With static metadata, the lamp is always either on or off.
Now we get to see the stars of the show and how they differ from each other.
This is the most widely adopted standard. It’s free to use, and almost every brand supports HDR10. It uses static metadata and offers 10-bit color depth. This format essentially sends metadata once at the beginning of the video, and the TV uses the same settings throughout the whole video.
LG uses its own Active HDR, which adds dynamic metadata on top of the typical HDR10 format. Active HDR isn’t precisely an HDR format, only a display end implementation.
Created by Samsung, 20th Century Fox, and Panasonic, this format adds dynamic metadata to the HDR10 standard. HDR10+ can adjust the brightness levels for each scene or frame in real-time, making the video look the way its director intended. Like HDR10, it produces 10-bit color depth and is an open standard. Manufacturers and content creators don’t have to pay for a license to use it.
Dolby Vision is by far the most technologically advanced standard. It uses dynamic metadata and provides 12-bit color depth. It adjusts to every screen and optimizes videos frame-by-frame.
In theory, Dolby Vision can produce content with up to 10,000 nits of peak brightness.
The downside of this standard is that manufacturers have to pay Dolby to use their format.
What’s more, there still aren’t many models, which can take advantage of Dolby Vision’s possibilities.
Anyway, if we compare Dolby Vision vs. HDR10, Dolby wins in every category, except price point. (The latter is free.)
HLG – Hybrid Log-Gamma
If you didn’t have enough acronyms in this article – here’s another one – HLG.
BBC and Japan’s NHK decided to create a format, which broadcasters can use to display HDR content.
It doesn’t use metadata at all, which makes it universal for any 4K TV (SDR or HDR) Standard Dynamic Range – FullHD format. In spite of this, it can still produce broader brightness and color levels.
The standard is backward-compatible, meaning it could process HDR video and display it as SDR.
It’s free to use by manufacturers, but the downside is there isn’t a lot of content yet. Most TV stations aren’t broadcasting 4K content either.
Fortunately for BBC, most newer TV models have adopted the HLG format, so we may expect broadcasting companies to embrace it as well.
What differs the Technicolor HDR from the other formats is that it can upscale SDR content to HDR.
In simple words – if you own a TV that supports Technicolor HDR, you’ll have a better picture watching SDR content as well. This happens by converting the signal to suit your TV specifications. That comes to show you’ll get the best of your TV’s possibilities no matter the content’s original format.
The Technicolor HDR technology is royalty-free for TV manufacturers and broadcasting/streaming companies.
Still, all these new TV models and formats wouldn’t mean a thing, if there wasn’t enough content to back their existence.
HDR Content – Where Can You Find HDR Videos?
Owning a 4K HDR TV doesn’t make much sense if you can’t watch HDR videos, right?
There are two ways to watch them – via streaming platforms or using hardware players.
Here’s where and how you can watch HDR content:
Some of the mainstream streaming platforms provide HDR content – the differences between them are the number of movies/shows and the supported formats.
The largest streaming service supports the mainstream HDR10 format and Dolby Vision. If your internet connection is 25MB/s or higher you can enjoy Netflix HDR titles.
However, if you have a Samsung TV, you won’t be able to watch a movie with dynamic metadata. The reason is that Netflix doesn’t support HDR10+, and Samsung opposes Dolby Vision. This is why you can only watch standard HDR10 shows and movies.
Amazon Prime Video
While Amazon counts on HDR for its “Prime Original” productions, they limit their HDR content to only three manufacturers. You can watch HDR titles only if you have a Samsung, Sony, or LG TV.
Titles in the Prime Video collection are available in HDR10, HDR10+, and Dolby Vision.
It’s no surprise Apple’s streaming service doesn’t support Samsung’s HDR10+, because of their well-known rivalry.
Apple users can enjoy just the standard HDR10 and Dolby Vision movies.
A nice touch by Apple is that if you’ve already purchased a FullHD title, once you get a new 4K HDR TV, you can watch it in HDR without buying the same movie twice.
You can even find a detailed list of 4K HDR titles available on iTunes.
These were the most popular streaming platforms which offer titles in HDR. FYI Hulu, HBO Now, and Sling TV don’t stream in 4K HDR at all.
Anyway, if you want a hard copy of a movie and watch it with HDR, here are the players that support 4K HDR.
Devices Supporting HDR
If you are a fan of discs and want to have a collection of the coolest movies, you can identify if they are HDR easily. Each case has an HDR logo on the front.
Ultra HD Blu-Ray
Ultra HD Blu-Ray players are becoming the new player-standard for the last two years. More and more movie titles come with a high dynamic range. Even some older movies are enhanced with HDR technology. One of those examples is the Matrix (1999) which came back at the end of 2018 with HDR and Dolby Vision.
And we all know who else comes back. The Terminator was released once again in 2018, this time – in HDR.
Just like with streaming services, different Ultra HD Blu-ray players support different formats, depending on their manufacturer. Nearly all new models support HDR10. Dolby Vision is the second most popular format and is gaining up on the current leader.
Samsung’s players, naturally, come with HDR10+. Recently the Korean giant announced they teamed up with Universal to release new HDR10+ content in 2019.
Now, let’s have a look at the gamers and the different abilities of the game consoles.
PlayStation 4 Pro and PlayStation 5
The PS4 Pro came with built-in HDR support since its launch. However, it doesn’t have a 4K Ultra HD drive, so it can’t stream HDR movies on your TV.
The games, though, are HDR-ready. There are several titles with the high dynamic range available so far (here’s a list). The list will grow much larger as HDR eventually becomes the standard in the gaming industry. So, rest assured, we’ll see more HDR games in 2022 and the years to come.
The new PS5 also supports HDR, and the new console comes with improved support, considering PS4 had some issues.
Xbox One X, Xbox One S, Xbox Series X
The PlayStation’s rival offers a built-in Ultra HD Bluray drive so that you can watch your favorite 4K HDR Bluray discs on your TV. The funny thing is it was Sony that created the Blu Ray technology in the first place (and as mentioned they didn’t put it in their own console.)
Anyway, Xbox One S and Xbox One X both offer HDR10 for video. The latter even supports native 4K HDR gaming, while in Xbox One S the graphics are upscaled to 4K.
So, that’s all in terms of devices.
The time has come for the high dynamic recap (HDR.)
Phew, that was quite a long journey.
We started at the dock of HDR photography. Then we boarded Gustave Le Grey’s ship, which took us into the future. It looked so bright and colorful that we decided to stay there for a while.
We saw vivid colors and discovered the vital importance of contrast ratio and metadata – and they introduced us to the different HDR formats. They, on the other hand, explained how they are implemented on different devices.
In general – we covered the basics of HDR in all its forms. To top it off – there is a FAQ section below.
If you liked this piece – share it, or express your opinion in the comments below.
See you next time, when there’s yet another technological wonder to explore.
Many people wonder about this – HDR vs. 4k, which is better?
HDR is undoubtedly better, compared to a simple 4k TV.
The majority of HDR TVs are 4K anyway, but not all 4K TVs are HDR compatible.
4k stands for quantity (of the pixels), while HDR refers to quality.
So it’s not a matter of choice between HDR and 4k – you can (and optimally – should) have both.
A 4k HDR TV is simply the optimal choice.
That is, of course, if your current TV doesn’t live up to your standards. If you casually watch the news and some show from time to time, 1080p might be enough.
However, if you want the best experience a TV can give you in 2021, go with a model that supports 4K HDR. (or better yet, 8K HDR, but this will be a talk for another time.)
To put it simply go with a TV that has “Ultra HD Premium,” printed on the box.
Many brands label their TVs as HDR compatible, but the label above is the only thing that can guarantee it.
HDR TV was the buzzword of 2019 in TV technologies. There are two main features which distinguish it from the average 4k TV.
While 4k provides a better viewing experience than 1080p, HDR makes use of the number of pixels to reveal more colors. The TV must be able to produce at least one billion colors to be able to make use of HDR content.
The increased contrast between the brightest and darkest parts of a scene is what makes the images “pop.”
SDR, Standard dynamic range. TVs produce up to 500 nits, while HDR standard requires at least 1,000 nits for LCD/LED TVs and at least 540 for OLED. The latter has lower requirements because it compensates with deeper blacks.
Yes, it does.
If you compare HDR vs. non-HDR TVs using the same source, you won’t have to wonder which one has HDR.
Thanks to HDR, more details pop in the picture, especially when the scene has high contrast.
This is especially convenient in horror movies, where scenes are more visible thanks to HDR.
So now that you know what HDR is, and why it makes a difference, there’s only one question left.
No, it isn’t.
UltraHD and 4K are the same thing, while HDR is something fundamentally different.
4K TV has four times more pixels than FullHD TVs (1080p).
HDR doesn’t care how many pixels it has to work it – it just takes advantage of their capabilities.
So it’s not the same. The UHD refers to resolution, while HDR refers to contrast and colors.
Now, this is a tricky question.
1080p refers to the resolution (a.k.a. the number of pixels).
In simpler words, HDR deals with what these pixels can do.
So basically, HDR content can be run on 1080p, but is arguably worth it, since there was no HDR when FullHD TVs appeared.
To receive the full viewing experience from HDR, you need at least a 4K UHD Premium TV.
And it looks fantastic. You can’t get that kind of picture in 1080p (FullHD).
4K HDR looks much better than 1080p.
Scratch that – it looks breathtakingly beautiful.
HDR content is far from abundant, and broadcasting groups still use HD or FullHD. So if you are happy with 1080p, don’t rush to the store to buy a 4K HDR TV just yet. Their price is still in the higher echelon, even though it’s slowly dropping.
Well, dear reader, that’s entirely up to you. We only supply the facts, based on which you can make an informed decision.