Every few years will see the introduction of a new TV technology, introducing new buzzwords for TV buyers to get their heads around.
In the past year, we’ve seen the likes of VRR and HLG enter the lexicon around today’s commercial TVs, which have their own guides you can delve into. In this article, though, we’re taking a step back to address HDR, or high dynamic range – a TV technology that’s been widespread for several years now, with almost every television (bar the very cheapest) claiming to support it.
Let’s look at why HDR isn’t quite that simple, and why your current cheap TV might not be delivering the great HDR picture you bought it for.
The basics of backlight control
The main aim of HDR content is to increase the dynamic range of the image on-screen. This happens in two key areas: shadow detail and specular highlight detail.
That means more image information in the darkest and lightest areas of an image: grimmer stonework in dungeons and more realistic, dazzling sunsets – possibly even in the same frame.
In an ideal scenario a film is shot with HDR in mind, although plenty of HDR movies are tweaked from a non-HDR source.
To achieve a great HDR image, a TV needs to have the ability to actually increase the space between the brightest and darkest point on the screen. That is HDR’s playground. Without this increase, though, using HDR can simply appear to flatten the picture, making it look worse than it did before.
This is the reason local dimming is an essential part of any truly great HDR TV – and it’s usually absent in all cheap TVs.
Affordable LCD televisions are lit by a universal backlight, with a series of LEDs that shine light from the edge of the display into a matrix that spreads it across the screen. Any dimming that occurs in the dark scenes of a movie therefore affects the entire screen.
Picture a scene where half of the picture is bright – say, a cloudy sky with the sun peeking in – with the other half in the dark, where the protagonist is hiding behind a rock, for example. The brightness required by the bright areas means that the dim areas can only get so dark.
LCDs use a layer of polarizers to block unwanted light from the backlight, but some will still seep through, and it’s this that bursts your dynamic range bubble.
It’s the reason you want local dimming, and there are plenty of forms of this – as seen in Samsung’s QLED range.
TVs from the Samsung Q70T and below don’t feature local dimming, which may be surprising given the Q70T isn’t exactly a cheap TV, costing £999 / $799 for a 55-inch size.
You need to opt for the Samsung Q80T and above to get a full array LED backlight. This splits the backlight into zones of LEDs that can be switched off, dramatically improving image contrast in scenes where there are bright and dark spots.
The ‘compare’ tool on Samsung’s website suggests that all TVs, from the mid-spec Q80T up to the high-end Q950TS, are evenly matched in this area, coming with ‘Direct Full Array’ backlight control. But there’s more to it than that.
Larger, more expensive TVs tend to include more of these backlight zones. Samsung’s 55-inch Q80T has 50 local dimming zones, while the 75-inch Q950TS has 488.
The more dimming zones, and the more compact each one is, the better – as large backlight zones mean you get distracting (and contrast-sapping) light ‘halo’ effects around bright objects.
OLED TVs are the masters of local dimming, because each pixel is its own light source. The 55-inch LG CX 4K OLED effectively has 8,294,400 zones, although OLEDs don’t use this ‘zone’ terminology.
Fundamental limitations of cheap TVs
So is every model below the Q80T in Samsung’s range poor for HDR because it lacks local dimming? Not quite.
While there are technical limitations to a TV such as the Samsung Q70T, its actual HDR experience is buoyed by excellent native ANSI contrast of around 7,000:1 and a decent peak brightness of around 600 nits. The set’s contrast without local dimming is better than that of some TVs with local dimming.
However, buy a truly cheap TV and you won’t get both deep blacks and high peak brightness.
Let’s take a closer look at one of the best budget TVs as an example: the Samsung TU7100. We’d recommend this set to just about anyone looking for a low-cost TV. It offers great contrast and rich images of the type you don’t often see at this price. However, with a low peak brightness of around 300 nits, it just doesn’t have the power to get close to the punch of a real HDR experience.
The Samsung TU7100 achieves this high contrast because of its VA panel. Many alternative TVs around the price use IPS LCDs instead, and here you tend to see the opposite problem.
While such panels can often scrape together what you might call the base level of brightness required for HDR, when the backlight is maxed out for scenes with super-bright sections, the darkest areas will appear somewhat washed out. You end up with an ‘HDR’ picture that has relatively low dynamic range, which should be a contradiction in terms.
This is demonstrative of how HDR as a concept has been diluted to bring it to the mass market.
We see the same effect in computer monitors. Low-end models might be stamped with a VESA HDR400 quality mark, but all this means is that they can reach 400 nits of brightness. That alone isn’t enough for a legit HDR experience.
HDR10, HDR10+ and Dolby Vision
If you’re in the market for a budget TV, you may as well ignore the HDR standards published on a manufacturer’s website. If your TV can’t meaningfully display the basic HDR10 format, there won’t be a notable visual difference between that and its dynamic counterparts, HDR10+ and Dolby Vision. It’s only on higher-end TVs that you’ll really get the benefit of them.
HDR10 has become the baseline HDR standard. This is your meat and potatoes HDR.
HDR10+ and Dolby Vision are rivals, both more advanced than HDR10 because they use dynamic metadata. This provides the TV’s processor with information about the brightness of each scene so it can optimize its picture more rigorously. HDR10 videos use one set of metadata for each movie / episode of television.
Dynamic metadata often leads to slightly better specular highlights and shadow detail in Dolby Vision and HDR10+.
There are other differences, too, but most direct comparisons highlight that Dolby Vision encodes tend to be slightly darker than those of HDR10+.
What about color?
HDR fans out there may have noticed that we haven’t yet mentioned color. Yes, this is an important part of HDR – and it’s yet another area where the results will be highly flavoured by your TV’s capabilities and calibration.
Good standard dynamic range footage is encoded with 8-bit color. That includes Netflix HD streams, and standard-definition Blu-rays and DVDs. HDR10 and HDR10+ use 10-bit color, while Dolby Vision uses 12-bit color.
This bit depth determines the number of possible gradations between color tones, but not specifically what you might mean when you casually say “color depth”. People are usually referring to the richness of the boldest tones with that term.
That element of color is determined by the color space or “gamut”. Standard dynamic range streams use Rec.709, a color space extremely similar to sRGB. It’s the traditional standard for computer monitors of printers.
HDR formats use Rec.2020. This incorporates a huge array of color tones deeper than those of Rec.709, particularly in the green and red wavelengths.
Even our favourite TVs don’t get close to filling this color gamut. The flagship LG CX OLED manages around 70-76% of Rec.2020 – and a cheap TV can hardly compete.
The Samsung TU7000, a TV we mentioned earlier that offers decent image quality but poor brightness for HDR, only aims for full coverage of sRGB/Rec.709. Rec.2020 doesn’t even come into the picture.
While it may be HDR-compatible, the Samsung TU7000 is unable to display the additional color tones that HDR unlocks.
This can lead to poor results if a TV isn’t particularly adept at handling HDR color outside what it can actually relay. Some sets exhibit color clipping in HDR, where you’ll see a flat-lining of color tone at the most saturated point. This may be visible in an image of a red flower, for example. Subtle natural color textures become flat patches rendered in one tone of red. It’s similar to overexposure white-out in photography, but with color.
Headroom for new heights
The top HDR formats even show up the very best consumer TVs in the world, such as the Panasonic HZ2000, LG CX OLED and Samsung Q95T. But this is largely because such formats have been made not only for the TVs of today, but for those of the future too.
If we break down HDR10+ and Dolby Vision to their basic bullet-point differences, we get the following (see box, right).
HDR10+
10-bit color
Up to 4,000-nit mastering
Dolby Vision
12-bit color
Up to 10,000-nit brightness mastering
The LG CX OLED manages around 700-750 nits of peak brightness, and that’s only when a small fraction of the screen is lit. This drops to below 150 nits when displaying a full screen of white. It supports 10-bit color, not 12-bit.
Samsung’s Q950TS offers much higher brightness, with 2,000 nits in Dynamic mode and 1,300 using the Movie preset. But its display panel isn’t even “true” 10-bit. It’s an 8-bit panel that uses a trick called frame rate control, or FRC, to bump this closer to 10-bit performance. This uses the screen’s polarizer layer to rapidly flick between states, to emulate the missing tonal gradations.
What does this tell us? While home cinema fans often become annoyed by ‘fake’ HDR in a variety of flavours, the reality isn’t so clear-cut. And the HDR we venerate today may be considered fake HDR just a few years in the future.
For a great HDR experience, look for TVs that feature support for either HDR10+ or Dolby Vision, in addition to an OLED panel or an LCD one with good local dimming. Anything else will just let you down.
- What is VRR? Variable refresh rate explained
from mohammadahte https://ift.tt/3pZCmnY
0 Comments