This is why you don't need a 4K HDR monitor (yet)

HARDWARE

HDR is seemingly everywhere and not just in those HDR photo modes on our cameras.

Sony recently launched the XPERIA XZ Premium – the world’s first 4K HDR smartphone. You can also buy 4K HDR TVs, Netflix will stream Marvel’s Iron Fist in 4K HDR during 2017, and PC games are starting to feature HDR modes.

You might know that HDR stands for High Dynamic Range imaging. But you might not know why you need it or whether it’s worth the upgrade?

HDR has its origins in photography. It’s a technique designed to show off a greater range of luminance levels in an image, which it achieves by combining several different exposures together. The result boosts detail that’s often lost in the dark and bright areas of a photo, jazzing up colours so that they appear more true to life.

The idea of combining multiple images is older than you might think. Back in the 1850s, French photography pioneer Gustave Le Gray combined two negatives with different exposures to create atmospheric seascapes.

Deeper blacks, whiter whites

Over 160 years on, HDR has since made its way to 4K TVs like the Samsung UE49KS8000 and the LG OLED55C6V. These high-tech screens are characterised by more vibrant colours, deeper blacks and brighter whites.

It’s not just high-def tellies. 4K HDR monitors are starting to appear too. At CES 2017, LG showed off the 32-inch UHD 4K 32UD99 (pictured below), while Dell teased an ultrathin 27-inch S2718D display with XPS 13-style InfinityEdge bezels.

That said, HDR monitors are currently thin on the ground. But future displays will come equipped with the next generation of AMD’s FreeSync technology.

Radeon cards are already HDR-capable

“Radeon FreeSync 2 technology is the first of its kind that combines HDR support with dynamic refresh rate technology, and does it in a seamless, plug-and-play manner that improves gaming quality automatically when the right content is present,” said Scott Herkelman, vice president and general manager of gaming, Radeon Technologies Group, AMD.

The good news is that all FreeSync-compatible Radeon GPUs will support the new FreeSync 2 technology. This means that RX-based cards like the RX 560 or RX 580 are already HDR-capable. It’s just a matter of time.

The bad news is that there are still some significant HDR challenges to overcome. One is that few PC games currently support it. Shadow Warrior 2 by developer Flying Wild Hog was the first title with built-in, real-time HDR. Mass Effect: Andromeda will also cater for it at launch, while Unreal Engine 4.15 now bakes in experimental HDR support.

Another potential problem is that there isn’t just one HDR standard. There are four – HDR10, HLG, Dolby Vision and Advanced HDR by Technicolor. HDR10 and Dolby Vision, however, are the standards worth remembering. AMD’s Crimson ReLive software unlocked them in Radeon hardware last year.

PC gaming in 4K HDR is certainly coming and technologies like AMD’s FreeSync 2 and HDMI 2.1 will help speed it along. Right now, it’s arguably too early (and too expensive) to invest in a 4K HDR monitor or an HDR-friendly HD TV. Take another look in 2018.

Dean Evans
Dean Evans is a long-time gamer and reviewer who built his first PC at the age of eight. He is powered by That Media Thing, a collective of journalists who believe in the power of passionate content.

JOIN THE NATION

SIGN UP
JoinSapphireYoutube_logo

COMMENTS