by Roger N. Clark
There is a revolution coming to Photography, and it has already arrived for video. Photographers often talk about dynamic range and seek cameras with "high" dynamic range. They then post process that high dynamic range into lower dynamic range for display or print. But the real world IS high dynamic range, even in a typical indoors scene. New display technology is able to present images and video in a more natural looking dynamic range, and new standards have been developed to take advantage of the dynamic range now and into the future. This new forward-looking standard and technology is, to put it mildly, jaw dropping, knock your socks off, and game changing. In this article, I'll explain the standards and technology, show why it is game changing, and where the technology stands for photography and video. Once you actually see this technology, images and video on our current low dynamic range displays and prints will look flat, lifeless, and limited in color.
High Dynamic Range (HDR)
Color range (gamut)
Sound: Dolby Atmos and DTS:X
Data rates and streaming video vs blu-ray disks
More Technical details
Discussion and Conclusions
References and Further Reading
A new stunning revolution in color imaging and video is underway. There have been a number of revolutions in photography and videography over the years, for example, the change from film to digital cameras, or the change from CRT screens to LCD flat screens. The next revolution will also be profound and it involves better color with more dynamic range. It is amazing technology, as big as the change from black and white to color in my opinion. And it is here. If you are in the market for a new TV read this article and the next one before purchasing.
When I first began to investigate this new technology, circa 2017, I did not predict it would have such an impact. After all, it seems current (at the time) computer displays and TVs had plenty of dynamic range. I was stunned when I saw the first high dynamic range 4K movie on a monitor that actually delivered true high dynamic range. The color and realism was far beyond anything I had ever experienced in computer or TV displays. Note: many so-called 4K HDR TVs are not actually high dynamic range. There is currently only one technology that really shines in this regard and that is Organic Light Emitting Diode, OLED.
In writing this, I can only describe the effects, because unless you view the new technology yourself, it is somewhat like trying to describe the advent of color TVs when you are only viewing with a black and white monitor. Sure, we see in color, so can understand color versus black and white. We also see in high dynamic range, so perhaps I can try and describe the impact of the new technology.
There are 5 factors, each making a profound change in image and sound quality, in order (in my opinion) of impact:
HDR: High Dynamic Range. New technology, currently OLED is by far above other technologies.
SDR: Standard Dynamic Range. This includes most current LCD/LED flat panel computer monitors and TVs, as well as all print.
The human eye can see detail over about 14 stops, but we perceive brightness over a much larger range. The extreme example is the sun: we can perceive detail in bright sunlit snow as well as in shade cast by a tree in the scene, and while the snow can appear uncomfortably bright, we also see the sun as much much brighter than the snow. Same with sun glint off water or metal--we see it as very bright but photos showing the scene on a SDR monitor or print can't show that brightness. Realism is lost. But HDR displays can get us closer to that reality. The difference is profound. In fact the increased dynamic range can give the impression of 3-dimensional, 3D. I have observed this effect myself, and others have commented about this too.
There are multiple types of TVs and computer monitors, including those saying they are HDR because they can accept an HDR formatted signal. But many aren't really HDR having limited dynamic range (often less than 1000, or 10 stops). The current top end HDR is Organic Light Emitting Diode, OLED, which has dynamic range over one million! That adds reality and 3-dimensionality to scenes close to the real view, and has to be seen to really understand the impact. When this was first described to me, I though Oh, OK, that's nice. I had to experience it first hand to understand the truly jaw dropping effect HDR high color gamut OLED displays can show. I have also reviewed and purchased other technologies, and OLED is far above other technologies as of this writing. I'll describe these differences below.
The high dynamic range on an OLED TV makes scenes much more realistic, from glint off water, to lights in a room, to backlit hair, to a fire. On a standard LED TV, 4K or not, a fire looks like a flat dull orange blob. But on a high dynamic range display, the fire looks bright orange, just like looking at a real fire with detail in the bright dancing flames. You can see detail in the flames and they appear very bright. Viewed in a dark room, the light from the fire flickers off the walls, just like a real fire. The blacker blacks on OLED add so much contrast at the low end, it makes many scenes look 3-dimensional. I did not realize the difference until I saw it. A good 4K HDR movie on OLED is simply stunning. Photos can be this way too.
Other HDR display technologies use what is called local dimming, where the back light is dimmed for darker areas of a scene. But because local dimming involves regional dimming over many pixels, a bright object surrounded by dark areas will show halos. On cheaper TVs/monitors, there are fewer dimming zones and the halo around a bright object may look square! OLED avoids this because each pixel can be the full brightness range from pure black to as bright as the pixel can get. That enables edge contrast, pixel to pixel, unmatched by any other technology, and it is that edge contrast that makes the difference.
The blacker blacks of OLED have another effect. The color range (gamut) of a standard monitor decreases as intensity drops, so at the low end there is little to no color at all. For example, a typical modern LED computer display that advertises close to 100% Adobe RGB color gamut may fall below sRGB at 3 stops below maximum brightness! This loss of color gamut is not reported by any review site I have seen. This loss of gamut is not how we perceive a real scene, with the scene turning gray into the darker parts of the scene. Another problem is dark areas on a standard monitor/TV are gray, not black. OLED with its blacker blacks due to its higher dynamic range shows colors in dark scenes like we see in reality. It is a stunning difference that I had no idea would be so profound because we have been conditioned with the current low dynamic range technology for decades.
Put all of the above together and the change in computer/TV display impact is jaw dropping. (I can't say jaw-dropping enough in this article!) This revolution has been happening over the last several years and is now quite mature in new TVs and all but low end computers (laptops are still behind). The internet, as I write this, seems to focus on resolution (4K, 8K), and the video processing tutorials mainly focus on 10-bit color as reducing noise. In my opinion, these ideas completely miss the point. HDR output requires 10 or 12-bits per color to be effective, along with the Rec 2100 standard. That is the game changer, not simply resolution and noise. Given images with excellent light, composition and subject, I would prefer 2K content in HDR than 4K content in SDR. For movies, I look for 1) great story, acting, and cinematography, 2) HDR, and 3) sound in Dolby Atmos or DTS:X. Resolution is 4th, and of course having the first 3 plus high resolution (4K, 8K) just knocks your socks off so far you may never find them!
All this new technology is still working to meet the forward looking standards. The problem for electronics is data volume and data rates and that is taxing current technology. In order to deliver the data, compression is used. Data rate is also a problem even on your local computer trying to deliver high resolution 4K and high dynamic range content to your computer monitor. I will review the current status and try and keep this article up to date as the technology changes. Fortunately the technology has come a long way in the last couple of years and prices have started to become more reasonable for high end imaging.
The new technology requires new hardware and software. HDR images are 10 or 12 bits per color with specific coding for intensity. Current image and video (SDR) is 8-bits per color with no specific coding for intensity. The change from 8 to 10 or 12 bits per color requires new hardware (e.g. new computer graphics boards, and new TVs), as well as new software. Most TV's on the market these days have this new capability to process high dynamic range video, but only OLED has the true blacks pixel to pixel that show the astonishing impact.
Now to more technical details.
Photographers have learned that High Dynamic Range (HDR) meant taking High Dynamic Range images from a camera, sometimes using multiple exposures of different exposure times and compressing that dynamic range into the range that can be printed or displayed on a computer monitor of TV. But the new High Dynamic Range (HDR), and the HDR discussed in this article changes the term to mean the capability of the output device.
HDR is not simply brighter; it is more detail in the tonal range and the tonal range is larger than SDR. For example, low cost LED HDR displays may be able to present a brighter images than OLED, but OLED presents much darker levels, so has a greater dynamic range. Dynamic range of some different technologies are listed in Table 1.
DYNAMIC RANGE IS KEY TO REALISM
|CRT Monitor||~1:100||~ 8|
|LCD Monitor||~1:few hundred||~ 9|
|LED Monitor||~1:500 - 1000||~ 9 to 10|
|LG Nano LED||~1:1200 - 2400||~ 10 - 11|
|Samsung QLED (2019)||~1:3700 - 5700||~ 12 - 12.5|
|OLED (2016+)||> ~1 million||> 20|
|Within View||~1:15,000||~ 14|
|Adaptable within a scene||~1:1,000,000||~ 20|
|Total range of adaptation||~1:1 to 35 billion||~ 30 to 35|
HDR is independent of resolution and color gamut. For example, many movies and streaming shows are at 2K resolution and in HDR. The Queen's Gambit on Netflix (2020) is one superb example.
There is no HDR "format war." There are several software standards, both open and proprietary (licensed) and it is a matter of software to use those standards. But in every case, even a proprietary standard, like Dolby Vision will default to open standard HDR10 if the hardware does not have Dolby Vision software.
HDR can be compatible with SDR displays.
There are standards for HDR video and HDR still images.
Currently, there are 5 different common video HDR formats.
16-bit tiffs and raw files can be developed into HDR. Still image HDR formats:
Currently, no TV that I am aware of will read still image HDR formats, only SDR JPEGs. To display still image HDR, one must convert to a video stream with HDR. LG's OLED TVs do a pseudo HDR from 8-bit JPEGs that is pretty good. More on this in the next article. We need pressure from the still image photo community to get TV manufacturers to include these formats. We also need the still image raw converters and photo editing software to include these standards, along with Rec 2020 color gamut and Rec 2100 standards.
There are multiple standards for color. For an introduction to modern color standards and CIE Chromaticity, along with why these standards are not standards for human color vision, see: Color Part 1: CIE Chromaticity and Perception and Color Part 2: Color Spaces and Color Perception. These articles show that the standard CIE chromaticity is a significant compromise originating from people not wanting to perform numerical integrations on negative numbers by hand (circa 1930's), so they torqued the data to be all positive then invented "approximation matrices" to try and correct that perversion back toward reality. It works reasonably well for many colors in the natural world, like for trees, rocks and soils, people's skin, animal fur, but not as well for materials with unusual spectral shapes (many synthetic materials), and departs further from reality for very unusual things, like the true color of Rayleigh blue sky, to unusual light sources, like neon lights or emission nebulae in the night sky.
The main color spaces in use are:
Note that the Adobe RGB and sRGB blue primaries are the same, and the reds are the same, but the greens are different with the Adobe RGB green much greener. DCI-P3 has about the same blue primary, but redder reds and a green primary between the sRGB and Adobe RGB primary. On an Adobe RGB or sRGB calibrated monitor reds look red-orange, especially compared to a DCI-P3 monitor. This has led some people to complain that these new generations of TVs look too saturated in the red. Ironically, the new TVs are closer to red accuracy than the older generation sRGB and Adobe RGB monitors!
A typical backlit LED/LCD monitor is shown in Figure 3. Such backlit monitors have low dynamic range, typically under 1000, and blacks appear as a dark gray. You can test this yourself by filling you monitor with a black image (or a dark image with some actual black areas) and in a dark room with no lights, does the black actually look black? On backlit monitors, it does not. More typically we see a dark gray, or slightly off-color gray. This low contrast means that as scene intensity decreases, so does the color range. On the monitor shown in Figure 3, the color gamut drops below sRGB at only 3 to 4 stops below maximum brightness! That isn't even mid-tones! Very dark areas will look a blue-gray.
A QLED HDR TV color gamut is shown in Figure 4. Note that the red primary is near the DCI-P3 red vertex and this is specified as a DCI-P3 monitor. At about -7 stops the color gamut drops significantly below sRGB. and the low end is abut 3 tops better than the Del monitor in Figure 3.
The LG OLED color gamut versus intensity is shown in Figure 5. It shows excellent color, dropping significantly below sRGB at -16 to -18 stops. If the monitor was in a room painted black and all people were wearing black clothes it might be a little better. The measured low end was limited by reflections of the test pattern off the walls of a dark (no light) typical living room at night. Even at -18 stops, the color is still strong, and that shows with great impact on perceived image content. It has to be seen to understand what a profound effect this is.
Another game changer for video is 3-dimensional immersive sound. Imagine walking through a forest and hearing a bird in a tree overhead, and from the direction of the sound, be able to pinpoint where that bird is located. New 3-D immersive sound enables that in your home. There are two technologies in common use: Dolby Atmos and DTS:X.
The advent of surround started after simple 2-channel stereo. Because we cannot perceive the direction of low frequencies sub-woofers were added to the 2 channel sound system, called 2.1, where the .1 meant the sub-woofer. Next came 3.1, which added a center channel for voice. Next came 5.1: center, left front, right front, left side/rear, right side/rear. Then came 7.1: center, left front, right front, left side, right side, left rear, right rear. Now another number is added, like 5.1.2, 5.1.4, 7.1.2, 7.1.4, etc. The added number is the number of overhead speakers. The overhead speakers adds the 3rd dimension.
But the new channels are not simply multi-channel recorded tracks. While some audio tracks can be recorded for specific channels for backward compatibility, additional sounds are not coded for a specific channel, but to come from a specific 3-D direction! The receiver must then interpret the 3-D location and send sound to your speakers and which ones depend on the direction the sound should come from and your speaker setup. I run a 7.1.2 system and the 3-D effect is impressive. I plan to move to a 7.1.4 setup in a couple of years when receiver prices come down. Both Dolby Atmos and DTS:X offer similar capability, and for a home system one probably can not tell the difference. Dolby Atmos is proprietary and is licensed, while DTS:X is an open standard.
The coming revolution is in how we view images
|Increased resolution (nice but not the revolution!!!!):|
|Name||Pixels||Megapixels||bits / color||SDR/HDR||When Common|
|VGA||640 x 480||0.3||8||SDR||1990s|
|XGA||1024 x 768||0.79||8||SDR||~2000|
|1080p HD||1920 x 1080||2.07||8||SDR||mid-2000s|
|4K UHD||3840 x 2160||8.3||8||SDR||~2016+|
|New game changer is HIGH DYNAMIC RANGE, HDR:|
|4K UHD HDR||3840 x 2160||8.3||10 to 12||HDR||2018+|
|8K UHD HDR||7680 x 4320||33||10 to 12||HDR||2020+|
With higher resolution and more bits per color mean more data per image and more bit-rate for video. The uncompressed data rate for 4k 10-bits/color video at 30 frames per second is (3840 * 2160 * 30 fps * 3 colors * 10 bits per color=) 7.465 gigabits/second (933 megabytes/second). Few have internet speeds of that capacity, and that is even challenging for hard disk drive speeds on a local high end computer. So compression is used. Best quality for video (highest bit rate) is 4K Blu-Ray disks which have a bit rate of about 100 megabits/second, meaning compression ratio of 7465/100 ~ 75. Streaming services in the US, like Netflix, compress more, to stream around 15 megabits/second, thus compression ratios of around 500 for 4K 30 FPS. The higher the compression ratio, quality is lost, both in dynamic range and fine detail.
Video Data Rates, Uncompressed
|Pixels||Frames / Sec||# Colors||Bits / Color|| Data Rate|
Megabits / Sec.
|2K||1920 x 1080||30||3||8||1,490|
|4K||3840 x 2160||30||3||8||6,000|
|4K||3840 x 2160||30||3||10||7,500|
|4K||3840 x 2160||60||3||10||14,900|
|4K||3840 x 2160||60||3||12||17,900|
|4K||3840 x 2160||120||3||10||29,800|
|4K||3840 x 2160||120||3||12||35,800|
|8K||7680 x 4320||30||3||10||29,900|
|8K||7680 x 4320||60||3||10||59,700|
|8K||7680 x 4320||120||3||10||119,400|
|8K||7680 x 4320||120||3||12||143,300|
|Media capabilities|| Compression|
for 4K 60 fps
|Display Port 2.0 (2019)||77.37||1|
|HDMI 2.1 (released 2017)||42.6||1|
|Display Port 1.4a (2018)||25.92||1|
|HDMI 2.0 (released 2013)||14.4||1|
| USB 3 hard drives: ~100+ Megabytes /second|
(130+ on faster computers)
|~600+ to 1100+||~18|
|4K UHD HDR Blu Ray disks (released 2016)||92, 123 or 144||~103+|
|Netflix, Disney+ streaming||~15 to 20||~1000|
| For Reference: jpeg still image compression:|
~6 (high quality) to 16 (low quality)
In my experience, Netflix 4K HDR streaming is a wonderful experience. Disney+ seems to be better. But for real jaw-dropping knock your socks off experience, 4K blu-rays are the way to go. Even most movie theaters these days do not match the dynamic range of 4K HDR blu-ray on an OLED TV (if one could go to a movie theater during this pandemic).
While streaming 4K movies online can be a wonderful experience, compression limits image quality, losing fine detail and dynamic range. Streaming at 15 to 20 megabytes per second is a distant second to the best experience with highest data quality available from 4K blu-ray disks. Blu-Ray disk players are under $200. Blu-ray disks vary a lot in price from under $15 for disks that have been out for a while to typically $24 to $35 for new titles.
If you decide to go 4K blu-ray, the first disks to get are BBC's Seven Worlds, One Planet and Planet Earth II. Great movies for 4K HDR viewing are listed in the next article in this series. Movies are being scanned from film and remastered in 4K HDR. Movies shot on 65 mm film and higher show amazing clarity, detail and dynamic range. Included in this are Lawrence of Arabia, 2001 A Space Odyssey, the original Alien, Interstellar, and others. Newer films shot on modern 35mm film also transfer well, for example, Apollo 13. Older 35 mm film was grainier and 4K shows the gain. Lawrence of Arabia was scanned and remastered at 8K and the 4k blu ray transfer is stunning. And there are many more to choose from. Modern movies are being shot on 6K and higher digital movie cameras. Seeing these movies again, but in 4K HDR on OLED is a transformative experience. Part 2 of this series will list both hardware and media for examples to demonstrate this amazing new technology.
Photographers should start thinking how they will change their presentations to take advantage of this new technology.
Chroma sub-sampling is a method of compression. The idea is to reduce color detail in blocks of 2x2 pixels to lower the data volume and transmission time. This idea is similar to the way the human eye works: less color resolution than intensity (luminance) resolution.
The main chroma subsampling methods are:
For more details, see:
New technology, in the form of High Dynamic Range (HDR) TVs and monitors, is now out at the consumer level with 20 and more stops dynamic range! This revolution is being driven by the video film industry, and they have standardized new color spaces and encoding methods for HDR displays. Specifically, the standards are Rec 2020 and Rec 2100.
Prints have about a 5 stop dynamic range (a stop is a factor of 2) (Table 1). CRT monitors raised that to 6 or 7 stops, then LCD and LED monitors raised it to about 10 stops. All these are called Standard Dynamic Range, SDR. The human eye can see about 14 stops dynamic range in a single scene, but can quickly adapt to more than 20 stops as scene brightness changes. With dark adaption, we can see things around 25 stops fainter than average daylight, and during the daytime, we perceive detail in bright things a hundred or more times brighter. The total range of human dynamic range is on the order of 30 to 35 stops. What we see in images we produce and display with today's technology, like an LED computer monitor, looks flat compared to the real world.
Photographers commonly use Adobe RGB and sRGB color spaces with a few using ProPhoto color space. I predict Adobe RGB and ProPhoto will likely fall from use, as the film industry has the new Rec. 2020 color space and Rec. 2100 dynamic range standards. New TVs and computer monitors are aiming toward these standards for the film and gaming industries. Rec. 2100 is a difficult standard to meet with today's technology with a brightness range of 30 stops (one billion). DCI-P3 seems to be the new emerging interim standard, which is close to but different from Adobe RGB.
For an introduction to color standards and the approximations that started in the 1930s because people did not want to integrate functions with negative numbers, see:
Color Part 1: CIE Chromaticity and Perception
Color Part 2: Color Spaces and Color Perception
The above Color Spaces and Color Perception article shows the changing color gamuts of several computer computer monitors and TVs (See Figure 12 in that article). Also shown are spectral responses. These are not discussed by most review sites.
The motion picture industry has been moving to new color standards and higher dynamic range for output devices. The standards are impressive and will work well into the future. Devices, from cameras to TV and computer monitors aiming towards these standards are already showing results closer to real world views, and in higher resolution and dynamic range than ever before possible. Home computers are also catching up in both hardware and software capability. And the results are an impressive huge leap forward. New technology is being developed for better displays so that existing content will show even better on future hardware.
The still photo industry needs to catch up. Raw converter software needs to include Rec.2100 standards and photo editing software needs to also recognize the new standards. One can add the Rec.2020 and DCI-P3 color spaces to photo editors if not already there. But the photo editing software needs to write the HDR still image formats and the TVs display images with those formats.
At present, the only way for a still photographer to display still images in HDR on an HDR capable display (e.g. OLED) is to make a video clip and display the video. Of course this takes up much more storage space, but that can be minimized by specifying a slow frames per second, e.g 5 or 6 FPS. Note some TVs have lower limits of low FPS video. Software exists to do this, for example, ffmpeg (free open source, discussed in the next article).
Web browsers need to include the capability to display HDR content, including still image formats and video.
In my opinion and experience, this new technology is such a profound leap in visual experience, that I want to move to produce all my still images and videos in full HDR output. Because the motion picture industry is moving so fast in this regard, display manufacturers are racing to bring out new technologies for computer displays and TVs, that soon everyone will have real HDR display capability. Just like 4K TVs are common today and it is hard to find TVs not 4K capable, it will not be long before HDR is common too. And I do not mean just accepting an HDR signal (which is common already in most TVs), but actually able to show high dynamic range output.
In the next article, I will review hardware and 4K HDR movies that illustrate this new technology.
References and Further Reading
High-dynamic-range video (wikipedia)
Hybrid Log-Gamma (wikipedia)
Humphrey, j., 2018, High Dynamic Range. The Best TV Picture You've Ever Seen. Broadcast and Professional Products Hitachi Kokusai Electric America.
Software for HDR images.
Rec. 2100 (wikipedia)
DTS:X vs. Dolby Atmos, The latest surround sound formats, crutchfield.com.
First Published December 19, 2020
Last updated February 26, 2022