When 4K/Ultra High Definition exploded onto the consumer-video scene just a few years ago, everyone focused on the increased number of pixels on the screen, which grew from 2,073,600 for HD (1920x1080) to 8,294,400 for UHD (3840x2160). Four times as many pixels is four times better, right?

Not really. Unless you have a huge screen—or you sit extremely close to the screen—you probably can't see much of a difference in resolution between HD and UHD.

However, UHD is not only about increased spatial resolution. There are other improvements that are much more obvious. These include higher frame rates (HFR), a greater range of colors (commonly known as wide color gamut or WCG), higher peak brightness, and greater bit depth representing each color, which provides smoother gradations from one color to another and from one brightness level to another.

Those last two improvements combine to allow high dynamic range, or HDR. This is much more important than an increase in the number of pixels. Playing HDR content from a UHD Blu-ray or streaming provider on an HDR display is a revelation to just about anyone who sees it.

Reproducing HDR is easy on modern UHD TVs, which can pump out far more light than what SDR calls for. In fact, HDR content is prepared on—and primarily intended for—flat-panel TVs. If you watch that content on a flat panel, you will see something relatively close to what the creators saw during the mastering process.

But projectors are a different story. Today's models can produce more light than their progenitors, but they still can't come anywhere close to the brightness of flat panels. Plus, simply increasing the brightness of a projector can actually be counterproductive. As a result, projector manufacturers must adapt HDR content to the capabilities of their products. So, how do modern projectors deal with HDR content? Read on to find out...

Dynamic Range Basics

Before I discuss HDR and projectors, I want to make sure everyone has a basic understanding of dynamic range. In simple terms, video dynamic range is the difference—or, more technically, the ratio—between the darkest black and brightest white that a video system can represent and reproduce.

It's directly analogous to audio dynamic range, which is the ratio between the softest and loudest sound that an audio system can represent and reproduce. In audio reproduction, the low end of the dynamic range is the noise floor of a system, which corresponds to the black level of a display, and the high end is the maximum reproducible amplitude, which corresponds to the peak luminance of a display.

How does dynamic range relate to contrast ratio, a common specification for video displays? They're basically synonymous. According to Christopher Mullins, Product Manager of Digital and Home Cinema, Simulation, and Entertainment Projection at Sony Professional Solutions Europe, "Contrast is typically used to describe the capability of a display device...and dynamic range is used more regularly during the capture and post production phases and more recently also for display devices. The higher the achievable contrast by your projection system, the greater the amount of dynamic range that can be displayed on screen. HDR capability is directly related to the contrast ratio of the projection system."

The brightness—or, more technically, the luminance—of anything we see is expressed in units called candelas/square meter (cd/m2). But that's quite a mouthful, so the more common and equivalent term is "nits." This term comes up quite often when talking about dynamic range.

The real world has an extremely large dynamic range (see Fig. 1). At the low end, starlight can be as low as 0.000001 nit, while the sun emits over 1 billion nits. That's 1015:1! In photographic terms, it's about 50 stops.

HDR Projectors Fig1 800
Figure 1: The real world has an extremely wide dynamic range. So does the human visual system, but not all at once; the instantaneous dynamic range of human vision is labeled "Entertainment Dynamic Range" in this graphic. The portion labeled "Future TV" represents the results of Dolby's research into consumer preferences, while "Current TV" is SDR. (Source: Dolby)

Likewise, the perceptual dynamic range of the human visual system is very large, but not quite that large (see Fig. 1). People with normal vision can perceive a total dynamic range of about 1010:1 or 33 stops—but not all at once. If you've ever gone from the bright outdoors directly into a dark theater, you probably noticed that you couldn't see much for a little while. Or perhaps you've gone from a dark theater directly into the bright outdoors, and the glare was quite uncomfortable for a while.

These effects occur because the instantaneous dynamic range of the human visual system encompasses about 10,000:1 or 13 stops at any given moment, depending on the amount of ambient light in the environment. If you're in the bright outdoors, your instantaneous dynamic range shifts upward; if you're in a dark theater, it shifts downward. These shifts take a bit of time, so when you move from one environment to the other, it takes a few moments for your eyes to adjust.

What is HDR?

During and before the era of high-definition video, the dynamic range of most consumer video systems extended from about 0.1 nit to 100 nits, which is 1,000:1 or about 10 stops (see Fig. 1). This is referred to as "standard dynamic range" (SDR). In conventional commercial cinemas with professional projectors, the standardized peak brightness of the image is 48 nits. (Dolby Cinema has a peak brightness of about 100 nits and a much lower black level.) So, a conventional commercial cinema presents only about half the dynamic range of an SDR image at home.

Interestingly, while the peak brightness of SDR is defined as 100 nits, the black level is not defined at all; it can be whatever the display is capable of. A value of 0.1 nit is common, but displays such as Pioneer's Kuro plasma TVs reached much lower levels, which is why they were so well regarded. ("Kuro" means "black" in Japanese.) The lower the black level, the greater the displayed dynamic range, and the more the image "pops."

But even on a display with very low black levels, SDR does not take full advantage of the human visual system's instantaneous dynamic range, due primarily to the limitations of cameras and displays used at the time. Since then, technology has improved significantly, allowing a larger dynamic range to be captured and displayed. So, when UHD was under development, it was decided to include what became known as "high dynamic range" (HDR) within the UHD ecosystem.

To prepare for this upgrade, Dolby did extensive research into human vision, the nature of visual images, and what viewers prefer in terms of peak brightness. Several conclusions emerged from that research, which was aimed at future-proofing HDR as much as possible.

First, the human visual system's response to changes in brightness is not linear; it's roughly logarithmic. This is not new information; it has been known for a long time. In simple terms, we are more sensitive to changes in brightness at lower levels than we are at higher levels. This is critical for designing how a video display changes its light output in response to different brightness values.

Second, it's important to understand that most visual images have only tiny areas of super-high brightness—for example, the sun reflecting from a car's chrome bumper, which is called a specular highlight. The vast majority of most images has much lower brightness; in fact, the APL (average picture level) in most scenes of most movies is under 100 nits—that is, within the range of SDR.

Third, after exhaustive testing, Dolby found that most viewers preferred specular highlights as high as 10,000 nits (see Fig. 1). No commercially available display can reach anywhere near that peak brightness, so establishing 10,000 nits as an upper limit effectively future-proofs the HDR system. And even if display technology reaches that lofty peak brightness someday, there's no reason to exceed it, since viewers would likely complain about discomfort. In fact, they would probably complain if more than a fraction of the image was even 1,000 nits.

As with SDR, the black level of HDR is not defined. Of course, it should be as low as possible, and ideally, lower than what SDR displays can produce. As part of its certification process, the UHD Alliance has two minimum specifications. In one, a display must have a peak brightness of at least 1,000 nits and a black level no higher than 0.05 nit (20,000:1 or 13.5 stops). This is clearly aimed at LCD TVs, which produce higher brightness than OLED displays. In the other specification, a display must have a peak brightness of at least 540 nits and a black level no higher than 0.0005 nit (1,080,000:1 or 20 stops). This is clearly aimed at OLED TVs, which can shut down pixels to pure black but can't get as bright as LCD flat panels.

Unfortunately, virtually no projector can meet these requirements, at least in terms of peak brightness, and there are no such specs for them.

In addition to deeper blacks and brighter highlights, a companion benefit of HDR is a greater range of colors—commonly called wide color gamut or WCG—than SDR offers (see Fig. 2). This could easily be the subject of a separate article; for now, suffice to say that the color gamut of SDR is known as BT.709, while the gamut of HDR is technically BT.2020. However, very few displays can actually display the full range of BT.2020, so an intermediate gamut called P3 is typically used in HDR content.

HDR Projectors Fig2 800
Figure 2: The color gamut for SDR is often called Rec.709 as in this diagram, but it is more correctly called BT.709. The ultimate color gamut for UHD is BT.2020 (called Rec.2020 in this diagram), but very few displays can achieve that range of colors, so most HDR content uses the digital-cinema (DCI) color gamut called P3.

One more point about color in HDR. The color gamut is actually the range of colors available at one brightness level. But since HDR encompasses a wider range of brightness, it's important to consider what happens to the color at different brightness levels. So, the concept of color volume has become more widely discussed (see Fig. 3). The color volume of HDR is much larger than it is for SDR, and colors remain more saturated at higher brightness levels.

HDR Projectors Fig3 800
Figure 3: Three-dimensional color volume takes into account the brightness level as well as hue and saturation. As the brightness nears its maximum or minimum, the saturation of colors diminishes.

The visual improvement of HDR over SDR is dramatic. One of the best examples is an indoor scene with a window in which the daylit exterior is visible (see Fig. 4). In SDR, if the camera properly exposes the shot for the indoor portions of the scene, the view out the window is highly overexposed—what is often called "blown out." Conversely, if the camera properly exposes the shot for what's visible in the window, the indoor portion of the image is severely underexposed, and you can't see much of anything in the room. In HDR, the indoors and outdoors can be properly exposed at the same time, allowing you to see the details in both portions of the image.

HDR Projectors Fig4 800
Figure 4: In a shot that includes an interior with a window to the exterior, SDR lets you properly expose for the interior, in which case the exterior is overexposed, or you can expose for the exterior, in which case the interior is underexposed. HDR allows both portions of the image to be properly exposed. (Source: Business Insider)

Bit Depth

In all digital video, brightness is represented by a digital number that consists of several bits. The more bits, the smaller the steps between consecutive brightness values. With only a few bits, humans can see those steps as the brightness increases. If the number of bits is increased beyond a certain point, the steps fall below our threshold of perception, and we see a smooth gradation of brightness (see Fig. 5).

HDR Projectors Fig5 800
Figure 5: With fewer bits, we can see banding in gradations of brightness. With more bits, the banding disappears into a smooth gradient. (Source: ProjectorCentral)

In SDR video, the brightness of each color (red, green, and blue) is represented with an 8-bit value. Theoretically, those values range from 0 to 255, as they do with computer video, but in practice, black is represented in broadcasts and recorded video by the value 16 and white is represented by 235. This was established in 1982 to emulate the behavior of analog tube technology, which gradually clips above white and drops into the noise floor below black. Setting the values at 16 and 235 allows digital signals to behave in a similar manner without hard clipping.

When the dynamic range is increased much beyond SDR, 8 bits are no longer sufficient; the steps between consecutive values grow larger, so we start to see banding. As a result, HDR must use more bits to maintain smooth gradations. On the other hand, more bits means larger files and bandwidth requirements, so a balance had to be reached.

The final decision was to use 10 bits per color (see Fig. 6). Twelve bits would have been better, but with the careful distribution of brightness values from black to white, 10 bits turn out to be sufficient. As with SDR, the values for black and white are not 0 and 1,024; they are 64 and 960.

HDR Projectors Fig6 800
Figure 6: In both SDR (8-bit) and HDR (10-bit), black and white are not defined as 0 and the maximum possible value in order to emulate the soft-clipping behavior of analog tubes.

EOTF

The distribution of brightness values as they relate to the light output from the display is determined by something called the electro-optical transfer function (EOTF). This function defines the relationship between brightness values in the video signal and the amount of light emitted by the display.

In SDR video, the EOTF is called gamma (see Fig. 7). It originated in CRT displays because of how phosphors coating the inner screen surface respond to different intensities of the electron beam that stimulated them to glow. Interestingly, it also happens to be roughly the inverse of the logarithmic response of the human visual system; as brightness values increase along the gamma curve, the brightness of the display appears to increase more or less linearly to our eyes.

HDR Projectors Fig7 800
Figure 7: Different values of gamma determine how quickly an SDR display "comes out of black." Low gamma values cause the image to get brighter more quickly as the signal intensity increases and look more washed out, while high values can obscure shadow detail, depending on the amount of ambient light in the room. (Source: BenQ)

Virtually all displays allow the user to specify a gamma value, which determines how a display "comes out of black." At low gamma values, the display brightens quickly as the brightness values increase from 16. This lets you easily see details in the shadows, but the picture looks generally washed out. At high gamma values, the display brightens more slowly as the brightness values increase, which makes the picture look darker with obscured shadow detail. Setting the best gamma value depends in part on the amount of ambient light in the room—lower gamma for bright rooms, higher gamma for dark rooms. SDR content today is normally mastered with a gamma of 2.4 in a dark room.

Another characteristic of gamma is that it's relative. It is not related to specific brightness levels in nits; it simply goes from minimum to maximum brightness with no information about how much light the display is putting out. We know that SDR is created with a peak brightness of 100 nits, so the display should be set to output that much light when it receives the highest brightness value.

One result of Dolby's research into HDR was a new EOTF called the perceptual quantizer (PQ), which has since been formalized as a SMPTE standard called ST 2084 (see Fig. 8). Like gamma, PQ looks somewhat like an inverse-logarithm curve, rising slowly out of black and then becoming steeper as the brightness value increases. But whereas gamma is based on the behavior of phosphors and just happens to relate to human visual response, PQ is actually based on human visual response, making it better suited to its task.

HDR Projectors Fig8 800
Figure 8: The PQ curve defines how much light a display should output in response to a given brightness value in the signal. (Source: Insight Media)

Unlike gamma, there is only one PQ curve; you can't select different curves as you can with gamma. Also, the PQ curve represents absolute brightness levels in nits. For example, at a brightness value of 480 (50% code value), the corresponding light output is 100 nits; at a brightness value of 720 (75% code value), the corresponding light output is 1,000 nits. As you can see, most of the brightness values represent light levels below 1,000 nits, even though PQ extends up to 10,000 nits at a brightness value of 960 (100% code value).

Creating HDR Content

Okay, now that we have all that background out of the way, it's time to learn how HDR content is prepared for commercial distribution, a process called mastering (see Fig. 9).

HDR Projectors Fig9 800
Figure 9: In SDR, the mastering process requires a great reduction in the dynamic range of the captured content; in HDR, much more dynamic range is preserved all the way to the consumer display.

If you've been paying attention up to now, you can probably see a problem here. Recall that SDR assumes a peak brightness of 100 nits, so that's the level at which it is mastered and, hopefully, displayed in the home—easy peasy. By contrast (pun intended!), HDR extends all the way up to 10,000 nits, but there are no professional or consumer displays that can come anywhere close to a peak brightness that high. So, HDR content must be mastered with a lower peak brightness, keeping in mind the capabilities of consumer displays.

So, what should the mastering peak be? Many modern LCD TVs offer a peak brightness of 1000 nits or even more, and there are several commonly used mastering monitors with the same capability. So, most current HDR content is mastered with a peak luminance of 1,000 nits. Some titles are mastered at 2,000 or even 4,000 nits using a special Dolby monitor called the Pulsar (which is liquid-cooled to prevent overheating!). Why? I suspect it's to future-proof the content for the day when consumer displays can reach a peak luminance of 4,000 nits.

Then there's the question of which format to use. Dolby's first HDR format was HDR10, which incorporates PQ with 10-bit luminance values. In addition, it includes two small pieces of information called metadata: MaxCLL and MaxFALL. MaxCLL (Maximum Content Light Level) is the maximum light level of any single pixel in the entire program, and MaxFALL (Maximum Frame-Average Light Level) is the maximum average light level of any frame in the entire program. Because these pieces of metadata relate only to the entire program, they are collectively known as static metadata.

The purpose of MaxCLL and MaxFALL is to inform the display about the maximum light levels used to master the content. That allows the display to adjust its operation to accommodate light levels that exceed its native capabilities, a process called tone mapping (which I'll discuss in more detail shortly).

Unfortunately, specifying only two light levels isn't all that useful. For example, there might be a single pixel in one frame of an entire movie with a luminance of 4,000 nits, while most of the content is less than 1,000 nits. (In fact, the vast majority of images in most movies and TV shows are in the range of 100-250 nits; see Fig. 10.) As a result, the display reduces the overall brightness to compensate for that single pixel, and the whole movie looks darker than it should. This is a big problem with static metadata. Even so, HDR10 is probably the most common HDR format used in content today.

HDR Projectors Fig10 800
Figure 10: This chart shows the distribution of code values for different levels of peak brightness. In all cases, the majority of values are used for normal objects up to 250 nits. (Source: Dolby)

HDR10 is free of any licensing fees, which is one reason it's so widespread. But Dolby came up with an upgraded format called Dolby Vision, which includes metadata about the luminance levels in each scene or even each frame. This is called dynamic metadata, because it allows the display to adjust its settings dynamically as the content plays, resulting in a much more faithful rendering of what the creators intended. It is now widely used along with HDR10, though Dolby Vision is not license-free.

Samsung recently added dynamic metadata to HDR10, calling it HDR10+. This format is not as widespread as HDR10 or Dolby Vision, but it is supported in some by content from Amazon, 20th Century Fox, Universal, and Warner Bros. as well as in displays made by Samsung, Panasonic, and TCL.

Yet another HDR format is HLG (Hybrid Log Gamma), which was developed by British and Japanese broadcasters BBC and NHK, respectively. As its name indicates, HLG is a hybrid format that uses gamma for low brightness values and a logarithmic curve for higher values, and it does not use metadata at all (see Fig. 11). As a result, it's fully backward compatible with SDR displays. HLG is used mostly for live broadcasts rather than pre-packaged content.

HDR Projectors Fig11 800
Figure 11: Like gamma, HLG does not use absolute brightness values; instead, it's relative to the display's capabilities. That means it's fully backward-compatible with SDR displays. (Source: Eizo)

You might think that multiple HDR formats could be a problem—yet another format war. But I look at these formats like the various audio formats, such has Dolby Digital and DTS. Most audio devices can decode just about any audio format, so it doesn't matter which one is used by any particular piece of content. Similarly, most video source devices and flat-panel displays support most HDR formats, so having multiple formats in the marketplace is no big deal.

Unfortunately, projectors are more limited in their support for HDR formats. All HDR-capable consumer projectors support HDR10, and many also support HLG. In addition, the Samsung Premiere ultra-short-throw projectors support HDR10+ as well. However, there are no consumer projectors that support Dolby Vision. In fact, the only projectors that support it are the Christie commercial projectors used in Dolby Cinemas.

I've asked Dolby why consumer projectors don't offer Dolby Vision, but the company has declined to answer this question. It may be that Dolby wants to assure the quality of Dolby Vision images. This is relatively easy with flat panels, which are entirely self-contained, but not with projectors, because the size and type of screen has a big impact on the brightness and black level.

Projectors vs Flat Panels

In addition to more limited support for HDR formats, projectors are at a distinct disadvantage in terms of brightness. Most OLED flat panels can reach a peak luminance of around 700 nits (a few announced at CES 2021 claim to reach 1,000 nits), while LCDs can easily hit 1,000 nits or more. On the other hand, most modern consumer projectors max out around 100 to 150 nits, depending on the size and type of screen they are paired with. In addition, the black level of most projectors isn't nearly as deep as many flat panels, especially OLEDs.

On the plus side, projectors can maintain their full peak brightness even if the entire screen is 100% white. Flat panels must reduce their peak brightness if more than a tiny fraction of the screen is 100% white in order to prevent overheating and burnout.

You might think that increasing the light output of projectors would help bring them further into the world of HDR. But other than a few very expensive exceptions, it really doesn't. If you greatly increase the light output of a projector, that tends to disproportionately increase the black level as well. In all projectors, light from the light source is directed toward the imager, be it DLP, LCD, or LCoS. Despite the manufacturer's best efforts, light scatters at several points along the light path, and some of that light leaks out of the main lens, raising the black level.

Another problem is room reflections. With all projectors, some of the light reflected from the screen bounces around the room and ends up hitting the screen again, washing out the image. As you increase the light output of the projector, you also increase the amount of light that ends up back on the screen from room reflections. Of course, you can mitigate this problem with dark walls and furnishings as well as an ambient light-rejecting (ALR) screen, but in many real-world situations, a projector is used in less-than-ideal conditions.

Reliance on a separate screen is a fundamental difference between projectors and flat panels when it comes to HDR. A flat panel is a self-contained system with well-defined brightness levels, whereas the ultimate brightness of a projected image depends greatly on the size and type of screen. The projector doesn't know the characteristics of the screen or the final brightness of the image, which makes it more difficult to adapt an HDR signal to its capabilities. Also, most consumer flat panels have anti-reflective screens, which mitigate room reflections, while projection screens are designed to be reflective, leading to the problem described previously.

Next, consider the imaging technologies. To generate black, DLP pivots the micromirrors on the DMD chip to direct light into a "light sink" and away from the projection lens, but this is far from 100% efficient at keeping stray light out of the projected image. LCD and LCoS cells darken to allow less light through them, but again, this is not 100% efficient; some light always gets through. (LCoS generally has the best native contrast ratio, probably because light must pass through its LCD layer twice rather than once as in LCD imaging.) So, increasing the amount of light hitting the imager means that more light will leak into the dark portions of the image, raising the black level and reducing the dynamic range.

LCD flat panels have the same problem—some light always leaks through the cells that have been shut down to produce black, which is why LCD TVs have a traditionally high black level. But many LCD TVs now have full-array backlights with local dimming (FALD), which allows sections, or zones, of the backlight behind dark portions of the image to be dimmed, while zones behind bright portions of the image are brightened. This leads to a much lower black level and higher dynamic range.

Unfortunately, virtually no projectors have any form of local dimming. Instead, most projectors only have global dimming using a dynamic iris or dynamic modulation of the laser light source. This improves the contrast from one scene to another, but it does nothing for the dynamic range within a single frame or static scene.

It is widely rumored that Christie's Dolby Vision projectors use dual modulation—one set of DMDs forms the image while another set divides its micromirrors into zones that provide a form of local dimming—but neither Dolby nor Christie will confirm this. Christie has confirmed that its new Eclipse projector employs dual modulation to achieve an astounding black level and contrast ratio. (The Hayden Planetarium in New York City recently installed six Eclipse projectors, which we covered here.)

The only displays that can produce perfect blacks are OLED and microLED flat panels. These are self-emissive technologies in which each red, green, and blue subpixel emits its own light, which can be dimmed to 0 or brightened to maximum completely independently. As I mentioned earlier, OLED TVs can reach around 700 nits of peak brightness, while microLED displays can reach as high as 1000 nits.

As an aside, OLED and microLED displays give rise to a contrast-ratio fallacy. Since they can achieve true black at 0 nits, many manufacturers claim that means they have an infinite contrast ratio. After all, dividing any number by 0 equals infinity, right? Wrong. Technically, dividing by 0 is undefined. Also, if a display with a black level of 0 nits has infinite contrast, it could have a peak brightness of 1 nit, and the contrast would still be infinite!

Speaking of contrast ratios, it's important to understand that most display manufacturers specify the contrast ratio of their products based on measuring a full-black screen and a full-white screen separately. But the real value of HDR is in widening the dynamic range within a single shot. An ANSI contrast-ratio spec is based on measuring the black and white levels in the squares of a black-and-white checkerboard pattern, which provides a much better indication of how much dynamic range is available within a single shot.

Tone Mapping

Whereas SDR uses a well-defined and standardized set of parameters for content creation and display (100-nit peak luminance, BT.709 color gamut), HDR does not. The content could be mastered with a peak luminance of 1,000 nits, 2,000 nits, 4,000 nits, or something else. (The color gamut is likely to be P3 during content creation and display, but it doesn't have to be.) And consumers will view that content on a display with a peak luminance of anything from 100 nits to over 1,000 nits.

So, consumer displays must have some way to deal with content mastered at different peak-brightness levels while taking into consideration the display's own peak luminance. This process, called tone mapping, remaps the total brightness range in the content to the brightness capabilities of the display as necessary.

In the case of flat panels, which have a peak brightness of at least 500 nits and often up to 1,000 nits or more, brightness values well below the panel's peak are normally displayed exactly as they are encoded in the content. Remember that most scenes in most movies have an APL in the range of 100-250 nits, so virtually any flat panel can render that brightness level with no modification. As the brightness values approach and exceed the panel's capabilities, they are scaled down to remain within those capabilities. This is accomplished by "rolling off" the EOTF from the PQ curve as the brightness values increase beyond a certain point.

For example, if a flat panel has a peak luminance of 1000 nits and it receives an HDR signal encoded with a MaxCLL of 1,000 nits, no tone mapping is performed. But if the content has a MaxCLL of 4,000 nits, any values above 1,000—as well as some values a bit below that—are rolled off so the entire brightness range of the content fits within the brightness capabilities of the display.

Exactly how brightness values are rolled off is entirely up to each manufacturer; unfortunately, there is no standard for this process. With static metadata, some choose to roll off only the values close to and above the panel's peak, which produces an overall brighter picture, though the highlights are not dramatically higher than some other parts of the image, and they can looked clipped. Others start rolling off well below the panel's peak, which results in a less-bright image overall with more dramatic highlights and a less-clipped look. See Fig. 12 for examples of both approaches.

HDR Projectors Fig12 800
Figure 12: In these examples with HDR10 static metadata, MaxCLL is 1,000 nits, while the peak luminance of the display is 500 nits. In the upper example, the tone-mapping curve is designed to maintain the brightness of the original signal as much as possible, which preserves shadow detail in low-light scenes but causes high-light scenes to look clipped. In the lower portion, the tone-mapping curve rolls off much sooner, preserving detail in the peak highlights at the expense of lower levels, which causes the entire picture to look dimmer.

The dynamic metadata in Dolby Vision and HDR10+ avoids this compromise by telling the display how to tone map each scene or even each frame (see Fig. 13).

HDR Projectors Fig13 800
Figure 13: With dynamic metadata, the display adjusts the tone-mapping curve to accommodate different types of scenes with different APLs. As a result, all scenes look their best, rather than having to sacrifice low-APL for high-APL scenes or vice versa.

One important aspect of tone mapping is preserving the hue of colors that exceed the display's color volume. For example, if you desaturate blue, it could become a bit purple. So any good tone-mapping algorithm must take this into account.

Tone Mapping in Projectors

The problem is much worse with typical home theater projectors, which might max out at only 100 to 150 nits at an acceptably large image size. In this case, the entire brightness range in HDR content must be drastically tone mapped. According to Kris Deering of Deep Dive AV, a respected video-industry consultant and calibrator, there are two basic approaches to tone mapping in projectors.

In one approach, an engineer compares HDR content displayed on an HDR mastering monitor with the same content produced by a projector, and they adjust the projector's tone-mapping curve so that its image looks as close to the mastering monitor as possible. In most cases, the highlights and deep shadow detail are prioritized over the middle of the brightness range, which Deering says can look a bit artificial.

The other approach is to emulate what would have been an actual mastering grade for the projector's peak-luminance capabilities. This is commonly called a "trim pass," in which a mastering engineer regrades the HDR content for a peak brightness of, say, 100 nits using a logarithmic EOTF curve much like gamma. Of course, it becomes SDR in terms of peak brightness, but with the added benefits of 4K/UHD resolution, wide color gamut, and 10-bit brightness gradations. Deering says that this approach tends to look more natural, though he stresses that it's a personal choice.

In either case, the default tone-mapping curve was most likely established in a dark, light-controlled room much like a mastering studio. However, if the projector is then installed in a brighter environment—say, a family room—the image will look very dark. So, many HDR-compatible projectors offer a control that adjusts the tone-mapping curve for different amounts of ambient light. These controls also provide some adjustment to account for the varying brightness of different HDR titles. Epson calls this control HDR10 Setting or HLG Setting (depending on the format of the signal), JVC calls it HDR Level, and Sony calls it Contrast (HDR). Also, JVC's Theater Optimizer feature automatically adjusts the picture levels for optimal screen brightness due to screen size, material, throw distance and lamp hours.

As I mentioned earlier, no consumer projectors support Dolby Vision's dynamic metadata, and as of this writing, only Samsung's Premiere UST projectors support HDR10+. So, several companies have developed their own dynamic tone-mapping technology. For example, JVC's Frame Adapt HDR ignores metadata altogether; instead, it measures each frame in real time for average picture level and peak brightness, adjusting the tone-mapping curve accordingly. LG also offers frame-by-frame dynamic tone mapping in some of its projectors, calling it Dynamic Tone Mapping, and Sony provides a form of dynamic tone mapping called Dynamic HDR Enhancer in its newest projectors as well.

As a side note, HDR content for Dolby Cinemas is graded specifically for the Dolby Vision projectors they use. The same is true for IMAX Laser theaters, which also exhibit higher brightness and lower black levels than conventional cinemas. As a result, those commercial projectors do not use tone mapping at all; the content remains entirely within the projectors capabilities. This is an example of a professional trim pass.

Closing Thoughts

As I mentioned at the top, HDR combines higher brightness, greater bit depth, and wider color gamut. The result is a stunning picture that blows SDR images out of the water, especially on a modern flat panel.

Unfortunately, the difference between SDR and HDR is not as pronounced on projectors, primarily because the image from most projectors is nowhere near as bright as any flat panel. Many modern projectors do have higher brightness and lower blacks than their predecessors, but they aren't true HDR. Even Dolby Vision in a Dolby Cinema is not considered true HDR by many in the industry; at best, they call it EDR (extended dynamic range).

That doesn't mean HDR content can't look better than SDR from a projector, but we must have realistic expectations. Also, it strongly depends on the room. If there's much ambient light, mid- and low-level details in the image will be invisible. In that case, you would adjust the EOTF control to compress the dynamic range toward higher levels, which actually defeats the purpose of HDR.

This is analogous to the "loudness wars" in audio mastering, in which the dynamic range is squashed into a narrow band near the maximum level so everything can be heard in a noisy environment—say, over the radio in a car. By contrast, audio mastered with a wide dynamic range for SACD or DVD-Audio captures much more subtlety, but it can only be fully appreciated in a quiet environment.

Similarly, a projector can squeeze most of the dynamic range into the brightest region it can reproduce, which is fine for brighter environments such as a family room. But if the projector sacrifices some overall brightness to display details at lower light levels, a lot of those details will be invisible in a brighter room; they can only be seen in a dark, light-controlled room, which is the domain of enthusiasts, not the average consumer.

HDR content poses a unique challenge for projectors, but it is a challenge worth facing, especially if you have a dark, dedicated theater room. Look for a follow-up article on how to optimize and calibrate a projector for HDR, which will explain how to get the best possible HDR image.

Meanwhile, I have great hope for the future of HDR projection. Perhaps some type of dual modulation, such as the kind used in Christie's Eclipse, will provide a form of local dimming in consumer projectors. And a technology called light steering, which directs the light of a projector's laser away from dark areas and toward brighter areas of an image, can also greatly increase the dynamic range within each frame.

Admittedly, these innovations are very expensive, and they are currently limited to projection research labs and specialty commercial applications. But in the future, they could bring consumer projectors all the way into the world of HDR, which would be wonderful for those of us who value the giant-screen cinematic experience that only a projector can provide.

My deepest thanks to Rob Budde of JVC, Kris Deering of Deep Dive AV, Larry Paul of Christie, Carlos Regonesi of Epson, Neil Robinson of LG, and Joel Silver of Imaging Science Foundation for their help with this article.

 
Comments (8) Post a Comment
Stephane Gauthier Posted Mar 13, 2021 7:07 AM PST
Great HDR summary. Thanks for sharing.
Robert Silva Posted Mar 14, 2021 3:15 PM PST
This is a great article!
Rob Sabin, Editor Posted Mar 14, 2021 3:20 PM PST
Thanks, Robert! Coming from you this is high compliment, and I'm sure Scott appreciates it.
Dennis Mak Posted Mar 15, 2021 2:37 AM PST
It just a lesson....letting us to know more about the specific terms on TV and projector....though I am not fully understood...anyway ...well done.
Julian S Posted Mar 15, 2021 2:59 AM PST
Wow, what a read. This may be one of the most valuable articles on the current scientific situation with Projector and Display technology I've ever read. Precisely covering almost everything there is to know in one text, yet written to be clearly understandable. Cheers and thank you very much!
Mike Posted Mar 15, 2021 3:23 AM PST
Great article. Explains the gist of HDR in terms that everyone can understand.
Mark Dargan Posted Mar 24, 2021 8:01 AM PST
Great article. The most comprehensive, technically accurate article I have seen on this subject. Moreover the state-of-the art industry context was very valuable. Thank you.
Eport Posted Mar 26, 2021 9:18 AM PST
The human eye can definitely see more than 13 stops of dynamic range at any given moment. Proof: take a medium exposure of an interior photo with a window view with a standard camera that typically has about 15 stops of information. Compare that shot with your eyes in the same room. You'll notice your eyes have no problem whatsoever seeing all the details outside the window that are blown out and the interiors look much brighter. As a pro architectural photographer, I must use a DSLR to compose with my eyes and not the camera sensor. I usually have to gather 4 stops under and 4 stops over to recreate what the eye sees. This puts human eye dynamic range at least into the 20+ stop dynamic range world.

Post a comment

 
Enter the numbers as they appear to the left