When shopping for a new display or projector, how do you determine which model has the best image quality, or even just the best quality for your limited budget? Unless you can compare two models in a side-by-side shootout, you'll have to base your purchase on trusted product reviews from sites such as this one, or the manufacturer's ads and brochures. Either way, you're going to encounter a variety of technical terms used to describe the potential image quality, including its ANSI Lumens, white light output, pixel resolution, contrast, color accuracy, and color bit-depth.

Ok, so you might have a hard time finding that last one. Color bit-depth is often hidden on the specs page or described in some obscure way. However, bit depth is becoming an increasingly important metric for comparing projectors that claim the ability to reproduce wide color gamut (WCG) and high dynamic range (HDR) content. In fact, it may actually tell you more about a projector's potential image quality than its contrast, pixel resolution, or even color accuracy ratings—all of which can be varied based on display modes or focusing accuracy.

What Is Bit Depth and Why Does It Matter?

Theoretically, a projector's bit-depth rating describes the highest number of tonal values and colors that it can reproduce in any given frame of content. As the bit depth rating increases (to a point, anyway), the number of colors and tonal values a projector can reproduce on screen increases exponentially, resulting in fewer jagged transitions and posterization effects (i.e., smoother blue skies), along with wider color gamuts and improved shadow and highlight details. The improvements are relatively easy to see as you increase bit depth from 1-bit to 8-bit per color, less intense between 8- and 10-bits, and difficult or impossible to notice between 10-, 11-, and 12-bits due to the limitations of the human eye.

How do you translate a projector's bit depth rating into the number of colors it can reproduce? Let's first take the example of a monochrome projector that forms a single grayscale image on the screen. Its numeric bit depth rating ("x"-bits per color) can be used to quickly calculate the projector's entire range of unique gray scale values, from its deepest black to its brightest white. All you need to do is apply the log function formula. For the grayscale calculation it's: 2x = number of gray values. The chart below (Figure 1) shows you the results of the math for both grayscale only or RGB color . For now, just have a look at the grayscale values; we'll discuss color later.

Bit-Depth Values
Figure 1: Bit-Depth Gray Tone and Color Values

As seen in the illustration below (Figure 2), once an 8-bit grayscale or full color scale is achieved, you won't see the incremental benefits of 10-bits per color on your computer monitor or tablet, and probably not even on a true 10-bit display or projector driven by your computer or other internet-connected device. For starters, that would require a true 10-bit illustration (our illustrations are capped at 8-bits per color thanks to web color limitations).

BitDepthCompared
Figure 2: Comparison of bit-depth gradations (note: illustration limited to 8-bits due to web limitations)

Furthermore, you shouldn't be misled by the test patterns and even some movies available for download from the internet that claim to be 10-bit targets or 10-bit per color movies. Most are not what they claim! Unless you can download the test patterns as intact 16-bit TIFF format photos (all JPEGs are limited to 8-bits per color), you should quit while you're ahead. It's even harder to find animated 10-bit test targets and videos, as nearly all popular video formats available for download, including AVCHD and .MP4, are limited to 8-bits per color content. Even if you can actually find true 10-bit files available for download, you'll still need a computer with a 10-bit graphics card and 10-bit capable software. Otherwise, you'll wind up viewing a smooth 8-bit target or movie that shows no difference when viewed on an 8- or 10-bit display.

Using the ProjectorCentral 10 Bit-HDR Grayscale Animation

Fortunately, there is a simple way for any serious video enthusiast to download and view 10-bit test patterns to help assess their display. All 4K UHD Blu-ray players have built-in 10-bit per color graphics capability for playing back 4K UHD Blu-ray movies—all of which are stored in 10-bits per color HEVC format video. Most of these 4K UHD Blu-ray players and a few 4K media players, including the Roku 4K HDR, have a USB input that enables them to play back animated 10-bit per color test targets that have been saved in 10-bit HEVC format.

If you'd like to see how your own projector handles 10-bit signals, you can download the 10-bit per color animated test target you see below (Figure 3), created by In-Depth Focus Labs, from ProjectorCentral.com. The spinning wheels display a 10-bit grayscale between video levels 0 and 20 on the left, and levels 20 through 100 on the right. Although it should appear as a grayscale image, it is actually a full color pattern containing metadata tags that should automatically turn on the HDR and WCG modes in any HDR10 compatible display.

ProjectorCentral 10-bit HDR Grayscale Pattern
Figure 3: 10-bit HDR grayscale animated test target (© In Depth Focus Labs 2018)

To download the target to a Windows PC, you must RIGHT-CLICK on the link below, select "Save Link As" and save to your preferred location. The 10-bit HEVC file will download to that folder. On MacIntosh, right click and then select "Download Linked File" or "Download Linked File As."


Right Click to Download the Test Pattern Video File

To view the test pattern on your display, copy it to a USB flash drive and insert the drive into the USB media input on your UHD Blu-ray player. When you play the file from the disc player's built-in media player, it should be recognized by your display as a UHD resolution video with 10-bit bit depth, HDR, and BT.2020 color space.

As illustrated on the next page, obvious banding in the spinning wheels indicates that your display is playing back with less than full 10-bit bit depth.

Bring On The Color

Unlike a monochrome display, color monitors must form at least three grayscale images that represent the red, green, and blue data channels found in a standard SMPTE color signal. Most 3-chip projectors, whether using LCD, LCoS, or DLP imaging chips, start by using the data from each of the incoming R, G, and B data channels to form associated grayscale images. These are then illuminated by red, green, and blue lights (created by filtering a white light or using color LEDs or lasers) to form an overlapping full color image on screen (Figure 4).

ProjectorCentral 10-bit HDR Grayscale Pattern
Figure 4: Grayscale signal data for each primary color, when illuminated by light of that color, combine to form a full color image.

Single-chip DLP projectors parcel out fractions of the R, G, and B data to form as many as seven grayscale images in rapid succession on the DLP micromirror device. Although none of these individual grayscale images contain the full number of tonalities found within the three individual RGB-based grayscale images, the totals should add up to the same in the end. White light from a bulb, colored LED, or a laser diode is then reflected off the DMD and passed through up to seven corresponding colors on a spinning wheel in order to form a full color image on screen.

In all of these color projector models, the total number of achievable colors winds up being the product of the grayscale values created, and are listed in the 8-bit row of the column labeled "Potential R,G, B Color Values" in Figure 1.

For example, here's the math in an 8-bit per color display that forms three grayscale images:
                      8-bits per color channel: 28 = 256 gray values.
                      Total colors: (256R) x (256G) x (256B) = 16.7 million colors

What Does Bit-Depth Look Like?

So, now you have some idea of how bit depth specifications relate to the number of grayscale gradations or colors that can be created by a display. But how do bit-depth variations actually look on the screen?

The type of artifact most closely associated with bit depth, or rather, the lack of it, is the banding artifact. It looks like what it sounds like: areas of the image that should ideally look smooth and show even transitions of light and color instead exhibit noticeable bands or outlines where the brightness or color visibly jumps from one level to another. The display simply lacks the ability to reproduce all the fine gradations called for by the signal.

Below (Figure 5) are examples of the 10-bit circular HDR grayscale target cited above as it should appear when properly processed at 10-bit depth (top), and with obvious banding as a result of being processed with only 8-bit or 9-bit depth (bottom). You can clearly see the banding steps in the darkest part of the test pattern, and more subtly, in the brighter part of the pattern.

Gray Test Pattern-NoBanding

Gray Test Pattern-Banding
Figure 5: 10-bit HDR grayscale pattern with no-banding (top) and with visible banding caused by bit-depth deficiency (bottom).

As with the grayscale bit-depth chart shown in Figure 1 and the grayscale pattern above, differences in color bit-depth can also manifest visibly with banding—although the eye is more forgiving with certain colors than others. The illustration below (Figure 6), for example, easily shows banding patterns on most displays between 12-bit and 24-bit color, but these would be harder to see when comparing 20 and 24 bit colors.

12-bit vs. 24-bit color
Figure 6: 12-bit vs. 24-bit color

In real world content on most 8-bit per color displays, you might perceive bit-depth banding issues in the transitions of light levels and colors in a sunset, or in the different hues of blue in a sky. Other bit-depth artifacts can be seen around the edges of objects, such as the transition between a planet in outer space and the halo of light surrounding it, or when one saturated color ends and another begins. Instead of a smooth tonal transition, you see a line or edging effect. For example, in the illustration below, shot in 4K HDR with 10-bit color depth, compare the out-of-focus, violet-tinged flowers behind the butterfly. The top frame in Figure 7 shows the out-of-focus flowers as they should appear with proper 10-bit processing. Below that is the frame processed at 8 bits per color.

Butterfly-NoBanding

Butterfly-Banding
Figure 7: 4K, 10-bit HDR video frame displayed with 10-bit (top) and 8-bit (bottom) processing.

If you look at the 10- and 8-bit close-ups shown below (Figure 8), you can clearly see halos around the edge of the 8-bit flowers as well as a visible dark band that outlines it. Also notice the missing details within the 8-bit flowers.

Butterfly Close-Up_10-bit vs 8-bit processing
Figure 8: Image detail, 10-bit vs. 8-bit processing

For another example, consider the photos below in Figure 9 of a real color spectrum—a double rainbow. The photo on top is processed with 10-bit color, while the bottom image is processed with 8-bit color. The 8-bit version suffers not only from posterization and banding artifacts in the sky, but the fainter of the two rainbows (the one on the left) practically disappears due to the combination of posterization and truncated color gamut. This is the kind of image degradation you might expect if you saw banding in the 10-bit grayscale test pattern discussed above, and illustrates what happens when three overlapping grayscale images (each with distinct banding issues) are used to form a color image.

Double rainbow-10 bit

Double rainbow, 8-bit
Figure 9: Double rainbow, 10-bit processing (top) vs. 8-bit processing (bottom).

Understanding Bit-Depth Specs

Having a basic understanding of how bit-depth relates to image quality isn't really enough when it's time to start shopping for a projector. Here are a few other important things to know before you start sifting through product marketing sheets and reviews.

X-bits per color, X-bits pixel, X-bit color. Unfortunately, all of these terms are widely used to describe bit-depth capabilities in displays, which creates confusion. They don't always mean the same thing.

If we're talking about a monochrome display—something you won't likely be shopping for anytime soon—all three terms do refer to the same value. So a display deemed to have "8-bits per color" can also be described as an "8-bits per pixel" or simply an "8-bit color" display.

But in a full color display—more relevant to today's projectors—the term "X-bits per color" describes the number of tonal values found in each of the three grayscales formed from the video signal's R, G, and B data channels (as described in the previous section). So, if X=8, then "8-bits per color" generates 256 tonal values per color.

On the other hand, with color displays the terms "X-bits per pixel" or "X-bit color" describe the total product value of all grayscale images. So in this case, you can expect to see the value of X expressed as three times the value you'd see associated with the term "X-bits per color." Therefore...if you see "24-bits per pixel" or just "24-bit color," it means the display has the same potential colors as a display labeled "8-bits per color." Easy to understand...if you're a math major.

Millions vs. billions of colors. Why do some manufacturers list a display's color capability in terms of millions or billions of colors? The answer is simple: marketing! After years of describing display color capabilities in 8-bit terms, manufacturers came up with a description that made more sense to buyers: "millions of colors," or 16.7 million colors. Then 10-bits per color displays came along and the next logical marketing step was to claim "billions of colors," or "over 1 billion colors!" (1024R x 1024G x 1024B = 1.07 billion colors). To really confuse you, one manufacturer even lists its projector's color capability as 1,074 million colors! (the same as 1.07 billion colors, but a bigger number up front).

The problem with claiming billions of colors is that there is actually no such thing! According to vision experts, the average human can only discern around 12 million individual colors, while only a handful of humans found worldwide (all female) can discern about 100 million colors. Their expanded ability is due to a genetic mutation that produced a fourth color sensitive cone in their eyes. Most with this ability don't even know it until tested, but I'll bet they enjoy a good sunset or rainbow when they see it.

Samsung is about the only manufacturer who has tried to set the record straight by claiming that its 10-bit displays are capable of displaying "Billions of color data combinations"—a technically accurate statement. Few have followed.

Let The Buyer Beware

For more than a decade, advanced photographers, videographers, and film directors have been aware of the advantages of capturing and processing color images and video with a minimum of 10-bits per color (30-bits per pixel). The RAW modes on all DSLR cameras store still photos in 10- or even 12-bits per color, and affordable 4K camcorders now have similar capabilities. On the computer side, every Mac currently sold has at least 10-bits per color graphics capability, as do the majority of PCs, image and video editing programs, and 4K or higher-resolution monitors used for image editing and advanced gaming.

However, it wasn't until 4K UHD Blu-ray movies and players became available, enabling the distribution of high dynamic range (HDR) and wide color gamut (WCG) content to a home audience, that 10-bits per color became an important feature for both flat panel TV's and projectors. Before that, the marketing of displays and projectors had concentrated on increased resolution and in some cases, improved color accuracy and extended color gamut reproduction. In 2015, 10-bits per color became the minimum acceptable color standard when the CEA released its minimum guidelines for HDR10-compatible displays and projectors, which included a 10-bit requirement under the HDR10 Media Profile. Here are the parameters:

  • EOTF: SMPTE ST2084
  • Color Sub-Sampling: 4:2:0 (for compressed video sources)
  • Bit Depth: 10 bit
  • Color Primaries: ITU-R BT.2020
  • Metadata: SMPTE ST 2086, MaxFALL, MaxCLL

The simplicity of the CEA definition may have created more confusion among consumers than it eliminated. A deeper read shows that all a projector or display has to do in order to claim "HDR10-compatibility" is to accept an HDR content signal containing 10-bits per colors data that's stored using BT.2020 color space coordinates and includes appropriate HDR metadata tags. But HDR10-compatible displays and projectors are not required to maintain 10-bits per color from input to output, or even reproduce any wide gamut colors outside the standard dynamic range (SDR) Rec. 709 color space. That loophole was intentional, and left the door open for more affordable and "older-technology" 8-bit displays that are limited to Rec. 709 color gamuts (or slightly more) to be re-engineered to accept HDR and wide gamut color content from 4K UHD Blu-ray players without choking.

The TV industry has always prioritized backwards compatibility, and in this case it can be done with some internal processing tricks on the display or projector side, or within a computer or stand-alone media player. The result is that some displays with limited bit-depth capabilities are labeled as HDR-capable, but don't really meet the criteria or deliver the full image quality benefits of 10-bit HDR displays.

Here's how it typically works for an 8-bits per color display claiming to be HDR-compatible: When an incoming 10-bit HDR movie signal is detected, a front end processor in the display downsamples the signal to 8-bits per color data, or creates dithered 10-bit colors. Next, the display applies a reverse HDR or HLG curve adjustment to counter the EOTF 2084 contrast curve applied during the HDR mastering process. A color look up table (LUT) is then applied to scale all the wide gamut colors the display can't reproduce to the closest in-gamut colors that it can reproduce. Additional image tweaks may include selective saturation, contrast, and blurring adjustments to minimize posterization and banding artifacts.

The result on the screen lands in between a SDR 8-bit image and a 10-bit HDR image. You may still see some wide gamut colors in the 8-bit display output, as 10-bits per color is not required to create many of the DCI-P3 gamut colors that fall outside the smaller Rec. 709 standard color gamut. However, no reasonably affordable 8-bit display or projector can achieve 100% coverage of the DCI-P3 wide gamut color space used to master and color grade 4K UHD Blu-ray movies, and 10-bit or higher color is required to achieve the additional colors found in the full BT.2020 color gamut.

Unfortunately, in addition to color gamut limitations just described, downsampling and dithering also causes image-quality artifacts including posterization effects, lost shadow and highlight details, banding in fine color gradation, and outlines appearing along the edges of fine tonal transitions (as shown above in Figures 6 through 8). All of these problems are overcome by a display or projector with true 10-bits per color processing and the ability to reproduce a color gamut approaching or exceeding 100% of the DCI-P3 color space.

Projectors and flatpanel TVs with true 10-bit processing and the improved image quality it enables are out there and more affordable than you might think. But they're competing with some "HDR10-compatible" models that claim all sorts of HDR advantages yet don't reveal their 8-bit limitations until you see their output on screen, or learn about it in a product review. The lesson? If you're in the market for a new projector, make sure you do your homework.

Michael J. McNamara is the former Executive Technology Editor of Popular Photography magazine and a renowned expert on digital capture, storage, and display technologies. He is also an award-winning photographer and videographer, and the owner of In-Depth Focus Labs in Hopewell Junction, NY

 
Comments (15) Post a Comment
David Rivera Posted Dec 20, 2018 2:13 PM PST
Bring on the knowledge Michael. Your fantastic article, chuck full on great info, leads me to quote the late great Teddy Pendergrass: "The more I get the more I want"(Disco circa 1977). Quality information will lead to a wiser and more selective consumer base, which in turn promotes competition between manufacturers. Competition in projector development breeds better quality at better prices. Great work PC for keeping your readers informed and connected. You continue to build our trust and reliance on your site. To all your advertisers I say, invest your marketing dollars in Projector Central and you will reap the benefits.
Rob Sabin, Editor Posted Dec 20, 2018 3:03 PM PST
Thanks for the comments, David. I agree that Mike did an awesome job covering the bases for our readers on an important new topic. We'll look forward to bringing you his future contributions.
Michael J. McNamara Posted Dec 20, 2018 5:02 PM PST
David: Thanks for the compliments and appreciation. Hope I can continue to raise the bar a "bit" higher in future articles.
Harold Veatch Posted Dec 20, 2018 5:05 PM PST
Just curious. So practically speaking, how does this relate to an old projector like the Optoma HD80 which claims 10 bit color processing when it obviously doesn't support HDR.

Quotes from HD80 brochure. (Is this a bunch of bull?)

"Currently filmmakers record and process movies at greater colour depths than most consumer Home Cinema equipment can reproduce. Movie studios have had to reduce the colour depth of their films for home distribution so they are compatible with Home Cinema equipment. The pure 10 bit digital signal path of the HD80 paves the way for movie and gaming content to be displayed in a virtually lossless form producing a level of visual acuity and realism never seen before in the home"

"Commanding over two million individual pixels, luminance and vibrant colours blend fluidly with the ThemeScene ® HD80. At the heart of the projector is the latest 1080p DLP ® technology. A pure 10-bit signal path and processing architecture combine with an advanced colour wheel featuring NDG (Neutral Density Green) technology. NDG increases the visual colour resolution, creating a higher quality image that dramatically reduces low-level dithering artefacts."

"10 bit DNX Rich Colour Processing technology increases the number of colours that can be displayed from 16 million to over 1 billion by offering 4 times the colour information for each pixel."
Rob Sabin, Editor Posted Dec 20, 2018 7:19 PM PST
Harold, end-to-end 10-bit processing most certainly would have been forward thinking in the HD80, which was released in July 2007. But unless I'm mistaken there would have been no true 10-bit content to make use of it. 1080p Blu-rays are 8-bit, and 1080i broadcast did not support 10-bit content then or now. Perhaps photographic content might have been an option at that time.
Jason Posted Dec 20, 2018 11:46 PM PST
Thanks for the piece. Is it possible to add color depth into database as a feature search option? Thanks.
Rob Sabin, Editor Posted Dec 20, 2018 11:50 PM PST
Jason, the manufacturers don't always make this information immediately available in the spec sheets we use to create the database. But we are discussing how we might be able to incorporate this information. At this point, pretty much all new 4K displays ought to be able to do full 10 bit processing,though I suppose some budget models might not.
Mike Collins Posted Dec 21, 2018 3:31 AM PST
Loved this article. Always learning. Can you explain the difference between 4:4:4 vs. 4:4:2 vs 4:2:0 in general and how it relates to visual differences? That would be a great next article. Something on frame rate and interpolation (soap opera effect) would be wonderful as well, ie 60 FPS vs. 24 FPS, vs. 30 FPS.
Rob Sabin, Editor Posted Dec 21, 2018 8:35 AM PST
Mike, we actually do have a article explaining chroma subsampling in the works right now, and I agree that frame rates, frame interpolation techniques, and their affect on image quality would make another good one for consideration. Thanks.
krectus Posted Jan 1, 2019 6:44 PM PST
Great thanks for this, I wish there was more calibration/test patterns available on movies/devices/projectors. Trying to play a 4K HDR disc through a xbox one x on my projector is just hoping for the best, none of those things have any great test patterns to help you out. Things will get better but right now its...not good.
Dadix Posted Jul 27, 2019 3:12 AM PST
So an Optoma HD20 has it or not a 10 bit color deph (because I want to buy one second hand)? How about Optoma 142x ? ( Because in description I see 10 bit but also Rec709 which is standard colors )
Nitin Posted Jan 7, 2021 4:26 PM PST
Thank you for the excellent article.Are there many projectors that have implemented Dolby vision? I understand the point about looking for true 10 bit and native DCI-P3 projectors but are there any things that make or would make Dolby Vision implementation stand out on a projector?

I would think people benefit from an article on color spaces too. For e.g., I understand movies are mastered for home viewing using P3 color space monitors. Why are they then put in Rec.2020 containers? Given that most content out there is mastered on P3, is there any point in waiting for a Rec.2020 projector?

Btw, I have been a reader since about 2000.Still on my second main projector ( experimented with two toys along the way), and now in the market for my third main projector. Have neen on projectors through all these years (no TV). Wouldn't have been this way without your site.
Rob Sabin, Editor Posted Jan 9, 2021 10:16 AM PST
Nitin, the only Dolby Vision projectors we know of on the planet are the Christie digital cinema projectors built expressly for Dolby Vision movie theaters. There could be a couple of possible reasons for this, all conjecture on my part and not from anything I've heard: 1) The cost of licensing it for a projector is prohibitive for projector manufacturers, or 2) what I think is the more likely reason, Dolby simply doesn't have a version of this technology to license to projector makers. Dolby has been very interested in making sure the technology is well executed so that viewers have a positive view of it, and unlike in Dolby Vision theaters, where they have direct knowledge of the screen size and material and lighting conditions being used and get involved in the installation and tuning of these systems (which I think are dual-projector)they can't know how it's going to look in home theaters. Given that HDR10 is essentially an open source technology, they can't sell licences for something that can't be sure looks demonstrably better or at least is executed to a level they consider satisfactory. They can do that with TVs -- but not projectors.
AKB Posted Mar 5, 2021 7:25 AM PST
Hi Michael,

Do you believe we will ever get 36 bit colour projectors? If so when could you gestimate when we would have such things? Which system is best capable of doing in future, laser, LCD etc?

Kind regards
John Posted Dec 31, 2021 1:43 AM PST
I'm curious about the use of temporal dithering for the many panels that are true 8 bit panels but want to "show" 10 bit colors. With all manufacturers that I investigate, I cannot find what their native panels are. What I am looking for is a modern day true 10 bit projector, but I don't know where to look. Many say they perform 10 bit color processing, but that does not necessarily mean the panels are 10 bit. I'd really love to hear your thoughts on this. Thank you for the great article.

Post a comment

 
Enter the numbers as they appear to the left