2017 has proven to be the year of the Ultra HD 4K TV as high dynamic range (HDR) and wide color gamut (WCG) models have become more prevalent, as well as less expensive.
The two advances deliver enhancements that create a sense of realism in images that are more easily discernible than the higher resolution of 4K (3,840 by 2,160 pixels) alone.
The Consumer Technology Association (CTA) estimates that TV models that can accept and read HDR metadata will represent about one in five 4K UHD TV set sales in 2017, as the overall 4K market enjoys double-digit unit volume growth, according to Steve Koenig, CTA market research senior director.
Meanwhile, in WCG and HDR TVs, colors are easier to discern than resolution alone and the expanded color range in many high-performance 4K displays surpasses 90 percent of the Digital Cinema Initiative (DCI) P3 color recommendations for professional theaters.
Paul Gagnon, IHS Technology Group senior manager of analysis, noted that as sets with WCG and HDR get better and more pervasive, average 4K Ultra HD TV prices are falling more than 20 percent year over year, driving new demand.
In addition, new Ultra HD Blu-ray players arrived last year supporting up to the full Rec. 2020 SMPTE color gamut standard, still out of the reach of most models on the market. But in 2017, new-generation Ultra HD models are getting closer to that standard. New quantum dot technologies used in some top-level 4K Ultra HD LED LCD TVs include advances that approach that goal.
Similarly, the brighter highlights and wider contrast ratio presented in HDR content produces stunning realism, which is pushing manufacturers to elevate peak luminance levels above the 540-nit (OLED) and 1,000-nit (LCD) thresholds of Ultra HD Premium performance determined by the Ultra HD Alliance.
New technologies often add confusion and last year manufacturers introduced a wide range of 4K Ultra HDTVs advertised as “HDR-ready” or “HDR compatible” offered a varying range of performance levels. This means a TV can receive and recognize HDR metadata in a signal. However, the capability of the individual set determines how well HDR is displayed, and some cheaper models manage brightness levels only slightly better than standard dynamic range. Consumers are left confused in determining what HDR/WCG should look like.
Gagnon said IHS uses 500 nits as the division line between HDR and HDR-compatible TVs, the latter only needing to accept and read metadata.
In addition, more TVs in 2017 include support for the Dolby Vision HDR format in addition to HDR 10, which has become the baseline industry standard. Dolby Vision brings dynamic metadata with parameters that shift from scene to scene instead of remaining constant throughout a production, potentially heightening the sense of realism.
Meanwhile, some new TVs in 2017 support the Hybrid Log-Gamma (HLG) HDR format, which is included in the new ATSC 3.0 television broadcast system.
The three primary technologies driving WCG along with HDR in step-up and premium TVs are: organic light emitting diode (OLED) displays; quantum dot-based LED LCDs and phosphor coated LED LCDs. Each have experienced performance advancements in new sets this year.
Meanwhile, OLED technology, championed by LG Electronics, excels at generating black levels to generate a wider contrast ratio starting at nearly total blackness and working up, and has widened that range further by achieving peak brightness levels nearing the 1,000-nit level of premium LCD TVs.
CTA estimates volume shipments of OLED TVs will reach a minimum of 500,000 in 2017.
Because OLED technology stands near the pinnacle of display tech, it continues to carry a high price tag relative to LCD. Competitors like Samsung, Hisense and others are adapting LCD with new approaches to support WCG and HDR with quantum dot technologies, local dimming and phosphor-coated LEDs, to drive performance and value.
Samsung has aggressively defended its rights and patents on quantum dot applications. It previously acquired an equity stake in Nanosys, a leader in quantum dot technologies for the consumer display market, and late last year acquired QD Vision, which has developed a different quantum dot approach optimized for edge-lit LED applications.
Nanosys and Samsung developed a hybrid technology called QLED, which combines many of the benefits of quantum dot-based LCD TVs with self-emissive characteristics of OLED technology, leading to lead to brighter displays with deeper black levels and a wider color gamut. But market introduction isn’t expected before 2019.
QLED uses a blue OLED back light instead of white used today along with a layer of selectively patterned red and green quantum dots. This replaces brightness- restricting color filters with quantum dots producing red, green and blue light. Photons from the blue light source excite red quantum dots to produces a 2.8x improvement in system efficiency over color filters.
Aside from QLED, Nanosys is producing a quantum dot technology delivering a color gamut approaching the Rec. 2020 standard.
For Samsung, Nanosys has developed quantum dots free of toxic cadmium to comply with the European Union’s Restriction on Hazardous Substances (RoHS). But this approach makes it difficult to meet Rec. 2020, because the blue and the green primaries for Rec. 2020 are very close together. Nanosys has developed Hyperion quantum dots that mix cadmium-free red and cadmium-based green quantum dots into a single fi lm that produces narrower green. The result produces cadmium levels below 100 ppm to meet the RoHS directive.
Looking beyond the granular details of technology specs, what all this adds up to is a more dynamic, more visceral experience for consumers when viewing a demo of the latest TVs, creating an “I didn’t know I wanted this but now that I saw it I need this” dynamic.
If 2017 is the year Ultra HD truly arrived, 2018 should be the year that Ultra HD become ubiquitous.