The latest and greatest thing in TVs are new models with 8K resolution. That’s 7680×4320 pixels or four times as many pixels as a 4K TV and 16 times as many as a 1080p TV. Do these new 8K TVs offer a differentiated experience? Can you see the difference? Is there 8K content and does it matter? Is it worth the higher price? These are all valid questions to ask if you are in the market for a new TV. In this article, I’ll talk about some of the common misconceptions around 8K, discuss what is included in these new 8K TVs and conclude with the items to look for if you want to consider buying one.
You Can’t See the Difference – or Can You?
Much of the criticism of 8K TV has been that you can’t see the extra pixel when viewing the TV from typical viewing distances of 8-10 feet. This assessment is based upon standard measure of visual acuity – i.e. how well we can see based on the Snellen eyechart. The argument is that adjacent pixels in an 8K TV are so close that we simply can’t resolve them. While the science behind this conclusion is solid, human vision is far more complex than the simple acuity metric might suggest. The reality is that you can see the difference between a 4K and 8K TV image.
NHK has done research to compare displayed images at various CPDs (cycles per degree or line pairs per degree) to real objects. The goal was to see at what CPD the viewer now thinks the displayed image looks like the real object. Their research suggests that at around 150 CPD, displayed images now look like the real object. This clearly suggests that there is more to vison than simple acuity or the Snellen eye chart test where 20/20 vision corresponds to CPD=30.
If simple visual acuity does not full describe our ability to “see” resolution, what are the mechanisms? There appear to be two other factors at play: Vernier acuity and the brain. Vernier or hyperacuity refers to the ability to discern slight misalignments between lines – an ability that is not possible according to simple acuity descriptions of human vision. Hyperacuity means we can perceive fine details even at fairly long viewing distances.
A classic way to prove this is to show to two line-pairs. One pair has two black lines on a light background with the two lines perfectly parallel. The second pair can be misaligned by just a single pixel and many people can see this – even at some distance away. Because modern displays are pixelated, a line that is not parallel will have stair-stepping. While our normal visual acuity may not see this, our Vernier acuity can. As a result, if 4K and 8K images are displayed on 4K and 8K displays of the same size and viewed at the same distance – all other factors being equal – the 8K image will look sharper or crisper vs. a 4K image even when viewed at 8-10 feet. That’s because the spacing of the pixels on the 8K TV is half that on the 4K TV so there will be reduced stair-stepping – i.e. the 8K display will create a smoother line compared to the 4K display in this example.
It would also appear that the brain does processing on the data sent from the eyes. In the above example, the reduced stair-stepping is reinforced in the brain to create a more analog-like image and hence an increase in the “realness” of the image. Given a high-resolution input to the eyes, the brain does a good job of filling in any missing details (and must work less hard compared to lower resolution inputs). This processing also creates an increased sense of depth.
In other words, simple visual acuity does not tell the whole story on why 8K images look better than 4K even at longer distances. Higher order processes come into play that increase the sense of depth and realness of such images.
Two recent studies are confirming this. In one study by Dr. YungKyong Park of Ewha Womans University in Seoul, side by side 4K and 8K 65” TVs were set up and calibrated at 500 nits of peak luminance. Observers were pre-tested to be sure their simple acuity was 20/20 and that they had normal color vision. All 120 observers sat 9 feet from the displays in a dark room – typical nighttime TV viewing conditions. All participants were shown the same 16 images and 3 videos representing a diverse range of visuals.
As a result of the study, 8K displays performance was rated 35% higher—with perceived image quality increasing by 30% and depth perception increasing 60% from 4K to 8K.
What’s most fascinating is that rather than pointing out the increased sharpness or contrast of the image associated with higher resolution, participants highlighted the main differences to be those related to sensory perceptions – i.e. objects look cooler, warmer, more delicious, heavier.
The researchers concluded that this hyperrealism effect connects the perceptual aspects on the image (contrast, color expression and resolution) with the cognitive aspects perceived by the brain (weight, temperature, reality, space, depth and high image quality). That’s very interesting that the increased resolution has a stronger emotional impact.
A separate study by Dr. Kyoung-Min Lee of Seoul National University looked at the effects of super-resolution (8K) displays from the point of view of the brain. His main conclusions were:
• Super-resolution reduces information loss thus creating a more realistic image
• Super-resolution displays increase the dynamic signal-to-noise ratio reducing cognitive loading and increasing the immersive effect
Having more pixels reduces jaggies in lines and moire effects leading to naturally sharper edges. These sharper edges make it easier to see separate objects allowing for an increased sense of depth. This effect is evident on native 8K content and on upscale/restored content as well.
But more pixels are also very beneficial for creating more realism in colors. Subtle hue changes can result in banding of these colors even if there is sufficient bit depth. Having four pixels to change the hue instead of one leads to a smoother and more lifelike image. This is illustrated in the graphic below. The left shows slight changed in hue while the left side shows the 8-bit RGB values for each color.
The top image represents a 6×6 matrix of pixel that represents the 8K case. A slight change in green hue runs left to right with a slight change in red hue from top to bottom. The bottom case is a 4K display where the average of 4 pixels is represented in this 3×3 matrix of pixels. This shows there can be more banding in a 4K display vs. an 8K display. A smart image restoration algorithm can try to reproduce the more subtle hue changes in the 8K example from a 4K input source.
The two images below are screen shots from a Samsung 8K TV with and without the image restoration algorithms applied to show the reduction in banding artifacts for subtle hue changes.
Glints or specular reflections also add realism to an image, but these are often tiny parts of the image. Being able to define a high luminance glint in finer detail with an 8K vs. 4K display allows such subtle components to be more accurately reproduced – and increasing realism.
All of these benefits are resolution dependent and apply even if the comparing 4K and 8K high dynamic range (HDR) images. Yes, some of these benefits are subtle, but the brain is remarkable and can processes such subtle improvements to create a more realistic and immersive image with more emotional impact.
What About 8K Content?
It is true that is limited native 8K content today, but that was also true of native 4K content 5 years ago. Today, there is a decent amount of native 4K HDR content available and I believe 8K content will come along at a similar pace in the coming years.
Japan is already broadcast 8K content on an 8K satellite channel and is gearing up to broadcast the full 2020 Summer Olympics from Tokyo in 8K. With this precedent set, it will be hard for major sporting events to now not be captured in 8K. China is expected to be a major market for 8K TVs so content creation is expected to heat up there soon, and also in Korea. In Europe, Rakuten in Spain has announced the first 8K streaming service and SES Astra may be offering 8K satellite service soon as well.
Streaming service providers led the adoption of 4K and I expect them to lead with the adoption of 8K. None have made public announcements yet, but with the introduction of improved compression technologies in the next 2 years allowing 8K streaming at acceptable data rates, don’t be surprised to see these companies vying to be the leader in 8K streaming. And, Sony has announced that the next PlayStation platform, PS-5, will be 8K capable. This is also expected to arrive in 2020.
As you can see, all the pieces are coming together to drive creation of 8K content in 2020 and beyond. But the reality is that almost all content for the next 2 years is going to be in 4K and 2K resolution, so isn’t that a problem? In short, no.
The new 8K TV have very powerful upscaling engines that create an image that has more pixel than the incoming image. Upscaling has been done for decades in all kinds of devices where the incoming image resolution does not match the display resolution. But this term is now obsolete because the algorithms to do this are now much more sophisticated. Maybe a better term is image restoration.
The new techniques use machine learning (ML) and artificial intelligence (AI) algorithms that go well beyond simple algorithms and nearest neighbor scaling concepts. ML techniques use computers to classify types of images and compare low- and high-resolution versions of them to develop tools to allow the computer to then reconstruct a high-fidelity image from a low-fidelity one. AI or “Deep Learning” takes this a step further by adding a feedback loop allowing the system to learn which reconstructions were better than others. The big benefit: the algorithms can get better over time.
Upscaling and image restoration is a very important feature for an 8K TV as it allows images with sharper edges, more texture, reduced jaggies and reduced noise. Many of the providers of 8K TV will implement some form of this image restoration technology, but not all implementations will be equal. Samsung has been showing their image restoration capabilities for some time and they highlight the ability to improve 2K and 4K images to 8K fidelity while also reducing image noise and eliminating compression artifacts.
Market Dynamics will Drive Price Declines of 8K TVs
At CES 2019, most of the major TV brands showed or announced 8K TV products – not just as prototypes as many have done in the past, but as real mass-production TVs for introduction in 2019. The table below shows the anticipated 8K TVs from the major brands for the U.S. market.
All of these TVs, with the exception of the LG 88”, are LCD models. The 88” LG 8K TV is going to be an OLED model.
CES also revealed that TV brands are already selling or plan to sell 8K TVs in China, Japan, Europe and other markets as shown in the table below.
Sharp is selling its 8K TVs in Japan and maybe China while Hisense, TCL, Changhong and Konka all claimed to be selling 8K TVs in China now or later in 2019. All of these are LCD-based. Skyworth showed an 88” OLED 8K TV (panel from LG Display, of course) but did not announce plans for commercialization.
This broad array of a new class of products from the major providers was similar to 2014 CES when 4K was the key buzz of the show. There will be many parallels to the roll out of 8K over the next 5 years to the way 4K rolled out over the last 5 years. One parallel is the impact panel fab capacity has on TV pricing and marketing push. For example, over the last 5 years 4K TV sales have skyrocketed. In a few short years, almost every TV now sold in developed countries over 50” is 4K resolution. What helped drive this was an over supply of G8 fab capacity that lead to big price reductions in 4K panels, especially the 55” size which is the most optimized size for G8 fabs.
The same thing is about to happen with the latest round of panel fab investment. This has focused on 10.5G class fabs which are optimized for 65” and 75” panels. This latest round is led by the Chinese and reached record levels of investment and capacity. What is coming is a tsunami of large-sized panels which will incentivize TV makers to offer them at increasing attractive price points.
According to market researchers Display Supply Chain Consultants, there are now three major 10.5G class fabs coming on-line in 2019, with at least two more planned for 2020 and others possible as well (see table below). All are in China expect the one OLED fab planned for Korea (which they see as on-again/off-again).
Less expensive 8K panels and the desire of TV brands to offer something new and exciting means we will see a big marketing push from the TV brands to sell 8K TVs in these larger screen sizes to start. If history repeats, this will lead to major price reductions for 65” and 75” TVs in the coming years – and many of these sets will be 8K resolution.
Will consumers buy these bigger sets? Most market researchers expect 8K TVs to continue the trend toward consumer adoption of larger screen sizes and actually create a surge of sales of TVs above 65”.
What does all this tell you? All the major brands will be selling 8K TVs and the panel supply of these larger screen sizes is increasing rapidly. The result? Prices will come down nicely – and maybe very quickly.
Looking at the 8K TV Specs
So far we have learned that there is cognitive and emotional value in a good 8K image. Native 8K content will come, but in the meantime AI-based image restoration algorithms are going an excellent job of creating 8K images from 4K and even 2K sources. Prices on the first 8K sets are high, but market dynamics will drive these prices down quite fast. So, is it time to consider buying an 8K TV? If you have the budget and you need/want the latest TV now, an 8K TV purchase can be a good future-proof option. While all the specifications on this wide range of 8K TVs are not yet fully available, some information has been revealed. Rather than try to compare and contrast various models, maybe it is better to make a list of items to look for in an 8K TV. This can be your guide to helping to pick that 8K TV now or in a year or two.
Native 8K resolution
It is best to look for an 8K TV that does not do sub-pixel rendering, which means sub-pixels are shared to create full color pixels. Traditional RGB pixel architectures with 7680×4320 pixels is best. Each pixel should be able to show the full range of colors and luminance of the display.
LCD TVs that use sub-pixel rendering will have a hard time displaying a single pixel wide grid of black and white (or colored) lines. Since it is the fine details and sharp edges of an 8K display that makes it so impressive, sub-pixel rendering should be avoided.
This describes the range of colors the TV can produce. The standard for Standard Dynamic Range (SDR) content is Rec. 709 while 4K and 8K color gamut is specified in the BT.2020 document from the ITU. While the RGB primaries for BT.2020 are very wide allowing all natural colors to be displayed, all practical implementations of 4K and 8K TVs target the DCI-P3 color gamut, which is in between rec.709 and BT.2020 in coverage of colors. DCI-P3 is the color gamut used for cinema movies and so there was a lot of content that could be easily displayed in this gamut, so it made sense. All 8K TVs should be able to meet the Rec. 709 requirement but some may fall a bit short on the full DCI-P3 gamut. The closer to 100% coverage the better.
There are no agreed upon ways to measure peak luminance, especially in HDR mode. Some sets claim a very high peak luminance so the questions to ask are: How big a patch or window (percentage of the screen area) was used for the measurement and for how long can it hold this peak level.
In my personal opinion, this peak level should be able to be held indefinitely, so the percentage area should also be mentioned. This does not need to be very high – 1% maybe just fine as highlights are often specular reflections (like glints of sunlight off a car fender) that are very small in area, but which add a high degree of realism to the image.
Again, there is no standard way to measure this. My suggestion is to specify the luminance of a full screen white image that can be held indefinitely. Caveats here to watch out for – what was the white point setting (D65?) and in what mode was the TV (standard, dynamic or movie)? Higher color temperatures will increase the average luminance as will standard and dynamic modes. In content mastering, the term diffuse white is often used as a reference point for setting the average luminance level of the content. Diffuse means the light intensity is the same in every direction. The TV average luminance level does not have to match what the content creator chose for diffuse white luminance, but it does tell you how bright the TV will be for a snow scene, for example.
The lower the black level the better in general, but this is not often specified by the TV maker. Maybe they publish a contrast number, but unless you know exactly how the contrast was measured, you may not be able to deduce the black level. There are a number of ways to characterize the black level of the TV, but it is most important when you view content in a dark room. Poor black levels will be evident as an inability to discern subtle dark tone shades. In reality, looking at dark content in a dark room to see if these tones are nicely visible may be the best way to evaluate the black level performance at this stage.
This describes the ability of each color to be displayed over the range of luminance to display can produce. Think of it as multiplying the deepest saturated colors times the luminance level over all combinations of luminance levels and colors. Methods to measure this for HDR-capable displays is still evolving, so I would not put too much value on this specification.
However, it is quite important as it is a big differentiator in describing the performance of LCD TVs vs. OLED TVs. All TVs have a power budget that means they can only create just so-much peak luminance. The TV may be able to hold a peak luminance for a few seconds or minutes before the heating becomes too much and the luminance must be dimmed down. All OLED TVs have peak luminance of under 1000 nits and will have to dim or tone-map these pixels at this or higher values. That doesn’t mean the image is bad, it is just different.
LCD TVs have the advantage of being able to make much brighter images by adding more LEDs in the backlight and carefully managing their thermal properties. That means if a pixel or pixel area calls for yellow at 2000 nits, the LCD TV may be able to actually display this as the content creator intended. An OLED TV would need to show this at much lower luminance levels, so it may not be as impactful (and not reproducing the signal exactly as the creator intended).
As mentioned previously, new image restoration techniques will allow these algorithms to improve over time making them vastly superior to older generation upscaling engines. But there can still be many ways that these algorithms were trained and ways they can improve. Not all TV brands will have superior image restoration methods. As a result, take a good look at how the 8K TV handles not only 4K content, but 2K content as well. Is the image look sharper and crisper with more details and more smoothness in lines and hues? And, try to test it with real world sources – like the content coming from your cable box or streaming service where compression artifacts are likely to be present too. Is the image restoration attempting to fix some of this as well?
HDR or High Dynamic Range is a new technology introduced with some 4K TV that will also be part of 8K TV. HDR requires specially mastered content to expand the range of luminance and colors allowing to display of highlights with higher luminance and detail, the display of darker tones in the shadows of content, and the ability to show brighter and more vivid colors. Each “flavor” of HDR requires a different transfer curve – i.e. a way to interpret the digital code values of each pixel into a color and luminance level.
For live broadcast, the Hybrid Log Gamma (HLG) transfer curve is mostly preferred, while file-based workflows use a Perceptual Quantizer (PQ) transfer curve. NHK, which is already broadcasting in 8K in Japan, uses the HLG HDR format.
A second distinction is the use of metadata, which can be static or dynamic. HLG-based HDR content does not inherently carry any metadata, but it must have a flag to signal the TV it is HLG content. The HDR10 PQ-based content has static metadata, which is basic information about the colors and luminance levels regarding the entire piece of content.
Any 8K TV must support HLG and HDR10. Support for dynamic metadata, which allows for the picture optimization on a scene-by-scene basis, is offered in the form of HDR10+ or Dolby Vision. These are competitive formats that offer a clearly improved image, especially for TVs with mid-tier performance levels. Both also require additional mastering by the content creator. These are not mandatory for a good 8K image but should be considered aspirational for the best possible image.
HDMI cables provide the interface between a playback device and the TV. The newest version, HDMI 2.1, is a suite of new applications that includes a faster data rate (from 18 Gbps to 48Gbps), eARC, gaming-centric features, Display Stream Compression (DSC) and more. HDMI 2.0 has sufficient bandwidth for 8K/30fps at 4:2:0 color sampling, but HDMI 2.1 is needed for 8K/60p at 4:2:0, the accepted distribution format, over a single cable.
The HDMI suite of specifications has been approved and silicon is ready, but a compliance test plan for all the features is not complete at this time. That means some 8K TVs may have HDMI 2.1 transceivers inside them (or can be upgraded to this), but the full functionality can’t be turned on until the test plan is complete and the TV tested for compliance. This is expected to happen sometime in 2019.
If you are buying an 8K TV soon, ask if the internal processing can support 8K/60p and what the plan is for upgrading to full HDMI 2.1, 48 Gbps transceivers. You might also ask about support for other 2.1 features if these are important as well.
Content arriving at the 8K TV over an HDMI 2.0 interface is uncompressed. HDMI 2.1 allows for light compression via the Display Stream Compression codec. No content that I am aware of is being delivered with Display Stream Compression, but it may be in the future. Having a decoder inside the 8K TV that can decode Display Stream, Compression is a nice-to-have and a good future-proofing option but should not be considered mandatory.
Bit depth refers to the number of finite digital steps that are allocated for red, green and blue elements of the image over the full range of luminance values. SDR displays were designed to have a limited luminance range and use the rec.709 color gamut. For this content, 8-bits per color is sufficient to reproduce colors over the full gamut and luminance value without seeing any banding or contouring.
With HDR content (4K or 8K), the range of luminance values and colors increases substantially. This expanded range now requires a minimum of 10-bits per color to avoid banding or contouring issues.
Most 4K HDR sets now have 10-bits per color inputs and processing pipelines, but they could default to 8-bit or 8-bit plus dithering at the panel. The TV maker won’t tell you this but it may be evident as increased banding on darker content. For an 8K TV, you may want to ask if the 10-bit pipeline is maintained from input to display, if that information is available. This will help ensure the best image quality.
All content is distributed in a compressed format. For 4K distribution (and increasingly on lower resolution content), the HEVC compression codec is used most often (Google also offers their VP9 codec on content). An HEVC decoder is therefore a mandatory requirement for an 8K TV. This should be available to support content delivered into an ethernet port or WiFi adaptor from a broadband source (i.e. Netflix/Amazon/Hulu, YouTube, etc.) or from a USB flash drive. These may be the first ways that end users will be able to see native 8K content and it will probably be encoded in the HEVC compression format.
However, an alternative codec, AV1 is now available offering similar performance as HEVC. Having support for this codec would be a good idea, but I think in the nice-to-have, not mandatory category.
In development are next generation versions of these codecs – Versatile Video Codec (VVC) for HEVC and AV2. It is unclear if new hardware decoders will be needed for these or if existing 8K TV platforms can be upgraded. These new codecs promise to half the data rates for delivering 8K content to the home so will be important enablers for the whole ecosystem.
The pieces for the 8K transition are in place and progressing in ways that mirror the way 4K was adopted. There remain headwinds for the wide adoption of 8K, but similar headwinds were faced by 4K 5 years ago – and look where we are today. I expect the 8K roll out will likely be a little slower and less comprehensive than the 4K roll out. Nevertheless, 8K images and even images restored from lower resolution content are already compelling today and they will only get better and more abundant in the future.
About the Author:
Chris Chinnock is the founder and president of Insight Media. His areas of focus include the 4K ecosystem, laser displays, 3D displays, advanced imaging technology (HDR, HFR, WCG) and emerging technologies and products. His clients are in the broadcast, cinema, ProAV and display industries. Chris has helped to guideInsight Media, one the most respected analyst firms in the industry, and contributes to many of the reports, consulting projects and events that the company provides. He holds a BSEE from the University of Colorado and prior to 1993, worked for companies such as General Electric, Honeywell, MIT Lincoln Labs and Barnes Engineering.
To join Insight Media email list visit: https://visitor.r20.constantcontact.com/manage/optin?v=001UsQCyUJjA_uIWGtLYOz9glGaXrgoXeEknGBk5znzUVMWo9qK0TMAjLwShwmSCazwf0fatcTI37M807-_JixGDWNRP-BukCGjtAzvT2w7gBVZWIf_TUvo2jdfi8O0XB7Bvj5tjrYuZj2s6dd2hKQ662nRjlV4hwPEwhRbeYCjP_k7W2W4OaJENg%3D%3D