IT Lifehack

A super easy-to-understand explanation of the problem that 4K broadcasting can be seen darkly. It's not a TV problem (Masaichi Honda)-Engadget Japan

I received an impression from a Twitter follower, “Mr. Honda, I'm pretty angry,” but it might be angry.

“4K broadcasts are dark and do n’t look good, but that ’s because TV is n’t good enough.”

Such a theory walks alone, and people who don't know the technical background can hear from various places, "Don't let go because it's dark". Although it depends on the adjustments, the programs of the commercial key stations that often play the same thing in BS broadcasting and BS4K broadcasting can certainly be seen dark.

However, this is not because TV performance is insufficient. “Because the broadcaster is playing a dark video,” it just appears dark.

Detailed explanation around here is an audio & visual information site “ AV Watch '',My youtube channelAs I told you, because there were a lot of technical stories, I would like to explain this from a different angle this time.

(embed) (/ embed)

What is HDR in the first place?

4K broadcasting has the advantage of being high-definition and low compression noise, but in fact, there are other factors that improve the impression the most in terms of image quality. That is “HDR (High Dynamic Range)”. It is a technology that can express the difference between light and dark more widely than before, and stands for high dynamic range.

From full HD to 4K, the number of pixels to depict will be quadrupled and the definition will increase, but HDR will be easy to understand if you think that there will be more “paints” on each pixel. Since more colors can be expressed, the reality is greatly improved.

Is there a high dynamic range? As you might think, this is not “LDR” but “SDR (Standard Dynamic Range)”. What is a standard? What is a standard in the first place?

Actually, SDR is determined based on “standard characteristics of CRT”. Some people don't know CRT these days. An analog device that shines an electron beam on a phosphor. Since it is an analog device, there is a saturation characteristic (response becomes worse when approaching the upper limit of the maximum brightness), and the brightness rolls off naturally (the brightness rises gradually) against the strength of the signal input.

4kdarkproblem "data-caption =" 4kdarkproblem "data-credit =" "data-credit-link-back =" "data-dam-provider =" "data-local-id =" local-4-7851520-1576028013688 "data- media-id = "ce9072fb-662b-4ed7-8b78-958ff9882b05" src = "" data-title = "4kdarkproblem" /></p>
<p>In the figure above, the blue area is SDR, and the image was created on a professional monitor adjusted to a maximum of 100 nits (unit of brightness) in a completely dark environment (currently with a device that imitates the characteristics) Adjust).</p>
<p>When CRT TVs were mainstream, or when LCD and plasma performance was low, there was no problem, but when the backlight of LCD TVs evolved and became able to produce even higher brightness, “Characteristics of CRTs” The performance of the display device (liquid crystal, now also OLED) cannot be brought out with the "dynamic range based on".</p>
<p>Therefore, it was decided to liquidate the past, which was constrained by the performance of the CRT, and expand the dynamic range so that the performance of the latest display could be utilized. This is HDR.</p>
<h2>"Let's put everything in for now!" Is the basic concept of HDR</h2>
<p>In the era of analog television, the only display device was supposed to be a cathode ray tube, but since then the liquid crystal era has come, plasma has been active, and OLED is now in practical use. It is nonsense to determine the video standard according to the display device.</p>
<p>Therefore, in the HDR standard, the basic concept is “Let's put all the information of the dynamic range”. Put all of them in, "Adjust the display on the device according to the performance of the display to display".</p>
<p><img alt=

It is said that the human eye can capture a dynamic range from 0 to 10,000 nits. Actually, the iris mechanism (iris) expands this up to 10 times, so there is a range up to 100,000 nits, but the dynamic range that the retina captures at the same time, the HDR video standard is up to 1 Record up to 10,000 nits. If you lose the information, you can't reproduce it. That's it.

However, if there is no guideline, the person who creates the work cannot summarize the painting. Especially in movies and other works. For this reason, there is a gradual decision to make video based on reproduction on a display with a maximum value of about 1000 nits.

I think that this is the number that is the basis of the theory of “Disqualification as 4K TV if 1000nits are not applied”. This loose rule is what the UHD Alliance has shown in the process of developing UHD Blu-ray standards. Well, this is an industry rule. This is also because Sony “BVM-X300”, the master monitor for business that was used as an industry standard, was able to display up to 1000 nits.

…… but actually contains more than 1000 nits of information, and it doesn't mean much more than "brightness that engineers check as a guide".

So, in the graph above, all the information including the green, yellow, and red areas has been recorded, but the range of brightness that can be expressed varies depending on the device method and grade.

If the display has an ability value of about 300 to 400 nits, it can express up to the green area, and if it has an ability value of about 1000 nits, it can express up to the yellow area. Future schemes you haven't seen yet (even if you don't know if it makes sense) may be able to display up to the red area.

How is the actual video recorded?

Then, some people may think that it is not necessary to set the entire screen to 1000 nits, but that would make it dazzling with tremendous brightness. Also, it is not realistic to record the entire screen at 1000 nits. 1000nits is only a guideline for making images, and how to display the material should be considered according to the surrounding environment.

For example, you can enjoy HDR movies at a Dolby Cinema movie theater, but the maximum is 108 nits. It is 48nits at a bright digital cinema-compatible cinema when it is a normal movie screening, so it is more than twice as bright, but it is considerably darker than the TV value.

This is because the movie theater is completely dark, in addition to the large screen and the large amount of light. In darkness, humans open the iris that is the aperture, so they feel bright even with little light. Of course, there are times when the video is made according to the movie theater. In fact, 1000nits is not always necessary to enjoy HDR video.

Next, look at UHD Blu-ray. Although the video is made with 1000 nits as a guideline, most of the subjects are drawn in the blue area of ​​the graph above, and even the description of the bright part is drawn up to about 200 nits (green area). Since it was produced while checking the reproduction on a home TV, there was no problem.

There is a movie called “Hudson River Miracle” that is often used for testing as typical HDR content. In this scene, Tom Hanks runs around New York's Times Square, and many of the electronic signage that appears there has exceeded 1000 nits, but many high-end televisions including OLED have a solid signage You can see through the content, and there is no darkness in image quality.

This is because it is a movie mode that is often viewed in a dark room, but because the screen size is larger than the master monitor, it feels brighter and brighter.

Even low-priced TV cannot express the details of signage, but Tom Hanks himself is expressed at a lower brightness, so it is depicted without any problems. Because the main subject is correctly depicted, it does not look strange.

The situation where commercial HLG is different from normal HDR

All of the above HDR contents express the brightness as an absolute value. This is because in the process of grading, we decide how much brightness to express and ship it.

4kdarkproblem "data-caption =" 4kdarkproblem "data-credit =" "data-credit-link-back =" "data-dam-provider =" "data-local-id =" local-2-7860250-1576028013678 "data- media-id = "72e4b2ab-e0ca-446e-b2eb-8baa7e9aa284" src = "" data-title = "4kdarkproblem" /><br /><strong>▲ HLG brightness distribution</strong></p>
<p>However, in the case of broadcast content, HDR is expressed by a standard called HLG (Hybrid Log-Gamma) to support TVs that do not support HDR. HLG puts white in SDR at 75% position and puts brighter information as HDR. This is a standard established jointly by NHK and BBC, so that even if the HLG broadcast is displayed on the SDR TV as it is, it can be viewed without any problem with a slight brightness adjustment.</p>
<p>Please think that HLG is just "developed so that you can watch HDR broadcasts without problems on SDR TV". Even if live images such as sports broadcasts cannot be adjusted finely, it can be enjoyed on both SDR and HDR TVs, so it is suitable for broadcasting.</p>
<p>The problem is how to broadcast an SDR program.</p>
<p>There is no problem because NHK gives instructions for SDR programs and HDR programs with broadcast flags for 4K and 8K broadcasts. However, commercial 4K broadcasting is always fixed to HDR (HLG). Most of the commercials and the main part are SDR. Since CM is produced for terrestrial and BS broadcasting, it becomes SDR, and the main part of the program is mostly broadcast simultaneously on BS and BS4K, so the dynamic range is SDR.</p>
<p>Therefore, if you send in SDR, the problem will not occur as with NHK. However, the commercial BS4K is supposed to broadcast all video on HLG as a standard specification. Since the content is SDR, there will be inconsistencies, so the SDR signal will be converted to HLG and broadcast.</p>
<p>Let's go back to the starting point.</p>
<p><img alt=▲ SDR brightness distribution

Since SDR white is equivalent to 75% of HLG brightness, if converted, the brightness of 25% will not be used. This is also evident in the graph, and you can see that the bright part is between 70% and 75%.

The TV receiver thinks that it is HLG content to the last, and tries to express the image without using the entire bright part 25% in order to allocate information up to 100% brightness to the display panel.

Returning to the origin of HLG standard measurement will solve the problem

The purpose of HLG is to improve the operability at the time of broadcasting by devising the curve of light and dark that can be expressed, and should not be used to broadcast SDR content. If it's SDR content, you won't be able to make a display that takes advantage of panel performance unless you tell your TV receiver that this is SDR.

Even if you consider the commercial business model that is operated with the sponsor's advertising fee, it may be a problem that the commercials are displayed darkly. There is no way for SDR to be broadcast as SDR, or to devise mapping so that it can be correctly viewed on HLG, and for commercial BS4K to be displayed at its original brightness.

4kdarkproblem "data-caption =" 4kdarkproblem "data-mep =" 3047717 "src =" -11ea-bff7-2b6fd756ca96 "/></p>
<p>The Tokyo Olympics will be held next year, so I hope that these problems will be solved in advance. If measures are taken by the broadcaster, it will be possible to make use of the TV's capabilities in the same way as UHD Blu-ray and HDR compatible Internet video distribution services. In fact, you should be able to understand it by watching the NHK broadcast.</p>
<p>In terms of coping therapy, matching with the program information, it is determined whether the SDR program is originally broadcast on HLG or whether it is really HLG in conjunction with the program information service on the Internet, It is not impossible to display correctly. However, as long as the barren method that does not create such value is adopted, it will be solved if commercial broadcasters adjust to the dynamic range and broadcast SDR as SDR.</p>
<p>that's all!</p>
<p>But of course, there is no possibility that commercials will be produced in HDR in the future, and HDR may be used as much as NHK in program production. In that case, there is a possibility that switching to SDR may not be smooth, but it is different from the analog era now (in the past there were cases where the first part of CM was disturbed by switching screen mode) . At least the author has not experienced any problems such as darkening by switching between SDR and HDR.</p>
<p>After that, there should be no problem if the private side can tolerate switching for each content (each CM). I hope the situation will improve in the future.</p>
<p><iframe frameborder=

Source link

Show More

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button