View Single Post
Old 03-16-2016, 04:03 PM   #3
James Freeman James Freeman is offline
Senior Member
 
James Freeman's Avatar
 
Oct 2013
Default

It has been discussed vigorously at the AVSForum.

Color accuracy is tied to luminance, if the luminance value is changed the color changes.
If the content was mastered on a 1,000 cd/m2 (nit) monitor and the HDR display is capable to display only 600nit then the colors that tied to 600 and above will distort as the TV will compress the luminance from 1000 to 600.

The absolute worst thing you can do to color is play HDR content on a non-HDR display.
The player has to convert the HDR ST.2084 EOTF curve on the disc to 2.2 Gamma.
Yes, it is the player that does the conversion because the non-HDR (non HDMI 2.0a) TV/Projector has absolutely no clue about ST.2084.

Although a smart algorithm can be applied to change the color value depending on the amount of change so it will look OK at any conversion range.
OK means it will resemble the HDR color to your eye after range conversion, but nowhere near calibration accuracy.
The chances that it is implemented are 0%.

Last edited by James Freeman; 03-16-2016 at 04:14 PM.
  Reply With Quote