View Single Post
Old 05-22-2017, 04:14 PM   #826
mzupeman mzupeman is offline
Blu-ray Ninja
 
mzupeman's Avatar
 
Oct 2009
Upstate New York
385
1669
173
589
7
Default

Quote:
Originally Posted by Geoff D View Post
This all boils down to oversampling making the source look better. You physically cannot display 60-odd billion shades on a panel that's natively 1 billion or so, but Sony's Super Bit Mapping (which has been around for years on Blu-ray mastering as well as the internal processing on their TVs which also pre-dates 4K) is designed to overdrive the bit depth to reduce the banding in the final 8/10-bit product.

A ZD9 is NOT actually displaying a 14-bit image but it is displaying the benefits of that oversampling e.g. increased dithering of colour gradation vs a native 8/10-bit capture. You do need a lot of processing grunt to do this and although Dolby Vision's 12-bit source should work along the same principle it's still at the mercy of the TV's own silicon, as evinced by various sets choking on a 12-bit 4:4:4 4K input as they struggled to dither it back down to 10-bit. I guess in that case it depends where the internally downsampled 10-bit image enters the chain, whether the Dolby processing does it first or whether it hands off the native 12-bit signal to the TV's processing to deal with on its own.


So it would ultimately come down to what TV you're using.


Sent from my iPhone using Tapatalk
  Reply With Quote