|
|
![]() |
||||||||||||||||||||
|
Best 4K Blu-ray Deals
|
Best Blu-ray Movie Deals, See All the Deals » |
Top deals |
New deals
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() $74.99 5 hrs ago
| ![]() $24.96 1 day ago
| ![]() $44.99 | ![]() $33.49 34 min ago
| ![]() $24.96 | ![]() $35.33 | ![]() $34.99 | ![]() $27.13 1 day ago
| ![]() $32.96 5 hrs ago
| ![]() $27.57 1 day ago
| ![]() $70.00 | ![]() $29.95 |
![]() |
#1 |
Blu-ray Knight
|
![]()
A place for discussing HDR10, Dolby Vision, how they compare, info articles, why you love it or hate it, and everything HDR.
A few info articles on HDR: What HDR Means for Color and Luminance https://www.avforums.com/article/ult...ces-2016.12295 Last edited by bruceames; 04-22-2016 at 02:05 AM. Reason: added article link |
Thanks given by: |
![]() |
#2 | |
Special Member
|
![]() Quote:
|
|
Thanks given by: | bruceames (04-21-2016), Trekkie313 (10-08-2016) |
![]() |
#3 |
New Member
Jun 2011
|
![]() ![]() Sony X930D SDR vs HDR |
![]() |
#4 |
Active Member
|
![]()
Home comparisons on uncalibrated displays suggests wrong ideas when it comes to HDR (because it looks like HDR is too dark, while SDR overblown).
Here's some great article about HDR that would not confuse people http://www.lightillusion.com/uhdtv.html And picture comparison SDR vs HDR on calibrated HDTV's. As anyone can see, on calibrated displays average brightness levels looks similar (not like in these "home" comparisons). http://www.hdtvtest.co.uk/news/4k-vs-201604104279.htm |
Thanks given by: | bruceames (04-22-2016) |
![]() |
#5 | |
Expert Member
|
![]() Quote:
Those pictures posted above show me just as much as the pictures do in your link. Calibrated or not. |
|
![]() |
#6 |
Retired Hollywood Insider
Apr 2007
|
![]() |
Thanks given by: | James Freeman (04-22-2016), legends of beyond (07-19-2016) |
![]() |
#7 | |
Retired Hollywood Insider
Apr 2007
|
![]() Quote:
|
|
![]() |
#8 |
New Member
Apr 2016
|
![]()
May I ask somebody to watch my 4K UHD HDR BT.2020 ST.2084 encoded footage on a 4K HDR TV (USB)? Please report back, how it looks?
|
![]() |
#9 | |
Blu-ray Samurai
|
![]() Quote:
downloaded Jamess' new videos too, will also watch later. |
|
Thanks given by: | surami (04-23-2016) |
![]() |
#10 | |
Active Member
|
![]() Quote:
Here's how it should looks like http://www.hdtvtest.co.uk/image/arti...ight-large.jpg http://www.hdtvtest.co.uk/image/arti...ax-3-large.jpg And below BD vs UHD comparison on uncalibrated displays, and you can clearly see the difference in brightness levels http://www.hdtvtest.co.uk/image/arti...ax-2-large.jpg http://s16.postimg.org/gsaciz9rn/Clipboard01ass.jpg |
|
![]() |
#11 |
Senior Member
Oct 2013
|
![]()
What we must understand is that most film shot on camera still use 18% output as middle grey in mind.
Middle grey is 50% input (128 in 8bit) that will give you 18 nit (cd/m2) output. Who determined that 50% input is 18 nit output? 100 nit peak Gamma 2.4 CRT TV! So the whole system is still adapted to SDR CRT technology with Gamma of around 2.4. The camera (digital or film) may capture huge dynamic range but the captured middle grey card should still produce around 18 nit on your SDR or HDR display! A calibrated SDR TV (100nit peak) and HDR TV (1000nit peak) have the same picture luminance and information from 0 to around 50 nit and SHOULD look the same. 75% input (190 in 8bit) for that matter is 50 nit on a CRT or any SDR TV. It's from 50 nit and up what makes the difference between SDR and HDR TV. On an SDR TV the huge dynamic range highlights are compressed from 50nit to 100nit with something called Log curve in the camera (film does that naturally). While on HDR TV the highlight have much more room from 50nit to 1000nit so the highlights are not as compressed, in fact they have to be expanded from the camera Log curve. The Light Illusion site has a terrible example. His error is that he thinks the luminance is totally clipped above 100nit from the camera capture (like the HDR to SDR images blu-ray.com posted). In reality all the F-Stops the camera captured are compressed from 50nit to 100nit (75% input to 100% input) or 190 to 255 in 8bit. For HDR, 18 nit and 50 nit are STILL the same as SDR but above 50 nit, luminance is simply expanded and mapped. In other words, from 50 to 1,000nit the captured data above 50nit has to be expanded linearly and mapped to input levels 44% to 75% in ST.2084. Whether on SDR, from 50 to 100nit output is 75% to 100% input. As a side note, modern cameras can capture around 14 F-Stops or 16,384:1 contrast ratio (that is A LOT). An HDR TV can easily reproduce that (1,000/0.01 nit) = 100,000:1. In fact, the last generation plasma TVs can do 20,000:1 no problem, so they can also show all the dynamic range the camera captured. A little technical, but that will answer few questions. ![]() Last edited by James Freeman; 04-22-2016 at 03:07 PM. |
Thanks given by: |
![]() |
#12 |
Site Manager
|
![]()
Note: The screenies were from 300nits up as that's the middle ground between "old" 100nit TVs and "new" 1000nit TVs.
|
![]() |
#13 | |
Blu-ray Knight
|
![]()
Interesting comment regarding the difference between DV and HDR10.
Quote:
Another point to be made: HDR10 is the standard, not Dolby Vision. In fact, HDR10 comes from Dolby Vision and is just the base layer. So technically there is no format war. The DV enhancements mainly pertain to improving the picture on lower end displays. The probable reason why Sony or Samsung didn't include DV in their 2016 models is because it wasn't necessary. In high end models, HDR10=DV. It's no coincidence that DV is being implemented in the lower end models (LG and Vizio) and HDR10 in the higher end models (Sony and Samsung). The only exception is LG OLED, but then DV is desirable because even though OLED has an infinite contrast ratio, the peak brightness is only around 500 nits. So if the content is mastered at 1000 nits, then the dynamic metadata will scale the content accordingly. |
|
![]() |
#15 |
Blu-ray Knight
|
![]()
Yes, when there are 12 bit consumer TVs available then DV will be superior. But today there is no real difference between them as long as you have a high end display since DV content has to be scaled down to HDR10 levels anyway.
|
![]() |
#16 |
Member
Jan 2016
|
![]()
Can any of you answer which is the better option and why.
A. 4KOLED without HDR or B. 4K-LED with HDR |
![]() |
#17 | |
Retired Hollywood Insider
Apr 2007
|
![]()
I like
![]() Quote:
luminance is increased. This is known as the Hunt Effect which I’ve made reference to in at least one old posting (use search word 'Hunt'). As far as your link to the article from Sound and Vision with the lead-in pic, it’s good ![]() How the Ultra HD Alliance is involved |
|
Thanks given by: | bruceames (04-23-2016) |
![]() |
#18 | |
Retired Hollywood Insider
Apr 2007
|
![]() Quote:
thick negative = a well exposed image containing a full tonal image…in other words, a solid picture to work with, in that the whites aren’t clipped and the blacks aren’t crushed. Colorists and post production supervisors having experience in HDR mastering are finding out that not all movies are great HDR candidates. DPs having an interest in HDR versions for their future work…. take note ^. Some already have, like at NAB 2016 this past week. ![]() |
|
Thanks given by: | bruceames (04-23-2016) |
![]() |
#19 |
Blu-ray Knight
|
![]()
Good point Penton about needing 10 bit HDR to really appreciate the WCG more. HDR by itself is much more impressive than a WCG by itself, but perhaps here the sum is greater than the parts? (just guessing with that last part, you're the expert here
![]() |
|
|
![]() |
![]() |
|
|