|
|
![]() |
||||||||||||||||||||
|
Best 4K Blu-ray Deals
|
Best Blu-ray Movie Deals, See All the Deals » |
Top deals |
New deals
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() $45.00 1 hr ago
| ![]() $82.99 1 day ago
| ![]() $74.99 | ![]() $26.59 1 hr ago
| ![]() $27.99 9 hrs ago
| ![]() $41.99 5 hrs ago
| ![]() $34.99 11 hrs ago
| ![]() $19.96 7 hrs ago
| ![]() $70.00 | ![]() $29.95 | ![]() $99.99 | ![]() $24.96 |
![]() |
#361 |
Blu-ray Emperor
|
![]()
Didn't someone say that Dolby is well behind on the certification process? Folks gotta bear in mind that it's not a case of just squirting the firmware down the pipe, Dolby have got to take in a sample set and test the bejesus out of it to actually create the custom display profile which is a crucial part of the entire Dolby Vision set-up.
Dynamic metadata is useless without the bespoke display measurement with which that metadata is used to map the image down, which is also why (as someone said very recently) DV projectors for the home aren't coming any time soon. There are too many variables w/ref to the actual screen being used, it's not just the projector that they'd have to profile but also the screen and there are so many combinations there that it'd be a never-ending task. |
![]() |
Thanks given by: | gkolb (06-06-2017), Staying Salty (06-06-2017) |
![]() |
#362 |
Blu-ray Samurai
|
![]()
Geoff,
Thanks for explaining what I thought was probably happening. That bespoke display profile for each and every model (and maybe even varies by screen size/ model) that applies the golden tone mapping for DV 12 bit must be a huge pile of custom code. This part of the firmware hasn't been getting the discussion that it warrants from some of the newer folks here. Had not seen that they're behind, but not surprised by that. The next big question for me is will they push it out for all mfg's at once, or go one by one? There is Sony, LG, Vizio. Who else? Edit: Ha, I mistakenly said LG and Vizio, how stupid, they already have DV out of the box. So if it is ONLY Sony displays, then I'm not too sure why it's taking from let's say March after Sony announced Dolby future compatibility to now for the displays with X-1extreme SoC's? Or are we expecting too much from Dolby/Sony? Last edited by gkolb; 06-07-2017 at 02:49 AM. Reason: fixed my dumb question |
![]() |
![]() |
#363 |
Blu-ray Emperor
|
![]()
Just to back to the calibration stuff, I watched some movies yesterday and in HDR it seemed a bit too bright in the lower registers and having checked my readings it is indeed a touch brighter than it should be with gamma at +2, but at +1 the low end is now tracking very nicely. So gamma at +2 is a bit too much.
|
![]() |
Thanks given by: | DJR662 (06-07-2017) |
![]() |
#364 | |
Special Member
|
![]() Quote:
![]() Thanks for the settings though. Since I don't have equipment I'll run with Contrast 90 on Cinema Home (down from 95) and Gamma at 0 for now. I am digging ACE at Medium for now. Not sure how it affects measurements but looks good for non-critical viewing. The other thing I'm digging is Reality Creation on Auto. Of you pause it on facial close ups, it seems to give better definition in like pores and hairs (sounds creepy). I guess it's the scaling feature of the TV that defines edges better without artificial sharpness. I need to try it on more movies with grain to make sure it doesn't do anything funny. |
|
![]() |
Thanks given by: | DJR662 (06-07-2017) |
![]() |
#365 |
Blu-ray Emperor
|
![]()
Yeah, I've noticed very little difference between medium and high in the dimming, haven't measured it or anything though. 90 with gamma at 0 sounds good to me.
I don't wanna start messing with contrast/black enhancers, with things like that I find that they work great with x film but then make something else look like poop, I'd rather leave them off. Next time I do a calibration run I might check the effects of it though. I love the reality creation for cleaning up jittery old Blu-rays (and for boosting the sharpness on Oblivion) but for anything that's come from a good solid master I keep it off. |
![]() |
![]() |
#366 |
Blu-ray Knight
|
![]()
From what I've read I think the general consensus was to leave LD at medium.
I experimented with ACE on my X940D and I didn't like the effect of it. Both the medium and high setting clipped details at the higher end, while the low setting darkened the picture too much for my taste so I just left it at off. IMO the Z9D is such a great TV anyway, it really doesn't need this feature. ![]() I find RC at auto way too agressive, especially with high quality material. I only use it for watching TV broadcasts anyway (set to 20). You should try experimenting with the manual setting and start somewhere from 10, that should already make a noticable difference without looking too artificial. Last edited by DJR662; 06-07-2017 at 04:43 AM. |
![]() |
![]() |
#369 |
Senior Member
|
![]()
Fingers crossed it will be before the rollout of their first Dolby Vision disc (Resident Evil). That would make sense at least.
|
![]() |
Thanks given by: | Robert Zohn (06-14-2017) |
![]() |
#370 |
Blu-ray Emperor
|
![]()
Apropos of nothing, here's how Vince Teoh's choice of Affleck in BvS as a mapping torture test for the OLEDs looks on my ZD9. The colour's off in the photo but the bolts of electricity and his shirt are both mapping quite beautifully and the bolts themselves are still mucho, mucho bright.
It's also been ascertained that Cinema Home (which is what I've been using from minute 1) is the 'dynamic' mapping mode whereas Cinema Pro is not, if I switch to the Pro mode using the exact same settings as Home then brightness increases but it blows out 3000/4000-nit highlights quite considerably. With everything being equal in terms of my settings (brightness max, contrast 88, gamma +1) then Custom hard clips at 1000 nits (said to be for mastering usage), Cinema Pro clips at about 1500 nits (not far off the brightness limit of the TV) and Cinema Home clips at nearer 4000 nits (Sony's own dynamic mapping at work, letting the mapping take over only when the brightness exceeds that of the TV). Last edited by Geoff D; 06-14-2017 at 01:46 AM. |
![]() |
Thanks given by: | gkolb (06-14-2017), Robert Zohn (06-14-2017) |
![]() |
#372 |
Blu-ray Knight
|
![]()
I'm using Geoff's HDR settings for my ZD9 and I'm happy with those.
But I'm still trying to find the best settings for HDR mode on my XD9405. Default setting for contrast is max but I don't like losing details in the brightest parts so I turned it down a bit while also setting gamma at a higher level. I'm still playing around with these settings. XD9405 works differently btw. When playing 4K/HDR content it locks into a separate HDR mode not connected to Pro, Home or any other preset. |
![]() |
![]() |
#374 | |
Blu-ray Knight
|
![]() Quote:
![]() Peak brightness on the XD9405 is alot lower than on the ZD9. I am now wondering what would be the best thing to do: just leave contrast at the default max setting (as AVforums' Steve Withers did on their calibrated test model: https://www.avforums.com/review/sony...v-review.12268) and enjoy the TV's full brightness, or still find some middle way solution by lowering the contrast (and raising gamma) to get as much details as possible in bright highlights. I can get up to around 2000-2500 on the Sony nit ramp this way without compromising brightness too much but I do lose some of that brightness pop though. But since Steve mentions this in his review "However, using a 10,000nits test pattern we could see that the TV wasn’t correctly mapping the content to the panel’s native peak brightness capability and was clipping content.", I wonder if it is worth it lowering the contrast like that? Decisions, decisions. |
|
![]() |
![]() |
#376 |
Blu-ray Emperor
|
![]()
One other thing to muddy the waters re: HDR playback on the ZD9 is that it might be worth keeping Custom and its hard clip to 1000 nits as a playback mode for 1000-nit content.
When watching Terminator Genyiysyisys yesterday I was playing around with the modes and when switching between Custom, Home and Pro there was no difference with regards to clipping of highlights because of the 1000-nit source (Par, Uni and Fox all master to 1000 nits for HDR10) but a noticeable difference in brightness. Home still looks superb in a pitch dark room but the other two modes are that much brighter again because they're not trading off some of the peak brightness for the >1000-nit mapping, so I might keep three separate modes for SDR, 1000-nit HDR and 4000-nit HDR. |
![]() |
Thanks given by: | DJR662 (07-10-2017) |
![]() |
#377 | |
Special Member
|
![]() Quote:
Cinema Home for me needs Contrast at 82, while Pro is ok at default 90. I used the test patterns found on Magnificent Seven (pushing 7669 at main menu; white clipping are the last 2 or 3 patterns out of 21). |
|
![]() |
Thanks given by: | DJR662 (07-10-2017) |
![]() |
#378 |
Blu-ray Emperor
|
![]()
The problem with contrast that low in Home is that while you're getting good results from the pattern (not that you actually need anything past 4000 nits anyway right now, and likely not for several years yet) the PQ EOTF won't be tracking properly as it shifts through the brightness scale, e.g. content will be darker than it should be. That may not make a hell of difference in real world terms given how much brightness this set has to play with at the upper end, but as it's usually the darkest/low APL scenes which HDR10 mapping has so much trouble with I made sure that the darkest points were tracking correctly up to 50% before letting the brighter content fall away slightly.
And when I actually tried properly watching 1000-nit stuff in Pro after I posted that ^ it looked too bright, not just in terms of my eyes being uncomfortable but stuff like grain looking much too 'crawly' and overcooked - as increasing contrast is bound to do. So now I just watch all HDR in Home set to Contrast 88 and Gamma +1. |
![]() |
![]() |
#379 | ||
Blu-ray Knight
|
![]() Quote:
Btw you say Universal, Paramount and Fox master at 1000. So only Sony master at 4000? What about Warner and Lionsgate and will they all continue to master their movies at the same level? Quote:
|
||
![]() |
![]() |
#380 | |
Special Member
|
![]() Quote:
I tested it with BvS, and there's a couple scenes during the Doomsday battle where his eyes light up and shoots lasers, there was obvious clipping and it also looked bad in general when I had Home and 90. Also when superman is floating in space unconscious and "healing", the camera pans and the sun is in view, and it's also a good scene to test the clipping. Above 82 and it started looking bad and clipped. I sort of want to do what you do and just have one HDR setting so I can stop tweaking! |
|
![]() |
|
|
![]() |
![]() |
|
|