|
|
![]() |
||||||||||||||||||||
|
Best 4K Blu-ray Deals
|
Best Blu-ray Movie Deals, See All the Deals » |
Top deals |
New deals
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() $24.96 7 hrs ago
| ![]() $13.99 2 hrs ago
| ![]() $44.99 | ![]() $31.13 | ![]() $70.00 | ![]() $34.99 | ![]() $29.95 | ![]() $29.96 | ![]() $34.99 1 day ago
| ![]() $26.95 | ![]() $34.99 | ![]() $30.52 |
![]() |
#741 | |
Banned
|
![]() Quote:
|
|
![]() |
![]() |
#742 | |
Banned
May 2013
|
![]() Quote:
![]() ![]() |
|
![]() |
![]() |
#743 | |
Senior Member
|
![]() Quote:
It says it exactly what I said it says. It talks about why they chose 12-bit encoding instead of 10-bit encoding. I'm not wrong about anything I said. Sorry, you literally don't understand the difference between a Panasonic UHD player that forced 10-bit disks to do 12-bit output, and actual Dolby Vision. Sorry if the technical jargon is too complicated for some people to understand but it is clearly there, so stop spreading your keyboard vomet libel and mis-info... Some generic HDR approaches (HDR10) use the PQ EOTF with 10 bits instead of 12. Why is it important to use 12 bits instead of 10? The human visual system has different sensitivities at different levels of brightness, and it is particularly sensitive to small changes over large areas of nearly uniform brightness. The following graph shows how noticeable 10- and 12-bit quantization is, depending on luminance. 10-bit quantization is always above the visual threshold, meaning that the average user can see noticeable differences with each change of luminance value. In natural scenes, noise in the scene can often mask this difference, but areas such as blue sky will often show banding or contouring if the quantization steps are too coarse. One of the things that is being said is that Dolby Vision uses 12-bit encoding because 10-bit encoding can cause color banding, for instance, in blue skies. Even if you can't understand that for some reason, it's not hard to understand that some UHD Blu-rays, including The Revenant do suffer from minor color banding, while Dolby Vision content does not. It's hilarious that someone would so harshly reject this, without any idea what they are talking about, and without any facts to back them up. |
|
![]() |
![]() |
#744 |
Special Member
![]() Mar 2010
Portishead ♫
|
![]()
Very informative thread on a film that I adore.
Why they didn't do an Atmos track? Also, why with all the 4k Blu-rays they don't include an improved regular Blu 1080p for the high prices they charged (between $30 and $35)? But here with 'Unforgiven' thank you they did...for $33 (I must have five or six formats/versions of this flick already). |
![]() |
![]() |
#745 | |
Senior Member
|
![]() Quote:
I'm thankful you are willing to acknowledge that Dolby Vision material will not have issues with color banding. Someone has to know what they're taking about around here... ![]() |
|
![]() |
![]() |
#746 | |
Senior Member
|
![]() Quote:
You've already made the point that BD66 isn't ideal for a UHD disk. I've agreed with you about that. Dolby Vision HDR is a premium HDR format that charges fees. If including it on a BD66 wasn't going to produce better results than HDR10 only, then they wouldn't bother adding it to the BD66 disks in the first place. I'm sure there will be a way to toggle between Dolby Vision and HDR10 if your tv is compatible. I'll wait 2&1/2 weeks for the disks to be released, and for the UHD BD DV vs HDR10 shoot-outs before I throw the most premium HDR format under the bus. Hopefully the UHD Blu-ray players will get their fw updates by June 6th. I'll be glad when this issue is put to bed. Just a hunch, but I think most of the negative blowback about Dolby Vision is coming from people that already have 4K setups that won't be able to take advantage of Dolby Vision, so they're just hoping that it's going to suck. I don't expect anyone to take my word for it, but these people will not be vindicated. |
|
![]() |
![]() |
#747 | |
Blu-ray Grand Duke
|
![]() Quote:
|
|
![]() |
Thanks given by: | bruceames (05-20-2017) |
![]() |
#748 | |
Senior Member
|
![]() Quote:
I just recently read that a DTS track is currently cheaper for studios to produce than Dolby Atmos, and that it's also a more simplified workflow. You can do the DTS Master Audio or DTS:X using a plugin within the video software. With Dolby Atmos, you have to use the Dolby Media Producer Suite v.2.0, there are currently no Atmos encoding plugins. At least it's lossless 5.1 audio, that was probably just deemed sufficient for a film that only had stereo track to begin with. Lossless Atmos or DTS:X would've been cool though. |
|
![]() |
![]() |
#749 | |
Senior Member
|
![]() Quote:
|
|
![]() |
![]() |
#750 |
Blu-ray Emperor
|
![]()
I *think* I know what I'll be watching on the ZD9 tonight:
|
![]() |
Thanks given by: | bruceames (05-20-2017), flyry (06-04-2017), legends of beyond (05-20-2017), OI8T12 (05-20-2017), philochs (05-20-2017), PS3_Kiwi (05-20-2017), reanimator (05-20-2017) |
![]() |
#751 | |
Banned
|
![]() Quote:
The truth? DTS is actually *paying* for their own remixes & encodes. That's the reason it's "cheaper". Other than Harry Potter, Warner hasn't remixed any of their titles in immersive codecs and just port the Blu-ray encodes. |
|
![]() |
Thanks given by: | philochs (05-20-2017) |
![]() |
#754 | |
Blu-ray Ninja
|
![]() Quote:
The page explains why 12 is better than 10. Nowhere on that page does it say 12 nits on 10 nit displays will not produce banding. Did you forget what you were responding to in the first place? Good grief. |
|
![]() |
Thanks given by: | Adrian Wright (05-20-2017) |
![]() |
#755 | |
Banned
May 2013
|
![]() Quote:
I'm sure they will find a way to make things okay, I'm just saying I don't always trust "them" either. We have already seen too many mediocre or plain bad transfers on UHD IMO. |
|
![]() |
Thanks given by: | philochs (05-20-2017) |
![]() |
#756 |
Banned
May 2013
|
![]() |
![]() |
![]() |
#757 | |
Senior Member
|
![]() Quote:
Yeah, it didn't say those exact words, in that exact order. it just says Dolby settled on 12-bits because 10-bit causes banding in blue skies and such. It never actually says that 12-bit doesn't cause banding, but that is implied because they are talking about why they chose 12-bit over 10-bit, and one reason specifically was because 10-bit quantization is too coarse and causes color banding, that much is said. Do semantics generally bog you down this much? Seems a fruitless endeavor. It's not as if this info isn't found in other places as well. Just found it on Rtings too... "Dolby Vision content allows for up to 12 bit color; HDR10 is only 10 bit. It might not sound like a lot of difference, but you have to remember that the difference of 2 bits here is the difference between 1.07 billion colors and 68.7 billion. This means much smoother graduations between colors and no color banding in skies. 12-bit is simply better than 10... The higher the bit depth, the smoother it will be." You and skycaptain both got confused because of the Panasonic UHD blu-ray players that recently fw updated for 10-bit output. Fair enough, honest mistake no big deal, I politely pointed out your error, and yet you respond with sass. You haven't openly recognized the errors of your arguments, something I typically do as soon as I am shown to be mistaken about something. You've yet to actually point out anything I was genuinely wrong about, but you cannot say the same about me. Oh well, I've lost interest in your opinions and your childish arguments. Thought you were ignoring me now, since you cannot be cordial, then please do. PS please learn the difference between color-bit depth and a tv's rated nits output, apparently you're getting bits and nits confused. If your tv is only 10 or 12 nits, odds are you have more problems than just some simple color banding. Last edited by philochs; 05-20-2017 at 01:32 PM. |
|
![]() |
![]() |
#758 | |
Senior Member
|
![]() Quote:
Dude, do you have a scoop on 12-bit panel displays? I assume they're in the works, naturally, but I've never found anything concrete really said much about them. You mean, like in the works for 3-5 years from now? |
|
![]() |
![]() |
#759 | ||
Senior Member
|
![]() Quote:
As far as my concerns about the initial reviews of this disk, it was just some ocd paranoia. I don't like for initial reviews to be questioning how dark a transfer is, things like that worry me. Now that I've read more feedback, I feel a lot better about how this disk seems to have turned out. I'm actually watching the old Blu-ray now, as it's the best copy I have. This movie is dark as hell. I don't watch this film twice or more a year, as I do with some. That point wasn't so fresh in my mind earlier. As far as the limitations of static metadata, and the idea that dynamic metadata is genuinely what they want to show consumers, but have been limited by technology, as far as saying static metadata is only optimized for the bright scenes, and that dynamic metadata improves shadow details, and specifically darker scenes, and that it can improve PQ on any tv, specifically darker scenes, that info has all been put out by the smpte at one time or another. I haven't seen Dolby Vision theatrical yet, but I've heard first hand accounts of how much better the shadow details are, in 'Kong:Skull Island' for instance. Much of what you're asking about can be found here... https://www.smpte.org/sites/default/...V2-Handout.pdf and this video for instance, interesting watch... |
||
![]() |
![]() |
#760 | |
Blu-ray Ninja
|
![]() Quote:
I know the difference between bits and nits. I wrote the wrong word. Give me a break. And no. Nowhere on page 9 is it even implied that 12 bits on a 10 bit display will not produce banding. Look, I'm not arguing that DV isn't superior. All I was responding to was that 12 bits on 10 bit displays can produce banding. I asked for a link. You provided it. It doesn't support what you said. Period. That's the problem here. Nobody wants to read your assumptions, and that's what you're providing. Yes, you use factual knowledge as well, but you're also filling in the blanks however you please and passing that off as fact as well. You are moving the goalpost here. You claimed to have a fact and now you're saying, "well, it's implied". Then nobody takes you seriously, and then decide to be elitist and condescending, acting like NOBODY can know everything YOU know. You don't even see any of this either, that's the sad part. |
|
![]() |
Thanks given by: | bruceames (05-20-2017) |
|
|
![]() |
![]() |
|
|