As an Amazon associate we earn from qualifying purchases. Thanks for your support!                               
×

Best 4K Blu-ray Deals


Best Blu-ray Movie Deals, See All the Deals »
Top deals | New deals  
 All countries United States United Kingdom Canada Germany France Spain Italy Australia Netherlands Japan Mexico
Back to the Future Part II 4K (Blu-ray)
$24.96
7 hrs ago
Wallace & Gromit: The Complete Cracking Collection 4K (Blu-ray)
$13.99
2 hrs ago
Back to the Future: The Ultimate Trilogy 4K (Blu-ray)
$44.99
 
The Toxic Avenger 4K (Blu-ray)
$31.13
 
Teenage Mutant Ninja Turtles Trilogy 4K (Blu-ray)
$70.00
 
The Breakfast Club 4K (Blu-ray)
$34.99
 
Jurassic World Rebirth 4K (Blu-ray)
$29.95
 
The Lord of the Rings: Return of the King 4K (Blu-ray)
$29.96
 
House Party 4K (Blu-ray)
$34.99
1 day ago
Starship Troopers 4K (Blu-ray)
$26.95
 
A History of Violence 4K (Blu-ray)
$34.99
 
Lawrence of Arabia 4K (Blu-ray)
$30.52
 
What's your next favorite movie?
Join our movie community to find out


Image from: Life of Pi (2012)

Go Back   Blu-ray Forum > 4K Ultra HD > 4K Blu-ray and 4K Movies
Register FAQ Community Calendar Today's Posts Search


Reply
 
Thread Tools Display Modes
Old 05-20-2017, 01:34 AM   #741
PeterTHX PeterTHX is offline
Banned
 
PeterTHX's Avatar
 
Sep 2006
563
14
Default

Quote:
Originally Posted by Khronikos View Post
In particular, I think you need to push on the brakes a bit bud. You are talking about pushing an extra amount of data that is NOT insignificant into the same space as the more compressed versions. This is not something that just magically looks better. Many people may not even be able to tell the difference outside of some minor banding improvements.
Sounds like the difference between high bitrate lossy vs. lossless codecs. I can't see many people here advocating for lossy audio encodes to save space on discs.
  Reply With Quote
Old 05-20-2017, 05:06 AM   #742
Khronikos Khronikos is offline
Banned
 
May 2013
Default

Quote:
Originally Posted by PeterTHX View Post
Sounds like the difference between high bitrate lossy vs. lossless codecs. I can't see many people here advocating for lossy audio encodes to save space on discs.
Yeah, except for the fact that audio and video are two completely different things. And one happens to tie up a lot more data than the other. Other than that I think you are on the right track.
  Reply With Quote
Old 05-20-2017, 05:19 AM   #743
philochs philochs is offline
Senior Member
 
philochs's Avatar
 
Feb 2012
8
Default

Quote:
Originally Posted by mzupeman View Post
Yeah. Nowhere on that page does it say what he claims it says. And with that, I'll just ignore what he had to say from now on.

It says it exactly what I said it says. It talks about why they chose 12-bit encoding instead of 10-bit encoding. I'm not wrong about anything I said. Sorry, you literally don't understand the difference between a Panasonic UHD player that forced 10-bit disks to do 12-bit output, and actual Dolby Vision. Sorry if the technical jargon is too complicated for some people to understand but it is clearly there, so stop spreading your keyboard vomet libel and mis-info...


Some generic HDR approaches (HDR10) use the PQ EOTF with 10 bits instead of 12. Why is it important to use 12 bits instead of 10? The human visual system has different sensitivities at different levels of brightness, and it is particularly sensitive to small changes over large areas of nearly uniform brightness. The following graph shows how
noticeable 10- and 12-bit quantization is, depending on luminance. 10-bit quantization is always above the visual threshold, meaning that the average user can see noticeable differences with each change of luminance value. In natural scenes, noise in the scene can often mask this difference, but areas such as blue sky will often show banding or
contouring if the quantization steps are too coarse.


One of the things that is being said is that Dolby Vision uses 12-bit encoding because 10-bit encoding can cause color banding, for instance, in blue skies. Even if you can't understand that for some reason, it's not hard to understand that some UHD Blu-rays, including The Revenant do suffer from minor color banding, while Dolby Vision content does not. It's hilarious that someone would so harshly reject this, without any idea what they are talking about, and without any facts to back them up.
  Reply With Quote
Old 05-20-2017, 05:37 AM   #744
LordoftheRings LordoftheRings is offline
Special Member
 
LordoftheRings's Avatar
 
Mar 2010
Portishead ♫
Default

Very informative thread on a film that I adore.

Why they didn't do an Atmos track?
Also, why with all the 4k Blu-rays they don't include an improved regular Blu 1080p for the high prices they charged (between $30 and $35)? But here with 'Unforgiven' thank you they did...for $33 (I must have five or six formats/versions of this flick already).
  Reply With Quote
Old 05-20-2017, 05:38 AM   #745
philochs philochs is offline
Senior Member
 
philochs's Avatar
 
Feb 2012
8
Default

Quote:
Originally Posted by Geoff D View Post
That's a negatory on "all source material" being 12 or 16-bit. The latter is a rarity and although 12-bit is more common nowadays a LOT of DIs are comprised of 10-bit DPX files, bearing in mind that the process is a good 15 years old already. And 10-bit is still very much in use today, I read the AC article on a very recent movie (can't remember which, if I find it I'll edit this post) and it specifically mentioned 10-bit mastering.

BUT at least the 12-bit DV encode of that content will basically be a 'supersampled' version and should keep banding at bay, as the same is true of the 12-bit XYZ gamut that such 10-bit P3 content is mapped to when being prepared for a theatrical DCDM.
Thanks for the clarification about the 10-bit sources. I'm pretty sure that the majority of 35mm scans are at least 12-bit though. You can easily scan 35mm at 16-bit, and a 10-bit film scanner is pretty cheapo in 2017. As far as Age of Ultron goes, even though the DCP is 10-bit, certainly the raw 3.4K digital files are saved somewhere with their natural 12-bit color depth. If they spend the cash to do a new 4K DI, certainly it would be true 12-bit. There is a 12-bit source for that film, as it was filmed in 'ARRIRAW'.

I'm thankful you are willing to acknowledge that Dolby Vision material will not have issues with color banding. Someone has to know what they're taking about around here...
  Reply With Quote
Old 05-20-2017, 06:26 AM   #746
philochs philochs is offline
Senior Member
 
philochs's Avatar
 
Feb 2012
8
Default

Quote:
Originally Posted by Khronikos View Post
You are talking about pushing an extra amount of data that is NOT insignificant into the same space as the more compressed versions. This is not something that just magically looks better. Many people may not even be able to tell the difference outside of some minor banding improvements.

I'm not saying you are pushing an agenda or anything, I'm saying it is definitely best to wait and see. 66GB is too small for a lot of 4K films IMO. And god knows what kind of filtering they are using behind the scenes anyway. We don't know a whole lot about how 4K discs are authored to be talking with authority. Some discs look better than others. Some discs look terrible. No one is sure quite why at this point.

And now we are going to suddenly be pushing 4:2:2 and 12-bit into the encodes in the same space? I can't see this occurring without major problems for some films with 66GB discs for sure. We'll see what they do.

You've already made the point that BD66 isn't ideal for a UHD disk. I've agreed with you about that. Dolby Vision HDR is a premium HDR format that charges fees. If including it on a BD66 wasn't going to produce better results than HDR10 only, then they wouldn't bother adding it to the BD66 disks in the first place.

I'm sure there will be a way to toggle between Dolby Vision and HDR10 if your tv is compatible. I'll wait 2&1/2 weeks for the disks to be released, and for the UHD BD DV vs HDR10 shoot-outs before I throw the most premium HDR format under the bus. Hopefully the UHD Blu-ray players will get their fw updates by June 6th. I'll be glad when this issue is put to bed.

Just a hunch, but I think most of the negative blowback about Dolby Vision is coming from people that already have 4K setups that won't be able to take advantage of Dolby Vision, so they're just hoping that it's going to suck. I don't expect anyone to take my word for it, but these people will not be vindicated.
  Reply With Quote
Old 05-20-2017, 06:40 AM   #747
StingingVelvet StingingVelvet is offline
Blu-ray Grand Duke
 
StingingVelvet's Avatar
 
Jan 2014
Philadelphia, PA
849
2329
111
12
69
Default

Quote:
Originally Posted by philochs View Post
Just a hunch, but I think most of the negative blowback about Dolby Vision is coming from people that already have 4K setups that won't be able to take advantage of Dolby Vision, so they're just hoping that it's going to suck. I don't expect anyone to take my word for it, but these people will not be vindicated.
I'm not hopeful it's going to suck at all, and in fact I doubt it will. I predict people without HDR Premium sets will likely sing its praises. What I am skeptical of is how big a deal it will be for people with HDR Premium sets, especially after seeing that presentation slide that basically says the dynamic metadata won't even turn on if your set can handle the static signal well.
  Reply With Quote
Thanks given by:
bruceames (05-20-2017)
Old 05-20-2017, 06:44 AM   #748
philochs philochs is offline
Senior Member
 
philochs's Avatar
 
Feb 2012
8
Default

Quote:
Originally Posted by LordoftheRings View Post
Very informative thread on a film that I adore.

Why didn't they do an Atmos track?

I just recently read that a DTS track is currently cheaper for studios to produce than Dolby Atmos, and that it's also a more simplified workflow. You can do the DTS Master Audio or DTS:X using a plugin within the video software. With Dolby Atmos, you have to use the Dolby Media Producer Suite v.2.0, there are currently no Atmos encoding plugins. At least it's lossless 5.1 audio, that was probably just deemed sufficient for a film that only had stereo track to begin with. Lossless Atmos or DTS:X would've been cool though.
  Reply With Quote
Old 05-20-2017, 06:50 AM   #749
philochs philochs is offline
Senior Member
 
philochs's Avatar
 
Feb 2012
8
Default

Quote:
Originally Posted by StingingVelvet View Post
I'm not hopeful it's going to suck at all, and in fact I doubt it will. I predict people without HDR Premium sets will likely sing its praises. What I am skeptical of is how big a deal it will be for people with HDR Premium sets, especially after seeing that presentation slide that basically says the dynamic metadata won't even turn on if your set can handle the static signal well.
I said most, you're one of the good ones, Velvet. I'm personally more hopeful about Dolby Vision on high-nit sets than I am skeptical, but I understand where you're coming from. To me, that's important. The people that are balking about even the most basic concepts of Dolby Vision seem to me a bit befuddled and bewildered. I imagine at least a few of them feel they have an ax to grind. No worries.
  Reply With Quote
Old 05-20-2017, 09:10 AM   #750
Geoff D Geoff D is offline
Blu-ray Emperor
 
Geoff D's Avatar
 
Feb 2009
Swanage, Engerland
1347
2524
6
33
Default

I *think* I know what I'll be watching on the ZD9 tonight:

  Reply With Quote
Thanks given by:
bruceames (05-20-2017), flyry (06-04-2017), legends of beyond (05-20-2017), OI8T12 (05-20-2017), philochs (05-20-2017), PS3_Kiwi (05-20-2017), reanimator (05-20-2017)
Old 05-20-2017, 09:15 AM   #751
PeterTHX PeterTHX is offline
Banned
 
PeterTHX's Avatar
 
Sep 2006
563
14
Default

Quote:
Originally Posted by philochs View Post
I just recently read that a DTS track is currently cheaper for studios to produce than Dolby Atmos, and that it's also a more simplified workflow. You can do the DTS Master Audio or DTS:X using a plugin within the video software. With Dolby Atmos, you have to use the Dolby Media Producer Suite v.2.0, there are currently no Atmos encoding plugins. At least it's lossless 5.1 audio, that was probably just deemed sufficient for a film that only had stereo track to begin with. Lossless Atmos or DTS:X would've been cool though.
The Atmos software is just as easy to use as DTS:X.


The truth? DTS is actually *paying* for their own remixes & encodes. That's the reason it's "cheaper". Other than Harry Potter, Warner hasn't remixed any of their titles in immersive codecs and just port the Blu-ray encodes.
  Reply With Quote
Thanks given by:
philochs (05-20-2017)
Old 05-20-2017, 09:16 AM   #752
PeterTHX PeterTHX is offline
Banned
 
PeterTHX's Avatar
 
Sep 2006
563
14
Default

Quote:
Originally Posted by Khronikos View Post
Yeah, except for the fact that audio and video are two completely different things. And one happens to tie up a lot more data than the other. Other than that I think you are on the right track.
Except for the other fact that people can see a difference between 10 & 12-bit and actual 12-bit displays are in the works.
  Reply With Quote
Thanks given by:
philochs (05-20-2017)
Old 05-20-2017, 11:12 AM   #753
emgesp emgesp is offline
Senior Member
 
emgesp's Avatar
 
Jun 2008
143
341
1
Default

From the caps comparison I actually prefer the original release. Better color timing and detail.
  Reply With Quote
Old 05-20-2017, 12:06 PM   #754
mzupeman mzupeman is offline
Blu-ray Ninja
 
mzupeman's Avatar
 
Oct 2009
Upstate New York
385
1669
173
589
7
Default Unforgiven (1992) 4K UHD

Quote:
Originally Posted by philochs View Post
It says it exactly what I said it says. It talks about why they chose 12-bit encoding instead of 10-bit encoding. I'm not wrong about anything I said. Sorry, you literally don't understand the difference between a Panasonic UHD player that forced 10-bit disks to do 12-bit output, and actual Dolby Vision. Sorry if the technical jargon is too complicated for some people to understand but it is clearly there, so stop spreading your keyboard vomet libel and mis-info...





Some generic HDR approaches (HDR10) use the PQ EOTF with 10 bits instead of 12. Why is it important to use 12 bits instead of 10? The human visual system has different sensitivities at different levels of brightness, and it is particularly sensitive to small changes over large areas of nearly uniform brightness. The following graph shows how

noticeable 10- and 12-bit quantization is, depending on luminance. 10-bit quantization is always above the visual threshold, meaning that the average user can see noticeable differences with each change of luminance value. In natural scenes, noise in the scene can often mask this difference, but areas such as blue sky will often show banding or

contouring if the quantization steps are too coarse.





One of the things that is being said is that Dolby Vision uses 12-bit encoding because 10-bit encoding can cause color banding, for instance, in blue skies. Even if you can't understand that for some reason, it's not hard to understand that some UHD Blu-rays, including The Revenant do suffer from minor color banding, while Dolby Vision content does not. It's hilarious that someone would so harshly reject this, without any idea what they are talking about, and without any facts to back them up.


The page explains why 12 is better than 10. Nowhere on that page does it say 12 nits on 10 nit displays will not produce banding. Did you forget what you were responding to in the first place? Good grief.
  Reply With Quote
Thanks given by:
Adrian Wright (05-20-2017)
Old 05-20-2017, 01:02 PM   #755
Khronikos Khronikos is offline
Banned
 
May 2013
Default

Quote:
Originally Posted by philochs View Post
You've already made the point that BD66 isn't ideal for a UHD disk. I've agreed with you about that. Dolby Vision HDR is a premium HDR format that charges fees. If including it on a BD66 wasn't going to produce better results than HDR10 only, then they wouldn't bother adding it to the BD66 disks in the first place.

I'm sure there will be a way to toggle between Dolby Vision and HDR10 if your tv is compatible. I'll wait 2&1/2 weeks for the disks to be released, and for the UHD BD DV vs HDR10 shoot-outs before I throw the most premium HDR format under the bus. Hopefully the UHD Blu-ray players will get their fw updates by June 6th. I'll be glad when this issue is put to bed.

Just a hunch, but I think most of the negative blowback about Dolby Vision is coming from people that already have 4K setups that won't be able to take advantage of Dolby Vision, so they're just hoping that it's going to suck. I don't expect anyone to take my word for it, but these people will not be vindicated.
For the record, I don't even have a 4K TV yet as I will be waiting for HDMI 2.1. I have a couple discs, but even then only John Wick, Sicario, and The Revenant so far.

I'm sure they will find a way to make things okay, I'm just saying I don't always trust "them" either. We have already seen too many mediocre or plain bad transfers on UHD IMO.
  Reply With Quote
Thanks given by:
philochs (05-20-2017)
Old 05-20-2017, 01:04 PM   #756
Khronikos Khronikos is offline
Banned
 
May 2013
Default

Quote:
Originally Posted by emgesp View Post
From the caps comparison I actually prefer the original release. Better color timing and detail.
LOL. The original release is mediocre at best with terrible audio.
  Reply With Quote
Old 05-20-2017, 01:09 PM   #757
philochs philochs is offline
Senior Member
 
philochs's Avatar
 
Feb 2012
8
Default

Quote:
Originally Posted by mzupeman View Post
The page explains why 12 is better than 10. Nowhere on that page does it say 12 nits on 10 nit displays will not produce banding. Did you forget what you were responding to in the first place? Good grief.

Yeah, it didn't say those exact words, in that exact order. it just says Dolby settled on 12-bits because 10-bit causes banding in blue skies and such. It never actually says that 12-bit doesn't cause banding, but that is implied because they are talking about why they chose 12-bit over 10-bit, and one reason specifically was because 10-bit quantization is too coarse and causes color banding, that much is said. Do semantics generally bog you down this much? Seems a fruitless endeavor.

It's not as if this info isn't found in other places as well. Just found it on Rtings too...

"Dolby Vision content allows for up to 12 bit color; HDR10 is only 10 bit. It might not sound like a lot of difference, but you have to remember that the difference of 2 bits here is the difference between 1.07 billion colors and 68.7 billion. This means much smoother graduations between colors and no color banding in skies. 12-bit is simply better than 10... The higher the bit depth, the smoother it will be."

You and skycaptain both got confused because of the Panasonic UHD blu-ray players that recently fw updated for 10-bit output. Fair enough, honest mistake no big deal, I politely pointed out your error, and yet you respond with sass. You haven't openly recognized the errors of your arguments, something I typically do as soon as I am shown to be mistaken about something. You've yet to actually point out anything I was genuinely wrong about, but you cannot say the same about me. Oh well, I've lost interest in your opinions and your childish arguments. Thought you were ignoring me now, since you cannot be cordial, then please do.

PS please learn the difference between color-bit depth and a tv's rated nits output, apparently you're getting bits and nits confused. If your tv is only 10 or 12 nits, odds are you have more problems than just some simple color banding.

Last edited by philochs; 05-20-2017 at 01:32 PM.
  Reply With Quote
Old 05-20-2017, 01:46 PM   #758
philochs philochs is offline
Senior Member
 
philochs's Avatar
 
Feb 2012
8
Default

Quote:
Originally Posted by PeterTHX View Post
Except for the other fact that people can see a difference between 10 & 12-bit and actual 12-bit displays are in the works.

Dude, do you have a scoop on 12-bit panel displays? I assume they're in the works, naturally, but I've never found anything concrete really said much about them. You mean, like in the works for 3-5 years from now?
  Reply With Quote
Old 05-20-2017, 02:36 PM   #759
philochs philochs is offline
Senior Member
 
philochs's Avatar
 
Feb 2012
8
Default

Quote:
Originally Posted by jaaguir View Post
I'm sorry I couldn't get sooner to replying to you, I've barely had time to read the forum, much less write, for the last week or so. I'm only quoting the paragraphs that I have some issue with:


We're going to have to agree to disagree on that one. You trust reviews blindly, so you feel you can pass judgement on this release without having seen it. I don't agree, but let's move on.


I'm sorry (and I give you that I'm not as technically savvy as you are) but it's the first time I read anything like that. You are actually stating that HDR10 is such a broken technology that it just cannot correctly reflect the HDR grading of a movie with both nightime and daytime scenes? (How could we ever make anything out on the screen with SDR709 and the even more limited standard on dvd then? ).
I believe Geoff asnwered to that in this post, and in your answer to his post, I don't think you refuted anything.

Sorry for generalizing but summing it up, as I understand it, in the end there are two points you're trying to make here, and also in almost every post of yours:
One is that movies will look better with DV. I have some small differences with your opinion about that, but overall I have no problem with that point, I'm not going to discuss with you over that. It's going to be true for many tv-sets, and even the high-end LCDs might benefit, even if it's more slightly (I know you believe they too will benefit greatly). I appreciate that you're read all those white-papers and technical stuff, and maybe you'll be right and it will be great for all.
The other point is that HDR10 is a flawed technology. From what I know, I cannot agree with that one, sorry.
Those two points are not correlated, one being true doesn't mean the other one is too. But I believe they've been mixed-up often in the discussion. Someone might make a point defending HDR10, and the other one might answer listing the benefits of DV, for example. It can be tricky to separate, but maybe it's difficult to reach agreements that way.

As far as I know, digital grading has always been done shot-by-shot, so to speak, by the colorist, within the range of the technology that he's working in. The HDR10 metadata having some "single brightness level for the entire movie", according to you, doesn't mean the colorist didn't do his work, it doesn't mean that the final encode cannot have moments of 1200-nits brightness on screen at some point, and then have 5-nit moments of darkness in another secne. This has already been measured and proved true for existing UHD releases (I'm thinkin of zmarty's video of "ST:Beyond"). So I'm sorry but with my limited knowledge I cannot make sense of your statement.
It's one thing that the static metadata cannot adapt itself scene-by-scene to best fit your tv, and it's another thing that the movie wasn't graded properly scene-by-scene or shot-by shot. And why does the static metadata need adapting? Because many tv-sets out there don't come close to the capabilites of the monitor the grade was made in, and thus it might to look bad. They're really not fit for the format, period. But that's another war.
So in my humple opinion the real problem here is that every tv manufacturer is implementing HDR10 in a different way and then adding to that producing sets with different capabilities (and studios producing discs with different max-brightness values just makes it more confusing). That's not HDR10's fault.
Of course there is the possibility, like with any other release, that this release has been botched, because somethiing went wrong, but not because HDR10 cannot produce a top-notch look.

Will everything be better (to a bigger or lesser degree) with DV? If that's the main point you're trying to make after all, you won't get an argument with me there, I'm fine with that.
But does that mean HDR10 is faulty like you repeat quite much? No, I don't think so. And I'll be delighted if you can prove oherwise to me, in some way that I can understand, I'm open to it (I mean it sincerely, I think you can tell I have no personal animosity towards you), but so far I haven't found that in your posts.

As far as my concerns about the initial reviews of this disk, it was just some ocd paranoia. I don't like for initial reviews to be questioning how dark a transfer is, things like that worry me. Now that I've read more feedback, I feel a lot better about how this disk seems to have turned out. I'm actually watching the old Blu-ray now, as it's the best copy I have. This movie is dark as hell. I don't watch this film twice or more a year, as I do with some. That point wasn't so fresh in my mind earlier.

As far as the limitations of static metadata, and the idea that dynamic metadata is genuinely what they want to show consumers, but have been limited by technology, as far as saying static metadata is only optimized for the bright scenes, and that dynamic metadata improves shadow details, and specifically darker scenes, and that it can improve PQ on any tv, specifically darker scenes, that info has all been put out by the smpte at one time or another. I haven't seen Dolby Vision theatrical yet, but I've heard first hand accounts of how much better the shadow details are, in 'Kong:Skull Island' for instance.

Much of what you're asking about can be found here...

https://www.smpte.org/sites/default/...V2-Handout.pdf


and this video for instance, interesting watch...

  Reply With Quote
Old 05-20-2017, 02:41 PM   #760
mzupeman mzupeman is offline
Blu-ray Ninja
 
mzupeman's Avatar
 
Oct 2009
Upstate New York
385
1669
173
589
7
Default

Quote:
Originally Posted by philochs View Post
Yeah, it didn't say those exact words, in that exact order. it just says Dolby settled on 12-bits because 10-bit causes banding in blue skies and such. It never actually says that 12-bit doesn't cause banding, but that is implied because they are talking about why they chose 12-bit over 10-bit, and one reason specifically was because 10-bit quantization is too coarse and causes color banding, that much is said. Do semantics generally bog you down this much? Seems a fruitless endeavor.

It's not as if this info isn't found in other places as well. Just found it on Rtings too...

"Dolby Vision content allows for up to 12 bit color; HDR10 is only 10 bit. It might not sound like a lot of difference, but you have to remember that the difference of 2 bits here is the difference between 1.07 billion colors and 68.7 billion. This means much smoother graduations between colors and no color banding in skies. 12-bit is simply better than 10... The higher the bit depth, the smoother it will be."

You and skycaptain both got confused because of the Panasonic UHD blu-ray players that recently fw updated for 10-bit output. Fair enough, honest mistake no big deal, I politely pointed out your error, and yet you respond with sass. You haven't openly recognized the errors of your arguments, something I typically do as soon as I am shown to be mistaken about something. You've yet to actually point out anything I was genuinely wrong about, but you cannot say the same about me. Oh well, I've lost interest in your opinions and your childish arguments. Thought you were ignoring me now, since you cannot be cordial, then please do.

PS please learn the difference between color-bit depth and a tv's rated nits output, apparently you're getting bits and nits confused. If your tv is only 10 or 12 nits, odds are you have more problems than just some simple color banding.


I know the difference between bits and nits. I wrote the wrong word. Give me a break.

And no. Nowhere on page 9 is it even implied that 12 bits on a 10 bit display will not produce banding.

Look, I'm not arguing that DV isn't superior. All I was responding to was that 12 bits on 10 bit displays can produce banding. I asked for a link. You provided it. It doesn't support what you said. Period. That's the problem here. Nobody wants to read your assumptions, and that's what you're providing. Yes, you use factual knowledge as well, but you're also filling in the blanks however you please and passing that off as fact as well. You are moving the goalpost here. You claimed to have a fact and now you're saying, "well, it's implied". Then nobody takes you seriously, and then decide to be elitist and condescending, acting like NOBODY can know everything YOU know. You don't even see any of this either, that's the sad part.
  Reply With Quote
Thanks given by:
bruceames (05-20-2017)
Reply
Go Back   Blu-ray Forum > 4K Ultra HD > 4K Blu-ray and 4K Movies



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT. The time now is 02:05 AM.