|
|
![]() |
||||||||||||||||||||
|
Best 4K Blu-ray Deals
|
Best Blu-ray Movie Deals, See All the Deals » |
Top deals |
New deals
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() $82.99 1 day ago
| ![]() $27.99 3 hrs ago
| ![]() $74.99 | ![]() $41.99 37 min ago
| ![]() $34.99 5 hrs ago
| ![]() $19.96 2 hrs ago
| ![]() $99.99 | ![]() $35.94 17 hrs ago
| ![]() $29.95 | ![]() $39.02 23 hrs ago
| ![]() $23.60 18 hrs ago
| ![]() $24.96 |
![]() |
#521 | |
Blu-ray Samurai
|
![]() Quote:
because that's what most of the experts and industry people are saying...how do you know that it won't be?...sounds more like you don't want it to be true or you own stock in Samsung...are you saying that there are going to be major differences between native HDR10+ implementation versus LG's Active HDR method? |
|
![]() |
![]() |
#522 | |
Blu-ray Samurai
Jul 2008
|
![]() Quote:
But to say Active HDR is the same thing as HDR10+ is just false information.. And I don't think spreading false information is a good thing |
|
![]() |
![]() |
#523 |
Special Member
|
![]()
actually; i do
|
![]() |
![]() |
#524 |
Member
|
![]()
I've been following this thread for the past few pages, and I gotta say that I haven't read anyone who has claimed that they are literally the same thing. The distinction has been made quite clearly a few times. LG's active HDR is a workaround that is achieving the same thing that HDR10+ is achieving.
Perhaps I am missing something? |
![]() |
![]() |
#525 |
Power Member
|
![]()
Active HDR and HDR10+ are entirely different things.
Active HDR = The display is buffering frames and analyzing to determine the best tone map on a scene by scene basis, no metadata is generated and the display does not need to use any of the static metadata it has incoming. HDR10+ = Dynamic METADATA added to the content to let the display know how to adjust on a scene by scene basis. There is also a third solution being used, displays that use the metadata in HDR10 to try to intelligently tone map. This means that they actually use the metadata to make an intelligent decision on what the peak value and such should be. This solution will only be as good as the metadata, which is frequently wrong or generic (a copy of the mastering display properties). I don't know why I keep seeing posts from people saying that the display is creating its own metadata. It is not. It is processing the content based on what is probably an internal histogram of buffered frames. Metadata is just packets of information provided so the display doesn't have to do any analysis. HDR10+ vs "Active HDR". In most cases, if the display does a good job, the difference could be slight, but HDR10+ (if the metadata is reliable) will take out any guess work or limitations of the histogram (big changes in content from scene to scene). But so far the quality of representation in HDR10 for metadata has been really poor, so the idea that the frame by frame metadata of HDR10+ will be more reliable has yet to be proven. |
![]() |
![]() |
#526 |
Special Member
|
![]()
"false information"?
Last edited by jibucha; 12-02-2018 at 08:17 PM. |
![]() |
![]() |
#527 | |
Blu-ray Emperor
|
![]() Quote:
Funnily enough though, the in-house variants may yet prove to be more suitable for this purpose because + doesn't apparently do frame by frame metadata, and during the + demonstrations earlier this year people noted how the metadata seemed to lag behind the image. That could just be typical technological teething troubles but even so, it's something to keep an eye out for as official + software is released. [edit] And, having just read Kris' well-informed post (a rarity for this thread), it could well work better the other way around if the TV's 'active' processor is being taxed too much by the content, i.e. the + metadata takes away the real-time processing strain. Last edited by Geoff D; 12-02-2018 at 04:03 PM. |
|
![]() |
Thanks given by: | MisterXDTV (12-02-2018) |
![]() |
#528 |
Special Member
|
![]()
is HDR10+ useless?
Last edited by jibucha; 12-02-2018 at 08:16 PM. |
![]() |
![]() |
#529 | |
Blu-ray Samurai
Jul 2008
|
![]() Quote:
Again, as I said: Warner and Sony use Dolby Vision exactly like they would use HDR10+. With a few Kb/s for metadata.... Nothing more.... |
|
![]() |
![]() |
#530 |
Special Member
|
![]()
Active HDR (LG/Sony) & HDR10+ (what's the difference?)
Last edited by jibucha; 12-06-2018 at 06:55 AM. |
![]() |
![]() |
#531 |
Special Member
|
![]() |
![]() |
Thanks given by: | BladeRunner86 (12-02-2018) |
![]() |
#532 |
Blu-ray Samurai
Jul 2008
|
![]() |
![]() |
![]() |
#536 | |
Special Member
May 2017
Earth v1.1, awaiting v2.0
|
![]() Quote:
Hopefully my questions are not as unsolvable as the post title suggests. ![]() |
|
![]() |
![]() |
#537 | |
Blu-ray Samurai
|
![]() Quote:
|
|
![]() |
![]() |
#538 |
Special Member
|
![]()
my understanding (if memory serves) is that 'it can and has that capability' but whether the director/studio/production facility implement it is questionable
the problem (as i see it) is that HDR10+ is generally the preferred option, if implemented, to save on costs/time and not to 'optimize the picture quality' as Dolby Vision (for example) please note, that i am not raising the Dolby Vision banner here, but contrasting, at times is essential to communication (forgive me?) in fairness, HDR10+ is/would be a significant improvement to HDR10, but we already have HDR10 as a baseline (Blu-ray standards) and Dolby Vision does far more already, so 'what's the point?' - - - we already have 'more' currently available, and HDR10+ 'complicates the marketplace', at least as i see it what do you think? |
![]() |
![]() |
#539 | |
Power Member
|
![]() Quote:
|
|
![]() |
![]() |
#540 |
Power Member
|
![]()
DV vs HDR10. Couple things here. For one, the difference between these two will heavily rely on a couple of factors.
The Display. Lets say you are watching the two on an OLED. The dynamic metadata isn't doing anything to help out a dynamic contrast system because the OLED is already infinite, so no issue there. For tone mapping, it really depends on the content. Most of the titles on the market right now are 1000 nits or less, so the actual amount of "tone mapping" is very small for displays that are as bright as most of the OLED/LCDs on the market. You would see a difference in difficult material (high peaks) but those may be few and far between. The other part is the 12 bit vs 10 bit. 10 bit is already REALLY good, so unless you run into a test case where there just wasn't enough bit depth at 10 bit, there is a chance the 12 bit one may avoid a bit of banding and have slightly better color resolution. HDR10+ is still new. It is supposed to encode per frame data as well. The reason I think Samsung is pushing this so hard is that their panels are a lot of edge lit displays that really rely on good dynamic contrast to make the most of their contrast. With per frame data this helps tremendously because the display isn't guessing or adapting over time, it knows exactly what it should or shouldn't be for overall APL. Some displays ignore metadata all together and do their own tone mapping based on frame buffering and analysis, others do a mixture of both. The latter depends a lot on how good the metadata is. If you look at disc releases, the majority of studios are reporting 1000 nits. Sony and Warner are probably the biggest outliers, and even they can be very hit or miss in accuracy (lots of MaxCLL values that exceed the mastering display). This is why you see people talking about titles like BR2049 either looking amazing or WAY too dark (using the display max for the tone map, which would be the WRONG answer). I haven't paid too much attention to HDR10+, but I've had a lot of talks about it with the folks at Spectracal and with Stacey Spears. But until it is adopted by more than some Amazon titles, or niche titles that I could care less about, I have more to do with my time. I wouldn't mind seeing it adopted more though, because of its implementation and Dolby's reluctance to support projectors or stand alone video processors, it may be a great solution for those of us looking for more reliable HDR performance from home theater projectors. Not only do they require aggressive tone mapping, almost all of them use a dynamic contrast system of some sort. |
![]() |
Thanks given by: | bruceames (12-11-2018), cjake (01-03-2019), gkolb (12-03-2018), Leterface (12-03-2018), Staying Salty (12-03-2018) |
|
|
![]() |
![]() |
|
|