As an Amazon associate we earn from qualifying purchases. Thanks for your support!                               
×

Best 4K Blu-ray Deals


Best Blu-ray Movie Deals, See All the Deals »
Top deals | New deals  
 All countries United States United Kingdom Canada Germany France Spain Italy Australia Netherlands Japan Mexico
A Better Tomorrow Trilogy 4K (Blu-ray)
$82.99
11 hrs ago
Superman I-IV 5-Film Collection 4K (Blu-ray)
$74.99
 
Corpse Bride 4K (Blu-ray)
$35.94
4 hrs ago
Longlegs 4K (Blu-ray)
$23.60
5 hrs ago
The Dark Half 4K (Blu-ray)
$34.68
4 hrs ago
Superman 4K (Blu-ray)
$29.95
 
Back to the Future Part III 4K (Blu-ray)
$24.96
 
Congo 4K (Blu-ray)
$28.10
6 hrs ago
Jurassic World: 7-Movie Collection 4K (Blu-ray)
$99.99
 
The Bad Guys 2 4K (Blu-ray)
$33.54
7 hrs ago
The Toxic Avenger 4K (Blu-ray)
$48.44
5 hrs ago
A Minecraft Movie 4K (Blu-ray)
$20.18
31 min ago
What's your next favorite movie?
Join our movie community to find out


Image from: Life of Pi (2012)

Go Back   Blu-ray Forum > 4K Ultra HD > 4K Ultra HD Players, Hardware and News
Register FAQ Community Calendar Today's Posts Search


Reply
 
Thread Tools Display Modes
Old 02-04-2017, 03:38 AM   #181
philochs philochs is offline
Senior Member
 
philochs's Avatar
 
Feb 2012
8
Default

Quote:
Originally Posted by Richard Paul View Post
PQ is a royalty free standard and is the basis for HDR10. While PQ is used by Dolby Vision that is only part of it and Dolby Vision includes Dolby dynamic metadata and ICtCp. Dolby Vision requires a license from Dolby and a royalty payment to Dolby. Samsung supports PQ but they have strongly rejected Dolby Vision. That is one of the reasons they created SMPTE ST 2094-40 so that there would be a royalty free standard for dynamic metadata.

PQ is a royalty free standard and Dolby doesn't make a penny from it. That is why Dolby is promoting Dolby Vision.

Again I understand everything that you just wrote, PQ was invented by Dolby and used in HDR-10, but I don't think you understand that I'm specifically talking about ATSC 3.0, and broadcast tv "Dolby Vision". Samsung didn't just support HDR-10s "PQ" at a panel, they actually threw their support behind broadcast Dolby PQ which in reality is advertised and viewed as "Dolby Vision". When given the choice of the two HDR broadcast formats, Samsung effectively gave their nod to Dolby Vision as their preference for television broadcasting, and not Hybrid Log-Gamma.

The average person clicking through the tv channels will just know they are watching a broadcast of live HLG HDR tv or a broadcast of live on-air Dolby Vision HDR tv, they probably won't even realize the difference between Dolby PQ from cable/satellite/tv tuner or real Dolby Vision on a UHD Blu Ray or on streaming websites, because any of this Dolby content, even the broadcast version, "Dolby PQ" has to be read and recognized by the consumer's tv as a Dolby Vision signal. At CES 2017 they did advertise broadcast "Dolby Vision" over ATSC 3.0, they didn't call it "Dolby PQ" like the insiders do, as seen in this picture if you look close.


Broadcasters and insiders call it "Dolby PQ" but the public perception will be "Live Dolby Vision" over a HDMI 2.0a-2.1 UHD device like a cable box. Samsung isn't just going to continue to neglect to add Dolby Vision, or HLG for that matter, by the 2018 models. They'll have to pay for Dolby Vision, because otherwise many angry people would start calling, petitions would go online when people couldn't get half of their programs in broadcast HDR tv to work. Streaming sites are one thing, but you piss off all of your Cable/Satellite/Tv Tuner users at once and cause panic, grief, and rage for a couple million people, then that's going to be a whole other issue all together. What if the NFL goes with Dolby PQ exclusively for live broadcast HDR? It's all up to the networks and the NFL. What if CBS and Fox decide to use Dolby Vision HDR (PQ) exclusively for broadcast HDR, while ESPN, ABC, and NBC use only HLG HDR? What if all those channels use a combination of both for different programs?

Samsung is going to have to pay, and they know this, and they are okay with this. ATSC 3.0 standards are going to require for their tvs to have Dolby Vision support. Sure, they avoided fees as long as they felt that Dolby Vision was not a necessity for them in the US tv market. Now that it's an intimate part of ATSC 3.0, and will thus likely more than double their tv sales, and also considering Dolby dropped the price, I think we can rest assured that every major US tv manufacturer will be including at least HDMI 2.1, native 120hz for high frame rate tv and gaming, a method for ST 2094 playback (HDR10+) an ATSC 3.0 tuner, Dolby Vision, and HLG, on all of their 4k HDR models, at least the top end models at first. Simply because it's all part of the chain of ATSC 3.0 being finalized this spring, going live in Korea 2017, live in USA next year. The FCC is already making announcements. Dolby Vision has such strong content support at this point and it's growing by many multitudes each year, it would be inconceivable for Samsung not to add DV in 2018.

There are two HDR standards for ATSC 3.0 HLG and Dolby PQ, that means you must have a tv that can do both HLG and Dolby Vision, if you want to enjoy 1080p and Ultra HD HDR content from ATSC 3.0 broadcasts, you'll need an ATSC 3.0 tv tuner, either built in to your tv, or externally. ATSC 3.0 is first going to benefit 4k hdr tv, but in 3-5 years, it will benefit phones, laptops, and tablets, because ATSC 3.0 is meant to make streaming live broadcast tv channels to mobile devices.

Samsung also claims they don't want to make OLED and consumers don't need OLED, just QLED, they claimed that at 2017 CES last month... yet they are planning on getting back into making OLED panels now that inkjet OLED technology has matured. Samsung just wants people to think they don't need Dolby Vision and they don't need OLED right now, but in 2018, they'll change their mind and they will start to hype Dolby Vision forcefully, and they'll do the same with OLED tv by 2019-2020. Samsung has already purchased a OLED inkjet factory printer and is in the test phases, and Samsung has privately continued work on OLED for R&D, yet you're not going to hear them announce any of that at CES 2017.

Last edited by philochs; 02-04-2017 at 04:42 AM. Reason: correcting info
  Reply With Quote
Old 02-04-2017, 06:44 AM   #182
Richard Paul Richard Paul is offline
Senior Member
 
Oct 2007
Default

Quote:
Originally Posted by philochs View Post
Again I understand everything that you just wrote, PQ was invented by Dolby and used in HDR-10, but I don't think you understand that I'm specifically talking about ATSC 3.0, and broadcast tv "Dolby Vision". Samsung didn't just support HDR-10s "PQ" at a panel, they actually threw their support behind broadcast Dolby PQ which in reality is advertised and viewed as "Dolby Vision".
Samsung never did that and I am guessing you got this information from the techradar article which was written by someone who didn't understand the difference between PQ and Dolby Vision. PQ is not Dolby Vision. Also just to be clear the Dolby proposal for ATSC 3.0 was Dolby Vision.

http://www.tvtechnology.com/expertis...elivery/278733

Also if you want to verify what I am saying you can look at the ATSC 3.0 video candidate and see that it is HLG and PQ that was added to it.

http://atsc.org/standards/candidate-standards/

Quote:
Originally Posted by philochs View Post
When given the choice of the two HDR broadcast formats, Samsung effectively gave their nod to Dolby Vision as their preference for television broadcasting, and not Hybrid Log-Gamma.
Samsung doesn't support Dolby Vision and that is why they are working to add dynamic metadata to HDR10.

http://www.flatpanelshd.com/focus.ph...&id=1485251457

Quote:
Originally Posted by philochs View Post
Broadcasters and insiders call it "Dolby PQ" but the public perception will be "Live Dolby Vision" over a HDMI 2.0a-2.1 UHD device like a cable box.
Dolby PQ is just another way to say PQ. It was Dolby that proposed PQ to SMPTE which than created SMPTE ST 2084.

Quote:
Originally Posted by philochs View Post
Samsung is going to have to pay, and they know this, and they are okay with this.
Technically it isn't really Samsung that would pay for Dolby Vision but instead if it happens it would be consumers that would pay for it.

Quote:
Originally Posted by philochs View Post
Dolby Vision has such strong content support at this point and it's growing by many multitudes each year, it would be inconceivable for Samsung not to add DV in 2018.
Wait and see what happens in 2018 after dynamic metadata gets added to HDR10. That is when Dolby Vision will go through it's trial by fire.
  Reply With Quote
Thanks given by:
mrtickleuk (07-08-2017)
Old 02-04-2017, 09:42 AM   #183
philochs philochs is offline
Senior Member
 
philochs's Avatar
 
Feb 2012
8
Default

Quote:
Originally Posted by Richard Paul View Post
Samsung never did that and I am guessing you got this information from the techradar article which was written by someone who didn't understand the difference between PQ and Dolby Vision. PQ is not Dolby Vision. Also just to be clear the Dolby proposal for ATSC 3.0 was Dolby Vision.

Also if you want to verify what I am saying you can look at the ATSC 3.0 video candidate and see that it is HLG and PQ that was added to it.

Samsung doesn't support Dolby Vision and that is why they are working to add dynamic metadata to HDR10.

Dolby PQ is just another way to say PQ. It was Dolby that proposed PQ to SMPTE which than created SMPTE ST 2084.

Technically it isn't really Samsung that would pay for Dolby Vision but instead if it happens it would be consumers that would pay for it.

Wait and see what happens in 2018 after dynamic metadata gets added to HDR10. That is when Dolby Vision will go through it's trial by fire.

So I've checked up on each of your claims.

1.a: Yes, I did read the techrader article, and it was incorrect when it stated that Samsung embraced Dolby's broadcast standard, "Dolby PQ", they actually do have their own separate broadcast HDR . 1b: Actually, Dolby has created two distinct types of "Dolby Vision" for the home, one designed specifically for broadcast that is actually the most premium type of any broadcast HDR, and the other is the most premium type of HDR for UHD Blu Ray and streaming services. Even with dynamic metadata added, HDR-10 will never be as premium as Dolby Vision's solutions.

2. Actually, ATSC 3.0 was looking at 4 different HDR technologies for broadcasting, and this April they are approving a spec that will include all 4 tested technologies, plus Technicolor's distribution method. They are leaving the door open for possibly adding other HDR technology in the future specs. They approved Technicolor/Philip's HDR technology, which is like a router for the other 4 HDR technologies, so they don't ruin the SDR signal and so both the HDR and SDR signals can be sent live at the same time without any major issues. They also included both Hybrid Log-Gamma and the broadcast PQ HDR standard which means they also approved 3 different types of PQ HDR (Samsung/Dolby/Ericsson) So you will need a tv that can do Dolby Vision, HLG, and HDR-10 if you want access to all broadcast tv hdr in full quality...

"Mark Richer, ATSC president, told HD Guru this week that “ATSC Technology Group 3 has approved a ballot to elevate A/341 Video-HEVC as a Proposed Standard. The document includes support for HLG, PQ transfer functions, and optional static metadata for PQ (HDR10 metadata). Additional HDR technologies are under consideration for inclusion in future revisions of A/341.”

The approved HDR broadcast technologies.

Technicolor/Philips (“Technicolor/Philips HDR Distribution solution”—is a single layer approach that converts the HDR signal to a SDR signal with dynamic metadata to provide both tone re-mapping as well as color gamut correction. Supports PQ, HLG, or SDR input video signals. At the decoding side, and takes the SDR as decoded and with the tone mapping and CRI color correction metadata can provide HDR outputs signals (PQ or HLG) as well as a native SDR (without any further processing or meta data).)

Dolby Labs: (PQ transfer function)
Qualcomm+Sharp+Samsung: (PQ transfer function)
Ericsson: (PQ transfer function)
NHK/BBC: (Hybrid Log-Gamma transfer function)

"Content supporting these formats will use dynamic range elements, which are captured by new digital cameras capable of recording 14 stops or more of light, and added to the final program in on-the-fly color grading. "

3.a: Mark my words, Samsung will add DV in 2018, if they don't announce it by CES 2018, their tvs will at least be able to get it with a firmware update by then, because Samsung will begin to sense an overwhelming demand for Dolby Vision, especially if Dolby strikes a deal with the NFL or something. 3.b:Lots of companies have worked on ST 2094, including Dolby who actually invented the HDR technology. Samsung should continue to work on improving HDR and getting more content released, I fully support that.

4. Um, nope, "Dolby PQ" refers specifically to Dolby's HDR technologies for broadcast television. It goes beyond other methods of broadcast HDR in terms of quality, with metadata and special enhanced color grading, and it feeds tvs a signal that tvs accept as a Dolby Vision source, while Samsung and Ericsson's PQ HDR methods both feed your tv a HDR-10 signal. DV is always going to have less color banding and finer color grading and other such subtle enhancements.

5. Dude, Samsung is a company worth a few hundred billion dollars, and with the Chinese LCD manufacturing output now surpassing Japan/Korea's total output of LCD tv panels, and now that inkjet OLED will be for sale to customers by 2019 at the latest, the prices of these televisions are going to start to drop quite significantly. You sound like you don't plan to buy a new tv any time soon and maybe you're happy with your old Samsung 4k hdr tv. You don't have Dolby Vision and you don't plan to get it, so you want to act like it's dumb and pointless and it will soon randomly start to fail and meet its demise... well, that's not happening, so you should get used to hearing a lot more about Dolby Vision for many happy years to come.

Dolby Vision content is mastered on a 4000 nit reference screen, and it's a scalable technology. In 5 years when consumer LCD tvs are 4000 nit, old Dolby Vision content will still look amazing. So the gap between DV and HDR-10 will actually grow, as far as perceivable quality goes. DV will always be the premium method of HDR, even after HDR-10 gets dynamic metadata.

HDR-10 content is mastered on a 1000 nit OLED reference screen. Samsung QLED tvs are already hitting 2000 nits this year. Wait till you see Sony's followup to the ZD9 in 2018. 2500 peak nits, I'm guessing, with a backlight masterdrive +, none of that edge lit "slim masterdrive" stuff Sony is releasing this year. I think Sony's followup to ZD9 will end up blowing away Samsung and LG tvs next year, I'm looking forward to it.

Anyway, go ahead and wait till you can watch some DV content (broadcast and UHD Blu Ray sources) on a wide color HDR tv that hits at least 1800 nits or more, I bet you'd start to appreciate the technology then.

Last edited by philochs; 02-04-2017 at 10:43 AM. Reason: correction
  Reply With Quote
Old 02-04-2017, 02:49 PM   #184
ray0414 ray0414 is offline
Blu-ray Samurai
 
ray0414's Avatar
 
Oct 2015
Michigan, USA, 35yo
9
Default

I see a few misaccurate statements above ^^^
  Reply With Quote
Thanks given by:
mrtickleuk (07-08-2017)
Old 02-04-2017, 05:18 PM   #185
Penton-Man Penton-Man is offline
Retired Hollywood Insider
 
Penton-Man's Avatar
 
Apr 2007
Default

Quote:
Originally Posted by ray0414 View Post
I see a few misaccurate statements above ^^^
ray,

  Reply With Quote
Thanks given by:
gkolb (02-05-2017), ray0414 (02-04-2017)
Old 02-04-2017, 05:29 PM   #186
Penton-Man Penton-Man is offline
Retired Hollywood Insider
 
Penton-Man's Avatar
 
Apr 2007
Default

Quote:
Originally Posted by philochs View Post
So I've checked up on each of your claims.
Lots of companies have worked on ST 2094...
Actually, still are (working on it).

As we speak, one piece is in audit. Given your profound expertise (and energy), care to help with checking up on any of the essence, typos, semantics, e.g. from any of these pages…..or others in the same doc?




  Reply With Quote
Thanks given by:
mrtickleuk (07-08-2017)
Old 02-04-2017, 09:41 PM   #187
philochs philochs is offline
Senior Member
 
philochs's Avatar
 
Feb 2012
8
Default

Quote:
Originally Posted by ray0414 View Post
I see a few misaccurate statements above ^^^
Misaccurate is not even a word, do you mean inaccurate? Funnily enough, but you couldn't actually point out what was supposedly wrong, and surprise, you're a Samsung fanboy. I'm shocked.
  Reply With Quote
Old 02-05-2017, 12:41 AM   #188
Penton-Man Penton-Man is offline
Retired Hollywood Insider
 
Penton-Man's Avatar
 
Apr 2007
Default

Quote:
Originally Posted by philochs View Post
Dolby Vision content is mastered on a 4000 nit reference screen
No, not all the time. That is not a prerequisite, i.e. you don’t need a 4,000 nit monitor, nor a 4,000 nit setting to produce Dolby Vision mastered content.

You can do it with an ~1,000 nit Sony BVM X-300 (even a lesser bright Canon DP-V2410)....a Dolby CMU (you can get here... http://www.nmh.com/ ), a Dolby Vision supported color corrector (Baselight, Lustre, Resolve, etc.) and of course, paying for a Dolby Vision mastering and playback license to allow the later to engage. In general, depending on the color corrector and version, you're given several SMPTE ST 2084 peak luminance setting options, e.g. 300 nits, 500 nits, 800 nits, 1000 nits, 2000 nits and 4000 nits. When using the Sony, you select the 1000 nit option and then continue merrily along Dolby Vision grading.

Basically, the metadata is saved into the mastered media and it’s used to more “intelligently” (that’s according to Dolby, personally I have a problem with personification) scale the HDR highlights to fit within any given HDR display’s peak highlights, also to handle how to down convert the image for SDR displays, and also, something not mentioned much, to determine how to respond when a TV’s automatic brightness circuit switches in.
  Reply With Quote
Thanks given by:
Geoff D (02-05-2017)
Old 02-05-2017, 01:44 AM   #189
Richard Paul Richard Paul is offline
Senior Member
 
Oct 2007
Default

Quote:
Originally Posted by philochs View Post
So I've checked up on each of your claims.

1.a: Yes, I did read the techrader article, and it was incorrect when it stated that Samsung embraced Dolby's broadcast standard, "Dolby PQ", they actually do have their own separate broadcast HDR .
The techrader article contains several mistakes and it was written by someone who doesn't understand the difference between PQ and Dolby Vision. Just to say this again but PQ, Dolby PQ, and SMPTE ST 2084 are the exact same thing. PQ is a HDR replacement for the gamma curve that was proposed to SMPTE by Dolby back in 2014:

https://www.smpte.org/sites/default/...-2-handout.pdf

Quote:
Originally Posted by philochs View Post
1b: Actually, Dolby has created two distinct types of "Dolby Vision" for the home, one designed specifically for broadcast that is actually the most premium type of any broadcast HDR, and the other is the most premium type of HDR for UHD Blu Ray and streaming services.
Dolby's proposal for ATSC 3.0 is Dolby Vision. They tried really hard to get the broadcasters to use it but so far the broadcasters have rejected it.

Quote:
Originally Posted by philochs View Post
2. Actually, ATSC 3.0 was looking at 4 different HDR technologies for broadcasting, and this April they are approving a spec that will include all 4 tested technologies, plus Technicolor's distribution method.
No, what they are doing is having future ballots for ATSC 3.0 on all the other HDR formats. Obviously Dolby wants to get Dolby Vision added to ATSC 3.0 and it might happen. It also might not happen.

Quote:
Originally Posted by philochs View Post
3.a: Mark my words, Samsung will add DV in 2018, if they don't announce it by CES 2018, their tvs will at least be able to get it with a firmware update by then, because Samsung will begin to sense an overwhelming demand for Dolby Vision, especially if Dolby strikes a deal with the NFL or something.
We will see.

Quote:
Originally Posted by philochs View Post
You don't have Dolby Vision and you don't plan to get it, so you want to act like it's dumb and pointless and it will soon randomly start to fail and meet its demise... well, that's not happening, so you should get used to hearing a lot more about Dolby Vision for many happy years to come.
I don't think that it is dumb I am just skeptical of how well Dolby Vision will do in the long term.
  Reply With Quote
Thanks given by:
mrtickleuk (07-08-2017)
Old 02-05-2017, 02:47 AM   #190
philochs philochs is offline
Senior Member
 
philochs's Avatar
 
Feb 2012
8
Default

Quote:
Originally Posted by Richard Paul View Post
The techrader article contains several mistakes and it was written by someone who doesn't understand the difference between PQ and Dolby Vision. Just to say this again but PQ, Dolby PQ, and SMPTE ST 2084 are the exact same thing. PQ is a HDR replacement for the gamma curve that was proposed to SMPTE by Dolby back in 2014

Dolby's proposal for ATSC 3.0 is Dolby Vision. They tried really hard to get the broadcasters to use it but so far the broadcasters have rejected it.

No, what they are doing is having future ballots for ATSC 3.0 on all the other HDR formats. Obviously Dolby wants to get Dolby Vision added to ATSC 3.0 and it might happen. It also might not happen.
Why is it you don't understand that ATSC 3.0 standards do actually include multiple methods of PQ HDR from separate companies including Ericsson, Samsung, and Dolby? You understand that Dolby Vision is a type of PQ HDR, yet you want to claim it isn't going to be added to ATSC 3.0 right away? It's already being added in the first go around. If you don't understand that by now, I'm done explaining it to you, just wait a year and you'll understand it by then, nature will take its course.

I don't understand people who act disdainful about Dolby Visionwho've never experienced it on a great hdr screen. Mass Effect: Andromeda, the PC game that launches next month is the first game to use Dolby Vision, and it may end up the best looking game of the year, HDR-10 support won't hurt it, but the fact that it has Dolby Vision too is actually really exciting. Many more PC games will be utilizing it soon, and since Sony uses it in tvs now, there's a decent chance it will be used in PS5 games. Warner, Lionsgate, and Universal are dedicated to Dolby Vision UHD Blu-Ray, first disks expected in a couple months. Sony and several Chinese manufacturers have added Dolby to their sets just in 2017. It's not going anywhere.
  Reply With Quote
Old 02-05-2017, 04:40 AM   #191
ray0414 ray0414 is offline
Blu-ray Samurai
 
ray0414's Avatar
 
Oct 2015
Michigan, USA, 35yo
9
Default

Quote:
Originally Posted by philochs View Post
Misaccurate is not even a word, do you mean inaccurate? Funnily enough, but you couldn't actually point out what was supposedly wrong, and surprise, you're a Samsung fanboy. I'm shocked.
not a fanboy, but thanks.

and yea, some spelling errors happen when your posting while youre driving for your job, it happens i didnt post the errors because 1. i was busy 2. Its not worth it, but i wanted to ruffle your feathers more so as entertainment.

Last edited by ray0414; 02-05-2017 at 06:38 AM.
  Reply With Quote
Old 02-05-2017, 04:46 AM   #192
Richard Paul Richard Paul is offline
Senior Member
 
Oct 2007
Default

Quote:
Originally Posted by philochs View Post
Why is it you don't understand that ATSC 3.0 standards do actually include multiple methods of PQ HDR from separate companies including Ericsson, Samsung, and Dolby? You understand that Dolby Vision is a type of PQ HDR, yet you want to claim it isn't going to be added to ATSC 3.0 right away? It's already being added in the first go around.
The only two HDR formats that got added to ATSC 3.0 were PQ and HLG. The other HDR formats included proprietary features (dynamic metadata, ICtCp, adaptive shapers, etc...) that would require royalty payments. All of the other HDR formats will be voted on in the future and maybe some of them will be added or maybe none of them will be added.

Quote:
Originally Posted by philochs View Post
If you don't understand that by now, I'm done explaining it to you, just wait a year and you'll understand it by then, nature will take its course.
I am just telling you what is currently in the ATSC 3.0 video candidate standard.

http://atsc.org/standards/candidate-standards/
  Reply With Quote
Old 02-07-2017, 07:13 AM   #193
Penton-Man Penton-Man is offline
Retired Hollywood Insider
 
Penton-Man's Avatar
 
Apr 2007
Default

Quote:
Originally Posted by ray0414 View Post
some spelling errors happen when your posting while youre driving
Ray, please don’t post while driving.
  Reply With Quote
Thanks given by:
ray0414 (05-08-2017)
Old 03-28-2017, 06:29 PM   #194
singhcr singhcr is online now
Blu-ray Samurai
 
singhcr's Avatar
 
Sep 2008
Apple Valley, MN
11
4
26
4
42
Default

Is the dynamic metadata feature of HDMI 2.1 meant for improvements to HDR10, etc? From what I have read, the dynamic metadata feature in Dolby Vision can be done with HDMI 1.4b or newer. The reason I ask this is that I want to buy a new AVR (Marantz SR6011) and while things will never be future proof, I want to at least be able to be able to pass DV encoded UHD BDs through this.

http://yoeri.geutskens.com/faqs/dolby-vision-faq.html

Quote:
What version of HDMI does Dolby Vision require?
Also, if Dolby Vision uses dynamic metadata, how can an Ultra HD Blu-ray player pass the proper signal on to an HDR TV using HDMI 2.0a (which supports only static metadata)?

During the color grading process in post-production, the Dolby Vision workflow enables the colorist to perform a scene-by-scene analysis of the particular look that they are going for. Once determined, a set a metadata is generated for this scene, then married to the picture and transmitted on a frame-by-frame basis. This is why the type of metadata discussed here is referred to dynamic metadata. This concept is specific to Dolby Vision and adds a layer of performance and fidelity on top of other concepts that only use static metadata.

In addition to the scene-based dynamic metadata mentioned above, there are also some static metadata parameters, such as specifications of the mastering display used during the color grading process. This data does not change for the duration of the program and is hence called static metadata. This is data can be carried via the existing HDMI standard.

Dolby Vision does not require HDMI 2.0a or 2.1. It embeds the metadata into video signal. Knowing that previous versions of HDMI would not pass the Dolby Vision dynamic metadata, Dolby developed a way to carry this dynamic metadata across HDMI interfaces as far back as v1.4b. The HDMI specification is now catching up with v2.0a supporting static metadata and future versions expected to support dynamic metadata as well. Dolby’s intent is not to compete with HDMI but merely to enable deployment of a full Dolby Vision HDR ecosystem without having to wait for HDMI standardization to catch up. Dolby was and is directly involved in standardizaton of the current and future versions of HDMI.

In practice, most Dolby Vision content is in Ultra HD and requires HDCP 2.2 copy protection, which is only available on HDMI 2.0 and up.
  Reply With Quote
Old 03-28-2017, 07:11 PM   #195
puddy77 puddy77 is online now
Blu-ray Guru
 
Jan 2008
2
Default

Quote:
Originally Posted by singhcr View Post
Is the dynamic metadata feature of HDMI 2.1 meant for improvements to HDR10, etc? From what I have read, the dynamic metadata feature in Dolby Vision can be done with HDMI 1.4b or newer. The reason I ask this is that I want to buy a new AVR (Marantz SR6011) and while things will never be future proof, I want to at least be able to be able to pass DV encoded UHD BDs through this.

http://yoeri.geutskens.com/faqs/dolby-vision-faq.html
You should be fine with the Marantz SR6011. But only after an update sometime later this year. See here: http://marantz-uk.custhelp.com/app/a...-compatibility

It's an odd situation. Dolby Vision seems to be designed with compatibility back to HDMI 1.4b. But when the Chromecast Ultra came out (the only external device available right now that supports DV), people noticed that it only passed DV when directly connected to their displays. Whenever it was hooked up to a receiver, it would not pass it. Sound and Vision had a Q&A article addressing this: http://www.soundandvision.com/conten...ion-compatible So it doesn't seem as simple as it should. But again, Denon/Marantz said they'll update the SR6011 among others.

No one knows when HDMI 2.1 will make its way to devices. But it is only needed for other, more rare forms of HDR: Technicolor, Philips, and Samsung's Dynamic HDR10.
  Reply With Quote
Thanks given by:
singhcr (03-28-2017)
Old 03-28-2017, 08:32 PM   #196
singhcr singhcr is online now
Blu-ray Samurai
 
singhcr's Avatar
 
Sep 2008
Apple Valley, MN
11
4
26
4
42
Default

Thanks for the information! Looks like I can finally put my 5.1 2000s era receiver to rest.
  Reply With Quote
Old 03-28-2017, 09:12 PM   #197
Geoff D Geoff D is offline
Blu-ray Emperor
 
Geoff D's Avatar
 
Feb 2009
Swanage, Engerland
1348
2525
6
33
Default

Quote:
Originally Posted by puddy77 View Post
You should be fine with the Marantz SR6011. But only after an update sometime later this year. See here: http://marantz-uk.custhelp.com/app/a...-compatibility

It's an odd situation. Dolby Vision seems to be designed with compatibility back to HDMI 1.4b. But when the Chromecast Ultra came out (the only external device available right now that supports DV), people noticed that it only passed DV when directly connected to their displays. Whenever it was hooked up to a receiver, it would not pass it. Sound and Vision had a Q&A article addressing this: http://www.soundandvision.com/conten...ion-compatible So it doesn't seem as simple as it should. But again, Denon/Marantz said they'll update the SR6011 among others.

No one knows when HDMI 2.1 will make its way to devices. But it is only needed for other, more rare forms of HDR: Technicolor, Philips, and Samsung's Dynamic HDR10.
Yeah, the simplest way of explaining it is that although the signal can be physically interpreted by the HDMI hardware going back to 1.4, you still need some form of DV chippery in line to be able to recognise it for what it is, if that makes sense.
  Reply With Quote
Old 05-07-2017, 11:03 PM   #198
PeterTHX PeterTHX is offline
Banned
 
PeterTHX's Avatar
 
Sep 2006
563
14
Default

Quote:
Originally Posted by Geoff D View Post
Yeah, the simplest way of explaining it is that although the signal can be physically interpreted by the HDMI hardware going back to 1.4, you still need some form of DV chippery in line to be able to recognise it for what it is, if that makes sense.
I don't think that's it. I think it's because receivers typically add some kind of processing, they still add an OSD when their video processors are set to "off". Firmware updates should be able to take care of it without having DV processing.
  Reply With Quote
Old 05-09-2017, 12:23 PM   #199
Funky54 Funky54 is offline
Active Member
 
Funky54's Avatar
 
Nov 2010
Florida
63
Default

I am so lost its silly. I really like the look of HDR content... but there is so little. Now it seems there is all this other stuff that will likely change. I get (I guess) that physical media is at deaths door and my attention should now be on streaming. But what about the infrastructure for streaming 8k or all these formats? It's overwhelming. As a consumer, It makes me just say No to everything. If i cant buy with confidence a new format with matching TV, Receiver and Player and expect to enjoy the best quality for at least a few years, Why buy anything. I just updated to Atmos (only to realize after the fact that there is hardly any content) and that was expensive. I sold my oppo and am just using an old Sony Bdp and a LG 1080p plasma for now. Ive been looking at Sony XBR 850D TV's... now, I'm not buying anything. I wont buy anything more until there is some stability.
  Reply With Quote
Old 05-09-2017, 02:37 PM   #200
rroeder rroeder is offline
Special Member
 
rroeder's Avatar
 
Mar 2011
Default

I hear ya I wanted a new display this year but the lack of good content has me second guessing. I do think the newer TV's and players are fine now tho and do HDR properly it seems. I would def look for Dolby vision support at this point and I imagine most newer TV's will have support for 2.1 HDR.

The audio is a nice upgrade and your receiver will work with these formats but you might have to use a 2nd audio HDMI hookup, that's what I do with my setup.

To me the gear seems A LOT more stable than a year or 2 ago but there's still not enough content that's the problem at this point. We need more quality catalog releases
  Reply With Quote
Reply
Go Back   Blu-ray Forum > 4K Ultra HD > 4K Ultra HD Players, Hardware and News



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT. The time now is 04:13 AM.