|
|
![]() |
||||||||||||||||||||
|
Best 4K Blu-ray Deals
|
Best Blu-ray Movie Deals, See All the Deals » |
Top deals |
New deals
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() $29.96 10 hrs ago
| ![]() $49.99 | ![]() $29.99 11 hrs ago
| ![]() $22.49 3 hrs ago
| ![]() $36.69 | ![]() $34.96 | ![]() $31.99 | ![]() $37.99 | ![]() $39.99 | ![]() $32.99 | ![]() $29.96 1 day ago
| ![]() $14.44 1 day ago
|
![]() |
#1061 |
Blu-ray Guru
|
![]()
Of course there would be losses, my point was just that the mapping of data to higher luminance output values on the display would be roughly the same which is what HDR is to me, but I guess were talking different definitions here. You could have a limited-range, gamma-encoded image in 10bit, but I wouldn't call that HDR just because it's 10bit.
|
![]() |
![]() |
#1062 | |
Senior Member
|
![]() Quote:
|
|
![]() |
Thanks given by: | Geoff D (07-20-2018) |
![]() |
#1063 | |
Blu-ray Knight
|
![]() Quote:
EDIT: Reg. the 10 bits ... The way I understand the higher nits SDR conversions, this is what those do: If you use 300 nits, it will basically map 300 nits luminance levels (instead of 100) to the SDR max of 100 nits.You then (from my understanding) basically boost this SDR image to 300 nits, so that the 300 nits luminance will indeed be displayed roughly correctly (and everything else being a best effort as well, 0 being correct at any rate - I always cringe when people talk about better blacks with HDR). But since you're mapping luminance levels of 0-300 nits to a "container" that is supposed to display only a range of 0-100 nits, lower values will have to be mapped to wrong (for example too low) luminance levels. Something that is only "almost black" in the HDR signal may (and will imo) therefore become just black resp. 0 then. I do agree that 10 bit SDR will only give you more precision in the 0-100 nits range, but not higher luminance levels. Last edited by andreasy969; 07-20-2018 at 03:17 PM. |
|
![]() |
![]() |
#1065 | |
Blu-ray Guru
|
![]() Quote:
In such a screenshot scenario the entire chain would be something like: linear light in (light as it exists in the real world, what goes into the camera) -> camera/post stuff, eventually leading to the light being encoded in st2084 on disc -> st2084 decoded by player to linear, clipped and mapped so 300 nits becomes the max data value -> linear light encoded using gamma, saved as screenshot -> screenshot decoded using gamma by display -> linear light out (with max data value displaying as 300 nits) It's worth remembering that the 8/10/whatever bits files/streams are just sets of data values that could mean anything, and as long as the sender and receiver are in agreement you could map any range you wanted, using any transfer function, in any bitdepth container (being limited only by precision, with some combinations making more sense than others, which is what st2084 is all about.) |
|
![]() |
![]() |
#1067 | |
Blu-ray Knight
|
![]() Quote:
But I can tell that it does look very good with the highest nits display that I own at any rate, which happens to be my phone (Note 4) whose AMOLED display goes up to 750 nits in its "super bright" mode and according SDR caps really look very good on it (even the HDR caps look decent) (but unfortunately I don't watch my movies on my phone). But I still also see this problem. |
|
![]() |
![]() |
#1068 |
Blu-ray Guru
|
![]()
I've also tried it with my OLED, playing the HDR stream converted to SDR at 300 nits in madVR while boosting the SDR display settings to max, and then compared to the HDR. It's roughly close and you definitely get a proper HDR effect. Of course the higher the nit setting the less efficient gamma is going to be for encoding the image (highlights stealing too many bits), and you will probably want an OLED or a display with good local dimming, because on a regular old monitor pushing 300 nits, the black level will be so poor it's likely going to ruin the HDR contrast from the other end.
|
![]() |
![]() |
#1069 |
Blu-ray Emperor
|
![]()
I was basically doing this two years ago when watching converted SDR UHDs on my non-HDR Sony 4K which could hit about 450 nits peak brightness with everything cranked up, this was using my Panny UB900 to output SDR 2020. Black levels weren't a problem (dat dimming) but I did see a shit-ton of artefacts and banding on certain titles because, as you rightly say, it simply runs out of bit depth in converted gamma and so the artefacts buried in those highlights were brutally exposed, along with heavy banding in stuff like the Peanuts Movie.
If the SDR conversion were first overdriving the HDR signal to at least 11 bits then gamma artefacts would be less of a problem, you'd be giving the highlights in the 10-bit HDR signal a roughly similar amount of bit depth as in gamma so when it gets mapped across it's a better fit (this is also why the processing is so shit on certain HDR TVs, they still operate in gamma and when the HDR is converted across it's not given enough headroom in the bit depth to map across properly). |
![]() |
![]() |
#1070 |
Blu-ray Knight
|
![]()
I tend to re-visit my own 1000 nits caps on my phone and yes those conversions look great and like HDR with according brightness indeed. But I always thought they would still "crush" things. It's not that I perceive it that way, it's just what I think "should" happen. When doing the screenshots, the main purpose is to show the increased highlight detail though. That it actually also looks great on an according displayl, is sth. I noticed only later. (And my projector is annoying me even more since then.)
|
![]() |
![]() |
#1071 | |||||
Site Manager
|
![]() Quote:
So I wouldn't call it pushing. (btw 500 is +0.66 f/stop than 320, 100 is -1.66 f/stop). I'm waiting for 1000 nits ones*. Quote:
No, the colors are correct here. The colors within the gamut of your monitor are displayed correctly. If you have a P3 computer monitor (like previously mentioned iMac, even iPhones and tablets are getting on the P3 bandwagon) since the colors within the triangle will be correct, you will see all the P3 colors. If you have an Adobe 1998 comp monitor (a wide gamut monitor with less saturated red than P3, but a more saturated green between P3 and 2020) (the green that the SONY OLED Pro has) the colors within the A1998 triangle will be correct, and if the disc has any A1998 greens you'll see those. If you have a rec709/sRGB monitor the colors within the triangle will be correct, outside the triangle will be at the triangle boundaries. As for brightness if you display them at 320 nits, (and probably within a f/stop of that) the brightness and contrast of the image you see will be correct. (unless you have the gamma of your monitor out of whack. Indeed if you had 2 same model monitors, at the same peak brightness for 100%RGB, one calibrated correctly for 2.2 and another correctly for 2.4; these 320s should look the same on both.) Quote:
Now the question actually is if changing that level of quantization (10bit video levels HDR to 8bit PC levels HDR) might make the image and tones more visibly crude or not compared to the original HDR quantization, which is one thing, but that doesn't change an HDR image into an SDR per se. The image still has the rec2020 colors and HDRtone encoding, on the properly done screenshot, so it is HDR, just 8bit HDR. Instead of 10bitHDR or 12bit HDR. Bits doesn't make HDR, as there is 10bit SDR, 12bitSDR, even 16bitSDR. In Photoshop you can change your settings to 16bits. 16bits setting doesn't make an image HDR. What makes it HDR is the tone curve and display of higher range. Highlight display (on the 320) is +1.67 more range. The only issue about doing that on 10-bit HDR to 8-bit HDR screenshots is that the tone discrimination, (levels), is being reduced. As I said this could make the images visibly cruder but that of itself doesn't affect the range of highlights to shadows directly. Let's address this anyway. On HDR-10, for 50-100 nits (Video levels levels 449-509), one f/stop; we actually have 60 video levels. So we have 1/60 of an f/stop gradations. On 8bit PC levels HDR, from 50-100 nits (about PC levels 112-129) there are about 18 levels. So we have 1/18 of an f/stop gradations. 1/18 f/stops is the difference between distinguishing 100nits from 104nits if you can check that. Your camera normally can change exposure in thirds of an f/stop (100 nits <-> 126 nits). On a photographic print diffuse whites were usually printed about 1/6 of an f/stop darker (0.16 f/stop) than specular highlights so they could be told apart as that was considered the minimum variation needed to distinguish (discriminate one from the other) specular highlights from non specular whites (1/6 is the difference between 100 nits vs 113 nits) (So you see prints have extremely low SDR highlights ![]() 1/18 is 3 times finer gradations than 1/6. I'm not saying it won't be noticeable, but presently that's the limit we have to abide for on screenshots and jpgs. (I'm waiting for 10bit jpgs* ![]() Some posterization and banding might show up. But that by itself doesn't change HDR to SDR. What it would change is Smoothness of Gradation. Now what about crushed blacks. That is always a concern. I pixel-peeped. This is a screenshot of the lowest blacks brightened to see how low 'til what level a level was distinguishable from the bottom end from a familiar pattern. screenshothot2018-07-21at3.52.20amtoshowshadowdetailofhdrjpg.jpg As you can see on the 8bitHDR shot 0.005 nits is distinct from black. Not 0.001 tho. But science has its limits. Be aware this is a brightened screenshot of an screenshot of the actual display of the 4K screenshot by the browser. So it's not the jpg being brightened and screenshoted, it's what you see on the screen screenshoted by the PC and then that digital screenshot of the computer display was brightened and then that was screenshoted again by the PC. (This is done to check if the actual values on the 320 screenshot as seen on the web browser are getting trough so it's the digital values/brightness that your computer is sending to your screen and what you would see if you had Superman's telescopic brightener vision (Kryptonian Carrot Vitamin A) and the display/system is capable of displaying it. If you have a 320 nit display, dividing 320/0.005 = 64,000:1 range above black shadow detail. I think that's ok. Now if you have strict gamma 2.2 but the monitor has 1000:1 CR probably everything below 0.5nit is black. (If you have a 4000:1, everything below 0.1, etc. etc.) (Most people have a curve (like sRGB2.2 which is not really g2.2) elevating blacks or they actually increase the brightness (black level) somewhere to get around this low level limit ands see shadow tones. But in the strick sense of the word that would not really show the exact contrast the image has). Quote:
(I asked Geoff what he was seeing in the hopes it was something I see. The ones here are not perfect by any means but no one has yet pointed out 2 things on them that I've caught. The poor engineer is always having to retest things to check why. ![]() In any case tho there's always been the disclaimer below [Show spoiler] , which is true (hint I found 4Ks are much much sharper than screenshots when they are moving around because eye integration, aggregation and circles of confusion, motion blur and all that jazz), so apart that, I think these are very close within limitations.Quote:
I'm not talking about others I've seen, which seem to me their aim is to make them be more or less 100nit/709 SDR-ed like you say. Have, 4K, HDR contrast, +1.67 increased highlights (for now*) and rec2020, in PC 8bits (for now*) ![]() |
|||||
![]() |
Thanks given by: | andreasy969 (07-21-2018) |
![]() |
#1072 |
Blu-ray Knight
|
![]()
@Deciazulado
Thx for your detailed response. We are basically saying the same thing though. ![]() I have my PC monitor calibrated (not professionlly mind you) for 100 nits content (resp. for BDs), which results in a brightness setting of only 32/100, so I for one have to "push" its brightness even for my 200 nits caps - that's why I used the term "push". (With its brightness "pushed" to 100 percent, my LED monitor should reach around 350 nits according to tests, but I barely use that resp. it's kind of wasted potential.) EDIT: One more thing ... Another problem with calling for ex 320 nits SDR caps "accurate" is that many people will be ignorant about the fact that they will (most lilkely) have to "push" the brightness of their monitor (I know that and am still nitpicky reg. "accurate", others don't even know the former though and will just see a "dark HDR image"). The fact that I was/still am getting the "Why is the image so dark?" even with my 200 nits caps only confirms this. PS: Zero Dark Thirty coming today. Last edited by andreasy969; 07-21-2018 at 02:07 PM. Reason: see above and typo |
![]() |
![]() |
#1073 | ||
Site Manager
|
![]() Quote:
4K x PQ x high specular highlight nits x rec2020 x 10 bits = 4K UHD HDR We aim to reach as much of that that as technically feasible ![]() Quote:
As you haven't said that the site's review's 4K images aren't grey flat and muddy (as in most colors are unsaturated tans or brownish ![]() ![]() When BDs came out, monitor and web browser technology were up to date and capable, but since UHD with all the improved components making it up, it has been more of spurts in growth. As I said, trying to reach as much of that as technically feasible. And when the majority of computer monitors get brighter and jpgs or bandwidth makes it possible for 10bit that will be updated. I just checked the specs of the last few iPhones as example 4 390nits 709 5 560nits 709 6 550nits 709 7 600nits P3 8 530nits P3 X 570nits P3 (w/incorporated system wide colour management into iOS since version 9.3) High end UHDTVs are P3 and about 700-1400 nits (which is 1000 nits +/- 0.5 f/stop) but desktop computer monitors are not there yet. 500 is too much of a small change to go from 320 to 500 (+0.66 brighter). So when the overall average gets to ~1000 which is the same difference between 320 as 320 is to 100, another jump in HDR8bit pics ![]() Anyway the function of shots in reviews and such is to give you a good idea of what you're getting instead of flying blind ![]() |
||
![]() |
![]() |
#1074 | |
Site Manager
|
![]() Quote:
![]() Most people that complaint must have the monitors optimized at 100nits (or compare the 4K screenshot directly to the 100 nit BD one, and the BD one will look to them twice as bright a "200" one or 3 times as bright as a "300" one, etc. (32 times as bright compared to a 10000 one I posted on MI5s thread!) This even happens with people with UHDTVs: they put the BDs with 100 nit 100% whites at the maximum nits of the TV lets say 1000 nits, then switch to the UHD, which will have the UHD image darker, with the 100 nit 51% whites at 100 nits and go: UHDis a scam it looks 10 times darker than the BD! ![]() (Which is of course lack of consumer education or orientation about all these extra highlights and rendering intent, so they don't know that on their 1000nit room warmer they had to adjust the settings's contrast slider down for the BD input till it's about 100 nits. Is like when we got BDs in our 1080 HDTVs and people had the sharpness maxed out cus they were watching 480 DVDs and NTSC broadcasts. "But but the Blu-ray looks horrible, look at all the grain and noise! The DVD looks best".) Last edited by Deciazulado; 07-21-2018 at 03:04 PM. |
|
![]() |
![]() |
#1075 |
Blu-ray Knight
|
![]()
Well, my disclaimer actually even contains the words "too dim" (in bold) by now. But (some) people still won't get it.
![]() EDIT: I also appretiate your screenshots. Forgot to mention that as well. Last edited by andreasy969; 07-21-2018 at 02:56 PM. |
![]() |
Thanks given by: | Deciazulado (11-05-2018) |
![]() |
#1076 |
Site Manager
|
![]()
Geoff et all
Yes that's posterization/clipping/extreme compression/out of bound tones etc. Mostly is in bright skies, where the sky values (which are composed of RGB triplets) and clouds values reach the ceiling (of the screenshot parameters). Same as like the above-X-nits tones (let's visualize them first as grey tones) are extremely compressed at the "ceiling", so far as to enable to see the tones below them in correct fashion, the colors of the triplets and the color triplet parts bunch up at the ceiling and might do at different points and rates. So sky which is bright clouds and blues and cyans (and did give trouble on BDs too at the pre-UHDhistory) (cyan is the addition of green + blue) well cyan happens to be the second brghtest color (yellow is first) and also the, one which is more troublesome to reproduce its full saturation in RGB color spaces (rec709's cyan is kind of grayish). (Dunkirk's sky is yellow greenish so if the high brightness R or G components hit the ceiling it would veer too while the blue part is a lower value level) Etc. etc. So you go from 10bit levels to 8 bit levels which posterizes (decimates) 3-4X the tone/color gradations with btw the way the eye works it's more discriminatory of tones (small density variations) on brighter values (That's one reason people are always complaining about sky reproduction: even in film, be it b/w or color people tend to find sky areas "too grainy": they can perceive the grain there more. Add to that 8-bit tone decimation, the expansion to make the image look back correct in brightness and then the highest tone values hitting the ceiling, lets say a part of the sky area has one of the RGB values go much higher than the others R 50 G 125 B 95 it may end up on the ceiling as R 50 G 100 B 90 so it veers off-color as it falls on the highlight compression curve but an adjacent part of the area has lower different values of colors so it's not squeezed as much or its RGB components don't have the severe modification as the other part and doesn't veer as much or none at all (Don't know if you are following my explanation is difficult to describe color with words). So you get splotchiness. Banding. So in an example one sky tone might veer to blue or green or grayer, greens to cyan or yellow, etc, an area ending least saturated might be seen as the complementary color next to the saturated color. etc. The vertical alternating lines in the bright sky in Dunkirk are interesting, might be an artifact of the signal interacting with the tech, or it might just be there as part of the compression (some kind of compression dithering?) but imperceptible if seen at 10bits on without tone compression/decimation with full displayed high nit highlights breathing. The first thing is to look at the raw hdrs but as they are not 10bit one can't be 100% certain if you see it there (however faint it might be) that it's not from the 10->8bit reduction alone or in the actual signal. I mentioned I've caught a couple of things when checking test patterns. Sometimes there are small variations (1 level) in some greyscale patterns but as there are not many pattern variations on discs you can't double check in the colors (if they affect the colors, affecting one level of the triplet). I made the engineer get and check another unit but the same small level variation were found, so to further confirm would be to get different brands of the stuff. You might think 1 level out of place out of 256 x 3 might not do much, but as I said the HDR expansion is actually 100 fold x the 3X color expansion so any variation however minute, will be exacerbated in any HDR system/viewiing. In some levels, basically the HDR curve expand contrast ("gamma") a lot like viewing things at gamma 4 instead or 2, and the rec2020 expands the colors like saturating things double or more (There are some shots when you compare the expanded signal to the grayish HDR where it looks that it goes from full beautiful blues/cyan to a black and white image that you'd think has no color blue in it. I'll look for it). Tolerances have to be tightened and reduced. That's why I wish we had 10 bit jags and a 10-bit chain and all that jazz. And andreasy969 demands it! ![]() 12 bit internal monitors might also be good. (And DV is 12-bit) And also the precision of the computer system your viewing this too. I see them in a 10bit OS with a 10bit browser and image editor with a 10-bit monitor. But on another I see banding increases in the same image. So to me somewhere in that one app the code is choking on the color managing/precision, purple and yellow splotches. An example I can even see in my system and point out is look at the Rock's right cheek in BitRate's cropped UHD screenshot where I applied the formula. Look closely, I was wondering if someone would notice: has splotches in the skin tones. But the Br.com ones don't have these obvious splotches or are very minimized (in my system view). Now I'm not saying BitRate's image is bad or wrong, I was pleasently surprised it looked so similar to the others so it was correctly recorded. But why do I see these splotches there then if it's the same 8bits? It makes me think that somewhere along the chain from BitRate's digitizing to here the precision of that particular jpg decreased a certain amount or level. Maybe it was the jpg encoder or the host site. And also if I view/see some banding/splotches on images on my 10/10/10 chain, what about people that have 8 bit systems, 8 bit browsers or image editors, 8 bit monitors or any combination of the 3? Maybe they see horrific oil diffracted skies that I don't see on my system, instead of the amount of "posterization" I do see? That's why levels of precision are important, and 10 bits are better than 8 and 12 > 10, etc., better precision, less divergence, tighter tolerance, even when things veer off tolerances. Last edited by Deciazulado; 07-21-2018 at 05:29 PM. Reason: Ahh the days one inserted a VHS and hit play... |
![]() |
![]() |
#1077 |
Blu-ray Knight
|
![]()
Zero Dark Thirty
When I first looked into the disc back then, I thought that it looked great already (even in 1080p/SDR) and didn't remember the BD looking THAT good (and also said so). But I never compared. Doing the caps confirmed what I thought: Another 2K DI UHD that is (relatively) "destroying" the BD throughout imho (source noise intact, real HDR, colors). I also have some comments below. (screenshotcomparison.com just gave me an error when I uploaded the URLs and I won't be be doing this twice - sorry) BD (upscaled) left, UHD-BD (madVR/SDR/200 nits) right Disclaimer as to why the UHD-BD images may appear to be too dim: [Show spoiler] 1. ![]() ![]() 2. ![]() ![]() 3. ![]() ![]() 4. ![]() ![]() 5. (#3 1000 nits) ![]() ![]() ![]() 6. (#3 1000 nits) ![]() ![]() ![]() 7. ![]() ![]() 8. ![]() ![]() 9. ![]() ![]() 10. (#3 1000 nits) ![]() ![]() ![]() 11. ![]() ![]() 12. ![]() ![]() 13. different framing - my best guess is they corrected it on the UHD (#3 1000 nits) ![]() ![]() ![]() 14. (#3 1000 nits) ![]() ![]() ![]() 15. just look at the wall, especially with the 1000 nits cap (#3 1000 nits) ![]() ![]() ![]() 16. ![]() ![]() 17. ![]() ![]() 18. ![]() ![]() 19. ![]() ![]() 20. (#3 1000 nits) ![]() ![]() ![]() 21. ![]() ![]() 22. ![]() ![]() 23. ![]() ![]() 24. ![]() ![]() 25. ![]() ![]() 26. ![]() ![]() 27. ![]() ![]() 28. ![]() ![]() 29. (#3 1000 nits - in this case included because of the red digital display (did the same with 2 other caps)) ![]() ![]() ![]() 30. ![]() ![]() 31. ![]() ![]() 32. (#3 1000 nits) ![]() ![]() ![]() 33. the UHD looks somewhat filtered here, but the BD is even worse, so I guess it's source related ![]() ![]() 34. ![]() ![]() 35. (#3 1000 nits) ![]() ![]() ![]() 36. ![]() ![]() 37. (#3 1000 nits) ![]() ![]() ![]() 38. ![]() ![]() 39. just green on the UHD resp. yellow "gone" (#3 1000 nits) ![]() ![]() ![]() 40. (#3 1000 nits) ![]() ![]() ![]() 41. ![]() ![]() 42. ![]() ![]() 43. ![]() ![]() 44. ![]() ![]() 45. I found the title to be sharper (incl. aliasing) on the BD and I also think the BD should better be compared with the 700 nits version (#3 700 nits) ![]() ![]() ![]() |
![]() |
Thanks given by: | aetherhole (07-23-2018), chip75 (07-29-2018), HD Goofnut (07-25-2018), juanbauty@yahoo.es (07-23-2018), NDcowboy (07-21-2018), Pieter V (07-21-2018) |
![]() |
#1078 |
Site Manager
|
![]()
Ok here's what I see on my screen. This is a capture off my screen of the jpgs
![]() I see some posterization/banding in the clouds but minimal (at 2PH, equivalent to 130" at 10 feet) and up close (4PH equivalent to 260" at 10 feet) it looks more like noise/grain (I'd call that posterization noise cut it's splotchy and iregular unlike grain which is small and regular). Knowing this is from an 8bit jpg conversion from a 30MB HDR screenshot that's being transformed to look normal on a normal monitor I don't think is perfectly smooth but considering the subject matter (blurry feathery clouds with smooth gradations of tones) it's not much bothersome. But maybe this is all in vain, and you might be seeing posterization bands and rainbows galore in the clouds above the planes in the png screenshot I took of the jpg screenshot. ![]() On this I see more splotchiness and posterization on the kind of smooth overcast clouds sky, kind of what I would see on certain CGI animated blu-ray's smooth gradations because of the compression of tones. But again it's bright skies getting a "grey ramp" of bright to dark. What's more troublesome is if I blow up the image to 4PH (again equip to 260" at 10 feet) (2x on on my computer screen to 4320 x 7680) i see this vertical lines like a faint vertical dither pattern, but at 2PH (equiv 130" at 10 feet, 1x on my computer screen) they are fainter but sharper. They look w/o knowing what I see like dirty digital noise. Oh and I see the pink splotches.. Well you can't make an HDRette without breaking any levels But Again I wonder if the mild artifacting I see is what you see. I'm already getting blind by looking at bright skies for a long time ![]() ok here's another done from the same HDR (the grey one) on the site but applied the for 10Knit transform straight with no highlight curves, which makes it look almost black, then brightened up in Phshp, much less irregular color splotchiness and the vertical lines pattern almost gone but then is brighter contrastier than the others, ![]() guess I can't wait till we move to 1000nits ones with less compressed highlights curve. 10bits and lots of 1000nit puter monitors ![]() Last edited by Deciazulado; 07-22-2018 at 04:33 AM. |
![]() |
![]() |
#1079 | |
Blu-ray Knight
|
![]() Quote:
![]() HDR10 doesn't use 10 bit for no reason. And DolbyVision uses even 12, because even 10 bit is not perfect, which becomes more than obvious, if you consider that HDR10 uses basically half (0-519 resp. 520 values) of the available range for just the SDR range of 0-100 nits already (and 520-1023 resp. only 504 values for the rest of 100+ up to 10,000 nits). I think the reasoning for this is that precision isn't as important in the higher nits regions (?), but I still think that this is the one (and only) advantage DV actually has over HDR10 - not the dynamic meta data, which I consider rather redundant - just give me proper image processing instead please. |
|
![]() |
Thanks given by: | Geoff D (07-24-2018) |
![]() |
#1080 |
Blu-ray Knight
|
![]()
A Quiet Place
Watched this yesterday and thought it looked wonderfully filmic. http://screenshotcomparison.com/comparison/117287 BD (upscaled) left, UHD-BD (madVR/SDR/200 nits) right Disclaimer as to why the UHD-BD images may appear to be too dim: [Show spoiler] 1. ![]() ![]() 2. ![]() ![]() 3. (#3 1000 nits) ![]() ![]() ![]() 4. (#3 1000 nits) ![]() ![]() ![]() 5. ![]() ![]() 6. ![]() ![]() 7. ![]() ![]() 8. ![]() ![]() 9. ![]() ![]() 10. ![]() ![]() 11. (#3 1000 nits) ![]() ![]() ![]() 12. ![]() ![]() 13. ![]() ![]() 14. ![]() ![]() 15. ![]() ![]() 16. (#3 1000 nits) ![]() ![]() ![]() 17. ![]() ![]() 18. ![]() ![]() 19. ![]() ![]() 20. ![]() ![]() 21. ![]() ![]() 22. ![]() ![]() 23. (#3 1000 nits) ![]() ![]() ![]() 24. ![]() ![]() 25. ![]() ![]() |
![]() |
Thanks given by: | aetherhole (07-23-2018), chip75 (07-29-2018), HD Goofnut (07-25-2018), juanbauty@yahoo.es (07-23-2018), Lionet (08-06-2018), NightKing (07-24-2018), Pieter V (07-22-2018) |
|
|
![]() |
![]() |
|
|