|
|
![]() |
||||||||||||||||||||
|
Best Blu-ray Movie Deals
|
Best Blu-ray Movie Deals, See All the Deals » |
Top deals |
New deals
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() $74.99 | ![]() $101.99 8 hrs ago
| ![]() $124.99 19 hrs ago
| ![]() $23.79 3 hrs ago
| ![]() $35.99 1 day ago
| ![]() $24.96 | ![]() $70.00 | ![]() $29.95 | ![]() $33.49 | ![]() $30.49 | ![]() $99.99 | ![]() $33.49 |
![]() |
#1 |
Member
|
![]()
Here's a technical question for those wiser than I.
If higher video-bitrates do not necessarily mean good PQ (as in Patriot Games [1992], which, according to Blu-ray Disc Technical Statistics has a very high bitrate of 32.95 Mbps, but which only gets a 3.5-star video rating on this website's review----and there are several other titles that rank very high when you set up Blu-ray Disc Technical Statistics to list them in order of bitrates, but which only get about 4 stars at best for video rating here, such as Cast Away [2000], Click [2006], Hunt for Red October [1990], etc), then why does a movie like Batman Begins (2005) get severly penalized in its review (3.5 stars for video transfer, explicitly for the poor bitrate, etc) for the inexcusable low bitrate, etc? I'm not here challenging the reviewers. I'm convinced they know exactly what they're talking about and more. I'm just looking for better understanding. What exactly is the relationship between video-bitrates and PQ? In contrast to Batman Begins (2005), which is 13.7 Mbps and got 3.5 stars for video transfer, the movie Troy (2004) has a bitrate of 11.77 Mbps, and yet its review got 5 stars for video transfer. I haven't seen this Blu-ray (and don't plan to because of the horrible cast...), but surely its not the exception to the rule. I expect there are other BDs out there with a relatively poor bitrate that nevertheless have very good PQ. Any info will be appreciated. |
![]() |
![]() |
#2 |
Blu-ray Samurai
|
![]()
higher bitrates don't necessarily guarantee a high score. There are other factors such as digital noise, coloring, black levels, transition between film stocks (is it noticeable when it occurs) and the dreaded DNR and EE. Knowledge of the elements used to create a movie aid in these types of opinions.
As for the bitrate itself, I'll leave that to the more knowledgeable, but I believe the bitrate also has to do with the amount of work the encoder is doing. |
![]() |
![]() |
#3 | |
Blu-ray Duke
|
![]()
this is a quote from the review on Batman Begins on why it doesnt deserve a 5 star for PQ.
Quote:
|
|
![]() |
![]() |
#4 | ||||
Member
|
![]()
That's what I already said.
Quote:
Quote:
Quote:
Quote:
|
||||
![]() |
![]() |
#5 |
Active Member
Oct 2006
|
![]()
You already know a number of the factors.
1. If the source isn't up to snuff, high bit rate won't fix it. 2. If the human watching the encode was asleep at the wheel, then poorly encoded scenes may not get re-encoded with different settings or even higher bit rates as they ought to. Batman Begins was encoded with VC-1, I think, and I know very little about it, other than it had a lot in common early on with mpeg 4. I don't know when they diverged or how much, but I know mpeg 4 part 10 (h264) has a number of settings which can affect quality outside of bitrate. If you are interested, have a look here This is specifically x264, but I expect a great deal applies to the other codecs as well. But your real question is "why do the reviewers refer to it, if it isn't the real culprit?". I don't know. Easy scape goat? It gets talked about a great deal, and there are those who think it is the single largest contributor to PQ. It is definitely one of the advantages BD had over HD DVD, and I think it gets trotted out more than it deserves to not just explain PQ, but to also imply that BD is better than HD DVD (which it is ![]() In short, while the increased bandwidth can help BD, it assumes that processes earlier in the production chain did not render it meaningless. I think the reviewers would be more accurate if they mentioned it as a POSSIBLE source of the problems without stating it as if it is the only possible source of the problems. Last edited by scott1256ca; 01-08-2009 at 03:21 PM. |
![]() |
![]() |
#6 | ||
Blu-ray Samurai
|
![]()
You said 'IF' which implies a question. I was confirming for you
![]() Quote:
Quote:
|
||
![]() |
![]() |
#9 |
Blu-ray Count
Jul 2007
Montreal, Canada
|
![]()
II guess first we need to understand compression.
If I have white text on a black screen, I don't need to say each pixel is black (i.e. pixel 1.1=black,1.2=black, 1.3=black.... 2.1=black....) what you do is compress is block 1.1=black, 1.2=black...) a block represents many pixels (for example an 8x8 block represents 64 pixels). if you have a O’Brien type of example where the lips on a pic are replaced by live video to mimic talking, you don't need to redefine everything, you just encode the changes in the lip area and the decoder knows that the rest is the same. These are the two biggest tools in a codecs arsenal (even though very simply put above). The next thing you need to realize is that life is not that simple, for example even on simple text on black if it is from a film source then there will be film grain and so even the all black part (in the example above) won't be all black (i.e. the same value of black). The same thing could happen with the lips situation (lets face it film grain is not only on black but all over the place) so the encoder could say (In the first case) this block has different blacks (some darker then others due to grain) but they are close enough and we will pretend they are all the same. And this is where compression destroys information So the first thing to realize is that how simple the video you want to encode the less BW you can use and the less will be affected. If you have a high action sequence, if there are many editing jumps, if the image is complex.... all of these will require higher BW to start off with. And if you have all one colour for a few seconds that takes very little ---------------- the next thing you need to realize is that BD and DVD (unlike D-VHS for example) use dynamic allocation, a sequence can use 6mbps while somewhere else it is at 35mbps so it is also important to know if you are talking average or max ----- So In essence it gets real complicated real fast and you really must know what you are comparing. -------- Now that we looked ta the compression part, let's look at the full question Compression artifacts will obviously degrade PQ, lets face it, if the whole pic is divided in 8x8 blocks then you are not getting 1080p detail but 135p ![]() ------ In the end there is only one fact you can’t get into trouble with more BW but you can with less. |
![]() |
![]() |
#10 |
Blu-ray Prince
|
![]()
In the vast majority of cases, a significantly higher bitrate video encoding will always be better than a lower bitrate video encoding, assuming everything else is held constant. The problem is that a high bitrate compression encode of a poor looking master is never going to look good. That is why you can end up in situations where Blu-rays sourced from an excellent master can average 14 Mbps and look better than an encoding averaging 35 Mbps on another film. A high video bitrate does minimize compression artifacting though, which is important. Bitrate is just one part of the overall equation that determines how well each Blu-ray looks.
|
![]() |
![]() |
#11 |
Active Member
Oct 2007
|
![]()
I'm a software engineer, and I actually work in the field of digital video... specifically in video surveillance. So I am quite intimately familiar with the effects of bit rates. In fact, the question is rather central to video surveillance systems... even more so than for theatrical movies. When you're talking about having to store a year or more worth of video from many simultaneous cameras, the question of video quality versus bit rate is very important to understand.
The crux of it is this: Some scenes compress more easily than others. Consider the extreme cases. A screen of nothing but one flat color (like black) is extremely easy to compress. There is simply no information there. On the other extreme, a sceen covered in rapidly and randomly moving sharp details like a tree swaying in the wind is very difficult to compress. There is a large amount of information and very little duplication to identify and throw out. Video compression especially relies on duplication between frames. When much of the scene is not moving, the pixels in one frame are near duplicates of the pixels in the previous frame... and thusly can be thrown out without loss. This is true even if there is some motion, such as a panning camera... since the codec will identify the corresponding pixels in the previous frame even though they are offset a bit. These are just a couple of the elements that determine how hard a scene is to compress. There are many more factors, some of which aren't obvious or easy to explain. Sometimes a scene which seems like it aught to be easy to compress is surprisingly difficult. And sometimes the reverse is true as well. It's also effected by the skill of the compressionist. A good compressionist can help the codec along and squeeze a bit more performance out of it (and conversely, a bad compressionist can make the codec's job much harder with bad settings). The codecs themselves are not all created equal either. So what this means in terms of the OPs question is that the video quality of a scene is not a function of the bit rate alone but also the difficulty of compressing the scene. Specifically, the difficulty of the scene demands a particular bit rate below which quality deteriorates rapidly. Higher bit rate is always better, but there are diminishing returns the further it rises above this threshold. And so this is why some films can have excellent PQ with a tiny bit rate while others can not. The old HD DVD argument that they had 'enough' was actually a half truth... for some films, their bit rate and storage limits were pleanty. And they liked to tout those films as proof that any more was not needed. But most films can benefit from more than HD DVD was capable of. Even Blu-rays are stretched at times. Additionally, reviewers are not just reviewing the encoding quality of a blu-ray disc... they are judging the quality of the video as a whole. And factors such as the condition of the masters, the quality of the film stock, and even the quality of the original camerawork also play a role. It doesn't matter if the encoder reproduces the original master perfectly if the camera was out of focus... it's still not going to be sharp. Anyway, I hope that answers your question. |
![]() |
![]() |
#12 |
Member
Jul 2007
|
![]()
Think of jpegs from your camera.
Unless you're a top notch expert or a total novice, you know you have some good pictures and some terrible pictures. Looking at the size of each file doesn't tell you much if the picture was well taken or not. Blu-ray disks are like that. If the original master used is too old, or the film is very noisy, or scratched, or someone made a mistake in the transfer, you can end up with a bad master from which the blu-ray disk is encoded. Like the camera, there's a lot of places for errors to creep in before the final blu-ray disk is made, and the studios are slowly learning how to avoid these mistakes or to delay disks that didn't look good enough to release (usually older movies that have some unique problems). Now suppose you take your best quality pictures and load them to photoshop on your PC, you can manipulate the pictures in many ways, for the sake of simplicity, lets say you took the pictures at UltraFine setting and they look amazing. Now in photoshop, you could be at a 90% quality setting, now suppose you save the picture several times at 80%, 60%, 40%, 20% quality settings. With a 20% quality setting, the differences in quality should be fairly obvious from the original picture (provided the picture has sufficient detail, and is not just a white table cloth against a white wall). At 40%, it might take a little time to find problems with it, at 60% it would be more difficult and at 80%, it might be near impossible. So the video bitrate is like the size of the jpeg file, the lower the bitrate, the more potential there is for loss in detail. Which is why if you like pictures with higher quality, use the best settings on your camera, and why if you want the best possible video quality from your video, all else being equal, you'd rather pick the highest bitrate the disk can allow. Like pictures though, you can take simple scenes and comples scenes, and you see that pictures of simple scenes compress to very small jpeg files and still look good, while complex scenes compress to big files and can still look bad. Video bitrate is like that, you want to have higher peak bitrate for complex scenes and low bitrates for simple ones. Which is why video average bitrate doesn't matter as much as the peak (max) bitrate. You need a peak bitrate for video to be as high as 2X average to get consistent quality for most movies. --- As for the math behind it. It's too difficult to get into that, but just realize that the video compression codecs (format) are all lossy, and the bitrate is really about how much detail you want to lose in the compression. The lower the bitrate, the most is lost in the compression, so that's really a personal choice question on what you can tolerate. Another analogy is the mp3 songs. Some think 64kbps is ok, some insist 192kbps is their minimal acceptable quality, some think 320kbps is still not good enough. It depends. |
![]() |
|
|
![]() |
![]() |
![]() |
||||
thread | Forum | Thread Starter | Replies | Last Post |
Blu-ray Movie Bitrates Here | Blu-ray Movies - North America | benes | 195 | 08-06-2019 07:12 PM |
Do Bitrates matter? | Blu-ray Movies - North America | nathan28 | 19 | 01-20-2009 03:40 PM |
Lord of the Rings and Bitrates | Blu-ray Movies - North America | darinp2 | 220 | 02-02-2008 09:09 AM |
Interesting thread on bitrates . . . | Blu-ray Technology and Future Technology | powersfoss | 5 | 10-15-2007 12:53 AM |
BitRates | Blu-ray Movies - North America | Robert Plant | 0 | 07-19-2007 06:05 PM |
|
|