Best Blu-ray Movie Deals


Best Blu-ray Movie Deals, See All the Deals »
Top deals | New deals  
 All countries United States United Kingdom Canada Germany France Spain Italy Australia Netherlands Japan Mexico
Willy Wonka & the Chocolate Factory 4K (Blu-ray)
$25.99
5 hrs ago
Space Jam 4K (Blu-ray)
$25.99
22 hrs ago
Parks and Recreation: The Complete Series (Blu-ray)
$74.99
4 hrs ago
Drive (Blu-ray)
$27.99
1 day ago
Godzilla vs. Kong 4K (Blu-ray)
$42.99
 
The Last Starfighter (Blu-ray)
$18.60
9 hrs ago
Tremors (Blu-ray)
$28.10
9 hrs ago
The Hobbit: The Motion Picture Trilogy 4K (Blu-ray)
$73.43
 
Saw 4K (Blu-ray)
$15.99
11 hrs ago
The Conversation (Blu-ray)
$11.80
9 hrs ago
Dragonheart: 5-Movie Collection (Blu-ray)
$19.99
 
Final Fantasy VII: Advent Children Complete 4K (Blu-ray)
$24.49
 
What's your next favorite movie?
Join our movie community to find out


Image from: Life of Pi (2012)
Old 01-08-2009, 02:22 PM   #1
mthopper mthopper is offline
Member
 
mthopper's Avatar
 
Sep 2008
Georgia
70
Default Question about bitrates and PQ

Here's a technical question for those wiser than I.

If higher video-bitrates do not necessarily mean good PQ (as in Patriot Games [1992], which, according to Blu-ray Disc Technical Statistics has a very high bitrate of 32.95 Mbps, but which only gets a 3.5-star video rating on this website's review----and there are several other titles that rank very high when you set up Blu-ray Disc Technical Statistics to list them in order of bitrates, but which only get about 4 stars at best for video rating here, such as Cast Away [2000], Click [2006], Hunt for Red October [1990], etc), then why does a movie like Batman Begins (2005) get severly penalized in its review (3.5 stars for video transfer, explicitly for the poor bitrate, etc) for the inexcusable low bitrate, etc?

I'm not here challenging the reviewers. I'm convinced they know exactly what they're talking about and more. I'm just looking for better understanding. What exactly is the relationship between video-bitrates and PQ? In contrast to Batman Begins (2005), which is 13.7 Mbps and got 3.5 stars for video transfer, the movie Troy (2004) has a bitrate of 11.77 Mbps, and yet its review got 5 stars for video transfer. I haven't seen this Blu-ray (and don't plan to because of the horrible cast...), but surely its not the exception to the rule. I expect there are other BDs out there with a relatively poor bitrate that nevertheless have very good PQ.

Any info will be appreciated.
  Reply With Quote
Old 01-08-2009, 02:34 PM   #2
Slec Slec is offline
Blu-ray Samurai
 
Slec's Avatar
 
Dec 2007
Baltimore, MD
29
241
7
30
Default

higher bitrates don't necessarily guarantee a high score. There are other factors such as digital noise, coloring, black levels, transition between film stocks (is it noticeable when it occurs) and the dreaded DNR and EE. Knowledge of the elements used to create a movie aid in these types of opinions.

As for the bitrate itself, I'll leave that to the more knowledgeable, but I believe the bitrate also has to do with the amount of work the encoder is doing.
  Reply With Quote
Old 01-08-2009, 02:36 PM   #3
MOONPHASE MOONPHASE is offline
Blu-ray Count
 
MOONPHASE's Avatar
 
Jun 2007
California
4
412
780
19
Default

this is a quote from the review on Batman Begins on why it doesnt deserve a 5 star for PQ.

Quote:
Watch the scene on the frozen lake, where Bruce spars with Ducard. The color and detail is a touch muted, although the resolution is actually quite good. The tonal balance, from light to dark, appears lifelike, but definition is subdued as if a thin layer of plastic is placed over the screen. It is this "veiling" from the low bitrate transfer that hinders the picture and separates it from reference quality BDs like No Country for Old Men. Night scenes show good black level and adequate 1080p detail, far surpassing the DVD version. Since darkness and gradients of black are featured so prominently in the film and in Batman's accouterments, it becomes a necessity to resolve objects in night scenes. Here again, the BD does an adequate job, and many night scenes show good depth. Unfortunately, some life appears sucked out of the picture.
  Reply With Quote
Old 01-08-2009, 02:56 PM   #4
mthopper mthopper is offline
Member
 
mthopper's Avatar
 
Sep 2008
Georgia
70
Default

Quote:
Originally Posted by Slec View Post
higher bitrates don't necessarily guarantee a high score.
That's what I already said.

Quote:
Originally Posted by Slec View Post
There are other factors such as digital noise, coloring, black levels, transition between film stocks (is it noticeable when it occurs) and the dreaded DNR and EE. Knowledge of the elements used to create a movie aid in these types of opinions.
Of course these are all factors in the PQ. That goes without saying.

Quote:
Originally Posted by Slec View Post
As for the bitrate itself, I'll leave that to the more knowledgeable, but I believe the bitrate also has to do with the amount of work the encoder is doing.
Sigh...

Quote:
Originally Posted by MOONPHASE View Post
this is a quote from the review on Batman Begins on why it doesnt deserve a 5 star for PQ.
Uh huh. Here's the rest of the quote:

Quote:
One of the flagship Warner releases on HD DVD, Batman Begins was delayed on BD to better capitalize on interactive features. Unfortunately, Warner appeared to make no effort to capitalize on Blu-ray's overarching feature: superior capacity. The bitrates of the BD are no better than that of the HD DVD. That means the picture and sound are also held back by the HD DVD, and appear to have been sourced from the same transfer. The video and audio are not bad at all, but Batman Begins is not up to reference quality by any accurate assessment. The most frustrating part of this observation is not just waiting more than a year for no significant bitrate improvement over the HD DVD, but in comparing Batman Begins to a six-minute prologue of The Dark Knight, which is included on the BD. Every frame of the prologue is truly reference quality, generating a stark contrast to the comparably veiled and constricted dynamics of the main feature.
The point is that the poor bitrate is clearly being cited as a detriment to the PQ of Batman Begins. But why? My original question remains: What exactly is the relationship between video bitrate and PQ?
  Reply With Quote
Old 01-08-2009, 03:19 PM   #5
scott1256ca scott1256ca is offline
Active Member
 
Oct 2006
Default

You already know a number of the factors.
1. If the source isn't up to snuff, high bit rate won't fix it.
2. If the human watching the encode was asleep at the wheel, then poorly encoded scenes may not get re-encoded with different settings or even higher bit rates as they ought to.

Batman Begins was encoded with VC-1, I think, and I know very little about it, other than it had a lot in common early on with mpeg 4. I don't know when they diverged or how much, but I know mpeg 4 part 10 (h264) has a number of settings which can affect quality outside of bitrate.
If you are interested, have a look here
This is specifically x264, but I expect a great deal applies to the other codecs as well.

But your real question is "why do the reviewers refer to it, if it isn't the real culprit?". I don't know. Easy scape goat? It gets talked about a great deal, and there are those who think it is the single largest contributor to PQ. It is definitely one of the advantages BD had over HD DVD, and I think it gets trotted out more than it deserves to not just explain PQ, but to also imply that BD is better than HD DVD (which it is ) and that if they had utilized the bandwidth available to BD, but not to HD DVD, they would have had a great product on BD that would not have been possible on HD DVD. I don't necessarily buy into that argument, at least not in all cases.
In short, while the increased bandwidth can help BD, it assumes that processes earlier in the production chain did not render it meaningless. I think the reviewers would be more accurate if they mentioned it as a POSSIBLE source of the problems without stating it as if it is the only possible source of the problems.

Last edited by scott1256ca; 01-08-2009 at 03:21 PM.
  Reply With Quote
Old 01-08-2009, 03:49 PM   #6
Slec Slec is offline
Blu-ray Samurai
 
Slec's Avatar
 
Dec 2007
Baltimore, MD
29
241
7
30
Default

Quote:
Originally Posted by mthopper View Post
That's what I already said.
You said 'IF' which implies a question. I was confirming for you
Quote:
Sigh...
Source:
Quote:
Those who know me quite well are aware of my position regarding the allocation of plenty bandwidth instead of barely enough...I guess nobody argues that "more bandwidth equals less quantization equals objectively better PQ in context with any given source".

But the point is:

Is the soft PQ because of the low bit rate or is the bit rate low because the source is soft?

What needs to be understood is the following - Variable Bitrate encoding (like is used with any HDM release) is per definition a concept of CONSTANT QUALITY. To put it very simple - The compressionist sets a certain quantization parameter for any given release - this defines the relative quality in relation to the source - and the quantization level is directly linked to the available bandwidth budget and inherent characteristics of any given source (to put it very simple again: a highly detailed and dynamic source requires more bandwidth for any given quantization level than a more static one).

So - A low bitrate as isolated parameter DOESN'T tell you anything about the "quality" of any given encoding. Ergo an inherently soft,static source (=a movie shot this way on purpose or just a bad/dated transfer or for whatever other reason) is encoded at a relatively low bitrate level. BUT it can still be encoded with very low quantization -> ergo it is very close to the quality of the source DESPITE the low bitrate. Such an transfer can objectively be even "better" (truer to the source) than something like DH4 or Becoming Jane - objectively meaning encoded with lower quantization.


The argument often used is that "there are softer scenes/shots in some transfers and the bitrate is relatively low during those". The argument is something like due to bandwidth limitations those scenes are encoded at "less than optimal bitrates" and therefor the "PQ suffers" (something like low bitrate = heavy AVC/VC-1 deblocking loop filtering = lack of definition/softer/smoother picture)...

Well - we are now were my post started - the basic principle of VBR encoding is constant quality - therefor any given scene of a encoding is (give or take) encoded at a similar level of transparency to the source. Therefor if a given transfer shows "great shots with outstanding definition and a very high applied bitrate (which comes quite naturally with these exceptional scenes) " they are encoded at a very similar quantization level than all the softer/"lacking" shots of an given transfer.

This leads to the logic conclusion that:

- these "softer/smoother/less detailed" scenes are much easier to encode and therefor less bitrate is necessary than with more "difficult scenes". Although both these extremes show the same relative quality to the source.

- therefor the next logic conclusion is that bitrate limitation/starvation shows itself during scenes that are the "toughest to encode" and certainly NOT during easy shots (eg. out of focus, static,...). They don't look "nice" because they are shot this way and/or the transfer itself isn't very good - in both cases the source is to be blamed and NOT the encoding and therefor certainly NOT the low bitrate -> which is - following the VBR concept - just a consequence of the input.


In short: Looking at the isolated parameter of "applied bitrate per shot/scene/movie" DOES NOT conclusively tell you ANYTHING about the quality of any given encoding! It can just serve as a hint about the level of quantization among other factors.

To put it another way - Across the Universe looks pretty decent IMHO - "tack sharp" it is not nor is it "intended to be" (keyword: diffuse lightning) - BUT encoding at much higher bitrates (like 35mbps ABR) wouldBitrates certainly not change this attribute in any observable manner.
  Reply With Quote
Old 01-08-2009, 04:15 PM   #7
mthopper mthopper is offline
Member
 
mthopper's Avatar
 
Sep 2008
Georgia
70
Default

Ah yes, now we're getting somewhere. Thanks guys. Keep 'em coming.
  Reply With Quote
Old 01-08-2009, 05:41 PM   #8
ryoohki ryoohki is offline
Blu-ray Samurai
 
ryoohki's Avatar
 
May 2007
6
6
8
5
Default

The only thing high bitrate preserve is that it's the closest to the master that was used for making the Blu-ray, without inducing compression artefact like most DVD run into. If the master have DNR O Plently on it, it won't make a 5/5 PQ.
  Reply With Quote
Old 01-10-2009, 05:27 PM   #9
Anthony P Anthony P is offline
Blu-ray Baron
 
Jul 2007
Montreal, Canada
Default

II guess first we need to understand compression.

If I have white text on a black screen, I don't need to say each pixel is black (i.e. pixel 1.1=black,1.2=black, 1.3=black.... 2.1=black....) what you do is compress is block 1.1=black, 1.2=black...) a block represents many pixels (for example an 8x8 block represents 64 pixels).

if you have a O’Brien type of example where the lips on a pic are replaced by live video to mimic talking, you don't need to redefine everything, you just encode the changes in the lip area and the decoder knows that the rest is the same.


These are the two biggest tools in a codecs arsenal (even though very simply put above).

The next thing you need to realize is that life is not that simple, for example even on simple text on black if it is from a film source then there will be film grain and so even the all black part (in the example above) won't be all black (i.e. the same value of black). The same thing could happen with the lips situation (lets face it film grain is not only on black but all over the place) so the encoder could say (In the first case) this block has different blacks (some darker then others due to grain) but they are close enough and we will pretend they are all the same. And this is where compression destroys information

So the first thing to realize is that how simple the video you want to encode the less BW you can use and the less will be affected. If you have a high action sequence, if there are many editing jumps, if the image is complex.... all of these will require higher BW to start off with. And if you have all one colour for a few seconds that takes very little

----------------

the next thing you need to realize is that BD and DVD (unlike D-VHS for example) use dynamic allocation, a sequence can use 6mbps while somewhere
else it is at 35mbps so it is also important to know if you are talking average or max
-----
So In essence it gets real complicated real fast and you really must know what you are comparing.

--------

Now that we looked ta the compression part, let's look at the full question

Compression artifacts will obviously degrade PQ, lets face it, if the whole pic is divided in 8x8 blocks then you are not getting 1080p detail but 135p . But other stuff can affect PQ, even subjective taste, for example some would rather have an image that was blurred and sharpened in order to get rid of film grain, some will look at a shot which was (intentionaly or not) shot a bit hazzy and decide it was not as good.... Lets face it, after all a reviewer cannot look at the movie and say this is the way it is becuase of encoding (or pre-encoding) filters, because of over compression during the encode, because the guy missed on the focus on the camera or because it was filmed that way. In the end what he is saying is how well he thinks it looks.

------
In the end there is only one fact you can’t get into trouble with more BW but you can with less.
  Reply With Quote
Old 01-11-2009, 01:38 AM   #10
Clark Kent Clark Kent is offline
Blu-ray Prince
 
Clark Kent's Avatar
 
Oct 2007
Metropolis
2
180
Default

In the vast majority of cases, a significantly higher bitrate video encoding will always be better than a lower bitrate video encoding, assuming everything else is held constant. The problem is that a high bitrate compression encode of a poor looking master is never going to look good. That is why you can end up in situations where Blu-rays sourced from an excellent master can average 14 Mbps and look better than an encoding averaging 35 Mbps on another film. A high video bitrate does minimize compression artifacting though, which is important. Bitrate is just one part of the overall equation that determines how well each Blu-ray looks.
  Reply With Quote
Old 01-11-2009, 11:46 PM   #11
SkantDragon SkantDragon is offline
Active Member
 
Oct 2007
Default

I'm a software engineer, and I actually work in the field of digital video... specifically in video surveillance. So I am quite intimately familiar with the effects of bit rates. In fact, the question is rather central to video surveillance systems... even more so than for theatrical movies. When you're talking about having to store a year or more worth of video from many simultaneous cameras, the question of video quality versus bit rate is very important to understand.

The crux of it is this: Some scenes compress more easily than others. Consider the extreme cases. A screen of nothing but one flat color (like black) is extremely easy to compress. There is simply no information there. On the other extreme, a sceen covered in rapidly and randomly moving sharp details like a tree swaying in the wind is very difficult to compress. There is a large amount of information and very little duplication to identify and throw out.

Video compression especially relies on duplication between frames. When much of the scene is not moving, the pixels in one frame are near duplicates of the pixels in the previous frame... and thusly can be thrown out without loss. This is true even if there is some motion, such as a panning camera... since the codec will identify the corresponding pixels in the previous frame even though they are offset a bit.

These are just a couple of the elements that determine how hard a scene is to compress. There are many more factors, some of which aren't obvious or easy to explain. Sometimes a scene which seems like it aught to be easy to compress is surprisingly difficult. And sometimes the reverse is true as well. It's also effected by the skill of the compressionist. A good compressionist can help the codec along and squeeze a bit more performance out of it (and conversely, a bad compressionist can make the codec's job much harder with bad settings). The codecs themselves are not all created equal either.

So what this means in terms of the OPs question is that the video quality of a scene is not a function of the bit rate alone but also the difficulty of compressing the scene. Specifically, the difficulty of the scene demands a particular bit rate below which quality deteriorates rapidly. Higher bit rate is always better, but there are diminishing returns the further it rises above this threshold.

And so this is why some films can have excellent PQ with a tiny bit rate while others can not. The old HD DVD argument that they had 'enough' was actually a half truth... for some films, their bit rate and storage limits were pleanty. And they liked to tout those films as proof that any more was not needed. But most films can benefit from more than HD DVD was capable of. Even Blu-rays are stretched at times.

Additionally, reviewers are not just reviewing the encoding quality of a blu-ray disc... they are judging the quality of the video as a whole. And factors such as the condition of the masters, the quality of the film stock, and even the quality of the original camerawork also play a role. It doesn't matter if the encoder reproduces the original master perfectly if the camera was out of focus... it's still not going to be sharp.

Anyway, I hope that answers your question.
  Reply With Quote
Old 01-12-2009, 08:53 PM   #12
mapledragon mapledragon is offline
Member
 
Jul 2007
Default

Think of jpegs from your camera.

Unless you're a top notch expert or a total novice, you know you have some good pictures and some terrible pictures. Looking at the size of each file doesn't tell you much if the picture was well taken or not. Blu-ray disks are like that.

If the original master used is too old, or the film is very noisy, or scratched, or someone made a mistake in the transfer, you can end up with a bad master from which the blu-ray disk is encoded.

Like the camera, there's a lot of places for errors to creep in before the final blu-ray disk is made, and the studios are slowly learning how to avoid these mistakes or to delay disks that didn't look good enough to release (usually older movies that have some unique problems).

Now suppose you take your best quality pictures and load them to photoshop on your PC, you can manipulate the pictures in many ways, for the sake of simplicity, lets say you took the pictures at UltraFine setting and they look amazing. Now in photoshop, you could be at a 90% quality setting, now suppose you save the picture several times at 80%, 60%, 40%, 20% quality settings.

With a 20% quality setting, the differences in quality should be fairly obvious from the original picture (provided the picture has sufficient detail, and is not just a white table cloth against a white wall). At 40%, it might take a little time to find problems with it, at 60% it would be more difficult and at 80%, it might be near impossible.

So the video bitrate is like the size of the jpeg file, the lower the bitrate, the more potential there is for loss in detail. Which is why if you like pictures with higher quality, use the best settings on your camera, and why if you want the best possible video quality from your video, all else being equal, you'd rather pick the highest bitrate the disk can allow.

Like pictures though, you can take simple scenes and comples scenes, and you see that pictures of simple scenes compress to very small jpeg files and still look good, while complex scenes compress to big files and can still look bad.

Video bitrate is like that, you want to have higher peak bitrate for complex scenes and low bitrates for simple ones. Which is why video average bitrate doesn't matter as much as the peak (max) bitrate. You need a peak bitrate for video to be as high as 2X average to get consistent quality for most movies.

---

As for the math behind it. It's too difficult to get into that, but just realize that the video compression codecs (format) are all lossy, and the bitrate is really about how much detail you want to lose in the compression. The lower the bitrate, the most is lost in the compression, so that's really a personal choice question on what you can tolerate.

Another analogy is the mp3 songs. Some think 64kbps is ok, some insist 192kbps is their minimal acceptable quality, some think 320kbps is still not good enough.

It depends.
  Reply With Quote
Reply
Go Back   Blu-ray Forum > Blu-ray > Blu-ray Technology and Future Technology

Similar Threads
thread Forum Thread Starter Replies Last Post
Blu-ray Movie Bitrates Here Blu-ray Movies - North America benes 195 08-06-2019 07:12 PM
Do Bitrates matter? Blu-ray Movies - North America nathan28 19 01-20-2009 03:40 PM
Lord of the Rings and Bitrates Blu-ray Movies - North America darinp2 220 02-02-2008 09:09 AM
Interesting thread on bitrates . . . Blu-ray Technology and Future Technology powersfoss 5 10-15-2007 12:53 AM
BitRates Blu-ray Movies - North America Robert Plant 0 07-19-2007 06:05 PM


Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT. The time now is 04:55 PM.