View Single Post
Old 06-25-2007, 06:15 PM   #10
Shadowself Shadowself is offline
Senior Member
 
Shadowself's Avatar
 
Sep 2005
Default Ah...

Quote:
Originally Posted by movies3 View Post
why would it take 30 yrs to get 1080p on tv broadcast? Since its already out on bluray just wondering. If it did take that long we would already have another format to talk about and buy.
Short answer: Receivers are not good enough yet and we can't squeeze in enough bits yet.

Long answer:
1. 1080p requires more bits to be sent over the air than either 720p or 1080i. This requires a better carrier to noise ratio (Eb/No) at the receiver than is currently the standard (or expected to be in the near future).
2. Also the occupied bandwidth is more than is currently authorized.

We can mitigate the first one by better forward error correction codings, but that requires a fair amount of horsepower in both the encoding and decoding ends. The horsepower required is not currently available at consumer electronic pricing.

We can mitigate the second one by using higher order encodings, but that exacerbates the first one.

Both of these will eventually happen, but it will take years for it to trickle down to the consumer.

It won't take thirty years as the long duration issue is a technical one (getting enough bits through in a discernible manner) rather than a political one (getting more or different bandwidth allocated by the FCC). However, I don't expect 1080p for *at least* five years from the digital switch date and maybe not for 10 years after that date.

Quote:
Originally Posted by donricouga View Post
Problem is HD Broadcast is highly compressed. This results in artifacting and macroblocking. Its more evident in broadcasts with alot of motion in it. I.e. a sporting event, moving water, or a chase scene. On top of that, we can't even enjoy lossless audio from broadcast.
Over the air is not significantly more compressed than some Blu-ray disks have been. The issue is real-time or near real-time compression versus having enough time to tweak the compression.

The algorithms implemented for real-time or near real-time compression are not as efficient as those used for Blu-ray disks. Additionally, there is a fair amount of hands-on tweaking of the codecs for Blu-ray disks, some times taking several days per hour of movie. This time is not available for TV shows... especially "live" shows like the evening news and such. Plus, the computational horsepower to implement truly real-time MPEG-4, Part 10 compression on 1080p is more than can be inexpensively attained right now. The only ones of which I know are custom implementations in the latest and greatest field programmable gate arrays. They are quite costly implementations. Not something the local news station are going to implement. Will we get there? Yes, but not in the next year or two.

Additionally, many cameras used for HD transmissions are not 3 CCD/CMOS FPA cameras but instead use a "Bayer Array" with a single CCD. Look up Bayer Array and you'll see that you don't get as much information with that system (and thus start with a softer image to encode) than when you can capture each and every pixel at 3 (or more) colors.

Also the issue with lossless audio is, as above, the bandwidth available. Lossless audio -- especially 5.1, 7.1 and such -- just requires too much bandwidth to fit within the authorization.
  Reply With Quote