|
|
![]() |
||||||||||||||||||||
|
Best Blu-ray Movie Deals
|
Best Blu-ray Movie Deals, See All the Deals » |
Top deals |
New deals
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() $82.99 3 hrs ago
| ![]() $74.99 | ![]() $101.99 18 hrs ago
| ![]() $39.02 2 hrs ago
| ![]() $124.99 1 day ago
| ![]() $24.96 | ![]() $23.79 14 hrs ago
| ![]() $99.99 | ![]() $35.99 | ![]() $29.95 | ![]() $70.00 | ![]() $24.96 |
|
![]() |
#1 |
Active Member
Jan 2007
|
![]()
The 2014 Holiday shopping season may be a good time to purchase a 4k TV, for two reasons. First, thanks to companies like Vizio, prices of this year's units have started to decline. Secondly, manufacturers may be trying to make room for the release of new models. CES 2015 is only a little more than two months away as of this post, and there may be sufficient advances in next-gen 4k TVs that many consumers may decide not to buy this year's inventory after seeing the new prototypes in January.
At least so far as I'm aware, after speaking to technicians for Sony, Samsung and Toshiba it appears that most TVs this year employ an 8-bit color processor. (It appears that a notable exception is Vizio, but that company no longer supports 3D.) We all know that when 4k Blu-ray players are released, hopefully by the 2015 Holiday shopping season, they will employ 10-bit color processing. My understanding, though, is that TVs must have 10-bit processing power to take advantage of these players. If I'm right, that leaves me to ask just how much of an improvement 10-bit will be over 8-bit. Will the difference be night-and-day, or can we live with limitations for a few years? Here's why I'm asking: My guess is that a TV such as Sony's 65" 850 might sell for as low as $2,500 this Holiday season. It doesn't have local dimming or 10-bit, but the overall picture is very good, and at that price it would be an attractive buy. I could keep the TV for three years, after which I believe OLED would be affordable and refined. I could accept the sell-off price of the Sony. This sounds like a plan, but I have to wonder if the quality of next year's 4k TVs, with 10-bit processing and full backlighting improvements, will make me regret buying this year. I certainly would enjoy hearing opinions from the informed members of this forum. Thanks in advance for all advice and opinions you may care to give. |
![]() |
![]() |
#2 |
Blu-ray Champion
|
![]()
I really think UHD (unless you are planning to game at such a resolution and have the hardware to) will not be worth it for a few years. This years uhd sets will not meet the standards and next years may not either (the standards are still not known). Once they do release you will be looking at first gen attempts at that standard and large improvements are nearly always seen in the first few cycles of such things.
How important is 10 bit color, hard to say as cinemas apparently get both 8 bit and 10 bit dcp's yet I never notice a difference. I'm guessing unless a pro calibration it will likely be rather minor (and with one it will still be minor but maybe slightly less so). But this is a guess. |
![]() |
![]() |
#4 |
Retailer Insider
|
![]()
All true, but no TV can take advantage of a 10 bit panel as no consumer content is encoded with 10 bit information.
10 bit and even higher bit rate panels will be needed as Ultra HD evolves into larger color pallets, like DCI or rec. 2020. Additionally, the reason 10 bit panels will reduce banding is because we'll finally see more steps of gradation, 8 bit panels can produce 256 shades, 10 bit displays can deliver 1024 steps. 10 bit panels are also capable of displaying a larger color space. -Robert |
![]() |
![]() |
#5 | |
Member
Jan 2009
|
![]() Quote:
Do you have any insider info on this? Any chance we might get 12 bits? |
|
![]() |
Thanks given by: | Robert Zohn (12-18-2014) |
![]() |
#6 | |
Retailer Insider
|
![]() Quote:
Myself and many experts in greater power than me are lobbying for P3 aka DCI color space, which although a slightly smaller color pallet it would fit with less compression in a 10bit channel. DCI also makes sense as we have tons of content ready to go as it has been the color grade standard for Hollywood movies whereas rec. 2020 color would need to be created from scratch. Let's all get behind P3, the enhanced larger color pallet we can far more easily sustain vs. the proposed rec 2020. -Robert Last edited by Robert Zohn; 12-18-2014 at 09:57 PM. |
|
![]() |
![]() |
#7 | ||||
Senior Member
Oct 2007
|
![]() Quote:
Quote:
Quote:
Quote:
|
||||
![]() |
![]() |
#8 |
Retired Hollywood Insider
Apr 2007
|
![]()
Greater than 8bit encoded content, is the single most important thing you could wish for in regards to 4K Blu-ray - https://forum.blu-ray.com/showthread...ed#post9967092
It is especially crucial for HDR. HEVC already has native support for HDR, the challenge (which is doable) is to build gear which can decode HEVC HDR extensions. Last edited by Penton-Man; 11-16-2014 at 11:01 PM. Reason: For clarity, changed wording a little. |
![]() |
|
|
![]() |
![]() |
|
|