|
|
![]() |
||||||||||||||||||||
|
Best Blu-ray Movie Deals
|
Best Blu-ray Movie Deals, See All the Deals » |
Top deals |
New deals
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() $24.96 4 hrs ago
| ![]() $97.99 | ![]() $29.96 28 min ago
| ![]() $29.99 3 hrs ago
| ![]() $19.96 5 hrs ago
| ![]() $17.99 | ![]() $29.99 | ![]() $21.49 5 hrs ago
| ![]() $29.96 9 hrs ago
| ![]() $77.99 | ![]() $31.16 | ![]() $8.99 |
![]() |
#1 | |
Special Member
Feb 2008
|
![]()
I have been hoping for this kind of technology for a while now. Traditionally our monitors always scan at the same rate no matter what the source is, and that can make it look jittery if your TV is scanning at 60Hz and you are watching a 25Hz, 24Hz, or 50Hz source. With G-sync, monitors/screens that can alter the scan rate to match the source. So if you are watching a 24fps movie, the screen will scan in sync with the film. When you are watching 30 fps, it will scan in 30, or change it to 25 for a PAL video. Now with 48,50, and 60, it will scan natively for that too. As we move away from traditional standards, this technology is necessary to maintain the best quality. So hopefully when we upgrade to curved OLED massive 4K screens, they will implement this technology. This video explains it well.
|
|
![]() |
![]() |
#2 |
Expert Member
|
![]()
Ok I had to edit this and delete everything I wrote after watching the video. Now THAT is impressive. Real time refresh rate change to match in-game frame rate. The way you explained it I felt like it was exactly what already happens with my GPU - I start a movie and the display changes refresh rate to match it. But this is completely different. A movie or video file has a constant frame rate, so it's easy for the display to just choose a single refresh rate and stay there. In games however, the frame rate can vary quite a bit depending on the load, scene being rendered, etc. and for the monitor to be able to adjust the refresh rate on the fly to match those variances is amazing. Kudos nVidia~
Last edited by Nick The Slick; 02-13-2014 at 05:26 AM. |
![]() |
![]() |
#3 | |
Blu-ray Guru
|
![]()
I'm glad that this is being made. Although I don't see the value to it in regards to video content. I see the value to it in games since they have varying framerates (even though you'll need both a compatible GPU along with the display to make it work). I'm interested why the demo didn't show the framerate marker below 40 fps? I'm going to assume that it'll produce a multiple of 2 or 4 (if it's a 120Hz display) if the game goes below 30 fps, which I imagine requires a fairly complex algorith for the video processor in the display to handle.
There is a reason that film projectors for example use a shutter system. Despite alot of films being shot in 24 fps natively, it's native form is notably slow even for human vision, and that causes the appearance of flicker. Shutter systems in film often used to duplicate the image in a multiple of 2 (appearing as a 48 Hz refresh rate) or even 3 (appearing 72 Hz). Quote:
One of the common misconception of 120HZ is that they create the "soap opera" look, which is caused by the display altering the multiple frames via image processing of the display, not the duplication of the image. All the display does with 24/60fps content is that they duplicate the image, which simply reduces flicker, like a film's shutter system uses light to duplicate a frame. The only exception is obviously content that requires a pulldown system, such as content created in a refresh rate not in a multiple to 120Hz, like PAL content (although a 100Hz/200Hz display can be a work-around with this, and some displays can output different refresh rates). If it weren't for the existence of 120 Hz displays, I would have seen a lot more value in G-sync for video content. I'm glad that they are focusing on video games with this because games are often in various framerates. If we can avoid needing v-sync in games to avoid tearing, it'll be better if we have something like this. It's just the issue with tearing in video for example, it is more of the fault of poor mastering than a deficiency of modern video decoders. |
|
![]() |
![]() |
#4 | |
Expert Member
|
![]() Quote:
|
|
![]() |
![]() |
#5 |
Special Member
Feb 2008
|
![]()
The value would be for those in America wanting to watch PAL content, or for those in other countries wanting to watch 24 fps and NTSC sources. With HFR, it will get even more complex. You would need a constant refresh rate divisible by 48,50, and 60. I bet it won't be long and we will be demanding even higher frame rates. It wouldn't even matter if people filmed content in obscure frame rates anymore. You want 71 frames per second now? Sure, why not!
Last edited by Chevypower; 02-14-2014 at 03:52 AM. |
![]() |
![]() |
#6 |
Blu-ray Samurai
|
![]()
Yep, the duplicate frames are really important. My plasma TV can display 1080p24 images at 24 Hz but as you'd imagine it's very flickery and hard on the eyes, so I have it display at 96 Hz instead. There is no soap opera effect because there are no intrapolated frames of content that aren't on the disc but each frame of content is repeated 4 times to reduce flicker.
|
![]() |
|
|
![]() |
![]() |
Tags |
g-sync, nvidia, pulldown, variable refresh rate |
Thread Tools | |
Display Modes | |
|
|