Best Blu-ray Movie Deals


Best Blu-ray Movie Deals, See All the Deals »
Top deals | New deals  
 All countries United States United Kingdom Canada Germany France Spain Italy Australia Netherlands Japan Mexico
Plane 4K (Blu-ray)
$24.96
4 hrs ago
Superman I-IV 5-Film Collection 4K (Blu-ray)
$97.99
 
Shazam! Fury of the Gods 4K (Blu-ray)
$29.96
28 min ago
Universal Classic Monsters: The Essential Collection (Blu-ray)
$29.99
3 hrs ago
Plane (Blu-ray)
$19.96
5 hrs ago
Planet of the Apes Trilogy 4K (Blu-ray)
$17.99
 
Cliffhanger 4K (Blu-ray)
$29.99
 
Bio Zombie (Blu-ray)
$21.49
5 hrs ago
Scream VI 4K (Blu-ray)
$29.96
9 hrs ago
Star Trek: The Next Generation Motion Picture Collection 4K (Blu-ray)
$77.99
 
65 4K (Blu-ray)
$31.16
 
Heat 4K (Blu-ray)
$8.99
 
What's your next favorite movie?
Join our movie community to find out


Image from: Life of Pi (2012)
Old 02-12-2014, 11:29 AM   #1
Chevypower Chevypower is offline
Special Member
 
Feb 2008
Default NVIDIA G-Sync (variable refresh rate screen)

I have been hoping for this kind of technology for a while now. Traditionally our monitors always scan at the same rate no matter what the source is, and that can make it look jittery if your TV is scanning at 60Hz and you are watching a 25Hz, 24Hz, or 50Hz source. With G-sync, monitors/screens that can alter the scan rate to match the source. So if you are watching a 24fps movie, the screen will scan in sync with the film. When you are watching 30 fps, it will scan in 30, or change it to 25 for a PAL video. Now with 48,50, and 60, it will scan natively for that too. As we move away from traditional standards, this technology is necessary to maintain the best quality. So hopefully when we upgrade to curved OLED massive 4K screens, they will implement this technology. This video explains it well.

  Reply With Quote
Old 02-12-2014, 10:29 PM   #2
Nick The Slick Nick The Slick is offline
Expert Member
 
Nick The Slick's Avatar
 
Dec 2011
Kentucky
27
111
2
Default

Ok I had to edit this and delete everything I wrote after watching the video. Now THAT is impressive. Real time refresh rate change to match in-game frame rate. The way you explained it I felt like it was exactly what already happens with my GPU - I start a movie and the display changes refresh rate to match it. But this is completely different. A movie or video file has a constant frame rate, so it's easy for the display to just choose a single refresh rate and stay there. In games however, the frame rate can vary quite a bit depending on the load, scene being rendered, etc. and for the monitor to be able to adjust the refresh rate on the fly to match those variances is amazing. Kudos nVidia~

Last edited by Nick The Slick; 02-13-2014 at 05:26 AM.
  Reply With Quote
Old 02-13-2014, 08:23 PM   #3
Trogdor2010 Trogdor2010 is offline
Blu-ray Guru
 
Trogdor2010's Avatar
 
Mar 2009
45
266
Default

I'm glad that this is being made. Although I don't see the value to it in regards to video content. I see the value to it in games since they have varying framerates (even though you'll need both a compatible GPU along with the display to make it work). I'm interested why the demo didn't show the framerate marker below 40 fps? I'm going to assume that it'll produce a multiple of 2 or 4 (if it's a 120Hz display) if the game goes below 30 fps, which I imagine requires a fairly complex algorith for the video processor in the display to handle.

There is a reason that film projectors for example use a shutter system. Despite alot of films being shot in 24 fps natively, it's native form is notably slow even for human vision, and that causes the appearance of flicker. Shutter systems in film often used to duplicate the image in a multiple of 2 (appearing as a 48 Hz refresh rate) or even 3 (appearing 72 Hz).

Quote:
http://en.wikipedia.org/wiki/Movie_projector

Shutter

A commonly held misconception is that film projection is simply a series of individual frames dragged very quickly past the projector's intense light source; this is not the case. If a roll of film were merely passed between the light source and the lens of the projector, all that would be visible on screen would be a continuous blurred series of images sliding from one edge to the other. It is the shutter that gives the illusion of one full frame being replaced exactly on top of another full frame. A rotating petal or gated cylindrical shutter interrupts the emitted light during the time the film is advanced to the next frame. The viewer does not see the transition, thus tricking the brain into believing a moving image is on screen. Modern shutters are designed with a flicker-rate of two times (48 Hz) or even sometimes three times (72 Hz) the frame rate of the film, so as to reduce the perception of screen flickering. (See Frame rate and Flicker fusion threshold.) Higher rate shutters are less light efficient, requiring more powerful light sources for the same light on screen.
You need the system to produce a multiple framerate if you want a pleasant viewing experience with less flicker. I will argue displays with a higher refresh rate as long as it's simply a multiple is better for films, since the fill rate reduces the appearance of judder, as long as it's simply multiplies the image and avoids the need for 3:2 pulldown.

One of the common misconception of 120HZ is that they create the "soap opera" look, which is caused by the display altering the multiple frames via image processing of the display, not the duplication of the image. All the display does with 24/60fps content is that they duplicate the image, which simply reduces flicker, like a film's shutter system uses light to duplicate a frame.

The only exception is obviously content that requires a pulldown system, such as content created in a refresh rate not in a multiple to 120Hz, like PAL content (although a 100Hz/200Hz display can be a work-around with this, and some displays can output different refresh rates). If it weren't for the existence of 120 Hz displays, I would have seen a lot more value in G-sync for video content.

I'm glad that they are focusing on video games with this because games are often in various framerates. If we can avoid needing v-sync in games to avoid tearing, it'll be better if we have something like this. It's just the issue with tearing in video for example, it is more of the fault of poor mastering than a deficiency of modern video decoders.
  Reply With Quote
Old 02-13-2014, 09:20 PM   #4
Nick The Slick Nick The Slick is offline
Expert Member
 
Nick The Slick's Avatar
 
Dec 2011
Kentucky
27
111
2
Default

Quote:
Originally Posted by Trogdor2010 View Post
I'm glad that this is being made. Although I don't see the value to it in regards to video content. I see the value to it in games since they have varying framerates (even though you'll need both a compatible GPU along with the display to make it work).
[Show spoiler]I'm interested why the demo didn't show the framerate marker below 40 fps? I'm going to assume that it'll produce a multiple of 2 or 4 (if it's a 120Hz display) if the game goes below 30 fps, which I imagine requires a fairly complex algorith for the video processor in the display to handle.

There is a reason that film projectors for example use a shutter system. Despite alot of films being shot in 24 fps natively, it's native form is notably slow even for human vision, and that causes the appearance of flicker. Shutter systems in film often used to duplicate the image in a multiple of 2 (appearing as a 48 Hz refresh rate) or even 3 (appearing 72 Hz).



You need the system to produce a multiple framerate if you want a pleasant viewing experience with less flicker. I will argue displays with a higher refresh rate as long as it's simply a multiple is better for films, since the fill rate reduces the appearance of judder, as long as it's simply multiplies the image and avoids the need for 3:2 pulldown.

One of the common misconception of 120HZ is that they create the "soap opera" look, which is caused by the display altering the multiple frames via image processing of the display, not the duplication of the image. All the display does with 24/60fps content is that they duplicate the image, which simply reduces flicker, like a film's shutter system uses light to duplicate a frame.

The only exception is obviously content that requires a pulldown system, such as content created in a refresh rate not in a multiple to 120Hz, like PAL content (although a 100Hz/200Hz display can be a work-around with this, and some displays can output different refresh rates). If it weren't for the existence of 120 Hz displays, I would have seen a lot more value in G-sync for video content.

I'm glad that they are focusing on video games with this because games are often in various framerates. If we can avoid needing v-sync in games to avoid tearing, it'll be better if we have something like this. It's just the issue with tearing in video for example, it is more of the fault of poor mastering than a deficiency of modern video decoders.
Completely agreed. I see no value in this for normal video usage, but this is amazing news for gamers. This would even be enough for me to break my ATI/AMD streak and go nVidia if this proves to be cost effective and give real world in-game results.
  Reply With Quote
Old 02-14-2014, 03:39 AM   #5
Chevypower Chevypower is offline
Special Member
 
Feb 2008
Default

The value would be for those in America wanting to watch PAL content, or for those in other countries wanting to watch 24 fps and NTSC sources. With HFR, it will get even more complex. You would need a constant refresh rate divisible by 48,50, and 60. I bet it won't be long and we will be demanding even higher frame rates. It wouldn't even matter if people filmed content in obscure frame rates anymore. You want 71 frames per second now? Sure, why not!

Last edited by Chevypower; 02-14-2014 at 03:52 AM.
  Reply With Quote
Old 02-14-2014, 10:01 AM   #6
singhcr singhcr is offline
Blu-ray Samurai
 
singhcr's Avatar
 
Sep 2008
Apple Valley, MN
11
4
26
4
38
Default

Yep, the duplicate frames are really important. My plasma TV can display 1080p24 images at 24 Hz but as you'd imagine it's very flickery and hard on the eyes, so I have it display at 96 Hz instead. There is no soap opera effect because there are no intrapolated frames of content that aren't on the disc but each frame of content is repeated 4 times to reduce flicker.
  Reply With Quote
Reply
Go Back   Blu-ray Forum > Displays > Display Theory and Discussion > New Display Technologies

Tags
g-sync, nvidia, pulldown, variable refresh rate

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT. The time now is 06:50 AM.