nVidia has just unveiled a new proprietary system for connecting devices to displays intended to allow variable frame rates, no tearing, and no lag, all without the use of the terrible V-Sync method - nVidia calls this G-SYNC.

A couple articles (courtesy of Fudoh):
http://www.engadget.com/2013/10/18/nvidia-g-sync/
and
http://www.geforce.com/whats-new/art...er-free-gaming

I hope this catches on (and, hopefully, becomes an open standard).

The cost of getting one when they launch will be a recent high-end nVidia GPU and a monitor (or potentially a television) with the G-Sync module installed. Buying a bare G-Sync module to install yourself on a monitor which supports it will cost you $175, and hopefully that price will go down quickly.

This aims to solve one of the long-standing problems with LCD screens as compared with CRT: CRT monitors and televisions display the scan lines, being sent through the wire, as it receives them. LCD monitors update a frame all at once - yet that frame is built up from the signal coming in during a specific time window, and the signal is basically the same method as used on CRT displays - individual scanlines, each having their own horizontal blanking (extra information). Therefore devices have two ways of synchronizing: The first is to send the pixels along without synchronizing the start and end of the frame send to the monitor's capturing scanlines - any pixels sent from a new frame while the display is capturing the old one will appear under the old pixels. When the two don't match up, you get tearing. The other method is to delay output from the device to the TV, which is called V-Sync and adds lag.

G-Sync overcomes these problems by making use of two interesting properties of digital sets. The first is digital signalling between the device and the display, which coordinates new frame display on the display. The second is the interesting property of LCDs to run at different refresh rates in real time, which allows devices with inconsistent performance (i.e., a computer game which usually runs at 60Hz but sometimes speeds up or slows down) to speed up and slow down the display as well.

It will be interesting to see what effect, if any, this will have on console gaming for new systems. AMD might be able to implement something like this in their systems, but if nVidia locks this to a proprietary system and doesn't license it, Xbox and PlayStation owners might be left out through the coming generation, and by all reports this is huge for gaming.

It's also hoped that this will allow older consoles, MAME, and other systems (i.e. upscalers for playing arcade games) to display images at their native frame rates, and faster than was previously possible. Both devices (the console and the television, for example) will need to support this technology, though, so it won't be plug-and-play with your NES. This looks like a good way to win back some performance lost in the new technology.