Try our new beta! Click here
Submitted by Kayant 717d ago | review

Nvidia G-sync review - Anandtech

Anand Lal Shimpi :

It started at CES, nearly 12 months ago. NVIDIA announced GeForce Experience, a software solution to the problem of choosing optimal graphics settings for your PC in the games you play. With console games, the developer has already selected what it believes is the right balance of visual quality and frame rate.

On the PC, these decisions are left up to the end user. We’ve seen some games try and solve the problem by limiting the number of available graphical options, but other than that it’s a problem that didn’t see much widespread attention.

After all, PC gamers are used to fiddling around with settings - it’s just an expected part of the experience. In an attempt to broaden the PC gaming user base (likely somewhat motivated by a lack of next-gen console wins), NVIDIA came up with GeForce Experience. NVIDIA already tests a huge number of games across a broad range of NVIDIA hardware, so it has a good idea of what the best settings may be for each game. (NVIDIA, PC, Tech) -

Credit url:
kB0  +   717d ago
This is pretty awesome, I always leave v-sync off because of latency issues and timing, but if I had a choice to leave it on a still have good response time I'd definitely do it!

Both Nvidia and AMD are showcasing awesome potential! I hope this is made available to AMD users as well, much like Mantle being available to Nvidia users.
deecee33  +   717d ago
G-Sync is a brilliant idea. Let's hope we see larger displays like 55"+ HDTVs with it in a few years time. If Nvidia licensed this tech to future consoles (unlikely), it could make a huge difference to the tradeoffs developers could make as most console games are developed to run at 30-60fps.
#2 (Edited 717d ago ) | Agree(3) | Disagree(1) | Report | Reply
wtopez  +   717d ago
That sounds pretty impressive. I mostly play on a 50" HDTV so I wish Nvidia or some other company would offer an external adapter that added G-Sync to whatever your display is. The tearing and stuttering in AC4 is crazy, so making it look as smooth as a Blu Ray movie has to look quite impressive. Oh well, guess next year I might treat myself to a 1440p G-Sync capable monitor.
dsswoosh  +   717d ago
Just want to be the first to add here that the XBox One's hardware scaler already addresses stutter and vsync lag problems.

It posesses a dynamic resolution scaler which means when there potential scenarios for frame drops, it will drop the resolution, hardware upscale it again and maintain the fps, eliminating stutter and input lag.

That is why the XB1 frame rate latency is alot better than the PS4, and there is no screen tearing, and no input lag.

I tried explaining this to some PS4 fans, and they had zero clue what i was talking about.

Long story short - XB1 has superior frame rate processing with zero microstutter, zero screen tear and zero input lag.
Pandamobile  +   717d ago
Pretty off topic, considering this has nothing to do with either the PS4 or Xbox One.

You might want to at least provide some source material for those claims though.
dsswoosh  +   717d ago
I also want to add that i thought about this years ago.

Its pretty obvious that in order to match gpu and monitor refreshes without the gpu being locked (which creates input lag), that the obvious answer is to have a monitor that is capable of having a dynamic hz refresh.

Im pretty sure Nvida thought of this years ago too and its only just being implemented, but still......

i thought of this all by myself many years ago.

Arent i clever !! :)
Pandamobile  +   717d ago
Yeah, that really has nothing to do with what G-Sync does. All that does is alter the render times for each frame by literally removing resolution from the frame buffer in order to speed it up to match the display. G-Sync does the opposite.

What G-Sync does is that it allows the GPU to control the refresh intervals of the display so that as soon as the frame is fully rendered, it can tell the display "Okay, I'm done with this frame. Display it." After the monitor receives the command to render, it does it without tearing, and with as little latency as is currently physically possible.
Christopher  +   717d ago
***Long story short - XB1 has superior frame rate processing with zero microstutter, zero screen tear and zero input lag.***

1. It needs frame rate processing because it can't maintain 900p/1080p consistently like the PS4 due to DDR3 and processing issues.

2. There is microstuttering, it's just not perceptible to the human eye. Again, an issue primarily due to the architecture of the XBO that requires a tool to make sure that when things get too heavy, it scales it back rather than being able to handle it as it was designed at all times.

3. There is screen tearing.

4. There is no such thing as zero input lag. It's diminished input lag or limited, but no such thing as zero.

Add comment

You need to be registered to add comments. Register here or login
New stories

Word Party review - ChristCenteredGamer

1h ago - Other than those minor issues, Word Party is a family friendly game that has some educational pot... | Wii U

Désiré - Prologue: Walkthrough Guide

9h ago - Appunwrapper writes: "This is a complete step-by-step walkthrough for the iOS and Android point-a... | iPhone

HotLiked - What the Internet is talking about right now

Now - Kill some time at You will regret it... | Promoted post

Blaze Proves the Commodore Amiga Was Capable of Sonic the Hedgehog

9h ago - Carl Williams writes, "While Sega never supported the Commodore Amiga with any games, they did la... | Retro

Westwood Studios to Receive Industry Icon Award at The Game Awards 2015

9h ago - Host Geoff Keighley just announced that the defunct Westwood Studios will receive the “Industry I... | Retro

Yokai Watch Dance Gets Two New Japanese Commercials

9h ago - Level-5 has released two new commercials showing off the game’s colorful and adorable gameplay. | Wii U