Nvidia G-sync review - Anandtech

Anand Lal Shimpi :

It started at CES, nearly 12 months ago. NVIDIA announced GeForce Experience, a software solution to the problem of choosing optimal graphics settings for your PC in the games you play. With console games, the developer has already selected what it believes is the right balance of visual quality and frame rate.

On the PC, these decisions are left up to the end user. We’ve seen some games try and solve the problem by limiting the number of available graphical options, but other than that it’s a problem that didn’t see much widespread attention.

After all, PC gamers are used to fiddling around with settings - it’s just an expected part of the experience. In an attempt to broaden the PC gaming user base (likely somewhat motivated by a lack of next-gen console wins), NVIDIA came up with GeForce Experience. NVIDIA already tests a huge number of games across a broad range of NVIDIA hardware, so it has a good idea of what the best settings may be for each game.

Read Full Story >>
The story is too old to be commented.
kB01591d ago

This is pretty awesome, I always leave v-sync off because of latency issues and timing, but if I had a choice to leave it on a still have good response time I'd definitely do it!

Both Nvidia and AMD are showcasing awesome potential! I hope this is made available to AMD users as well, much like Mantle being available to Nvidia users.

deecee331591d ago (Edited 1591d ago )

G-Sync is a brilliant idea. Let's hope we see larger displays like 55"+ HDTVs with it in a few years time. If Nvidia licensed this tech to future consoles (unlikely), it could make a huge difference to the tradeoffs developers could make as most console games are developed to run at 30-60fps.

wtopez1591d ago

That sounds pretty impressive. I mostly play on a 50" HDTV so I wish Nvidia or some other company would offer an external adapter that added G-Sync to whatever your display is. The tearing and stuttering in AC4 is crazy, so making it look as smooth as a Blu Ray movie has to look quite impressive. Oh well, guess next year I might treat myself to a 1440p G-Sync capable monitor.

dsswoosh1591d ago

Just want to be the first to add here that the XBox One's hardware scaler already addresses stutter and vsync lag problems.

It posesses a dynamic resolution scaler which means when there potential scenarios for frame drops, it will drop the resolution, hardware upscale it again and maintain the fps, eliminating stutter and input lag.

That is why the XB1 frame rate latency is alot better than the PS4, and there is no screen tearing, and no input lag.

I tried explaining this to some PS4 fans, and they had zero clue what i was talking about.

Long story short - XB1 has superior frame rate processing with zero microstutter, zero screen tear and zero input lag.

Pandamobile1591d ago

Pretty off topic, considering this has nothing to do with either the PS4 or Xbox One.

You might want to at least provide some source material for those claims though.

dsswoosh1591d ago

I also want to add that i thought about this years ago.

Its pretty obvious that in order to match gpu and monitor refreshes without the gpu being locked (which creates input lag), that the obvious answer is to have a monitor that is capable of having a dynamic hz refresh.

Im pretty sure Nvida thought of this years ago too and its only just being implemented, but still......

i thought of this all by myself many years ago.

Arent i clever !! :)

Pandamobile1591d ago

Yeah, that really has nothing to do with what G-Sync does. All that does is alter the render times for each frame by literally removing resolution from the frame buffer in order to speed it up to match the display. G-Sync does the opposite.

What G-Sync does is that it allows the GPU to control the refresh intervals of the display so that as soon as the frame is fully rendered, it can tell the display "Okay, I'm done with this frame. Display it." After the monitor receives the command to render, it does it without tearing, and with as little latency as is currently physically possible.

Christopher1591d ago

***Long story short - XB1 has superior frame rate processing with zero microstutter, zero screen tear and zero input lag.***

1. It needs frame rate processing because it can't maintain 900p/1080p consistently like the PS4 due to DDR3 and processing issues.

2. There is microstuttering, it's just not perceptible to the human eye. Again, an issue primarily due to the architecture of the XBO that requires a tool to make sure that when things get too heavy, it scales it back rather than being able to handle it as it was designed at all times.

3. There is screen tearing.

4. There is no such thing as zero input lag. It's diminished input lag or limited, but no such thing as zero.