110°
Submitted by Kayant 284d ago | review

Nvidia G-sync review - Anandtech

Anand Lal Shimpi :

It started at CES, nearly 12 months ago. NVIDIA announced GeForce Experience, a software solution to the problem of choosing optimal graphics settings for your PC in the games you play. With console games, the developer has already selected what it believes is the right balance of visual quality and frame rate.

On the PC, these decisions are left up to the end user. We’ve seen some games try and solve the problem by limiting the number of available graphical options, but other than that it’s a problem that didn’t see much widespread attention.

After all, PC gamers are used to fiddling around with settings - it’s just an expected part of the experience. In an attempt to broaden the PC gaming user base (likely somewhat motivated by a lack of next-gen console wins), NVIDIA came up with GeForce Experience. NVIDIA already tests a huge number of games across a broad range of NVIDIA hardware, so it has a good idea of what the best settings may be for each game. (NVIDIA, PC, Tech) -

Credit url: neogaf.com
kB0  +   284d ago
This is pretty awesome, I always leave v-sync off because of latency issues and timing, but if I had a choice to leave it on a still have good response time I'd definitely do it!

Both Nvidia and AMD are showcasing awesome potential! I hope this is made available to AMD users as well, much like Mantle being available to Nvidia users.
deecee33  +   284d ago
G-Sync is a brilliant idea. Let's hope we see larger displays like 55"+ HDTVs with it in a few years time. If Nvidia licensed this tech to future consoles (unlikely), it could make a huge difference to the tradeoffs developers could make as most console games are developed to run at 30-60fps.
#2 (Edited 284d ago ) | Agree(3) | Disagree(1) | Report | Reply
wtopez  +   284d ago
That sounds pretty impressive. I mostly play on a 50" HDTV so I wish Nvidia or some other company would offer an external adapter that added G-Sync to whatever your display is. The tearing and stuttering in AC4 is crazy, so making it look as smooth as a Blu Ray movie has to look quite impressive. Oh well, guess next year I might treat myself to a 1440p G-Sync capable monitor.
dsswoosh  +   284d ago
Just want to be the first to add here that the XBox One's hardware scaler already addresses stutter and vsync lag problems.

It posesses a dynamic resolution scaler which means when there potential scenarios for frame drops, it will drop the resolution, hardware upscale it again and maintain the fps, eliminating stutter and input lag.

That is why the XB1 frame rate latency is alot better than the PS4, and there is no screen tearing, and no input lag.

I tried explaining this to some PS4 fans, and they had zero clue what i was talking about.

Long story short - XB1 has superior frame rate processing with zero microstutter, zero screen tear and zero input lag.
Pandamobile  +   284d ago
Pretty off topic, considering this has nothing to do with either the PS4 or Xbox One.

You might want to at least provide some source material for those claims though.
dsswoosh  +   284d ago
I also want to add that i thought about this years ago.

Its pretty obvious that in order to match gpu and monitor refreshes without the gpu being locked (which creates input lag), that the obvious answer is to have a monitor that is capable of having a dynamic hz refresh.

Im pretty sure Nvida thought of this years ago too and its only just being implemented, but still......

i thought of this all by myself many years ago.

Arent i clever !! :)
Pandamobile  +   284d ago
Yeah, that really has nothing to do with what G-Sync does. All that does is alter the render times for each frame by literally removing resolution from the frame buffer in order to speed it up to match the display. G-Sync does the opposite.

What G-Sync does is that it allows the GPU to control the refresh intervals of the display so that as soon as the frame is fully rendered, it can tell the display "Okay, I'm done with this frame. Display it." After the monitor receives the command to render, it does it without tearing, and with as little latency as is currently physically possible.
cgoodno  +   284d ago
***Long story short - XB1 has superior frame rate processing with zero microstutter, zero screen tear and zero input lag.***

1. It needs frame rate processing because it can't maintain 900p/1080p consistently like the PS4 due to DDR3 and processing issues.

2. There is microstuttering, it's just not perceptible to the human eye. Again, an issue primarily due to the architecture of the XBO that requires a tool to make sure that when things get too heavy, it scales it back rather than being able to handle it as it was designed at all times.

3. There is screen tearing.

4. There is no such thing as zero input lag. It's diminished input lag or limited, but no such thing as zero.

Add comment

You need to be registered to add comments. Register here or login
Remember
New stories
40°

Mad Catz S.T.R.I.K.E.TE Tournament Edition Mechanical Gaming Keyboard release date, colors and more

13m ago - New info, including the release date, are available for the Mad Catz S.T.R.I.K.E.TE Tournament Ed... | PC
30°

5 Games That Will Genuinely Make You Cry

57m ago - CCC Says: "I have been playing Professor Layton vs. Phoenix Wright and Danganronpa 2 lately, and... | Culture
20°

CCG Review: Lethal League

57m ago - GuG Writer, Ryan Thompson does a review of Lethal League for the PC. | PC
20°

Casual Monday: Game Dev Story

57m ago - Hardcore Gamer: Game Dev Story by Kairosoft Co. is a fun mobile game where you create and manage... | Mobile
Ad

Celebrate the new TV season with Filmwatch

Now - With the 2014-2015 TV season right around the corner, come join us on Filmwatch as we celebrate and give all you TV lovers something to enjoy! | Promoted post
30°

The Gears Game That Never Was

58m ago - Readers Gambit has a brief look at the gears game that never quite came to fruition. | Xbox 360