110°
Submitted by Kayant 373d ago | review

Nvidia G-sync review - Anandtech

Anand Lal Shimpi :

It started at CES, nearly 12 months ago. NVIDIA announced GeForce Experience, a software solution to the problem of choosing optimal graphics settings for your PC in the games you play. With console games, the developer has already selected what it believes is the right balance of visual quality and frame rate.

On the PC, these decisions are left up to the end user. We’ve seen some games try and solve the problem by limiting the number of available graphical options, but other than that it’s a problem that didn’t see much widespread attention.

After all, PC gamers are used to fiddling around with settings - it’s just an expected part of the experience. In an attempt to broaden the PC gaming user base (likely somewhat motivated by a lack of next-gen console wins), NVIDIA came up with GeForce Experience. NVIDIA already tests a huge number of games across a broad range of NVIDIA hardware, so it has a good idea of what the best settings may be for each game. (NVIDIA, PC, Tech) -

Credit url: neogaf.com
kB0  +   373d ago
This is pretty awesome, I always leave v-sync off because of latency issues and timing, but if I had a choice to leave it on a still have good response time I'd definitely do it!

Both Nvidia and AMD are showcasing awesome potential! I hope this is made available to AMD users as well, much like Mantle being available to Nvidia users.
deecee33  +   373d ago
G-Sync is a brilliant idea. Let's hope we see larger displays like 55"+ HDTVs with it in a few years time. If Nvidia licensed this tech to future consoles (unlikely), it could make a huge difference to the tradeoffs developers could make as most console games are developed to run at 30-60fps.
#2 (Edited 373d ago ) | Agree(3) | Disagree(1) | Report | Reply
wtopez  +   373d ago
That sounds pretty impressive. I mostly play on a 50" HDTV so I wish Nvidia or some other company would offer an external adapter that added G-Sync to whatever your display is. The tearing and stuttering in AC4 is crazy, so making it look as smooth as a Blu Ray movie has to look quite impressive. Oh well, guess next year I might treat myself to a 1440p G-Sync capable monitor.
dsswoosh  +   373d ago
Just want to be the first to add here that the XBox One's hardware scaler already addresses stutter and vsync lag problems.

It posesses a dynamic resolution scaler which means when there potential scenarios for frame drops, it will drop the resolution, hardware upscale it again and maintain the fps, eliminating stutter and input lag.

That is why the XB1 frame rate latency is alot better than the PS4, and there is no screen tearing, and no input lag.

I tried explaining this to some PS4 fans, and they had zero clue what i was talking about.

Long story short - XB1 has superior frame rate processing with zero microstutter, zero screen tear and zero input lag.
Pandamobile  +   373d ago
Pretty off topic, considering this has nothing to do with either the PS4 or Xbox One.

You might want to at least provide some source material for those claims though.
dsswoosh  +   373d ago
I also want to add that i thought about this years ago.

Its pretty obvious that in order to match gpu and monitor refreshes without the gpu being locked (which creates input lag), that the obvious answer is to have a monitor that is capable of having a dynamic hz refresh.

Im pretty sure Nvida thought of this years ago too and its only just being implemented, but still......

i thought of this all by myself many years ago.

Arent i clever !! :)
Pandamobile  +   373d ago
Yeah, that really has nothing to do with what G-Sync does. All that does is alter the render times for each frame by literally removing resolution from the frame buffer in order to speed it up to match the display. G-Sync does the opposite.

What G-Sync does is that it allows the GPU to control the refresh intervals of the display so that as soon as the frame is fully rendered, it can tell the display "Okay, I'm done with this frame. Display it." After the monitor receives the command to render, it does it without tearing, and with as little latency as is currently physically possible.
cgoodno  +   373d ago
***Long story short - XB1 has superior frame rate processing with zero microstutter, zero screen tear and zero input lag.***

1. It needs frame rate processing because it can't maintain 900p/1080p consistently like the PS4 due to DDR3 and processing issues.

2. There is microstuttering, it's just not perceptible to the human eye. Again, an issue primarily due to the architecture of the XBO that requires a tool to make sure that when things get too heavy, it scales it back rather than being able to handle it as it was designed at all times.

3. There is screen tearing.

4. There is no such thing as zero input lag. It's diminished input lag or limited, but no such thing as zero.

Add comment

You need to be registered to add comments. Register here or login
Remember
New stories
30°

Mega Man theme song composer still jamming over 20 years later

15m ago - Manami Matsumae studied music at university. Classical piano formed the core of her education, th... | PC
30°

Far Cry 4 Review | SHS Wave Breaker

22m ago - There is a vast world for you to discover and conquer. In Far Cry 4, every second is a story. | PC
30°

Resident Evil Series: Here Are Some Amazing And Mind Blowing Facts You Might Not Be Aware Of

49m ago - Resident Evil has been a strong franchise that has spanned over several video games and films. Si... | Culture
30°

Cosmophony headed to PS Vita

55m ago - Moving Player's CEO confirms that the Vita will get the Wii U rhythmic shooter. | PS Vita
Ad

Are you bored?

Now - Watch 10 seconds videos about games and game culture at COUB Gaming... | Promoted post
40°

What's new in Minecraft PS Vita Edition?

1h ago - The Vita version has updated to the 1.6.4 equivalent, and finally brings Horses, but what else is... | PS Vita