110°

Nvidia G-sync review - Anandtech

Anand Lal Shimpi :

It started at CES, nearly 12 months ago. NVIDIA announced GeForce Experience, a software solution to the problem of choosing optimal graphics settings for your PC in the games you play. With console games, the developer has already selected what it believes is the right balance of visual quality and frame rate.

On the PC, these decisions are left up to the end user. We’ve seen some games try and solve the problem by limiting the number of available graphical options, but other than that it’s a problem that didn’t see much widespread attention.

After all, PC gamers are used to fiddling around with settings - it’s just an expected part of the experience. In an attempt to broaden the PC gaming user base (likely somewhat motivated by a lack of next-gen console wins), NVIDIA came up with GeForce Experience. NVIDIA already tests a huge number of games across a broad range of NVIDIA hardware, so it has a good idea of what the best settings may be for each game.

Read Full Story >>
anandtech.com
kB04202d ago

This is pretty awesome, I always leave v-sync off because of latency issues and timing, but if I had a choice to leave it on a still have good response time I'd definitely do it!

Both Nvidia and AMD are showcasing awesome potential! I hope this is made available to AMD users as well, much like Mantle being available to Nvidia users.

deecee334202d ago (Edited 4202d ago )

G-Sync is a brilliant idea. Let's hope we see larger displays like 55"+ HDTVs with it in a few years time. If Nvidia licensed this tech to future consoles (unlikely), it could make a huge difference to the tradeoffs developers could make as most console games are developed to run at 30-60fps.

wtopez4202d ago

That sounds pretty impressive. I mostly play on a 50" HDTV so I wish Nvidia or some other company would offer an external adapter that added G-Sync to whatever your display is. The tearing and stuttering in AC4 is crazy, so making it look as smooth as a Blu Ray movie has to look quite impressive. Oh well, guess next year I might treat myself to a 1440p G-Sync capable monitor.

dsswoosh4202d ago

Just want to be the first to add here that the XBox One's hardware scaler already addresses stutter and vsync lag problems.

It posesses a dynamic resolution scaler which means when there potential scenarios for frame drops, it will drop the resolution, hardware upscale it again and maintain the fps, eliminating stutter and input lag.

That is why the XB1 frame rate latency is alot better than the PS4, and there is no screen tearing, and no input lag.

I tried explaining this to some PS4 fans, and they had zero clue what i was talking about.

Long story short - XB1 has superior frame rate processing with zero microstutter, zero screen tear and zero input lag.

Pandamobile4202d ago

Pretty off topic, considering this has nothing to do with either the PS4 or Xbox One.

You might want to at least provide some source material for those claims though.

dsswoosh4202d ago

I also want to add that i thought about this years ago.

Its pretty obvious that in order to match gpu and monitor refreshes without the gpu being locked (which creates input lag), that the obvious answer is to have a monitor that is capable of having a dynamic hz refresh.

Im pretty sure Nvida thought of this years ago too and its only just being implemented, but still......

i thought of this all by myself many years ago.

Arent i clever !! :)

Pandamobile4202d ago

Yeah, that really has nothing to do with what G-Sync does. All that does is alter the render times for each frame by literally removing resolution from the frame buffer in order to speed it up to match the display. G-Sync does the opposite.

What G-Sync does is that it allows the GPU to control the refresh intervals of the display so that as soon as the frame is fully rendered, it can tell the display "Okay, I'm done with this frame. Display it." After the monitor receives the command to render, it does it without tearing, and with as little latency as is currently physically possible.

Christopher4201d ago

***Long story short - XB1 has superior frame rate processing with zero microstutter, zero screen tear and zero input lag.***

1. It needs frame rate processing because it can't maintain 900p/1080p consistently like the PS4 due to DDR3 and processing issues.

2. There is microstuttering, it's just not perceptible to the human eye. Again, an issue primarily due to the architecture of the XBO that requires a tool to make sure that when things get too heavy, it scales it back rather than being able to handle it as it was designed at all times.

3. There is screen tearing.

4. There is no such thing as zero input lag. It's diminished input lag or limited, but no such thing as zero.

70°

NVIDIA Smooth Motion: Up to 70% More FPS Using Driver Level Frame Gen on RTX 50 GPUs

NVIDIA’s RTX 50 “Blackwell” architecture has been a bit of a bore for us gamers. Apart from Multi Frame Generation, which has limited use-case scenarios, there isn’t much to be excited about. It is achieved using GPU-side Flip Metering. The optical field data is generated using AI models in the Tensor cores.

Read Full Story >>
pcoptimizedsettings.com
60°

PNY NVIDIA GeForce RTX 5060 Ti GPU Review

Between the price, performance and power draw, with the GeForce RTX 5060 Ti, NVIDIA nailed the mainstream formula.

Read Full Story >>
cgmagonline.com
58d ago
230°

Nintendo Switch 2 Leveled Up With NVIDIA AI-Powered DLSS and 4K Gaming

Nvidia writes:

The Nintendo Switch 2 takes performance to the next level, powered by a custom NVIDIA processor featuring an NVIDIA GPU with dedicated RT Cores and Tensor Cores for stunning visuals and AI-driven enhancements.

Read Full Story >>
blogs.nvidia.com
ZycoFox71d ago

The raytracing probably doesn't even equal a low end PC GPU, even if it did it would probably be mostly useless. They'll probably force it in some game now that will run like shit maybe 30fps at best, just because "it can do it"

B5R71d ago

Raytracing is so unnecessary for a handheld. I just hope you can turn it off.

Vits71d ago

A lot of gamers don’t realize that ray tracing isn’t really about making games look better. It’s mainly there to make development easier and cheaper, since it lets devs skip a bunch of old-school tricks to fake reflections and lighting. The visual upgrade is just a nice bonus, but that’s not the main reason the tech exists.

So you can be 100% sure that developers will try to implement it every chance they get.

RaidenBlack71d ago (Edited 71d ago )

Agree with Vits .... but also to add, if devs and designers just implement RT to a game world then it won't always work as expected. RT is not just reflections but also lighting and illumination as well. For example, If you just create a room with minimal windows, then it will look dark af, if RTGI is enabled. Devs and designers needs to sort out the game world design accordingly as well.
DF's Metro Exodus RT upgrade is an amazing reference video to go through, if anybody's interested.

darthv7271d ago

So is HDR... but they have it anyway.

thesoftware73071d ago

Some PS5 and SX games run at 30fps with RT...just like those systems, if you don't like it, turn it off.

I only say this to say, you make it seem like a problem exclusive to the Switch 2.

Neonridr71d ago (Edited 71d ago )

sour grapes much?

"It probably doesn't do it well because it's Nintendo and they suck". That's how your comment reads. Why don't you just wait and see before making these ridiculous statements?

Goodguy0171d ago

Please. I'd like to play my switch games on my 4k tv without it looking all doodoo.

PRIMORDUS71d ago

Nvidia could have said this months ago and cut the bullshit. Anyway the rumors were true.

Profchaos71d ago

Would have been nice but NDA likely prevented them from saying anything

PRIMORDUS70d ago

TBH I don't think Nvidia would have cared if they broke the NDA. A little fine they pay, and they go back to their AI shit. They don't even care about GPU's anymore. I myself would like them to leave the PC and console market.

Tacoboto70d ago

This story was written half a decade ago when the world knew Nvidia would provide the chip for Switch 2 and DLSS was taking off.

Profchaos70d ago

Yeah but similar thing happened a long time ago when 3dfx announced they were working with Sega when they took the company public Sega pulled out of the contract for the Dreamcast GPU.

In response Sega terminated the contract and went to a ultimately weaker chipset.

So there's a precedent but that Nintendo would have much Of an option its AMD, NVIDIA or Intel

Profchaos71d ago

I'm not expecting of anything from ray tracing but dlss will be the thing that sees the unit get some impossible ports.

andy8571d ago

Correct. All I'm seeing online is it'll never run FF7 Rebirth. If it can run cyberpunk it'll run it. The DLSS will help. Obviously only 30 fps but a lot don't care

Profchaos71d ago (Edited 71d ago )

Exactly right when I buy a game on switch I know what I'm getting into I'm buying a game for its portability and I'm willing to sacrifice fidelity and performance to play on a train or comfortably from a hotel room when I travel for work.

71d ago Replies(1)
Show all comments (23)