With the announcement of its G-sync technology, many believe that Nvidia may already have ousted the next generation of consoles before their launch. Sure enough, the GPU manufacturer makes a rather fascinating proposition, with the promise of gaming without annoyances like input lag, stutter, and screen tearing.
The NVIDIA RTX Remix tool is redefining the gaming industry and the modding community in the best way possible.
"NVIDIA's Generative AI-Powered Modding Tool Is A Gamechanger For The Industry"
True, the especially for all the industry that gets put out of work by it.
A new GeForce Experience update is finally here, bringing 'optimal settings support' to 122 new games, even as Nvidia App sees development.
Interesting. Diablo IV, Helldivers 2, and Forza Motorsports have been included in the app for a long time.
Nvidia DLSS 3.7 is the latest update to the long-running AI upscaling technology, and it further shows native performance doesn't matter.
I think hardware development is at a point where they need to figure out how to draw less power, These beefy high end cards eat wattage, and I'm curious if using DLSS & AI in general will lower the power draw. It would seem like the days of just adding more VRAM & horsepower is over. Law of diminishing returns. Pretty soon DLSS/FSR will be incorporated into everything, and eventually the tech will be good enough to hardly notice a difference if at all. AI is the future and it would be foolish to turn around and not incorporate it at all. Reliance on AI is only going to pick up more & more.
PS4 Pro had dedicated hardware in it for supporting checkerboard rendering that was used significantly in PS4 first party titles, so you don't need to look to PC or even modern PC gaming. The first RTX cards released nearly 6 years ago, so how many nails does this coffin need?
Almost deaf person:
- lightweight portable 5$, speakers of 0,5cm diameter are the final nail in coffin of Hi-Fi audio!
Some people in 2010:
- smartphones are the final nain in the console gaming's coffin!
This is just the same.
AI upscalling is complete dogshit in terms of motion quality. The fact that someone is not aware of it (look at the deaf guy example) doesn't mean the flaws are not there. They are. And all it takes to see them, is to use a display that handles motion well, so either gets true 500fps at 500Hz LCD TN or OLED (or faster tech) or uses low persistence mode (check blurbusters.com if you don't know what it means) also known as Black Frame Insertion or backlight strobing.
Also, image ruined by any type of TAA is just as "native image" as chineese 0,5$ screwdriver is "high quality, heavy duty, for professional use". It's nowhere near it. But if you're an ignorant "journalist", you will publish crap like this article, just to flow with the current.
There's no coffin to native res quality and there never will be. Eventually, we'll have enough performance in rasterization to drive 500fps, which will be a game changer for motion quality while also adding other benefit - lower latency.
And at 500fps, the amount of time required for upscalling makes it completely useless.
This crap is only usable for cinematic stuff, like cutscenes and such. Not for gaming. Beware of ignorants on the internet. The TAA is not "native" and the shitty look of the modern games when you disable any TAA, is not "native" either as it's ruined by the developer's design choice - you can cheat by rendering every 4th pixel when you plan to put a smeary TAA pass on it later on. When you disable it, you will see a ruined image, horrible pixellation and other visual "glitches" but it is NOT what native would've looked like if you'd like to honestly compare the two.
Stay informed.
How much VRAM is standard today? My laptop has a 1080p QLED display but only an Intel Iris Xe with 128MB of VRAM. I currently do all my gaming on it but certain titles do suffer because of it. I plan on getting a Steam Deck OLED soon to play the newer and more demanding titles.
Did misterXmedia write this?
Somebody with some tech know how explain this crap.
I don't think the writer of this really understands what G-Sync is all about.
All this sounds like is a pretty standard V-sync implementation. The whole point of G-Sync is to only refresh the display when the GPU sends a new frame, instead of just updating at 60 Hz no matter what.
"this would lead to the elimination of input lag, stutter, and SCREEN TEARING"
Scrolls down to related articles
"The new Ryse build still shows some flaws, mainly SCREEN-TEARING."
-___-
This article is based on assumptions made by the author from a month old Digital Foundry interview.
I broke my promise to myself not to comment again on N4G, damn.
More wishful thinking and wild theories.
One day it's "yeah but specs don't matter" Then rumors prop up from the stupidest sources and now they get their game face on and back into the spec war trenches - until the rumors turn bust again.
After the double GPU rumor went bust this is what's next?
These guys are too much.Borderline insanity at this point.
First it was
"PS4 is just PS3.5! Next gen gaming only possible on Xbox One."
Then it was
"It's about gameplay! It's not the graphics!"
I'm having trouble keeping up...