GameSpot - Two of tech's biggest competitors are taking a very different approach to the next generation of gaming with Sony and Android, but which is making the right moves?
Nvidia DLSS 3.7 is the latest update to the long-running AI upscaling technology, and it further shows native performance doesn't matter.
I think hardware development is at a point where they need to figure out how to draw less power, These beefy high end cards eat wattage, and I'm curious if using DLSS & AI in general will lower the power draw. It would seem like the days of just adding more VRAM & horsepower is over. Law of diminishing returns. Pretty soon DLSS/FSR will be incorporated into everything, and eventually the tech will be good enough to hardly notice a difference if at all. AI is the future and it would be foolish to turn around and not incorporate it at all. Reliance on AI is only going to pick up more & more.
PS4 Pro had dedicated hardware in it for supporting checkerboard rendering that was used significantly in PS4 first party titles, so you don't need to look to PC or even modern PC gaming. The first RTX cards released nearly 6 years ago, so how many nails does this coffin need?
Almost deaf person:
- lightweight portable 5$, speakers of 0,5cm diameter are the final nail in coffin of Hi-Fi audio!
Some people in 2010:
- smartphones are the final nain in the console gaming's coffin!
This is just the same.
AI upscalling is complete dogshit in terms of motion quality. The fact that someone is not aware of it (look at the deaf guy example) doesn't mean the flaws are not there. They are. And all it takes to see them, is to use a display that handles motion well, so either gets true 500fps at 500Hz LCD TN or OLED (or faster tech) or uses low persistence mode (check blurbusters.com if you don't know what it means) also known as Black Frame Insertion or backlight strobing.
Also, image ruined by any type of TAA is just as "native image" as chineese 0,5$ screwdriver is "high quality, heavy duty, for professional use". It's nowhere near it. But if you're an ignorant "journalist", you will publish crap like this article, just to flow with the current.
There's no coffin to native res quality and there never will be. Eventually, we'll have enough performance in rasterization to drive 500fps, which will be a game changer for motion quality while also adding other benefit - lower latency.
And at 500fps, the amount of time required for upscalling makes it completely useless.
This crap is only usable for cinematic stuff, like cutscenes and such. Not for gaming. Beware of ignorants on the internet. The TAA is not "native" and the shitty look of the modern games when you disable any TAA, is not "native" either as it's ruined by the developer's design choice - you can cheat by rendering every 4th pixel when you plan to put a smeary TAA pass on it later on. When you disable it, you will see a ruined image, horrible pixellation and other visual "glitches" but it is NOT what native would've looked like if you'd like to honestly compare the two.
Stay informed.
How much VRAM is standard today? My laptop has a 1080p QLED display but only an Intel Iris Xe with 128MB of VRAM. I currently do all my gaming on it but certain titles do suffer because of it. I plan on getting a Steam Deck OLED soon to play the newer and more demanding titles.
Aleksha writes: "Nvidia has established itself as a dominant force in the world of AI, but I can't shake the worry of what this means for the RTX 50 series."
Echo sentiment here - I think the way GPUs are going, gaming could be secondary to deep learning. Wonder if the 40 series was the last true generation of GPUs?
You also need to consider that NVIDIA are heavily invested in cloud gaming. So they are likely going to make moves to push you into yet another life subscription service.
NVIDIA will never change their price point until AMD or intel makes a GPU that is comparable and cheaper than them .
it happend before in the days of gtx280 which they changed the price from 650$ to 450$ in a matter of 2 weeks because of rx4870 which is being sold at 380$.
Last September, we unleashed AMD FidelityFX™ Super Resolution 3 (FSR 3)1 on the gaming world, delivering massive FPS improvements in supported games.
So to put 2 and 2 together... FSR 3.1 is releasing later this year and the launch game to support it is Rachet and Clank: Rift Apart. In Sony's DevNet documentation it shows Rachet and Clank: Rift Apart as the example for PSSR. PS5 Pro also launches later this year... but there is something else coming too: AMD RDNA 4 Cards (The very same technology thats in the Pro). So, PSSR is either FSR 3.1 or its a direct collaboration with AMD for that builds on FSR 3.1. Somehow they are related. I think PSSR is FSR 3.1 with the bonus of AI... now lets see if RDNA 4 cards also include an AI block.
More details:
FSR 3.1 fixes Frame Generation
If you have a 30 series RTX card you can now use DLSS3 with FSR Frame Generation (No 40 Series required!)
Its Available on all Cards (we assume it will come to console)
Fixes Temporal stability
I wonder how much they fixed the ghosting in dark areas as Nvidia are leaving them in the dust with image quality. Still good that they are improving in big leaps, I'll have to see when the RTX5000 series is released who I go with... at the moment the RTX5000's are sounding like monsters.
Well AMD is going to sell to consoles, then there wont be any innovation later, it will be the same tech rolled on for the next 6 years.
Nvidia will constantly need to pump out a new GPU for the Androids every year.
From a chip makers point of view its healthy to be releasing new chips as they can make huge margins on new tech. Rolling the same tech for 6 years means no innovation and piddly profits.
Hence from a profitability point of view Android is def a better market at the moment.
Specially with console volumes being quite low compared to Androids not to mention console makers pay pennies for hardware.
"From a chip makers point of view its healthy to be releasing new chips as they can make huge margins on new tech. Rolling the same tech for 6 years means no innovation and piddly profits. "
yup!
Amd was sonys second choice.
worst cpu maker making the best console! nice..
"AMD Reports 2012 Loss of $1.18 Billion, While IBM Profits"
http://www.tomshardware.com...
Amd and sony can feel each others pain. Sony trying to be #1 again and amd trying to be #2 again.
Nvidia and intel are just ballers right now.
When you turn down a chance to be in a console you know you doing big things.
"PS3 and its RSX graphics. Two years ago, in January 2011, Nvidia CEO Jen-Hsun Huang told reporters that the Sony-Nvidia deal had earned Nvidia $500M in royalties since 2004. The total number of shipped PS3 consoles by March, 2011 stood at 50 million according to data from the NPD group. Half a billion is nothing to sneeze at"
http://www.extremetech.com/...
And why do sony gamers say Nvidia is butthurt that they turn sony down. O_o
360 uses a ATI Gpu and it's better then PS3's Nvida. I own a A6 APU laptop and it's not even close to the power of PS4 yet still runs all games much better then PS3/360.
I just dumped my lifes savings into AMD and in 1 day made $36. What now? It fricking $2 per share! Let's not forget that WiiU is also got AMD inside and after some big hitters It will pick up sales.
Let's think about it. Nintendo, Sony, MS are ALL going AMD this time. Also AMD is always making kick ass APU's and they will only get better. I never been so happy with a latop as my Asus APU
Nvidia is goig where valve goes to open system
With AMD you get more for your money and I prefer their business partner attitude, a much more friendly supplier to work with as a whole.
Having a stable chip design which they can cost effetely manufacture and cost reduce over 6-7 years must be attractive, there's no continuous yield issues that new chip design and manufacturing incurs.
The console sector looks like a good business model for AMD, I am sure Nvidia would like to have it.