Now this is great. Almost two weeks ago, Nvidia claimed that it was not worth investing on consoles and that was precisely why they did not bother with PS4 and X720. As Tony Tamasi, Senior VP of content and technology at Nvidia, stated back then, Nvidia came to the conclusion that it didn’t want to do the business at the price Sony was willing to pay. And as you’d expect, AMD decided to stand up and face Nvidia, claiming that they were able to offer PS4 the hardware Nvidia couldn’t.
Neal Robison, director of ISV relations at AMD, told TechRadar that AMD was able to provide an integrated solution through its APU that Nvidia couldn’t, by ‘optimizing information flows, generating greater performance, better power and heat efficiency, and by providing tools and dev relationships‘ to provide the PS4 a strong launch.
AMD has long been the best value option if you're looking for a new GPU. Now even their latest Radeon RX 7000 series is getting cheaper.
Nvidia DLSS 3.7 is the latest update to the long-running AI upscaling technology, and it further shows native performance doesn't matter.
I think hardware development is at a point where they need to figure out how to draw less power, These beefy high end cards eat wattage, and I'm curious if using DLSS & AI in general will lower the power draw. It would seem like the days of just adding more VRAM & horsepower is over. Law of diminishing returns. Pretty soon DLSS/FSR will be incorporated into everything, and eventually the tech will be good enough to hardly notice a difference if at all. AI is the future and it would be foolish to turn around and not incorporate it at all. Reliance on AI is only going to pick up more & more.
PS4 Pro had dedicated hardware in it for supporting checkerboard rendering that was used significantly in PS4 first party titles, so you don't need to look to PC or even modern PC gaming. The first RTX cards released nearly 6 years ago, so how many nails does this coffin need?
Almost deaf person:
- lightweight portable 5$, speakers of 0,5cm diameter are the final nail in coffin of Hi-Fi audio!
Some people in 2010:
- smartphones are the final nain in the console gaming's coffin!
This is just the same.
AI upscalling is complete dogshit in terms of motion quality. The fact that someone is not aware of it (look at the deaf guy example) doesn't mean the flaws are not there. They are. And all it takes to see them, is to use a display that handles motion well, so either gets true 500fps at 500Hz LCD TN or OLED (or faster tech) or uses low persistence mode (check blurbusters.com if you don't know what it means) also known as Black Frame Insertion or backlight strobing.
Also, image ruined by any type of TAA is just as "native image" as chineese 0,5$ screwdriver is "high quality, heavy duty, for professional use". It's nowhere near it. But if you're an ignorant "journalist", you will publish crap like this article, just to flow with the current.
There's no coffin to native res quality and there never will be. Eventually, we'll have enough performance in rasterization to drive 500fps, which will be a game changer for motion quality while also adding other benefit - lower latency.
And at 500fps, the amount of time required for upscalling makes it completely useless.
This crap is only usable for cinematic stuff, like cutscenes and such. Not for gaming. Beware of ignorants on the internet. The TAA is not "native" and the shitty look of the modern games when you disable any TAA, is not "native" either as it's ruined by the developer's design choice - you can cheat by rendering every 4th pixel when you plan to put a smeary TAA pass on it later on. When you disable it, you will see a ruined image, horrible pixellation and other visual "glitches" but it is NOT what native would've looked like if you'd like to honestly compare the two.
Stay informed.
How much VRAM is standard today? My laptop has a 1080p QLED display but only an Intel Iris Xe with 128MB of VRAM. I currently do all my gaming on it but certain titles do suffer because of it. I plan on getting a Steam Deck OLED soon to play the newer and more demanding titles.
Aleksha writes: "Nvidia has established itself as a dominant force in the world of AI, but I can't shake the worry of what this means for the RTX 50 series."
Echo sentiment here - I think the way GPUs are going, gaming could be secondary to deep learning. Wonder if the 40 series was the last true generation of GPUs?
You also need to consider that NVIDIA are heavily invested in cloud gaming. So they are likely going to make moves to push you into yet another life subscription service.
NVIDIA will never change their price point until AMD or intel makes a GPU that is comparable and cheaper than them .
it happend before in the days of gtx280 which they changed the price from 650$ to 450$ in a matter of 2 weeks because of rx4870 which is being sold at 380$.
So they're comparing a $1000 card to a console that will cost around 400$ ... that's nice. True that their cards are powerful, but even they couldn't supply a technology that is so ahead of the current gen at a reasonable price. And while they could supply what they used for the Titan, who would buy a let's say $1200 console? All I know is that the Sony and MS made the right choice for AMD.
Anyway, to sum up this article in one picture....
Oh Lord....
Me: PS4 & X720 get my money. So, who loses?
it's all about optimisation
titan is maybe 3x faster but using only 50% of his capabilities.
in other hand PS4 will be 3x slower but system will use(at beggining)60% and later 100% capabilities.
if you didn't know ps4 using new technology where CPU and GPU working together,results-MORE POWER-better looking visuals.
ps4 games will look like today pc games with all ultra graphic,or even better.thx bye
ePeen warz, gotta love em when the big boys go at it haha
Ever since Nvidia were confirmed to not have a hand in any console this gen, they seem mad.
How many people, in all honesty, are going to put down the sort of money they're asking for a Titan? It's going to be the big enthusists and, even then, it's only going to be the ones who really need to be on the bleeding edge.
It's never going to be a mass market item because PC gaming with that type of card is not really a mainstream market. Whatever number they sell, it will never match the easy licence that is console GPU development where the product doesn't change for 6+ years.