910°

AMD on the PS4: We gave it the hardware Nvidia couldn't

AMD came roaring into GDC 2013 with a vengeance. Not only did the chipmaker introduce its first branded line of dedicated cloud gaming graphics cards - the Radeon Sky Series - we got a taste of what it claims is the world's fastest GPU.
Attendees of a Tuesday night press conference saw the GPU, the Radeon HD 7990, make its first public appearance. Later, a set of 7990s powered EA's 17-minute introductory Battlefield 4 demo on the big screen.

Read Full Story >>
techradar.com
iGAM3R-VIII4035d ago (Edited 4035d ago )

Well Nvidia got their heads stuck up their a** now. They probably aren't happy at Sony and AMD. Well they should of seen that coming, you can't just bash 2 companies and expect them to not get you back, *sigh* Nvidia.....

gaffyh4035d ago

This is a really great interview, lots of good details regarding the PS4's architecture.

Army_of_Darkness4035d ago

I think everyone realized that except for the hardcore nvidia pc fanboys..
They can trash talk and hate on the ps4 for now, but once it releases and we start seeing what developer can do with it, i will expect a little silence from them and return to their cave...

cayleee4035d ago Show
gaffyh4035d ago

@cayleee - Everyone knows you are a PC fanboy, but I'll reply to one part of your comment anyway. Component manufacturers almost always make a profit on what they are selling, and if anyone thinks that nVidia didn't make a profit from being in the PS3, they are just stupid. It might not have been an obscene profit, but it was a profit nonetheless.

Maybe get your head out of nVidia's ass and look at the situation as a whole and it's pretty clear that the nVidia comments are damage control from their side. There is NO WAY, that they didn't try and get in on the console business this gen. First and foremost they are a BUSINESS, and any money they can make is good for them and their shareholders.

R6ex4035d ago

"We gave it the hardware Nvidia couldn't"

Well, obviously! Nvidia's Project Denver sees the light of day only in late 2014 to early 2015.

4035d ago
Tr10wn4034d ago

@Army_of_Darkness

Everyone knows Nvidia has more quality than Ati is really no secret that Sony went with AMD because it was cheaper for them and for us, while i don't like ATI i like AMD CPU's i hope this deal help them get out of their debts.

Gaming1014034d ago

The problem is, everyone is comparing the traditional x86 architecture of PC's to the PS4, and it's not an apples to apples comparison. The article outlines this rather well with the interview:

"For us, really by looking at that APU that we designed, you can't pull out individual components off it and hold it up and say, 'Yeah, this compares to X or Y.'

"It's that integration of the two, and especially with the amount of shared memory [8GB of GDDR5, 176GB/s raw memory bandwidth] that Sony has chosen to put on that machine, then you're going to be able to do so much more moving and sharing that data that you can address by both sides.

"It's more than just a CPU doing all these amazing calculations and a GPU doing calculations. We are now going to be able to move certain tasks between the two."

Devs, he said, will be able to push the console's capabilities beyond a traditional x86 PC architecture, and multithreading - being able to take advantage of all eight cores - is going "to become a huge deal for a lot of the big blockbuster games."

4034d ago
rainslacker4034d ago (Edited 4034d ago )

Here's some food for thought, particularly for those saying it's just not worth it for NVidia.

If Nvidia only made a $5 net profit on each chip manufactured(probably low ball figure), then over the life of 80 million consoles(arguably an obtainable sales number) they would have made a profit equal to about 5% of their current worth.

Yeah, so not worth their time. I'm sure investors don't care about a 5% increase in profits. /s

Nitrox4034d ago

rainslacker-

Nvidia didn't technically say it wasn't worth it, they said they looked at the opportunity cost of competing to get into the console market, and the justification wasn't there. That's why they wanted more $$$ than Sony was willing to shell out, to shift the justification to a level they were more happy with.

They're trying to make some major headway in the mobile market right now and didn't want to put various projects on hold to focus on console development. So maybe they looked at it like "Well, we can go for a 5% profit increase going the console route. Or... We can take a gamble on some of our own ideas and possibly see a significantly larger increase."

rainslacker4034d ago

I'm aware. Nvidia isn't one to slouch when it comes to trying to be a major player in any market they have an interest in.

Some people on here were just saying it isn't worth it for Nvidia based on their presumption that consoles don't return enough profit to make it worthwhile. I was just trying to put things in perspective.

Nvidia does make some really good products, and their reputation is really well deserved. That reputation does come at a cost though, and to me, if a comparable experience or tech can be offered cheaper to me then there is no reason to really hold on to brand loyalty.

+ Show (9) more repliesLast reply 4034d ago
IAmLee4035d ago

Tomorrows articles:
'Nvidia says PS4 is shit.'

shutUpAndTakeMyMoney4035d ago ShowReplies(5)
DeadlyFire4035d ago

Well there is a reason why Microsoft turned away from NVIDIA after the Xbox and now Sony is doing the same after PS3.

Cueil4034d ago

NVidia is the reason that Microsoft had to stop making Xboxes in the first place a probably a reason why you don't have HDD in every console because of BC license for emulating the Nvidia chipset was built into the HDD price... they thought to highly of themselves and AMD has been angling in this direction for a while it was the Xbox that was there first APU... albeit with a different CPU... the original A8 model was AD3870WNGXBOX s/n Microsoft was a important partner in the development and distribution of the first ever APU. Just another example of competition being good for gamers.

fermcr4035d ago (Edited 4035d ago )

summarizing...

Nvidia had better graphics cards, but was more expensive, since it had to be a GPU from Nvidia and a 3rd party CPU (probably IBM). Nvidia didn't want to budge with their prices, so Sony went with ATI, witch is cheaper with the CPU and GPU all in one. Not as good as the Nvidia (+ 3rd party CPU) solution, but still a good solution and much cheaper.

Sony made a big mistake going with the Cell processor for the PS3, with losses of billions. Now they are going with a much cheaper solution and will most likely make a profit with every PS4 sold from the start.

Microsoft will probably take the same route.

dcbronco4035d ago

It's not about a cheaper solution. Nvidia has Tegra. That's an APU to a certain extent. What the article is mainly talking about is HSA. Nvidia doesn't have that. They are said to be working on it now for Tegra 5.

http://www.theinquirer.net/...

The AMD APU was just a lot better and cheaper than anything Nvidia could offer. They are bitter and scared because AMD is challenging them in every market and that will soon include phones and tablets. And with a better product.

ijust2good4035d ago (Edited 4035d ago )

We all know deep down Nvidia are hurting big time. It must be a bitch for Nvidia not to have any kind of an attention next gen.

CEO of Nvidia- Jen Huang was on stage during 2005 E3 PS3 reveal bragging about console power, he seemed the most excited person on the stage that time. He always did like the attention. Too bad he wont get any this time around. I guess he tried another way to seek the attention he craved with the project SHIELD. Yep, that will threaten console market lol.

start from 07.00
http://www.youtube.com/watc...

Jen will certainly be gutted he wont be attending any E3 this year when next gen consoles gets full reveal.

ZombieNinjaPanda4035d ago

Because any company that is offering their products and services to another will talk trash about them at the time they're offering.

Logic fails majority of this website.

S2Killinit4034d ago

well to those who think that Nvidia didn't want, or didn't really need the console market, I gotta say, it seems like the way they are responding to it is saying otherwise. I'm actually more of a Nvidia guy than a AMD guy. I don't mind paying a little bit more for that little bit of extra security in knowing I've got a first rate item (not that AMD is bad).But I still gotta say, Nvidia didn't handle this very maturely.

hellvaguy4034d ago

And by "maturely" you do mean independent of their right to their own opinion? You mean they throw in a slight dig at sony for using cheaper parts then they would have used, (which may or may not be fact, again just their opinion), but as a devote sony zealot, the dig cut you deep.

Fanboys of all sides need to stop pretending that they know the "true motivation" of every company out there and that if they speak negatively about their cult company (wii.sony.ms), then obviously they are immature, hurtful, evil or w/e.

BABY-JEDI4034d ago

They should just let the games do the talking, anything else is just a waste of time IMO

4034d ago
+ Show (6) more repliesLast reply 4034d ago
TheLyonKing4035d ago

I love a good tit for tat against companies.

knifefight4035d ago

Bruce Less vs. Chuck Norris...

King Kong vs. Godzilla...

Mr. T. vs. Rocky Balboa...

And now, the grandest of all superfights...

NVIDIA VERSUS AMD!!!! :o

popcorn.gif

rezzah4035d ago (Edited 4035d ago )

Lee.

Edit:

If "Less" was meant to imply Bruce Lee being lesser to Chuck Norris, know that Bruce kicked his ***.

AJ Hartley4035d ago

Theres even a vid of chuck saying theres no comparison at all that bruce would anhilate him.

Diver4034d ago

@rezzah

Bruce Lee action star

Chuck Norris full contact world champion. Bruce asked him to appear in the movie. So basically rigged, fake outcome that would have gone differently in the real world. Something nvidia is learning the hard way.

specialguest4035d ago (Edited 4035d ago )

Despite the fact that there was a lame movie where King Kong beats Godzilla, Godzilla would've acually crushed King Kong like an ant. Godzilla is the size of a tall building. King Kong climbs buildings and fell to his death.

http://www.geekstir.com/wp-...

e-p-ayeaH4035d ago

no love for Rocky vs Drago?

Auron4035d ago (Edited 4035d ago )

Bruce Lee killed Chuck Norris watch way of the dragon. Lee is a Legend.

http://www.youtube.com/watc...

strickers4034d ago

I love Bruce but he never competed. Chuck said he would have done well. He praised him. However , Chuck was 7 times CONSECUTIVE World champ. Don't dismiss him because Bruce made better movies and was more charming .

Ritsujun4034d ago

knifefight vs. gunfight
You lose.

knifefight4034d ago

Ritsujun vs. Ritsurin.

Now we're both 0 and 1.

+ Show (2) more repliesLast reply 4034d ago
4035d ago Replies(1)
mcstorm4035d ago

IM glad AMD are making the chip and ide rather AMD make it for all 3 than NVIDIA just because AMD are finding it hard in the market because of Intel and NVIDIA making alto with there processors.

Show all comments (147)
330°

Nvidia DLSS 3.7 drives a further nail in the coffin of native performance

Nvidia DLSS 3.7 is the latest update to the long-running AI upscaling technology, and it further shows native performance doesn't matter.

DustMan3d ago

I think hardware development is at a point where they need to figure out how to draw less power, These beefy high end cards eat wattage, and I'm curious if using DLSS & AI in general will lower the power draw. It would seem like the days of just adding more VRAM & horsepower is over. Law of diminishing returns. Pretty soon DLSS/FSR will be incorporated into everything, and eventually the tech will be good enough to hardly notice a difference if at all. AI is the future and it would be foolish to turn around and not incorporate it at all. Reliance on AI is only going to pick up more & more.

Tapani3d ago (Edited 3d ago )

DLSS certainly lowers power consumption. Also, the numbers such as the 4090 at 450W does not tell you everything, most of the time the GPU stays between 200-350W in gameplay, which is not too different from the highest end GPU of 10 years ago. Plus, today you can undervolt + OC GPUs by a good margin to keep stock performance while utilizing 80% of the power limit.

You can make the 4090 extremely power efficient and keep 90% of its performance at 320W.

However, in today's world the chip manufacturing is limited by physics and we will have power increases in the next 5-10 years at the very least to keep the technology moving forward at a pace that satisfies both businesses and consumers.

Maybe in 10 years we have new tech coming to the markets which we are yet to invent or perhaps we can solve existing technologies problems with manufacturing or cost of production.

On the other hand, if we were to solve the energy problem on earth by utilizing fusion and solar etc. it would not matter how much these chips require. That being said, in the next 30-40 years that is a pipedream.

MrBaskerville3d ago

I don't think fusion is the way forward. It will mosy likely be too late when it's finally ready, meaning it will probably never be ready. Something else might arrive before though and then it becomes viable.

Firebird3603d ago

We need to stop the smear campaign on nuclear energy.
We could power everything forever if we wanted too.

Tacoboto3d ago

PS4 Pro had dedicated hardware in it for supporting checkerboard rendering that was used significantly in PS4 first party titles, so you don't need to look to PC or even modern PC gaming. The first RTX cards released nearly 6 years ago, so how many nails does this coffin need?

InUrFoxHole3d ago

Well... its a coffin man. So atleast 4?

Tacoboto3d ago

PSSR in the fall can assume that role.

anast3d ago

and those nails need to be replaced annually

Einhander19723d ago

I'm not sure what the point you're trying to make is, but PS4 Pro was before DLSS and FSR, and it still provides one of the highest performance uplifts while maintaining good image quality.

DLSS is it's own thing but checkerboarding om PS5 still is a rival to the likes of FSR2.

Tacoboto3d ago

Um. That is my point. That there have been so many nails in this "native performance" coffin and they've been getting hammered in for years, even on PS4 Pro before DLSS was even a thing.

RaidenBlack3d ago

Don't know what's OP's point is either but ... checkerboard rendering was good enough for its time but in terms of image quality its wayy behind what's DLSS 3 or FSR 3 is currently offering.
The main point of the article and what OP missed here is that DLSS 3.7 is soo good that its nearly undisguisable from native rendering and basically throws the "its still blurry and inferior to native rendering" debacle, (that's been going around in PC community since 2019), right out of the window.

Einhander19722d ago

RaidenBlack

DLSS is as i said a different thing from FSR and checkerboard.

But you're talking about FSR 3 which probably is better than checkerboard, but FSR 3 has only started to get games this year, so checkerboard which was the first hardware upscaling solution was and is still one of the best upscaling solutions.

Give credit where credit is due, PlayStation was first and they got it right from the get go, and PSSR will almost certainly be better than it will be given credit for, heck digital foundry is already spreading misinformation about the Pro.

Rhythmattic3d ago

Tacoboto
Yes... Its amazing how many talekd about KZ2 deferred rendering, pointing out the explosions were lower res than the frame itself..
And of course, Then the idea of checkerboard rendering, not being native....
For sure, maybe this tech makes it minimal while pixel counting, but alas, seems performance and close enough , and not native now matters.....
I want to see it run native without DLSS.. why not?

RonsonPL3d ago

Almost deaf person:
- lightweight portable 5$, speakers of 0,5cm diameter are the final nail in coffin of Hi-Fi audio!

Some people in 2010:
- smartphones are the final nain in the console gaming's coffin!

This is just the same.
AI upscalling is complete dogshit in terms of motion quality. The fact that someone is not aware of it (look at the deaf guy example) doesn't mean the flaws are not there. They are. And all it takes to see them, is to use a display that handles motion well, so either gets true 500fps at 500Hz LCD TN or OLED (or faster tech) or uses low persistence mode (check blurbusters.com if you don't know what it means) also known as Black Frame Insertion or backlight strobing.

Also, image ruined by any type of TAA is just as "native image" as chineese 0,5$ screwdriver is "high quality, heavy duty, for professional use". It's nowhere near it. But if you're an ignorant "journalist", you will publish crap like this article, just to flow with the current.

There's no coffin to native res quality and there never will be. Eventually, we'll have enough performance in rasterization to drive 500fps, which will be a game changer for motion quality while also adding other benefit - lower latency.
And at 500fps, the amount of time required for upscalling makes it completely useless.
This crap is only usable for cinematic stuff, like cutscenes and such. Not for gaming. Beware of ignorants on the internet. The TAA is not "native" and the shitty look of the modern games when you disable any TAA, is not "native" either as it's ruined by the developer's design choice - you can cheat by rendering every 4th pixel when you plan to put a smeary TAA pass on it later on. When you disable it, you will see a ruined image, horrible pixellation and other visual "glitches" but it is NOT what native would've looked like if you'd like to honestly compare the two.

Stay informed.

RaidenBlack3d ago

Main point of the article is how far DLSS has come with v3.7 since 2018. If this is what we're getting already, then imagine what we'll get within next ~3 years. Yes parity will obviously be there compared to the then native rendering tech but it'll slowly narrow down to the point it'll be indistinguishable.
Something similar is like the genAI Sora ... AI generative videos were turd back when they were introduced (the infamous Will Smith eating video) ... but now look at Sora, generating videos that just looks like real life.

2d ago
Yui_Suzumiya3d ago

How much VRAM is standard today? My laptop has a 1080p QLED display but only an Intel Iris Xe with 128MB of VRAM. I currently do all my gaming on it but certain titles do suffer because of it. I plan on getting a Steam Deck OLED soon to play the newer and more demanding titles.

purple1013d ago

Maybe better to get a budget gaming laptop and link a dualsense to it

= Portable console with far better graphics than a steam deck! + bigger screen and able to use it for work / etc

170°

Why I'm worried about the Nvidia RTX 50 series

Aleksha writes: "Nvidia has established itself as a dominant force in the world of AI, but I can't shake the worry of what this means for the RTX 50 series."

Tal1696d ago

Echo sentiment here - I think the way GPUs are going, gaming could be secondary to deep learning. Wonder if the 40 series was the last true generation of GPUs?

Number1TailzFan6d ago

No.. Jensen believes GPUs should stay expensive. Those wanting a top end GPU will have to splash out for it, or play at just 1080p and 60fps or something if you can only afford a low end option.

On the other hand if you don't care about RT or AI performance then there's always AMD that are doing ok at the mid range.

Christopher6d ago

***or play at just 1080p and 60fps or something***

My over 2-year-old laptop GPU still runs fine. I think this is more a reason why GPUs are going to other things in priority, because the market reach for new users is shrinking as more PC gamers focus less on replacing older and still working parts that run RT/AI fine enough as it is. Not to say there aren't people who still do it, but I think the market is shrinking for having the latest and greatest like it has been the past two decades. Problem is we aren't growing things at a rate as we were, we're reaching the the flattening of that exponential curve in regards to advancement. We need another major technological advancement to restart that curve.

D0nkeyBoi6d ago

The irremoval ad makes it impossible to read article

Tzuno6d ago (Edited 6d ago )

I hope Intel takes some lead and do a big dent to nvidia sales

Jingsing6d ago

You also need to consider that NVIDIA are heavily invested in cloud gaming. So they are likely going to make moves to push you into yet another life subscription service.

Kayser816d ago

NVIDIA will never change their price point until AMD or intel makes a GPU that is comparable and cheaper than them .
it happend before in the days of gtx280 which they changed the price from 650$ to 450$ in a matter of 2 weeks because of rx4870 which is being sold at 380$.

Show all comments (8)
270°

AMD FSR 3.1 Announced at GDC 2024, FSR 3 Available and Upcoming in 40 Games

Last September, we unleashed AMD FidelityFX™ Super Resolution 3 (FSR 3)1 on the gaming world, delivering massive FPS improvements in supported games.

Read Full Story >>
community.amd.com
Eonjay24d ago (Edited 24d ago )

So to put 2 and 2 together... FSR 3.1 is releasing later this year and the launch game to support it is Rachet and Clank: Rift Apart. In Sony's DevNet documentation it shows Rachet and Clank: Rift Apart as the example for PSSR. PS5 Pro also launches later this year... but there is something else coming too: AMD RDNA 4 Cards (The very same technology thats in the Pro). So, PSSR is either FSR 3.1 or its a direct collaboration with AMD for that builds on FSR 3.1. Somehow they are related. I think PSSR is FSR 3.1 with the bonus of AI... now lets see if RDNA 4 cards also include an AI block.

More details:
FSR 3.1 fixes Frame Generation
If you have a 30 series RTX card you can now use DLSS3 with FSR Frame Generation (No 40 Series required!)
Its Available on all Cards (we assume it will come to console)
Fixes Temporal stability

MrDead23d ago

I've been using a mod that allows dlss frame gen on my 3080 it works on all rtx series. It'll be good not to rely on mods for the future.

darksky23d ago

The mods avaiable are actually using FSR3 frame gen but with DLSS or FSR2 upscaling.

Babadook723d ago (Edited 23d ago )

I think that the leaks about the 5 Pro would debunk the notion that the two (FSR 3.1 and PSSR) are the same technology. PSSR is a Sony technology.

MrDead24d ago (Edited 24d ago )

I wonder how much they fixed the ghosting in dark areas as Nvidia are leaving them in the dust with image quality. Still good that they are improving in big leaps, I'll have to see when the RTX5000 series is released who I go with... at the moment the RTX5000's are sounding like monsters.

just_looken24d ago

Did you see the dell leaks were they are trying to cool cards using over 1k watts of power.

We are going to need 220 lines for next gen pcs lol

MrDead23d ago

That's crazy! Sounds like heating my house won't be a problem next winter.

porkChop23d ago

As much as I hate supporting Nvidia, AMD just doesn't even try to compete. Their whole business model is to beat Nvidia purely on price. But I'd rather pay for better performance and better features. AMD also doesn't even try to innovate. They just follow Nvidia's lead and make their own version of whatever Nvidia is doing. But they're always 1 or 2 generations behind when it comes to those software/driver innovations, so Nvidia is always miles ahead in quality and performance.

MrDead23d ago

I do a lot of work on photoshop so an Intel Nvidia set up has been the got to because of performance edge, more expensive but far more stable too. Intel also have the edge over AMD processors with better load distribution on the cores, less spikes and jitters. When you're working large format you don't want lag or spikes when you're editing or drawing.

I do think AMD has improved massively though and whist I don't think they threaten Nvidia on the tech side they do make very well priced cards and processors for the power. I'm probably going with a 5080 or 5090 but AMD will get a little side look from me, which is a first in a long time... but like you said they are a generation or two behind at the moment.

Goosejuice23d ago

While I can't argue for amd gpu, they aren't bad but they aren't great either. The cpu for amd have great. I would argue the 7800x3d as one of the best cpu for gaming right now. Idk about editing so I take ur word for that but gaming amd cpu is a great option these days.

porkChop22d ago

@Goosejuice

I have a 7800X3D. It certainly is great for gaming. Though for video editing, rendering, etc, I think Intel have the advantage from what I remember. I just mean from a GPU standpoint I can't support them.