400°

Analyzing the PS4 / Xbox 720 GPU

The specs for Sony’s successor to the PlayStation brand, the PS4 will feature parts from AMD. Advanced Micro Devices are supplying both the GPU (graphics processing unit) and the CPU for the new system.

This article gives some insight on why certain things are as they are, what to expect and how it all works.

Read Full Story >>
redgamingtech.com
yewles14053d ago

PS4's Liverpool GPU is confirmed to have 512KB L2 Cache and has 1.848TFLOPS performance. Also, the PS4's CU's are unified for both graphics and GP compute.

GameNameFame4053d ago

and obviously 720 with far less flops of 1.2

DatNJDom814053d ago (Edited 4053d ago )

Confirmed that xbox will hold PS4 back next gen once again when it comes to multiplats.

m$: Ok guys, heres a "BASKET". Make sure that the games dont look better on PS4.

Developers 1: But, we can push this just a little more. Our fans deserve the best possible outcome for the game were making, right?

Developer 2: Shut up and take the "BASKET"!

pc fanboyz: "This game will look better on my nvidiazzzz 7.1Tflop masterbation gfx card! Zieg Heil!"

Gaming1014053d ago

A lot of technical speculation that doesn't have all the facts right, like the teraflops on PS4 etc. The NextBox specs are still just rumoured based on what we know so far, so this would be more useful once we know confirmed final specs.

Really we're just splitting hairs here. Devs will be able to come up with whatever will work, however the PS4 won't be held back, the NextBox will just be pared down versions of PS4 games. The only thing holding back would be space limitations, and evidently you install all games to the hard drive of the Nextbox, which has people nuts since it disables used games. It just means you'll be standing around installing 6 DVD discs since Msoft won't support Bluray even if it kills them.

NewMonday4053d ago

from the recent M.Cerny interview the APIs are still not optimized for HSA like programming, when that happens expect a much higher level of performance from the APU.

ProjectVulcan4053d ago (Edited 4053d ago )

Various little inaccuracies in the report. All you need to know I guess is that the core is maybe 15 percent slower than a 7970M because of cut down CU units and clocks and less than half the performance of a full blown desktop standard 7970 I. e not the gigahertz edition which is even faster.

Memory bandwidth about the same as the 7970M because I have seen 20gb/s or so claimed to be used by the CPU which is roughly what you would expect.

But we known this for some time now...

hesido4053d ago

[Because the ram is shared] "the amount of [bandwith] that the GPU will have ‘available’ will likely be considerably lower than this."

This is not accurate. It won't be "considerably" lower, as CPU bandwidth requirements will be MUCH lower compared to the GPU, so the GPU will have the lion's share of the bandwidth.

Plus, it makes me think they didn't exactly grasp what "unified" ram means. The unified ram will allow 0 copy operations between CPU and GPU, any one of their outputs will be accessible by the eachother without having to move data around, which is a MASSIVE gain and will open up a lot of possibilities further down the PS4 life cycle. As openCL and HSA libraries mature, PS4 will continue to offer performance gains that currently few can imagine.

nunley334053d ago

@gaming101 If rumors are to be believed, the 720 will have blu-ray. The 360 struggled alot with many games especially in the 2nd half with dvd 9 so how bad would it be with much bigger 720 games? It'll have blu-ray for sure, even if it's like the WII-U with no blu-ray movies supported.

TheXonySbox4053d ago

confirmed; next gen consoles holding back PC gaming yet again.

MysticStrummer4053d ago (Edited 4053d ago )

Has the 720 GPU been confirmed? If not, can we hold off on these articles until it has been?

Having said that, lol @ DatNDom81's script.

@TheXonySbox - Get your PC brethren to pay full price for more games and that might change. Consoles don't hold back PC gaming. PC gamers hold back PC gaming with piracy and buying new games at incredible discounts. Besides that, people who pimp out a gaming PC overestimate how many other people actually do that. Most PC gaming is done on stock PCs or PCs with minimal upgrades. Devs aren't going to put in too much extra effort so that relatively few people get marginally better graphics.

Muerte24944053d ago (Edited 4053d ago )

gamers say that consoles are holding PC games back. It's kinda hard to hold PC back when games don't get released on PC until 6 months after consoles have it (Mainstream games). Anyone remember Super Street Fighter? Did Gears 2 or 3 even make it to PC? I'm mentioning xbox360 games because the architectures were similar.

PLASTICA-MAN4053d ago

The article is just biased, just downplaying the PS4. If it was objective it would mention this :

PS4 GPU is based on GCN 2.0 architecture. PS4 GCN has 8 ACE's, each capable of running 8 CL's each. Tahiti is 2 per ACE, 2 ACE's.

http://www.neogaf.com/forum...

And it was confirmed by Mark Cerny these days:

"This idea has 8 pipes and each pipe(?) has 8 computation queues. Each queue can execute things such as physics computation middle ware, and other prioprietarily designed workflows. This, while simultaneously handling graphics processing."

http://www.neogaf.com/forum...

That's a damn amount of processing power and units that the article forgot to mention. This alone will make the PS4 outshine.

darthv724053d ago

all i see on here is:

blah blah blah, blah blah blah blah.

No its , blah blah blah. NOT blah blah blah!

you all just need to just be patient for the final release. Then it will get torn down and analyzed piece by piece by every tech site in the known internet.

The current ps3 and 360 offer good games. The 720 and PS4 will no doubt offer good games but for that to happen, we need to let these guys work out the details of what this does and where does that go.

Is that too much to ask?

+ Show (9) more repliesLast reply 4053d ago
ATi_Elite4048d ago (Edited 4048d ago )

can someone please inform me when did SONY confirm a HD7970m for the PS4?

Oh wait they didn't so this article is FAIL!

the ONLY gpu specs we got were 1.84tFlops of compute performance which puts it nicely at a HD7790 level (1.79tflops) when u add the addition 4 compute units that the PS4 version will have now you may get 1.84tflops

(HD7970 is 3.79 tflops)
(HD7970m is 2176 gflops yes gflops)

PS4 is NOT getting a HD7970 or HD7970m way to make stuff up stupid website!

The HD7790 is low watt and has watt tune software that manages watt consumption and is most likely what is going into the PS4.

Now Granted Sony has NOT confirmed anything but going by factual SPECS this seems more realistic.

I'm so amazed how these websites just make crap up to get hits from consolers who just don't check facts as they are too busy glorifying any article that promotes their console as a hero machine.

This website just made up this HD7970m CRAP!

Story = WTF
Like website = NO

iGAM3R-VIII4053d ago

Well even though the desktop version seems only slightly better, it means that the console will be easier to push to their limit which means better quality games

zmack4053d ago (Edited 4053d ago )

Well the desktop version isn't exactly "slightly" better. Both AMD and Nvidia tend to tone down their mobile GPUs a bit compared to their counterparts and name them accordingly for marketing purposes. Basically, they want to make the consumer see a 7970 or 680 desktop card and then when said consumer checks out a laptop they will see a 7970 and a 680, but the catch is the added m at the end. However, Nvidia did release a 680mx variant and it has the full cuda core count as the desktop version. So, yeah, both companies love to play on names like this, but their mobile cards are still pretty decent.

Steam processors can make a big difference in performance. The 7970m has 1280, which is a bit less than the 2048 found on the 7970. So, I'm sure you have seen a lot of people compare the PS4 GPU to a 7870 desktop version, well that's because it has 1280 stream processors just like the 7970m.

http://www.newegg.com/Produ...

http://www.notebookcheck.ne...

So, let's compare the 7870 to a 7970 in benchmarks:

http://www.guru3d.com/artic...

http://www.anandtech.com/sh...

There can be a 10-20fps difference when comparing the two cards. So, the 7970 can be a bit more powerful depending on the game than the 7870. There's also a GHZ edition of the 7970 desktop GPU with even higher clocks as well. Furthermore, the GPU in the PS4 is running at lower clock speeds than the 7870, so there is a bit more of an fps gap than what was previously stated (7870 = 1GHz and 7970m = 850Mhz).

However, the great news is that a 7970m is still an awesome GPU and it can run a lot games at some nice settings. So, Sony still really picked out some nice hardware for their system.

JsonHenry4053d ago

Can someone explain to me how the next xbox's "move engines" work and how that will change the performance of the console?

Cueil4053d ago (Edited 4053d ago )

the purpose of the move engine is to basically increase the productivity of the bandwidth available. One of the things that you have to understand is that the 1.2 tflops of the next Xbox will be realized. Most GPUs are 40-60 percent their theoretical peaks. Microsoft uses a varied of methods to achieve near 100 percent theoretical limitation. And remember the PS2 could do more flops then the original Xbox... I think we all understand how much more powerful the Xbox was then the PS2... it's more then flops that's just the new "Bits" of the past two generations.

(was that two dumbed down? I'm really tired, but in reality we wont know exactly how much effect they'll have till the specs are out and people get their hands on it outside of the NDAs)

NewMonday4053d ago

from what I understand they are for going around bottlenecks. the PS4 GPU has a cache bypass that dose something like it.

JsonHenry4053d ago

No, that explained it well enough for me. I have a novice's understand of PC hardware and the like (been building my own for 15 years) just I was not sure how the move engine was being used. I guess just making the system more efficient would help more than adding raw throughput since the flops of performance thing is generally way over-rated in terms of what is "capable" vs. what is "actual". Thank you for your reply. +1 bubble

THEDON82z14052d ago (Edited 4052d ago )

@Cueil = TRU what you said,but microsoft problem will be the same one PS3 HAD.It will take heavy optimization(time/resources=MO NEY) AND UNLESS microsoft builds up some REALLY good 1st party studios to take advantage OF IT LIKE SONY did with ps3,then it will really just become another road block in developers way when trying to get there game out in a timely mannor.As I said in another post, the slight advantage they had by desighning this system was killed when Sony double its gddr5 ram.You also have to remember that even when Sony pump all that money into 1st party R/D it still took alot time for developers to really harness it.The result was multiplates on PS3 struggled(in the beggining)untill they built it on ps3 then ported to xbox. Thats what I see is coming to the next xbox when everyone starts asking about 1080p/60fps as a standard.The only diffrence will be that I believe developers will for the most part stick with ps4/pc as lead development.

Cueil4052d ago

What people don't seem to get is that programing for the X720 is not going to be any different then what they were doing with the 360 only instead porting games over to and from PS4 will be easier since they share the same hardware programming language and both use Microsoft Visual Studio if I'm not mistaken... a great move by Sony if they were not already doing so. Microsoft isn't becoming some kind of complex piece of hardware it's simply creating efficiency and that seems to be the direction Microsoft and AMD have been moving towards. I'm not going to pretend to know the real specs of the Xbox, but my guess is that the system is being build so that programmers can access it's power at or near theoretical limits.

THEDON82z14052d ago

@Cueil -I hope you are right because if the rumors are true and 3 out of 8 gigs of ram is for OS,they are good to need every trick they can get once next gen games start to muture around 2014/15.I dont want to see my PS4 (MULTIPLATFORM) GAMES held back in anyway.

+ Show (2) more repliesLast reply 4052d ago
Skynetone4053d ago (Edited 4053d ago )

while the xbox is busy doing calculations, the move engines are free to add even more calculations

its like a turbo boost

what does it mean in the real world, i guess well have to wait for halo 5

4053d ago
DeadlyFire4053d ago (Edited 4053d ago )

Move engines do boost, but still not to the same level. It will be slower. 32 MB ESRAM with move engines waiting in line with data at 100 GB/s can only boast so much with data while 8 GB can run at 176 GB/s on PS4.

Cueil4053d ago

I understand that the numbers look pretty, but bandwidth is a lot more complex then some of you seem to understand. I'm by no means an expert, but there are all kinds issues with just throwing up arbitrary numbers without understanding all the parts and technology behind those numbers. Do we know how much memory and cycles the OS is going to take up or how much is going to be reserved for the social features of each console? That stuff takes up bandwidth.

4053d ago Replies(3)
Mkai284052d ago

Xbox 720 has four move engines itself which allows for fast direct memory access to take place.

Their true purpose is to take workloads off of the rest of the system while yielding positive results at low cost.AKA "Secret Sauce"

The four move engines have abilities such as: from main RAM or from ESRAM, to main RAM or to ESRAM, from linear or tiled memory format, to linear or tiled memory format, from a sub-rectangle of a texture, to a sub-rectangle of a texture, from a sub-box of a 3D texture and to a sub-box of a 3D texture.

Each of the four move engines can read and create 256 bits of date per GPU clock cycle, which equals out to be a peak throughput of 25.6 GB/s two ways.

All of the engines use a single memory path, resulting in the best throughput for all of the engines that would be the same for only one of the engines.

They share their bandwidth with different components of the GPU, like video encode and decode, the command processor and the display output. The other source typically only consume a small amount of the bandwidth.

The great thing about the move engines is they can operate at the same time as computation is taking place. When the GPU is doing computations, the engines operations are still available. While the GPU is working on bandwidth, move engine operations can still be available so long as they use different pathways.

+ Show (1) more replyLast reply 4052d ago
DragonKnight4053d ago

I think it's great that redgamingtech.com knows what's in the Xbox 720 when no one else does.

Show all comments (57)
270°

AMD gaming revenue declined massively year-over-year, CFO says the demand is 'weak'

Poor Xbox sales have affected AMD’S bottom line

Read Full Story >>
tweaktown.com
RonsonPL5d ago

Oh wow. How surprising! Nvidia overpriced their RTX cards by +100% and AMD instead of offering real competition, decided to join Nvidia in their greedy approach, while not having the same mindshare as Nvidia (sadly) does. The 7900 launch was a marketing disaster. All the reviews were made while the card was not worth the money at all, they lowered the price a bit later on, but not only not enough but also too late and out of "free marketing" window coming along with the new card generation release. Then the geniuses at AMD axed the high-end SKUs with increased cache etc, cause "nobody will buy expensive cards to play games" while Nvidia laughed at them selling their 2000€ 4090s.
Intel had all the mindshare among PC enthusiasts with their CPUs. All it took was a competetive product and good price (Ryzen 7000 series and especially 7800x3d) and guess what? AMD regained the market share in DYI PCs in no time! The same could've have happened with Radeon 5000, Radeon 6000 and Radeon 7000.
But meh. Why bother. Let's cancell high-end RDNA 4 and use the TSMC wafers for AI and then let the clueless "analysts" make their articles about "gaming demand dwingling".

I'm sure low-end, very overpriced and barely faster if not slower RDNA4 will turn things around. It will have AI and RT! Two things nobody asked for, especially not gamers who'd like to use the PC for what's most exciting about PC gaming (VR, high framerate gaming, hi-res gaming).
8000 series will be slow, overpriced and marketed based on its much improved RT/AI... and it will flop badly.
And there will be no sane conclusions made at AMD about that. There will be just one, insane: Gaming is not worth catering to. Let's go into AI/RT instead, what could go wrong..."

Crows904d ago

What would you say would be the correct pricing for new cards?

Very insightful post!

RonsonPL4d ago

That's a complicated question. Depends on what you mean. The pricing at the release date or the pricing planned ahead. They couldn't just suddenly end up in a situation where their existing stock of 6000 cards is suddenly unsellable, but if it was properly rolled out, the prices should be where they were while PC gaming industry was healthy. I recognize the arguments about inflation, higher power draw and PCB/BOM costs, more expensive wafers from TSMC etc. but still, PC gaming needs some sanity to exist and be healthy. Past few years were very unhealthy and dangerous to whole PC gaming. AMD should recognize this market is very good for them as they have advantage in software for gaming and other markets while attractive short term, may be just too difficult to compete at. AI is the modern day gold rush and Nvidia and Intel can easily out-spend AMD on R&D. Meanwhile gaming is tricky for newcomers and Nvidia doesn't seem to care that much about gaming anymore. So I would argue that it should be in AMDs interest to even sell some Radeon SKUs at zero profit, just to prevent the PC gaming from collapsing. Cards like 6400 and 6500 should never exist at their prices. This tier was traditionally "office only" and priced at 50$ in early 2000s. Then we have Radeons 7600 which is not really 6-tier card. Those were traditionally quite performant cards based on wider than 128-bit memory bus. Also 8GB is screaming "low end". So I'd say the 7600 should've been available at below 200$ (+taxes etc.) as soon as possible, at least for some cheaper SKUs.For faster cards, the situation is bad for AMD, because people spending like $400+ are usually fairly knowledgable and demanding. While personally I don't see any value in upscallers and RT for 400-700$ cards, the fact is that especially DLSS is a valuable feature for potential buyers. Therefore, even 7800 and 7900 cards should be significantly cheaper than they currently are. People knew what they were paying for when buying Radeon 9700, 9800, X800, 4870 etc. They were getting gaming experience truly unlike console or low-end PC gaming. By all means, let's have expensive AMD cards for even above $1000, but first, AMD needs to show value. Make the product attractive. PS5 consoles can be bought at 400$. If AMD offers just a slightly better upscalled image on the 400$ GPU, or their 900$ GPU cannot even push 3x as many fps compared to cheap consoles, the pricing acts like cancer on PC gaming. And poor old PC gaming can endure only so much.

MrCrimson4d ago

I appreciate your rant sir, but it has very little to do with gpus. It is the fact that the PS5 and Xbox are in end cycle before a refresh.

RonsonPL4d ago

Yes, but also no. AMD let their PC GPU marketshare to shrink by a lot (and accidentally helped the whole market shrink in general due to bad value of PC GPUs over the years) and while their console business may be important here, I'd still argue their profits from GPU division could've been much better if not for mismanagement.

bababooiy4d ago

This is something many have argued over the last few years when it comes to AMD. The days of them selling their cards at a slight discount while having a similar offering are over. Its not just a matter of poor drivers anymore, they are behind on everything.

Tody_ZA4d ago (Edited 4d ago )

Great post. I went for a Nvidia RTX 3060Ti which was insane value for money when I look at the fidelity and frame rates I can push in most games including new releases. Can't justify spending 3 times what my card cost at the time to get marginal better returns or the big sell of "ray tracing", which is a nice to have feature but hardly essential given what it costs to maintain.

+ Show (1) more replyLast reply 4d ago
5d ago Replies(1)
KwietStorm_BLM4d ago

Well that's gonna happen when you don't really try. I want to support AMD so badly and give Nvidia some actual competition but they don't very much seem interested in challenging, by their own accord. I been waiting for them to attack the GPU segment the same way they took over CPU, but they just seem so content with handing Nvidia the market year after year, and it's happening again this year with their cancelled high end card.

MrCrimson4d ago

I think you're going to see almost zero interest from AMD or Nvidia on the gaming GPU market. They are all in on AI.

RhinoGamer884d ago

No Executive bonuses then...right?

enkiduxiv4d ago

What are smoking? Got to layoff your way to those bonuses. Fire 500 employees right before Christmas. That should get you there.

Tapani4d ago (Edited 4d ago )

Well, if you are 48% down in Q4 in your Gaming sector as they are, which in absolute money terms is north of 500M USD, then you are not likely to get at least your quarterly STI, but can be applicable for annual STI. The LTI may be something you are still eligible for, such as RSUs or other equity and benefits, especially if they are based on the company total result rather than your unit. All depends on your contract and AMD's reward system.

MrCrimson4d ago

Lisa Su took AMD from bankruptcy to one of the best semiconductor companies on the planet. AMD from 2 dollars a share to 147. She can take whatever she wants.

Tapani4d ago

You are not wrong about what she did for AMD and that is remarkable. However, MNCs' Rewards schemes do not work like "take whatever you want, because you performed well in the past".

darksky4d ago

AMD prcied their cards thinking that they will sell out just like in the mining craze. I suspect reality has hit home when they realized most gamers cannot afford to spend over $500 for a gpu.

Show all comments (33)
100°

Make your next GPU upgrade AMD as these latest-gen Radeon cards receive a special promotion

AMD has long been the best value option if you're looking for a new GPU. Now even their latest Radeon RX 7000 series is getting cheaper.

Father__Merrin14d ago

Best for the money is the Arc cards

just_looken14d ago

In the past yes but last gen amd has gotten cheaper and there new cards are on the horizon making 6k even cheaper.

The arc cards are no longer made by intel but asus/asrock has some the next line battlemage is coming out prices tbd.

Do to the longer software development its always best to go amd over intel if its not to much more money even though intel is a strong gpu i own 2/4 card versions.

270°

AMD FSR 3.1 Announced at GDC 2024, FSR 3 Available and Upcoming in 40 Games

Last September, we unleashed AMD FidelityFX™ Super Resolution 3 (FSR 3)1 on the gaming world, delivering massive FPS improvements in supported games.

Read Full Story >>
community.amd.com
Eonjay45d ago (Edited 45d ago )

So to put 2 and 2 together... FSR 3.1 is releasing later this year and the launch game to support it is Rachet and Clank: Rift Apart. In Sony's DevNet documentation it shows Rachet and Clank: Rift Apart as the example for PSSR. PS5 Pro also launches later this year... but there is something else coming too: AMD RDNA 4 Cards (The very same technology thats in the Pro). So, PSSR is either FSR 3.1 or its a direct collaboration with AMD for that builds on FSR 3.1. Somehow they are related. I think PSSR is FSR 3.1 with the bonus of AI... now lets see if RDNA 4 cards also include an AI block.

More details:
FSR 3.1 fixes Frame Generation
If you have a 30 series RTX card you can now use DLSS3 with FSR Frame Generation (No 40 Series required!)
Its Available on all Cards (we assume it will come to console)
Fixes Temporal stability

MrDead45d ago

I've been using a mod that allows dlss frame gen on my 3080 it works on all rtx series. It'll be good not to rely on mods for the future.

darksky44d ago

The mods avaiable are actually using FSR3 frame gen but with DLSS or FSR2 upscaling.

Babadook744d ago (Edited 44d ago )

I think that the leaks about the 5 Pro would debunk the notion that the two (FSR 3.1 and PSSR) are the same technology. PSSR is a Sony technology.

MrDead45d ago (Edited 45d ago )

I wonder how much they fixed the ghosting in dark areas as Nvidia are leaving them in the dust with image quality. Still good that they are improving in big leaps, I'll have to see when the RTX5000 series is released who I go with... at the moment the RTX5000's are sounding like monsters.

just_looken45d ago

Did you see the dell leaks were they are trying to cool cards using over 1k watts of power.

We are going to need 220 lines for next gen pcs lol

MrDead45d ago

That's crazy! Sounds like heating my house won't be a problem next winter.

porkChop44d ago

As much as I hate supporting Nvidia, AMD just doesn't even try to compete. Their whole business model is to beat Nvidia purely on price. But I'd rather pay for better performance and better features. AMD also doesn't even try to innovate. They just follow Nvidia's lead and make their own version of whatever Nvidia is doing. But they're always 1 or 2 generations behind when it comes to those software/driver innovations, so Nvidia is always miles ahead in quality and performance.

MrDead44d ago

I do a lot of work on photoshop so an Intel Nvidia set up has been the got to because of performance edge, more expensive but far more stable too. Intel also have the edge over AMD processors with better load distribution on the cores, less spikes and jitters. When you're working large format you don't want lag or spikes when you're editing or drawing.

I do think AMD has improved massively though and whist I don't think they threaten Nvidia on the tech side they do make very well priced cards and processors for the power. I'm probably going with a 5080 or 5090 but AMD will get a little side look from me, which is a first in a long time... but like you said they are a generation or two behind at the moment.

Goosejuice44d ago

While I can't argue for amd gpu, they aren't bad but they aren't great either. The cpu for amd have great. I would argue the 7800x3d as one of the best cpu for gaming right now. Idk about editing so I take ur word for that but gaming amd cpu is a great option these days.

porkChop43d ago

@Goosejuice

I have a 7800X3D. It certainly is great for gaming. Though for video editing, rendering, etc, I think Intel have the advantage from what I remember. I just mean from a GPU standpoint I can't support them.