PS4 Separate Memories Would've Been Preferred, GPU Is Here To Stay, SDK Will Make Console Faster

"When the PlayStation 4 was first announced, several games from the free to play and other genres were ported to the new console. Some of those games reportedly took only a few months to port on the PlayStation 4."

Read Full Story >>
The story is too old to be commented.
incendy352506d ago (Edited 2506d ago )

Interesting comment. Is he saying DDR5 isn't ideal for certain memory patterns? Is it a latency issue.

@Rashid: From looking at the diagrams the accessibility paths are unique, is that not the case? It is the same memory but it looks like GPU/CPU can access simultaneously, the bridges do not cross. GPU enters from one direction "at full speed", CPU from the other bridge "which is much slower". I would think for Unified Memory to truly work well this would always be the case. Otherwise it seems GPU would saturate the smaller "CPU Bus" bandwidth big time.

MasterCornholio2506d ago

I heard about that before but with the way the PS4 is playing multiplats I wouldn't worry about it.

GameDev12506d ago

I don't think it is a case for worry, as he said it is very easy to develop for the PS4, I think its more of a developer preference

NewMonday2506d ago

It's about know how, ND gave a presentation on the best use of PS4 memory, they should check it out.


? This has to do with RAM not Graphic Effects. RAM extensive task such as textures, draw distance, and character models are equal between X1 and PS4. PS4 is superior on GPU resources such as resolution and steadier framerate as well as better graphic effects in general like lighting and shadows. For PS4's RAM to be more than twice as fast, it does not show in texture loading, especially battlefield 4 and infamous as well as watch dogs which all suffer the same texture and object pop-in as X1 does :/

dantesparda2506d ago (Edited 2506d ago )

@ Gutz

The PS4's RAM is faster than X1's period! The GDDR5 has a bandwidth of 176GB/s vs 68Gb/s for the X1's DDR3 and the eSRAM while helps is still only 32MB which causes its on problems/bottlenecks. And your assumption about texture loading are wrong because the reason for the texture pop up/in is because of the HDD, not the RAM speed

alexkoepp2506d ago

It's old news, that is why Xbox chose their memory solution, the DDR3 works better for their OS.


"The PS4's RAM is faster than X1's period!"

That is just not true, you like to conveniently leave out the eSRAM, because it shows your statement is false, then justify it by saying it a bottleneck. That isn't what developers are saying. We already know for fact 32MB eSRAM can deal with 6GB of textures. It may be a small quantity, and how you use it is everything. Time is going to tell on this one, we'll see what happens. Thus far though, we are watching a closing gap (Ghost 1080p vs 720p at the beginning of the cycle, to now Wolfenstein 1080p vs 1080p and Watch_Dogs 792 vs 900p) and not a widening one. The fact is the most visually stunning games on both platforms, Ryse and Killzone, both look spectacular. We are going to end up with phenomenally great looking games on both platforms, and it's up to ones budget and taste preferences what they want. I personally own both systems, but I can say at this point (and likely forward because of the known franchises that will see sequels) that I wouldn't be missing anything by selling my PS4. But I do own it so I'll get a few games for it here and there.

Go look again on the bandwidth of PS4's GDDR5 vs X1's 8GB DDR3 + 32MB eSRAM. You'll sind out you are indeed wrong.

Ju2506d ago

Wolfenstein isn't 1080 vs. 1080. It uses a dynamic framebuffer up to 1920 (x1080), where the X1 version can drop to 700 columns while the worst on the PS4 is 1720 or something. This in fact half the PS4 resolution under heavy load and pretty much what other games indicate. Of course, hardly visible, because lower resolution is less pronounced in motion - and reached full 1920x1080 in idle frames.

esRAM can't keep up with GDDR5. It can match it for small amount of data - and with a tricky tile setup that bandwidth is often not required for the full texture where all this comes into play. But in no way will it ever support e.g. 200MB render targets - with full bandwidth and resolution.

Also, these guys - if you port games - this might totally be true. But a engine designed for UMA would simply not move any memory and thus take actually take advantage of the shared memory; while by the sound of those guys, this is not the case with their engine.

+ Show (3) more repliesLast reply 2506d ago
falviousuk2506d ago (Edited 2506d ago )

GDDR5 is good for graphics which is why this is used on graphics cards, while the other type is better for more general cpu intensive code.

As far as i believe

gameseveryday2506d ago

@incendy35 Yes, this is indeed an interesting comment, since in practice developers will prefer different accessibility paths to CPU and GPU for better performance.

This is something that Cerny discussed during their PS4 development meetings about have a separate esram memory for the PS4, but they instead went for unified.

nypifisel2506d ago

This have been debunked so many times, the PS4 memory isn't latency burdened. Upmost reason why DDR3 is used as primary memory in computers is their price.

fr0sty2506d ago

There are benefits and drawbacks to unified memory. The drawbacks are what he mentioned, the benefits are when CPU and GPU are working together on the same data. That data does not have to be copied into the other device's memory pool first, which saves a heck of a lot of time in the grand scheme of things. In gaming, the CPU and GPU team up like that quite often, so it isn't a "split memory is better, period" situation by any means. Perhaps for their particular game, it works better, which would make sense since the engine was built for that kind of setup. However, as the console cycle continues on, GPGPU functions and CPU/GPU interaction will increase, so the benefits of unified memory will become more apparent.

IcicleTrepan2505d ago

honestly, people aren't thinking with their brains on this one. If GDDR5 was ideal for all situations, PCs would already have GDDR5 only. But they don't.

+ Show (2) more repliesLast reply 2505d ago
elazz2506d ago

And in the end this decision turned out great for both first, second and third party developers.

XiSasukeUchiha2506d ago

Best decision from Sony ever for greater looking multiplats.

lifeisgamesok2506d ago

With how small the performance gain on multiplats I'd say it doesn't matter

Most people even gamers can't tell the difference in motion

Everyone thought the UFC demo was running at 1080p only to find out it is 900p once digital foundry did their analysis

DigitalRaptor2505d ago

@ Lifeisgamesok

So you are saying that being ignorant negates the power and performance differences between the consoles?

Nice to know.

gameseveryday2506d ago

@incendy35 I don't have the diagram in front of me but what I can say is that being unified means there is probably going to be one dimensional way of accessing data [which is not bad thing]. All the calls from CPU and GPU happens at one time through a single memory.

incendy352506d ago

So the entry point is the issue: the two bridges meet and enter at one point but have to wait on each other for access? That makes sense why it would be good and bad. Good in that they can access the same pointers but bad in that they have to share entry even when they are not updating the same bits.

sinspirit2506d ago


CPU's transfer data through threads, but when a CPU and GPU access RAM they go through a bus. Most bus' nowadays are 128-bit, but the PS4 is 256-bit, twice the amount of data can be transferred at the same time. There is plenty of access to RAM, and both can access this RAM simultaneously.

Psygnosis3332505d ago


quad channel memory?

Tempest3172506d ago

I honestly read it as caused by the architectural changes required to unify the whole system...aka if they had 8gb gddr5, but it was split into dedicated 3 for o.s. and 5 for games...its still the same type of ram but its not having to use the multiple types of patterns on the same pool...thats just what I read into it

fr0sty2506d ago (Edited 2506d ago )

"On the other hand even making a PC exclusive game you can’t target high end configurations, so it doesn’t matter in practice.”

Which shoots down every "My PC will smoke your PS4" argument... the developers aren't even targeting your PC.

That isn't to say I don't like PC gaming, the mod community is something you'll never find on consoles. I just find some of the PC master race comments to be silly.

LamerTamer2506d ago (Edited 2506d ago )

"the developers aren't even targeting your PC"

And the big reason is that they have no idea what "your PC" is. It can be infinite combinations of CPU, GPU, RAM etc. Developers CAN target all of our PS4s though as they all have the same hardware. That is why when someone says "the PS4 is just a mid range PC", it is false. They can't go to the metal on a PC, they can on a console. They just brute force it on the PC by throwing lots of hardware at it. In the end the PC comes out ahead because they are so powerful, but the consoles aren't as weak as the specs on paper would indicate.

Father__Merrin2506d ago

many of them are downright stupid imo

ChickeyCantor2505d ago (Edited 2505d ago )

" They can't go to the metal on a PC"

They actually can. That's why both OpenGL and DirectX are getting an overhaul. Both AMD and Intel have common instruction sets to improve speed. Granted some cpus don't have certain instruction sets. Some do. So it comes down to the speed of data that can be send between hardware.

The only advantage consoles have is that they can be tweaked more since you're working with one specific hardware set. This does not mean that PC games do everything trough layers upon layers of high level APIs.

Visiblemarc2506d ago

Nearly every top dev had been asking for unified memory. You can't please everyone.

+ Show (5) more repliesLast reply 2505d ago
BadlyPackedKeebab2506d ago

The ultimate (realistic) system that could have been put out would most likely be something like 5GB DDR3, 3GB GDDR5 then 512 EsRam. You can swing the balance between the GDDR and DDR however you like, you will never please everyone, hence the pressure to go unified.

We could then just use the main bulk of the GDDR for the graphical work a la PS4 but also have the added advantage of having an actual usefully sized chunk of EsRam to do back buffer work. Sucker Punch were quoted as having a 200mb back buffer in Second Son. Hopefully that will make people appreciate why the Xbox One's 32mb is such a problem.

MasterCornholio2506d ago

Does ESRAM have to be included in the APU? Because by doing that they would have to reduce the size of the GPU which results in a less capable system.

BadlyPackedKeebab2506d ago (Edited 2506d ago )

Realistically yes it does. And yes size is an issues. There are ways around it such as layering the fabrication but I suspect the tech was just too new and costly at the time when specs are being fixed. If ms knew what they know now they prob would have gone for a 4gb ddr3 4gb gddr and had the 18 GPU cores. This would have made the bone equal, potentially better than the ps4.

They cut the graphical power to make way for the esram. They needed the esram because they needed to use cheap, cool and quiet ddr3 memory. They needed cool and quiet memory because of TV TV TV. The picture is quite clear when you follow the breadcrumbs.

Volkama2506d ago

But the 32mb esram on the One APU is huge. That's why they get their 5million transistors claim, or whatever they said when they introduced the console. It has been said by various tech analysis outlets that they could not include more even if they wanted to.

I'm not sure if having esram off-die would be worthwhile if it is possible. But I don't know, I'm parroting info.

Belasco2506d ago

It's 5 Billion transistors

BadlyPackedKeebab2506d ago

This is true. In the case of the way the xbo apu is made they could not fit any more in there. They could however look at other alternatives such as an additional layer.As I say though the tech just isn't humble enough yet for consoles.

Volkama2506d ago

Ok. I knew it was more than 200, but I was guessing from there.

LamerTamer2506d ago

I wonder why they couldn't have just made the die a bit bigger in the Bone? I guess cost as then you would get less yield per wafer, plus more defective chips maybe? It makes you wonder, if they didn't insist on that Kinect maybe that extra $100 could have gone into a bigger/better SoC.

+ Show (1) more replyLast reply 2506d ago
Psygnosis3332505d ago

512 EsRam lol and then the whole die will be used for Esam

uth112506d ago

There's always a trade off. But it doesn't seem like this decision causes too much of a bottleneck

akurtz2506d ago

a well designed console thatll bring great experiences for for the time come.

KazHirai2506d ago

I made sure that the PS4 will have the power to change the world.

Show all comments (63)
The story is too old to be commented.