800°
Submitted by Abriael 433d ago | news

Mark Cerny on PS4: GDDR5 Latency Not "Much of a Factor"

There have been voices of concern around the internet about the 8 Gigabyte of GDRR5 of the PS4 suffering from latency compared to to DDR3, but the console's Lead Architect Mark Cerny isn't particularly worried about that. (Mark Cerny, PS4)

Need4Game  +   433d ago
GTA5 looks great on PS3, thus PS4 memory Latency is not much of a factor.
Abriael  +   433d ago | Well said
I think the connection you made there kind of escapes me...
Crazyglues  +   433d ago | Helpful
@ Abriael

I think he was just saying that if PS3 which i known to have memory problem, and somehow GTA5 was still made on that system, then the PS4 will be fine as some clever developers will find a way around any small latency issues that may or may not be a problem, and still make amazing games...

Now to go more in depth- I think we have to understand what Latency is first- and how it is being talked about here in the PS4

Latency -latency is the time (either in clocks or nano-seconds) taken to transfer a block of data either from main memory or GPU caches. We want the data as quickly as possible, thus the lower the time the better. The size of the data block we request is usually the size of a native pointer (4 bytes in 32-bit, 8 in 64-bit).

As a GPU (or APU) executes instructions, both the instructions themselves and the data they operate on must be brought into registers; until the instruction/data is available, the GPU cannot proceed and must wait; even advanced designs that can execute out-of-order eventually need data.

Latency is generally measured in core "clocks" (1/frequency) for caches (as they usually run at GPU speed) and nano-seconds (10^-9) for the main memory.

- TO MAKE IT SIMPLE - The latency of the main memory directly influences the efficiency of the GPU and from what I seen of the PS4, this will not be an issue in the PS4.

.____........___...
.____||......||.......|___||
||.........___||............ ||
#1.1.1 (Edited 433d ago ) | Agree(83) | Disagree(11) | Report
nukeitall  +   433d ago
@Crazyglues:

Memory latency was never an issue with the GPU, as GDDR5 is what graphics card use by standard pretty much. So it doesn't get worse than it.

The key here is that it affects CPU and you might get frequent CPU cycles stall. Cerny cleverly deflects that, by only talking about the GPU and most journalist don't have the tech knowledge (for good reason) to ask the proper question.

In fact on PC, the graphics memory has to fetch the data from main memory first, so that indicates latency isn't a issue. However, on PC the CPU deals with DDR3 which has significantly lower latency than GDDR5 thus, but on PS4 that latency on GDDR5 carries over to GPU.

Cerny himself said:

"Latency in GDDR5 isn’t particularly higher than the latency in DDR3. On the GPU side… Of course, GPUs are designed to be extraordinarily latency tolerant so I can’t imagine that being much of a factor."

So it is a small factor on GPU, and likely much bigger on CPU!
#1.1.2 (Edited 433d ago ) | Agree(24) | Disagree(66) | Report
Gimmemorebubblez  +   433d ago
The reason latency is not an issue for the CPU is because the CPU is clocked at half the frequency of a normal CPU, 1.6 ghz instead of 3.2 ghz. If the CPU was faster than I could foresee latency issues. There is a reason the CPU is clocked at 1.6 ghz and not 2.4 or 3.2 ghz.
WhittO  +   433d ago
Maybe that is why Cerny was saying the other day how their GPU can actually be used as a CPU for a number of tasks.

Probably one of the enhancements they made to the chip.
#1.1.4 (Edited 433d ago ) | Agree(36) | Disagree(4) | Report
blackmagic  +   433d ago | Helpful
Latency isn't the time taken to transfer a block of data, it's the time it takes to START transferring the data.

For the GPU this tends to not be much of an issue, it makes far fewer requests for data but those requests tend to be for large chunks of data so the time it takes to start transferring data isn't as important as how fast that data can actually be moved. That's why you use GDDR5 memory for graphics, it's slow to start moving the data but it can move a lot of data quickly. Yes the GPU has to wait for the data to start coming but after that it isn't starved for data.

For the CPU it is the opposite, it makes far more requests for small pieces of data. It's more important that the data starts transferring sooner than how much data can be moved. That's why you use DDR3 for the CPU, it matters how long it has to wait for many small chunks of data.

Here's an analogy. Two painters are each contracted to paint a picket fence.

Painter 1, GPU, is asked to paint the picket fence white. He needs a 5 gallon tub of paint to do it. His assistant, GDDR5, goes to the back of the garage to get the tub of paint then he proceeds with painting the fence. It wouldn't have made much difference if the 5 gallon tub was handy at the front of the garage. It would have only made the job go a little quicker. It was more important that GDDR5 was strong enough to carry the 5 gallon tub.

Painter 2, CPU, is asked to paint each picket a different colour. He needs many quarts of different colour paints to do it. His assistant, DDR3, goes to the garage and picks a quart of paint off the shelf at the front of the garage. Then cpu sends his assistant for the next quart of paint and this continues until the job is done. It wouldn't have made much difference if DDR3 was strong, it was more important that he was quicker.
fr0sty  +   433d ago
Video games are far more GPU dependent than they are CPU dependent. This is why better bandwidth>better latency (and also why both consoles have a relatively weak CPU when compared against their GPUs). GPUs need lots and lots of bandwidth, as they have to constantly access VRAM to perform different operations, many of which take place at the same time and therefore require a lot of bandwidth so none of those operations starve each other of data when waiting to draw the next frame.

For those not technically knowledgeable, imagine it like this. GDDR5 is like having a 4 lane highway, with each car on that highway being a big van holding a bunch of passengers. DDR3 is more like a two lane highway with each car being a 2 seater sports car.

For tasks that require a lot of people to complete (like gaming), the 4 lane highway packed with vans full of people is going to give you better results. For smaller jobs that only require a couple people to complete, but need to be completed very quickly, the 2 lane road with sports cars on it will be the better approach. The faster sports car will get those 2 people where they are going faster.

Gaming happens to be one of those tasks that requires a lot of "people" working at the same time, so the 4 lane highway with vans on it is the better option. Xbox One will be shipping 2 people at a time to the job site, but they'll get there really fast due to their sports car. Thing is, once they realize they are doing a task that requires 4, 6, or 8 people, those 2 people will be stuck waiting on the other guys needed to complete the job to arrive as well. The PS4 will be unloading vans full of people each stop to complete the tasks, even though those vans will not be arriving as often as the sports cars will.

OS functions may run a bit slower on PS4 due to latency, or other general purpose computing tasks, but gaming most definitely reaps the benefits of the higher bandwidth approach. This machine was built from the ground up to be a gaming console, so that was the logical choice to make.

Edit: If you're going to disagree, at least leave a comment showing why. If you can't even detail why you don't agree, it shows you don't know enough about the topic to give your opinion in the first place.
#1.1.6 (Edited 433d ago ) | Agree(8) | Disagree(13) | Report
extermin8or  +   433d ago
Can I point out AMD are going to start selling APU's based on PS4's structure (with 4 instead of 8 core cpu's running at a slightly higher clock rate apparently)at the end of this year, and they have unified memory apparently. I'm assuming as they are based on PS4 this unified memory will be 4gb of GDDR5. So clearly somehow they've offset the issues as they plan to use some of this architecture for conventional pc's/budget gaming pc's.
fr0sty  +   433d ago
@blackmagic, I actually like your analogy better, as it better clarifies RAM latency vs. mine, which makes the latency sound more like internet latency. Though in the end, it still equals how long the chip has to wait to get it's data.

The GPU is definitely the work horse in these new consoles. This is why we're seeing games rely on GPGPU functions more and more these days, offloading tasks that the CPU normally would do, like physics, onto the compute units of the GPU.

There are actually 2 things going for PS4 that make up for the added CPU latency of GDDR5. First, PS4's GPU has 6 more compute units, with 4 of them being specialized for performing GPGPU tasks and all of them having a direct bus to the CPU to aid it with calculations. Then you have the fact that PS4's CPU can be devoted entirely to gaming. It has it's own sub-processor to run it's OS, whereas the Xbox One has to use one of it's 8 CPU cores to run it's OS (I hear 2 of them are reserved for the system, I imagine kinect takes the other one?). PS4 has all 8 CPU cores to use for gaming, plus the additional 4 compute units on the GPU that can aid the CPU. As such, PS4's CPU will not be as much of a performance bottleneck as Xbox One's will, despite the higher RAM latency. The additional 2GBs of RAM available to games to use factors in there as well.
#1.1.8 (Edited 433d ago ) | Agree(8) | Disagree(2) | Report
ABizzel1  +   433d ago
The latency complaints are getting out of hand. The fact of the matter is that GDDR5 is significantly better for powering games than normal DDR3. Latency has little to do with that.

The biggest complaint about latency you'll have would be OS features where simple things take a couple of seconds to load up, due to the high latency. But when it comes to games the bandwidth of the memory significantly outweighs any negatives latency could have on short calculation task.
Metfanant  +   432d ago
@exterminator...no they are DDR3
Dunpeal  +   433d ago
up until the system is released we're only left with the option to take his word for it
BlackKnight  +   433d ago
Well, no. He isn't the only person in the world who understands computer technology...

It is simple, GDDR helps the GPU and hurts the CPU in certain areas of performance, DDR helps the CPU and hurts the GPU in certain areas of performance.
Metfanant  +   432d ago
BlackKnight...the thing is all this "latency hurts the CPU" talk is just talk...how many PC's have you seen running a unified pool of GDDR5?...yup, none...we can speculate on how latency might be an issue...but Cerny is one of the few that has actually experienced the performance...so his word has to be given more credibility then any arm chair engineers on N4G that think they are all knowing...
pixelsword  +   432d ago
@ Metfanant:

Not exactly; there's an inherent bias there because of this affiliation with the subject matter.

He could either be correct or putting out propaganda, we'll have to basically wait and see.
GameCents  +   433d ago
I lack the mental dexterity to figure out how those two statements correlate.
GunsAndTheBeast  +   433d ago
Hodor? HODOR!
ab5olut10n  +   433d ago
Hodor...
MysticStrummer  +   433d ago
Hodor.
mikeyphi  +   433d ago
Hodor!?
Bluepowerzz  +   433d ago
shake dat ass hodor
#1.4.4 (Edited 433d ago ) | Agree(8) | Disagree(2) | Report
PurpHerbison  +   433d ago
Way to go and break the combo Bluepowerzz.
Harmonizer  +   433d ago
H-Hodor....Hoder!
Gamers_United  +   433d ago
So says your 43 disagrees lol
wishingW3L  +   433d ago
PS3 has split memory so there aren't any latency issues there. PS3 has very fast XDR for CPU and GDDR3 for the RSX.
Sayai jin  +   433d ago
IMHO the memory won't latency issues. Should run super smooth. I'll play devil's advocate, even if there were latency issues Cerny would not admit it.
RedHawkX  +   433d ago
cerny dont lie
Sayai jin  +   433d ago
Hmmm really. He either lied or misquoted recently about 700 devs working on Destiny...Bungie said debunked this.

http://n4g.com/news/1309326...

Anyways, my point remains Cerny or any other person will not most likely admit to a flaw in their product if there was.
Gimmemorebubblez  +   433d ago
@sayai What does Cerny gain by "lying" about the number of staff at Bungie.
WhittO  +   433d ago
^^ Actually I don't think he lied or was misquoted.

I just think Bungie don't want people to know that many people are developing for their game, raising higher standards/expectations etc for the game itself.

It's like when developers don't really want people to know their REAL budgets for their games, people use it as ammo as to why their game is good or bad etc and may have an effect on the critics/gamers perception, saying it's only good because of that budget, or should have been much better due to that etc etc
#2.1.3 (Edited 433d ago ) | Agree(24) | Disagree(3) | Report
Sayai jin  +   433d ago
@gimmemorebubblez- I also said he could of misquoted. We do know his info was incorrect. No?
wishingW3L  +   433d ago
Cerny did his research, he didn't build the PS4 in a month, he has been working on it since 2008. If the latency were such a huge issue he would know better than anybody on this website because he is both: a game designer and hardware engineer.
#2.2 (Edited 433d ago ) | Agree(10) | Disagree(0) | Report | Reply
jlo  +   433d ago
These articles are pointless. He designed it, so of course he's going to say it's not much of a factor.
The_Infected  +   433d ago
Or maybe he's the architect of the PS4 so he knows whats he was doing when creating the PS4?
#3.1 (Edited 433d ago ) | Agree(45) | Disagree(13) | Report | Reply
jlo  +   433d ago
Or maybe, he's the architect so he won't say a bad word about it?
papashango  +   433d ago
That's called marketing.

Sony wouldn't stoop so low as to market their new product.

If Cerny says it. It must be true.
BallsEye  +   432d ago
Oh yea, when there were similar articles about xbox 360 and its move engines esram etc you all were saying " He's from microsoft, he's paid to say good things!!" but now when someone from sony talks good stuff about sony console hes 100% credible!
Thegamer41  +   433d ago
How about all the developers that have said nothing but good things about the hardware? Don't pick the articles that appeal to your 'of course he's going to say that' remark.
gamertk421  +   433d ago
Of course you're gonna say that, lol
SpideySpeakz  +   433d ago
Yep, just take look at the Xbone. So much bad, not enough good.
DarkHeroZX  +   433d ago
Its decent. Not worth the tag but no slacker either. I've come to just respect the X1 and while they aren't doing everything right they are making an attempt to please there fans.
medman  +   433d ago
jlo probably believes the engineers behind the power bricks and red rings. Case closed.
jlo  +   433d ago
'Probably believes'

Nope. I don't own a 360. In fact I've already pre-ordered a PS4.

I'm just not a blind fanboy like most the people on here.
Kutaragis Revenge   433d ago | Spam
Cuzzo63  +   433d ago
Kinda of makes ya think M $ would do tha same for Xb1 *coughcloud*
extermin8or  +   433d ago
Latency over the internet between the cloud and latency difference between GDDR5 and DDR3 is much greater in the 1st instance. Therefore a hell of alot harder to get around not to mention the bandwidth which can be used to offset some of the latency issues i believe?
The Meerkat  +   433d ago
Mark Cerny is the best thing that has happened to Sony.
Dunpeal  +   433d ago
idk bout that. what about Yosp? you know the guy that basically masterminded Cerny's involvement? :)
Jaqen_Hghar  +   433d ago
As much as people don't like Kutaragi for PS3 design he did pretty much CREATE the Playstation so a man has to give him the edge. Kutaragi had to convince the Sony execs to even consider a gaming system then designed the first ever 100m on his first try (people thought the NES's 60m was the ceiling before Sony changed the game). Then he followed up with the 160m selling PS2. PS3 got off to a rocky start and Ken didn't help with his complex expensive architecture but it will still most likely be the highest selling console of the gen by the time they stop selling.
PSN_ZeroOnyx  +   433d ago
Actually the original playstation was designed by Sony for nintendo as a CD add on for the super NES. But nintendo backed out the day they were supposed to go public about the nintendo playstation. Sony was going to scrap everything and kutaragi who was designing it convinced Sony to make it a stand alone CD based console. Then came the Sony Playstation era which still dominates.
Jaqen_Hghar  +   433d ago
A man knows this story. His point was that Kutaragi is more responsible than anyone for the creation of the Sony Playstation. Well other than Nintendo.
Kutaragis Revenge   433d ago | Spam
iMixMasTer872   433d ago | Spam
TurboGamer  +   433d ago
The CPU is too slow for the latency to be an issue.
Kleptic  +   433d ago
No joke...the cpu section of the jaguar is the biggest bottle neck...but that will never be a factor because the ps4 isn't running a heavy OS, console development is never heavy on generic instruction sets...for calc intensive processing, the stronger gpu and high bandwidth ram (and plenty of it) will be very happy together...

people keep saying how crappy the ps4 cpu was originally made for budget laptops...remember though, the ps4 doesn't have to run windows AND a game...comparisons with PC hardware aren't exactly relevant...
Gimmemorebubblez  +   433d ago
You are right, the CPU in the Ps4 isn't crap and is very efficient and is clocked at 1.6ghz to avoid latency issues with the unified GDDR5 RAM.
GameCents  +   433d ago
As expected. I mean what else can he say?
fsfsxii  +   433d ago
Its "PR" crap, yeah. Just because MS fucked up in the PR doesn't mean Sony is fucking up their PR too.
Foxgod  +   433d ago
If he would say latency would be a problem he would have his employer breathing flames down his neck.
Nicaragua  +   433d ago
wise words from captain obvious.
Foxgod  +   433d ago
And yet people wouldnt dare to believe it.
Cuzzo63  +   433d ago
But you would believe everything from Micro's side.....
Foxgod  +   433d ago
Nope, i think everything trough, i dont take spoon fed information.
andibandit  +   433d ago
You must be new here
XabiDaChosenOne  +   433d ago
Aww, did the news ruin your theory?
sAVAge_bEaST  +   433d ago
This will be no match , for the power of teh clowd & a billion transistors...
--right fox?!?!???!!!
Stryfeno2  +   433d ago
Well if Sony is feeding me and keeping a roof on top my head...what else would I have said?
EffectO  +   433d ago
...on the GPU side

"Latency in GDDR5 isn’t particularly higher than the latency in DDR3. On the GPU side… Of course, GPUs are designed to be extraordinarily latency tolerant so I can’t imagine that being much of a factor."

It will be a problem for GPU compute(putting CPU work on GPU) though.
#9 (Edited 433d ago ) | Agree(4) | Disagree(12) | Report | Reply
Agent_hitman  +   433d ago
lol I will laugh my arse of if one fanboys arrived and say DDR inside the Xbone is much faster than the GDDR5 found on the PS4....

Some people are stupid, trying to create false assumption and stories just to put PS4 and sony in the bad light.. Drill this in your mind, GDDR5's bandwidth is a lot faster than DDR3+ESram.
Corpser  +   433d ago
Bandwidth has nothing to do with latency, that's what's discussed here.
andibandit  +   433d ago
@agent hitman

DDR5 has higher bandwidth but slower lantency than DDR3. You are just making a fool of yourself
wishingW3L  +   433d ago
DDR5 doesn't even exist, DDR4 is only coming out starting from next year to the mainstream PC. And latency is not about being faster or slower but higher or lower.

GDDR5 has higher latency than DDR3 not slower. And even then, APU's CPU are weak and irrelevant to next-gen games anyway. It will be all about GPU and GPGPU implementation so the bandwidth will be a necessity. With the PS3 it was all about the Cell helping the RSX and with the PS4 the CPU will be there just to help the GPU and not to do anything else of importance.
#10.2.1 (Edited 433d ago ) | Agree(1) | Disagree(3) | Report
andibandit  +   433d ago
Yes I actually meant GDDR5, as for the gpgpu stuff, i've no idea who youre talking to, but it cant be me since i didnt mention any of that.
#10.2.2 (Edited 433d ago ) | Agree(0) | Disagree(0) | Report
Whitey2k  +   433d ago
There was a site stating that the cpu on the ps4 runs at 2ghz and u know since it 2 wets of four cores 2+2 = 4ghz
andibandit  +   433d ago
You must live a very linear life
Tk731  +   432d ago
That does not make any sense. 8 core 2ghz cpu is still 2ghz. The core frequency is not added or multiplied by the cores.
Tatsuya  +   433d ago
When it's coming from the lead architect himself, you better listen! Cerny is giving me all the positive vibes on the PS4. I trust my purchase won't be a mistake. PS4 is the real next-gen console. I'm glad that they kept the decision to go with GDDR5 RAM a secret from the mass public and Microsoft.
#12 (Edited 433d ago ) | Agree(8) | Disagree(3) | Report | Reply
madpuppy  +   433d ago
Half of the people here just "know" that the latency of GDDR5 ram is going to be an issue, wouldn't you think that someone like Cerny and the hardware developers for the PS4 would know that as well before building the console? I mean, the posters here that think there is going to be a problem are "Sure" and most likely are NOT hardware developers in their day job.

I would think that being HW devs and it being their JOB to know such things that they took all that into consideration before they started to assemble the prototype boards for the PS4. Heck! even if they found out after building the first prototype that it wasn't going to work they could have swapped it out with DDR3 and been done with it.

Not everything is a damn conspiracy! use a little deductive reasoning.
#13 (Edited 433d ago ) | Agree(7) | Disagree(5) | Report | Reply
Chris12  +   433d ago
I think you need to check your own deductive reasoning.

At this price point there will always be compromises. You have taken latency to the extreme, when you suggest that if it didn't work they wouldn't build it. That's not the point. Sony may have decided they could achieve greater fidelity with the sacrifice of occasional frame rate drops. MS may have decided to accept lower fidelity for more consistent frame rates. I'm not stating either of these possibilities as fact, more that both systems will have compromises that developers have to work with and these compromises will eventually show on both systems when developers start to push them in the coming years.
madpuppy  +   433d ago
so your saying they could have just went with ddr3 and the PS4 would have been "perfect", did they go with GDDR5 because it was some insidious plan to be a bunch of hardware trolls and sabotage the quality of their own console and future titles??

Sure everything is considered a compromise when you are dealing with a device that has to perform well at a certain price point..but, do you actually think that the hardware developers of the PS4 sat there and said "this will prove to be disastrous later in this consoles life but, screw it"? If you think you "know" that it will, do you think possibly that the people that are actual hardware designers know less than you what they are doing??

Frankly, at the end of the day this is all nitpicking and fault finding for no real reason.
#13.1.1 (Edited 433d ago ) | Agree(8) | Disagree(5) | Report
wishingW3L  +   433d ago
ESRAM was not chosen for any advantage because it has none, it was chosen because Microsoft wanted 8GB and felt that GDDR5 wouldn't be available in enough capacity, so they went with the slower DDR3 and boosted the bandwidth with the ESRAM which is has added complications when it comes to game development. As a result of choosing DDR3 and needing ESRAM on the GPU die they have compounded problems by limiting themselves to a smaller/less powerful GPU than PS4.

Sony on the other hand chose to gamble on the superior GDDR5 which wouldn't need any ESRAM, they aimed for 4GB at first but due to positive market changes bumped it up to 8GB, they also do not need ESRAM which meant they could spec a bigger/superior GPU.

----------------------------- -------
"The answer to that comes down to a specific gamble Sony made that Microsoft could not - the utilisation of a unified pool of GDDR5 memory. In the early days of PS4 development, only 2GB of this type of memory looked viable for a consumer-level device. As higher density modules became available, this was duly upgraded to 4GB. By the time of the reveal back in February, Sony had confidence that it could secure volume of 512MB modules and surprised everyone (even developers) by announcing that PS4 would ship with 8GB of unified GDDR5 RAM. The design of its surrounding architecture would not need to change throughout this process - one set of 16 GDDR5 chips would simply be swapped out for another.

Microsoft never had the luxury of this moving target. With multimedia such a core focus for its hardware, it set out to support 8GB of RAM from day one (at the time giving it a huge advantage over the early PS4 target RAM spec) and with serious volume of next-gen DDR4 unattainable in the time window, it zeroed in on supporting DDR3 and doing whatever was necessary to make that work on a console. The result is a complex architecture - 32MB of ESRAM is added to the processor die, along with "data move engines" to courier information around the system as quickly as possible with bespoke encode/decode hardware to alleviate common bottlenecks. Bottom line: if you're wondering why Xbox One has a weaker GPU than PlayStation 4, it's because both platform holders have similar silicon budgets for the main processor - Sony has used the die-space for additional compute units and ROPs (32 vs. 16 in One), while Microsoft has budgeted for ESRAM and data move engines instead. From the Xbox perspective, it's just unfortunate for Microsoft that Sony's gamble paid off - right up until the wire, it was confident of shipping with twice the amount of RAM as PlayStation 4."

http://www.eurogamer.net/ar...
#13.1.2 (Edited 433d ago ) | Agree(2) | Disagree(2) | Report
Foxgod  +   433d ago
Problem is, Cerny doesnt have the final saying in its design.
His job is to propose a number of designs, the business, the people in suits, then selects one of the designs, and proposes a number on revisions based on marketing strategy´s.

Designs are a mix of technical, marketing, business and financial point of view.

Aka, he has to work with the restrictions the rest of the company puts on him.
This is a very common thing in businesses.
#13.2 (Edited 433d ago ) | Agree(3) | Disagree(11) | Report | Reply
Chris12  +   433d ago
@madpupply

No, I'm stating that both systems have compromises. Neither systems architectures are perfect and both will have to be carefully developed for to extract the best from them.

Not sure why you think I said Sony should put DRR3 in the PS4, a very odd conclusion.
ALLWRONG  +   433d ago
So just like the PS3 the PS4 may have a bottleneck.
Foxgod  +   433d ago
In the end all systems have a bottleneck.
The perfect system doesnt exist.
Sooner or later one resource is gonna be bottle necked and block the rest of the system from reaching max computing.

Sony however has a tendency to claim their products are flawless, which is not very realistic.

Just look how they marketed the ps3, which is full of poor business decisions.
#14.1 (Edited 433d ago ) | Agree(6) | Disagree(12) | Report | Reply
sAVAge_bEaST  +   433d ago
DDR3, -bottleneck,.. that's why they choose, GDDR5 - to prevent bottlenecks, a few years down the road.
PSN_ZeroOnyx  +   433d ago
No, all that GDDR5 RAM has extremely high bandwidth so no bottleneck.
andibandit  +   433d ago
Back to school
PSN_ZeroOnyx  +   433d ago
Most of you are clueless
marchinggamer  +   433d ago
Yea it is a big factor if the cas latency is 11 (which is really low for gddr5)that's not good the x1 has 2133 speed (high speed) with cas latency of 9
neoandrew  +   433d ago
PC's have no problems with ddr3 so i really doubt ps4 will have with gddr5, and you need to remember that still pc is more powerful than ps4, yet it has worse system ram.

Even if latency of gddr5 is bigger than ddr3, then gddr5 have higher bandwidth, so more or less, you can send more data at once, as of ddr3 you need to send less data but more times, we will se whats better solution but i thing very comparable, slight advantage to gddr5.

No to mention WHOLE 2gb more than xone for games, i think m$ is secretly doing something to change that.

I also think that in time Sony can lower system ram usage to 0,5GB, to we can have 7,5 for games, that would be cool.
Corpser  +   433d ago
No, only very low end pc or laptop gpu use ddr3, Gddr5 have been the standard since at least 2009, gdd6 comes next year by the way
neoandrew  +   433d ago
I'm talking system ram not gpu ram, ps4 has unified ram, so it means that for everything that is not gpu related it uses the same ram as best gpus on pc, the gddr5, and not the normal ddr3 as every new pc...

You CAN'T deny it, ps4 has better ram than everyone pc, and better than xone.
Corpser  +   433d ago
^^^

If there's any advantage in using GDDR5 in CPUs, PCs would use it already, like I said GDDR5 has been around since 2008, it's nothing new.

On PCs DDR4 will replace DDR3, and GDDR6 will replace GDDR5 in the coming year
sAVAge_bEaST  +   433d ago
Latency,... I wonder if they can over clock it with a Driver..

Windows is horrible, . and you can fix Midi latency with this http://www.asio4all.com/ (i know it's diff., but sim.)
sync90  +   433d ago
more tech babble, i don't care about it. just want to see the games. that's all everyone should be arsed about really.
BlaqMagiq24  +   433d ago
While DDR3 does have lower latency compared to GDDR5 (maybe even a significant amount), GDDR5's bandwidth is ridiculously higher than DDR3. That's the ONLY advantage with Xbone's RAM. As for PS4 I think it's a good tradeoff.
cunnilumpkin  +   433d ago
the problems of the ps4 have NOTHING to do with the Ram

its the 1.6ghz mobile cpu and laptop class/outdated 1.8 flops gpu

a two year old gtx 680 = 3.5 tflops

the ps4 cpu is outclassed even a low end i3

"Nvidia previously made mention that the PS4 contains the CPU of a low spec PC, and honestly speaking – he’s right. The Playstation 4 (and likely Xbox 720 if it is indeed using the AMD Jaguar, like the PS4 is) have to rely on ‘going wide’. In other words, heavy multi threating, and passing a lot of the calculations off to the GPU side of things. The APU design is efficient, but honestly speaking, the CPU is not that fast in the console. It’s a step in the right direction – especially since it supports X86-X64 code, making programming for it far easier, but it’s slow compared to a PC."

-source redgamingtech.com
#20 (Edited 433d ago ) | Agree(4) | Disagree(6) | Report | Reply
a_squirrel  +   433d ago
I like the mini-article here, brings more truth than a normal so called article with less fluff.
Metfanant  +   432d ago
This guy is so threatened by the PS4 he comes running and posts the same stuff on every article...I know you have trouble reading...so I'm going to make this real clear...

NOBODY...did you get that?...NOBODY is saying that the PS4 is more powerful then high end rigs...

Lets try that again...NOBODY is saying the PS4 is more powerful than high end rigs...

HOWEVER...it's much more powerful than comparably priced hardware in the PC market and does have a few tricks up its sleeve that will allow it to "punch above its weight class" so to speak...

A GTX 680 which you like to praise for being so old but so much more powerful than the PS3 costs between $400-$500 all by itself...
#20.2 (Edited 432d ago ) | Agree(2) | Disagree(1) | Report | Reply
PurpHerbison  +   433d ago
Not buying any PS4 game that doesn't achieve a solid 60 FPS.
fenixm  +   433d ago
Ouch.
PurpHerbison  +   433d ago
Not like every PS4 game is going to be 30 FPS though.
JackVagina  +   432d ago
have you ever thought that a developer may choose 30fps over 60??
PurpHerbison  +   432d ago
Yes, but I do not choose 30FPS.
SilentGuard  +   433d ago
OMG, a possible weakness of the PS4? I won't hear of it. Not for a console built buy the gods that descended from heaven to the Messiah Sony.
lgn15  +   433d ago
This means nothing. It's not like he's gunna say latency is a problem even if it was.
Tundra  +   433d ago
PCs use DDR3 for system RAM because of lower latency. The reason for this is because a PC isn't only for gaming, it's running multiple processes in the background asking for resources. The resources need to be distributed in a timely manner and that's why DDR3 is used. The CPU rarely, if ever, hits DDR3's bandwidth cap.

GDDR5 has a higher latency but gaming processes are fairly straight forward, parallel processes. In a console, you have processes mostly, if not entirely, dedicated to gaming. There aren't several background processes that need to be worked on so latency is less of a factor in a console.
6DEAD6END6  +   433d ago
Why you so smart?

http://m.memegenerator.co/i...
#24.1 (Edited 433d ago ) | Agree(0) | Disagree(2) | Report | Reply
jairusmonillas  +   433d ago
PS4 is not a PC with windows 7 it doesn't needa crappy DDR3 memory. #dealwithit
Indo  +   433d ago
Mark Cerny does a great job at presenting the PS4 and certainly takes pride in his creation. And the only way for me to thank him is to buy the PS4.
brich233  +   433d ago
Anything said on here about memory and Latency is useless to read until we see real world performance vs a PC.
Lbong  +   432d ago
I love the ps4........but fuck this story, just go away.
fsydow1  +   432d ago
Gddr5 is better, why does every high end graphics card have it, for games.xbone gpu is weaker than ps4 .Why do you think the xbone is as big as the titanic, because their engineer doesn't come close to Cerny.
annus  +   432d ago
Every graphics card has it, but system memory is DDR3... Not everything is done by the GPU. Faster RAM will always be better for CPU cycles, where as it isn't needed as much for GPU as DDR5 allows textures to be transported in larger quantities to the GPU, and they usually aren't switched in and out of VRAM as much compared to other tasks that the CPU performs.

Add comment

You need to be registered to add comments. Register here or login
Remember
New stories
20°

Final Fantasy Explorers is Great, But It's Not a Good Single-Player Game (Preview) | Twinfinite

27m ago - Twinfinite writes: Final Fantasy Explorers got its first playable demo at this year's Tokyo Ga... | 3DS
30°

New Persona Q Trailer Investigates the Investigation Team

42m ago - Atlus revealed three new trailers today, the most important of which is a story trailer for the I... | 3DS
30°

New Age of Empires is Available Now For Free

1h ago - Twinfinite: "Microsoft Studios, and Smoking Gun Interactive, released a new free to play Age of E... | PC
30°

Cardboard Project - The Inexpensive Alternative To Virtual Reality

2h ago - A few Virtual Reality enthusiasts at Google decided to experiment a bit with projecting VR on sma... | Android
Ad

Destiny The Game

Now - Explore Mars, Rediscover Venus, Reclaim the moon, Protect Earth. Become Legend. The wait is over! Destiny is now available to play, Pick up your... | Promoted post
40°

Digital High Podcast – The Pros and Cons of Destiny

2h ago - In this week’s episode of Digital High, we have a new arrival. Evvy is the latest addition to Ent... | PC