Top
1160°

Microsoft outlines performance difference between Xbox One and PS4

Microsoft has criticised parts of the gaming community for spreading "misinformation" about the performance of Xbox One, and claimed that the alleged power difference between the PlayStation 4 and Xbox One has been "greatly overstated".

In a post on NeoGAF, Xbox exec Albert Penello commented that, though he was "not disparaging Sony... the way people are calculating the differences between the two machines isn't completely accurate. I think I've been upfront I have nothing but respect for those guys, but I'm not a fan of the mis-information about our performance."

Penello then goes on to clarify key elements of the two consoles that he believes are often misunderstood, including beliefs that the lower amount of Compute Units available in the Xbox One lead to a 50 per cent power disadvantage, and that Xbox One's memory is slower.

The story is too old to be commented.
Septic1438d ago (Edited 1438d ago )

Say what you want about the specs, but you have to give to give respect to Albert Penello for his down to earth and frank nature when discussing this.

Obviously, he works for MS so he has to defend the product, but he does so with far more objectivity than the likes of Harrison and others.

Also, I think we can all agree that taking these hardware figures at face value and equating them to a 50% advantage is silly and not taking into account the way things actually work.

"Given this continued belief of a significant gap, we're working with our most senior graphics and silicon engineers to get into more depth on this topic. They will be more credible then I am, and can talk in detail about some of the benchmarking we've done and how we balanced our system."

More disclosure and transparency is the way forward.

allformats1438d ago ShowReplies(6)
Baka-akaB1438d ago (Edited 1438d ago )

I wouldnt praise the frank nature of anyone working for PR at Microsoft , nor Sony nor Nintendo nor any other ....

It's their job description to lie when its needed . And obviously , they dont even always know that much when it comes to hardcore tech and geek stuff

Thehyph1438d ago

They're skilled in evasion. They're supposed to be.

What I keep getting stuck on is that these console power debates are really just stuck with the people on places like n4g and NeoGAF. To me, it just feels like fanboys of either camp just trying to convert each other.
Remember when Sony said that they have over a million preorders? One million is a small fraction of the people who will own the console over its lifespan. The arguments on here are pointless if the mass of buyers don't see them. I was at friend's house the other night, and I was one of the three there to have a next gen console preordered. I'm getting a ps4 and the other two are getting Xbox One. These guys have no idea about any differences in console power, they wouldn't have a sweet clue who Penello is, they are only vaguely familiar with TV features, new Kinect features, Gaikai, remote play, etc., they don't even mind paying the extra $100. The only thing that seemed to matter was that they have to wait a week longer than me.

All that I am trying to say is too many people are getting caught up arguing this foolishness.

I'll be happy with my PS4, and I'm sure others will be happy with their own console purchase(s)
Too many people are trying to convince others that their future purchase is wrong. So what if it is? Let the buyer find out on their own.

HammadTheBeast1438d ago

"• We have more memory bandwidth. 176gb/sec is peak on paper for GDDR5. Our peak on paper is 272gb/sec. (68gb/sec DDR3 + 204gb/sec on ESRAM). ESRAM can do read/write cycles simultaneously so I see this number mis-quoted."

Here is an exceptional piece of BS.

Our peak ON PAPER is 272gb/sec. YOU DON'T CALCULATE THE PEAK LIKE THAT BY ADDING IN THE ESRAM WITH THE DDR3, THAT'S NOT HOW IT WORKS, IN THE LINE ABOVE HE SAYS PEOPLE ARE TAKING THE GPU COMMENT OUT OF CONTEXT, THEN WHAT BS IS THIS?!

mewhy321438d ago

Well Albert has a tough job. You have to admire the way that he's spinning this. I realize that just because the GPU has 50% more compute units than the bone's doesn't equate into 50% better performance. However, I also realize that the eDRAM is only 32mb and it's high bandwidth doesn't equate 1:1 with the DDR3 for the big boost that he's trying to spin in there LOL. Overall I'd estimate that if you pushed both systems to the max you'd probably see about 30% performance difference in the PS4's favor, not 50%. However, this is all null if you don't get both systems running at their max. The only time that we're going to see that is with exclusive software.

n4rc1438d ago

Of course he's lying!! Why wouldn't he be!

Pathetic.. Saw it coming a mile away... Spin fanboys spin!

thechosenone1438d ago (Edited 1438d ago )

"I realize that just because the GPU has 50% more compute units than the bone's doesn't equate into 50% better performance."

lol! This is unfreaking real. The level of denial by xbox users is off the freaking charts. xD

And read the rest of the thread to understand why Penello is being a deceitful little *****.

http://www.neogaf.com/forum...

PS4 vs XBOX One: Developer Comments Compiled
http://www.neogaf.com/forum...

scott1821438d ago

Sony is not the one that has been saying it!! This guy is blaming sony for what random Devs have been saying.

user55757081438d ago

well im sure this MS employee is completely unbiased in his writing...right?

gaffyh1438d ago

Is Albert Penello the system architect? Because he really needs to STFU.

Sony isn't saying the PS4 is 50% more powerful, the DEVELOPERS are. So no matter what you try and explain your way out of, you can't, because the games makers are noticing the difference.

Also that bandwidth comment is complete BS. He's comparing peak bandwidth, of ESRAM and DDR3 as the peak possible performance. Bandwidth doesn't work like that. The ESRAM is on 32MB, so data will have to be cut up into 32MB, passed through that bottleneck, which will slow it down LOADS, and then to the 8GB DDR3 RAM, which runs at 68Gb/s. The PS4's entire RAM pool can run at 176Gb/s in peak performance. PS4 wins, easily.

nukeitall1438d ago (Edited 1438d ago )

@gaffy & others:

I didn't see Penello refer to what Sony said, and use that as a basis. If anything, he is just talking about mis-information that is thrown around by some (not necessarily Sony) and giving examples of how numbers can be misconstrued.

@HammadTheBeast:

"
Our peak ON PAPER is 272gb/sec. YOU DON'T CALCULATE THE PEAK LIKE THAT BY ADDING IN THE ESRAM WITH THE DDR3, THAT'S NOT HOW IT WORKS,"

That is exactly how you compute bandwidth as long as you have simultaneous access. Think of water pipes, if you have multiple smaller pipes that combined gives you more water than one big pipe, you indeed have higher bandwidth of water coming through. It doesn't matter that it comes through multiple pipes. This is exactly how your memory works.

What is often not discussed though is the equivalent of the length of your water pipe. You might be able to get a lot of water at once, but if your pipe is very long it will take you a long time to get that initial water (or data), causing a delay.

This equivalent to latency, something DDR3 has a big advantage over GDDR5.

Think of an online match, even if you have gazillion Mbps, a small Mbps can have superior experience with minimal lag, whereas a large Mbps can have massive lag. The simple answer is online gaming is dependent on latency i.e how long it takes to get the data, and almost zero bandwidth need. The amount of data being sent and receive is minimal, but the time is the killer factor.

That is why a lot of these numbers on their own is at best a very very very rough measurement and often times skewed towards marketing as opposed to real world benefits.

strifeblade1438d ago

The sony Fanboys commenting on the system power here are complete embecile, and know absolutely nothing. But here are exact figures.

40.67% gpu difference in favor of sony (since x1 upclock, prior to thats it was 50%) (853x12)vs (800x18)

9.375% cpu advantage in favour of x1 (thanks to upclock- prior it was even) (1.75 vs 1.6)

Ram in terms of bandwidth sony have the advantage but in terms of latency x1 have the advantage. Sony takes steps to minimalize latency disadvantages and x1 introduce es ram to achieve 140-150gb/ps (realistically) vs sony's 176 gb/ps. I know Panello says differently but numerous tech sites describe it in this way.

I also understand that x1 have larger buses- 30gb vs sony's 20gb. I beleive panello mentions this when he talks about the cpu communicating with the gpu at 30gb/ps. Sony is 33% slower in this regard.

X1 has 8 gb of flash to help os run and load apps very fast compared to sony's standard hdd from which apps would load from.

MSOFT confirms that x1 have numerous sub systems and customizations that are unknown to us and supposed to reduce the power gap. They announced they have 15 processors in the system (at hot chips) and is rumored that some of these processors can be used for graphics which would further reduce sony's power advantage.

Honestly i believe msoft- A lot of ppl on forums are implying the ps4 will make xbox one look like a wii u- or ps4 is a generation ahead of xbox when that is not the case.

Take the wii-u for example- its 2-3x more powerful than ps3/360 but do game3s look 2-3x better than the multiplat counterpart on ps3 or 360? No but there are small differences. Look at the wii- its also a few times more powerful than ps2/xbox and gamecube and you get the same results lol.

darthv721438d ago

is it more powerful...is it not more powerful....?

I dont give a damn. just give me the games and i will decide for myself.

gaffyh1438d ago (Edited 1438d ago )

@nuke - by that logic the Xbox 360 has a higher bandwidth than the Xbone because the eDram runs at 256Gb/s. So the Xbone is actually a downgrade. We all know that isn't true, so please cut the BS.

@strife - CPU speed isn't confirmed. Original rumours stated 1.6Ghz, since Feb it has been rumoured to be 2 Ghz.

P0werVR1438d ago (Edited 1438d ago )

I believe the ESRAM bandwidth will be used primarily for textures, obviously. Since textures use up most of the memory and takes up most of the space.

It's crazy in how the likes of Hammadthebeast talk so much about the architectures of these consoles yet still don't have a clue of what they're talking about, even just the basics. But makes more sense when emotions easily take over.

My question to all of you nay sayers.

Has Sony been confident enough to post anything of relevancy compared to what Microsoft has been doing these past months with Xbox One?

That goes to show they have nothing to show but specs.

The only thing that Sony has and why their games look great is because they have better first party developers. NOT SPECS!

If you can't consider that fact, goes to show how much you know. Microsoft this gen are going to focus more on first party studios fairly early with Black Tusk Studios.

@gaffy

Yes, but in Xbox One's archetectural panel video they stated clear time and time again that the ESRAM will take full advantage of that very purpose and why i't s an "upgrade" from the EDRAM.

Again, you make yourself look foolish.

nukeitall1438d ago (Edited 1438d ago )

@gaffyh:

"@nuke - by that logic the Xbox 360 has a higher bandwidth than the Xbone because the eDram runs at 256Gb/s. So the Xbone is actually a downgrade. We all know that isn't true, so please cut the BS."

Which also means the Xbox 360 is more powerful than the next generation PS4! /s

Totally proving my point, that these carefully crafted specs completely disregard other factors including that the Xbox One has more than 3x times the amount of RAM in ESRAM than eDRAM than the Xbox 360.

As Penello said, there are more to just pure numbers measuring very specific things ignoring other facts.

MS heavily engineered their system to handle bottlenecks and they have been far more open about what their system is compared to the competition (Sony) hiding behind a few carefully crafted specced numbers.

strifeblade1437d ago (Edited 1437d ago )

@gaffyh

Your info is outdated the only reason that 2ghz cpu rumour is there because the cpu ships standard as a 2ghz cpu- both sony's and microsoft use the same cpu. Since then we learned newer rumors pointed to a downclock to 1.6ghz for less heat, run quieter and as a result runs on lower power.

Sony's system does not allow for an upclock to 2 ghz or their gpu to be clocked above 800mhz. why?

1st the system is small compared to x1,as a result parts are closer thus retaining more heat

2nd the cooling system is smaller and as a result would run louder if chips are upclocked.

3rd the power brick is inside the system opposed to outside and as a result adds more heat.

These 3 key characteristics will not allow ps4 to be clocked higher without seriously jeopordizing the components of the ps4. Due to the x1 efficient component layout- it allowed them to clock their chips higher.

starchild1437d ago

@strifeblade

You are correct. The ~50% more CU power in the PS4 is FAR from the whole story.

But the mindless fanboys are going to keep screaming "50% more powerful" because that's what they want to believe. They're just as delusional as the Xbox fanboys who believed that misterX guy about the second GPU thing.

I believe that the PS4 is more powerful, I simply reject the simplistic thinking of these fanboys that says "50% more compute units equals a 50% more powerful console". Herp derp.

If that were true, it would mean my PC with an HD 7950 was over 55% more powerful than the PS4, simply because my HD 7950 has 28 compute units vs 18 in the PS4. This clearly is not the case, as differences in APIs, OS overhead, architecture and other hardware and software differences can all dramatically affect the final performance of the machine in question.

We don't have all the details and even if we did I wouldn't be able to accurately say exactly how the performance of each console relates to each other. What I do know is that the fanboys saying "the PS4 is 50% more powerful" don't have a clue what they are talking about.

For now, I'll listen to people like John Carmack, a man who I respect and who has always shown himself to be knowledgeable, rational and fair.

In his words:

"I haven't done really rigorous benchmarking [but] they're very close and they're both very good," he said.

"It's almost amazing how close they are in capabilities, how common they are."

"And that the capabilities that they give are essentially the same."
http://www.computerandvideo...
http://www.forbes.com/sites...

JokesOnYou1437d ago

I don't see what he said wrong, the info is all right here http://www.eurogamer.net/ar... I mean reading it and what he said seems legit for the most part. Ps4 on paper is more powerful but alot of the 50%, 40% stuff is BS according to all sources and most say X1 has some strengths that were previously unknown and nobody's completely sure how it all works. Which is again why putting a hard % on performance difference is foolish at this point but of course fanboy engineers/n4g experts will tell you they know everything.

cell9891437d ago

@ Starchild why dont you also listen to the GOW Judgment dev that also claimed the PS4 is indeed that much more powerful

dantesparda1437d ago

Wow, the level of stupidity and ignorance shown by half the people above me is astounding.

First of all, to all the MS fanboys claiming better latency on the X1. Its not gonna matter, because when it comes to graphics, bandwidth is king, PERIOD! not latency, FACT! Why do you think all high end cards have GDDR5 and not lower latency modules. When it come to the highly parallelized architecture of a GPU, bandwidth is what matters most (keeping all the pipes feed) not latency. Get that through your thick dumb heads. Latency is not going to matter in graphics processing. And GPUs are design to better handle the higher latency. Than a serial CPU.

And to strifeblade, dont call people "embeciles" if you cant even spell the word "imbecile" right, cuz it makes you look like an imbecile. Also the Wii U is 2-3 times more powerful than the PS3/360!? What the f*ck have you been smoking? Tell Nintendo to send me some of that sh*t. There goes all your credibility. Its obvious you know less of what you speak of than even NukeitAll

And PowerVR, you obviously know even less than Nuke or Strife, geesh! its getting dumber in here

And p.s. to Nuke, the 360s eDRAm had 256, wait for it, gigaBITS, you catch the word "BITS", not bytes, as in 1/8 the size of BYTES. Translated into bytes equated to 32GBytes, not 256GBytes. You get it? Or do i have to further spell it out for you?

And these are games machine first, and multimedia devices second. And in a games machine graphics is very important/king, and for graphics, bandwidth matters way more than latency. So the higher bandwidth of the PS4 will better benefit developers then the lower latency of the DDR3 in the X1. Also, answer me this. If latency is so important to a games system, then why didnt MS go with 1866MHz DDR3 RAM? it has even lower latency than the 2133Mhz RAM they went with, since latency is obviously so myuch more important according to MS fanboys. Also people, we are talking nanosecond differences here. What's that going to translate into on a app on screen? half a sec? less? We are talking billionths of a second. Now stop spewing sh*t y'all know nothing about just because you's wanna defend your beloved MS.

And btw, anything that is that latency dependent can just be put into the CPU's cache (which btw is way faster still) anyway, as long as its not to big. And what are they going to put in the 32MB eSRAM? AA? how you gonna fit gigabytes worth of textures into a 32MB buffer.

But with all that said, I do believe that games will be developed for the lowest common denominator and therefore, we will barely notice a difference on multiplats. Unless Im wrong and devs give the PS4 version higher res or higher framerates, or more effects. But i doubt it, for the sake of parity, cuz you know the pople with the inferior version will whine about

AngryTypingGuy1437d ago (Edited 1437d ago )

These debates are fun sometimes, but they do get old too. All this arguing when most games will look very close probably isn't worth it. The truth is, if the Xbox One is a little more powerful than the PS4 (doubt it), it's not going to make me want it more. I am choosing PS4 over the XB1 due to the games that appeal to me, the Instant Game Collection and all of the other little things that make up the whole experience. Will we see differences in some games, sure, but most will be close.

nypifisel1437d ago

Yep, that post is frankly quite embarrassing, seeing how most of it is bullshit, anyone that has any clue about how computers function can spot this immediately - I would recommend everyone to actually read the replies in the thread after his post, they explain why Albert are spewing poop!

Death1437d ago

@dante

Way too much anger from someone talking about video games. Latency is not an issue with GPU's. That is true. GPU'sare not linear in the way they compute. This is why all the top cards use a gig or two of GDDR5. In a traditional PC setup GDDR5 is not good for the CPU since latency is an issue. CPU's run much more efficiently using DDR3. Neither console is setup like a traditional PC, they have a combo CPU/GPU that share memory pools. Sony choose GDDR5 and Microsoft went with DDR3. If they both accessed that memory the same way, the GDDR5 in the PS4 would be much faster due to the increased bandwidth as far as graphics go. We really have no idea how latency on the CPU side is going to be addressed. Microsoft choose to go with DDR3 with their pool. Great for CPU's but slower than GDDR5 for GPU's. They addressed this with the inclusion of eSram. They also included 8 gigs of flash memory and access the memory pool a little different than the PS4. It's a much more complex solution, but it addresses both sides of the CPU/GPU to maximize efficiency. We have yet to see how this translates to games. That is the job of the developers. They will exploit the strengths of each system.

As for your games are all about graphics rant, that couldn't be further from the truth. The Last of Us is the latest example of a triple a game we have on consoles. Graphically it is a very nice, but the story and execution are what make it great. A game that is pretty, but has no substance is a very hollow experience and not one remembered well.

strifeblade1437d ago

@Dante

Sorry dante, i guess if i do not spell imbecile correctly therefore i am one, not the hundreds of baseless comments i see from the fanboys next time i will spellcheck. Thankyou for contributing, truly enlightening.

The wii u is more powerful than current gen- 2 gb of ram more powerful gpu, and the cpu is held back by the tablet processing but its solid when compared to ps3/360. Some sources are conflicting. Most sources point it being on par or more powerful- it just has not translated to games (some multiplats look slightly better on wii u) since game engines are difficult to port to the wii u. Again u like to throw around baseless comments without support in an attempt to discredit me. Its pathetic that you cannot contest my points (they are facts) yet choose to get me on a spelling error and point out a wii u comment? Sorry i am not intersted in the wii u or if my info of is dated on the subject as ps4/x1 are my true interests and researched heavily into the matter.

Your not a hardware engineer so you are not fit to comment of the efficiecy of latency or lack there of. I stated the FACTS, you put the SPIN. If latency did not matter then why did sony take measures to reduce it? lol. at the end ddr3 is still faster and we have yet to see how that will translate into multiplats. Get it through your thick skull. We don't know anything for certain until they are out.

warczar1437d ago

@starchild

yeah, I'll believe carmack since he's made oh about 5 games his whole career that ended up on a foreign system. The guy's been a microsoft tool for years. Jesus, he even looks kinda like Bill Gates.

P0werVR1437d ago (Edited 1437d ago )

@dantesparda

"And PowerVR, you obviously know even less than Nuke or Strife, geesh! its getting dumber in here."

Yeah buddy, nice one. So can you at least elaborate on that?!

Sorry for being too concise, don't want to make a mess with lil' fuzzy heads blowing up.

You calling the eSRAM a buffer or how lower latency is not a HUGE contribution of keeping the pipelines fed, is a huge indicator in how much you know (vewy little).

The eSRAM is more like a "scratchpad memory" built on chip making it very fast. Big advantage is you don't have to deal with latency issues with GPU/CPU RAM and bottleneck software implementations...BIG DEAL, since lower latency provides MORE "SUSTAINABLE" BANDWIDTH! Therefore keeping "pipelines fed".

So eSRAM is SUPERIOR to GDDR5 and GDDR3 for vital applications (textures) and why it points DIRECTLY at the GPU. Can you tell me otherwise?! I doubt it!

"And what are they going to put in the 32MB eSRAM? AA? how you gonna fit gigabytes worth of textures into a 32MB buffer."

See what I'm talking about?

That is why they have DMEs to compress and scale those "render to textures" and leave the main memory for whatever the hell developers can think of to add to. For those of you that don't know what that means...GENIUS DESIGN over BRUTE FORCE!

Don't you think they would have known better that GBs of data wouldn't fit into 32mbs of space. My goodness some of you are just a bunch of mooks on this site.

EDIT:

So yes, it is indeed about bandwidth. And since bandwidth has more so to do a lot with textures it is then textures that are important to solve and why Microsoft makes a huge deal about the eSRAM.

But again, it's also all about first party developers taking advantage of the hardware which are a few in many third party developers.

Features, features, features...features. Microsoft dominates features because they target casuals, not the easy consumers who are easily swayed by BS specs.

Also, how everyone miss the point of the silicon dies is either because they don't understand or they don't even want to consider it. The former more likely.

Ginesis1437d ago

@stevehyphen FINALLY!!! Someone with some sense on this site. Everything you said...absolutely true. There's hope for N4G yet!

UltimateMaster1437d ago

Oh come on, what a useless debate.

Who really cares what numbers and what engine tool a system is going to have.

Both consoles' games look great and are very realistic.
I don't really see much of a difference between the 2 of them.

The real deciding factor is the games they will have and how good they are.

+ Show (24) more repliesLast reply 1437d ago
jimbobwahey1438d ago

If you look on NeoGAF, you'd see people are tearing Penello apart for trying to straight-up lie and deceive people, because the numbers and figures he's been quoting do not add up at all, and anybody with even the most simple and basic understanding of hardware knows that he's on a campaign of deceit.

Really, the guy is trying to spread as much misinformation about the Xbox One as possible to make it look good, and more and more people are calling him out on his nonsense now. Penello is in no way deserving of any respect whatsoever, it's actually rather disgusting that he's going to such great lengths to try and trick people.

Thankfully, the crowd he's trying it on with are ripping his lies apart.

devwan1438d ago (Edited 1438d ago )

Exactly. The esram might well be read-write at the same time, but the best case scenario throughput figures they offer are not realistic in use, they are a theoretical maximum in utterly unrealistic real world conditions.

vulcanproject1438d ago (Edited 1438d ago )

At a very basic level he says their CPU is faster- does he know PS4's clockspeeds? He said he doesn't, only a couple weeks back.

This leads onto the claim Xbox one has a better sound chip, and takes more load off the CPU. He knows this because he knows much more about Sony's custom chipset with its secondary processor like he apparently knows their finalised clocks? HMMM!!!

He said that having 50 percent more CUs doesn't equate to 50 percent performance. Actually it generally does, because graphics tasks are incredibly parallel. For example a Radeon 7870 is twice the card as the 7770 on paper, and in gaming performance, unsurprisingly it is almost exactly twice as fast.... http://www.guru3d.com/artic...

He quoted the wrong peak bandwidth figured based on the old clocks of Xbox one (800mhz figure, not 853mhz figure)

He can't say if the eSRAM can ALWAYS read/write simultaneously. It probably can't.

But....yea. It goes on like this. We don't have good answers from them yet.

kneon1438d ago

@vulcanproject

Even if their CPU is 10% faster he is ignoring the fact that their OSes plus hypervisor are going to eat up any CPU performance advantage the xb1 might have.