A few game developers share their thoughts on PS3's CELL processor.
Kind of reminds me of the esram in the Xbox One a bit, some devs just don't have the time, knowledge or resources to code/develop for it and so games don't reach the their full potential but as you can see from PS3's first party games (whose devs had better knowledge of CELL I'm assuming) the PS3 was a very capable machine
Exactly what I was thinking in terms of comparison. People also compare the third party game situation on Xbox one a lot with how it was for the PS3 at the time. I clearly remember so many pixel counting articles. First party games on the PS3 actually looked and played much better than anything that came out multiplat last gen on the PS3. Games like TLOU are a perfect example of how there actually was a HUGE difference between consoles. The same really can't be said this gen since everyone states how similar the PS4 and Xbox One are.
God the amount of Pixel counting articles last year was insane. Seriously they had so many pixel counting websites pop up last gen and be profitable and then here comes this gen and they just aren't big news anymore. I have no idea why /s
"Lazy Developers are too Lazy to release CELL CPU true POWER" argument is as dead as CELL CPU architecture. No need to defend the STUPID CELL ARCHITECTURE with Uncharted, Last of Us, God of War anymore, because those game are better on the PS4, 1080p. SONY, Thanks. Square Enix/ Bandai Namco/ Capcom/ Koei Tecmo PS3 games/JRPGs are terrible because the developers are lazy, not because The PS3 CELL CPU clearly RETARDED.
Except the difference is Esram do not have a some power to be unlocked. A) Cell actually had quantifiably large power that can be used. B)Esram dont. It is 32mbs. That remains the same. And total read is about 100gb/s and total write is about 100 gb/s. Total they make up about 200 gb/s. GDDR5 is about 185 gb/s on read alone..or write. There isnt any power to be unlocked. Best case scenario, maybe you can catch up to GDDR5 with 32mbs... C)Devs used EDram all the time last gen. This isnt hard to develop for. It is just SMALL and that slow things down. It doesnt matter what first party dev you have. You wont unlock power from Esram because there is none to be unlocked.
Square Enix worked hard on PS3 version of FINAL FANTASY XII. FF13 is 720p on PS3, 576p on X360. Yet people blame Square Enix for not doing their best instead of the Real Culprit, the RETARDED PS3 CELL Architecture, that have constricted, hindered developers from achieving their vision to the fullest. PS4 thanks for liberating The Last Guardian from PS3 prison CELL.
No, dork, there wasn't a HUGE difference between the 360 and ps3. The cell was powerful, but the xbox had a better gpu and unified ram, which mostly negated that.
There's still few delusional PS3 RETARDED CELL Architecture(which SONY dumps it for AMD Jaguar CPU) defenders on neogaf spewing, "focus on SIMD, manual memory management, high degree of parallelism, streaming"... ...Xbox 360 have 3 CORES, 6 THREADS, 10MB eDRAM it can do those things too, easier. Which SONY could simply outmatch it with Smarter, Cheaper 4 CORE, 8 THREADS CPU and save billions and gave us THE LAST GUARDIAN !!!! Instead SONY gave us 1 CORE, 7 SPEs, CELL "Architecture", a fool's errand to bring out the true POWER of the SPE(Stupid Purposeless Endeavour). FINAL FANTASY XIII 720p on PS3, 576p on X360, see, CELL cpu got more POWER!!!
@never4get "... Xbox 360 have 3 CORES, 6 THREADS ... ... Instead SONY gave us 1 CORE, 7 SPEs... " So you saying that THREADS are the same as SPE??? >Thread != Parallel<
Direct X retarded the industry far more than the Cell. We had 2 growing APIs and instead we spent years having money and influence kill those off so a crippled Direct X could be rammed down out throat. Sure it has begun to make some decent advancements but we spent years backtracking and staggering under a very crippled system and all for monetary gain by a single company.
When i see the games and how every 1st party franchise evolved through the episodes i am more than happy they chose CELL processor. when 360 has been maxed out the first 3years, PS3 still has potential progression margin. Games like Beyond: 2 souls, TLoU, GT6, Uncharted 3, God of War: Ascension, Killzone 3 are there to prove this.. Frostbite got bitch slapped by any game engine i quoted above..
Ps3 scenario was different because on paper the Ps3 was the most powerful th is gen the ps4 is performing better name discuss also one with most power.
Huge difference? I don't think so. Halo 4 on xbox360 was best looking game on last gen. Check out all the awards. Generally xbox 360 was more powerful console than ps3. That doesn't mean ps3 didn't have amazing exclusives that looked mindblowing. All comes down to budget.
A fast fact, FF XIII was lead on ps3, 360's was a last minute port running a different engine. The other two games are 720p on both consoles.
ESRAM is just an evolution of the EDRAM found in the X360. The problem with the XONE is mainly that of immature drivers and poor launch SDKs. It's been heavily documented.
and again it is not large enough.
Exactly, eSRAM's problem isnt just "immature" drivers and a"poor luanch" SDK, its that its to small. Thats its real problem.
The issue isn't just the eSRAM is too small, it's also that DDR3 is way too slow for graphics purposes. The GPU sits right between a 7770 and 7790, meaning it should have a dedicated 1-2GBs of GDDR5, clocked at 84GB/s, DDR3 only offers up a maximum of 55GB/s real world. If Microsoft wanted to have 8GBs of memory in their 8th gen XBox and they didn't think they could get GDDR5 for all of that, then they should have opted for 1-2GBs of GDDR5 and the rest with DDR3. The 360 had basically the equivalent of memory it needed for it's GPU, it was 512MBs of GDDR3, with the additional 10MBs of eDRAM to help for graphics because the whole system shared that 512MB pool. That eDRAM cache is 25.6X smaller than the 256MB of VRAM it's ATI PC counterpart required. 32MBs of cache in XB1 is 32X smaller than 1024MBs of GDDR5 and 64X smaller than 2048MBs, way bigger deficit compared to 360's set-up. Xbox 360's processors really were less bottle necked for the amount of tech they had.
I think its fair to say ESRAM is no where near on the same level an issue cell was. We haven't heard anyone complain for a while now. The cell is what stopped many japanese devs from initially supporting the ps3. They either stuck with the ps2 or even tested the waters with the 360. Ultimately the Wii and PS3 ended up splitting the japanese devs but it took way too long to get there compared to sony's record. I remember reading that the ps3 and ps4 are actually not far off in terms of average game development costs. That is insane.
The context of the PS3 vs. the PS4 in terms of development cost is off, though... Going from the PS2 to the PS3, in terms of capabilities, is just about as big of a generational leap we're going to see for a very long time... its no where close to the PS3 to PS4...which are FAR more similar in capabilities than that of the PS2 and PS3... development costs have stabilized why exactly?...because rendering targets aren't that different...some better textures, some better lighting, some slightly higher resolution...and in a huge portion of cases...the exact same framerates...yes, ps4 games look 'a lot' better...but look at killzone 2 vs. shadowfall...then look at killzone 1 vs. killzone 2...you'll see exactly what i'm talking about... one of the most expensive games ever made was GTAV...releasing first on ps3 and 360...what current gen game, other than the high res version of that same game, comes anywhere near that in scale and scope?...nothing yet...so its no wonder current development costs aren't increasing at a staggering degree like they were through last gen...
@Kleptic: For one thing PS4's design is much more efficient at making use of it's technology compared to how things worked in PS3. Killzone 2 didn't release until 2009, 2 years after PS3 launched, arguably we've already seen as big a leap from Killzone 3 to Shadow Fall as there was between Killzone on PS2 and KZ2 on PS3, but Horizon is right around the corner, not only is it visually more impressive than Shadow Fall, but it's also a massive open world game, with completely real time weather, lighting and also fully dynamic physics. There's also an advancement in AI when you pay attention to what how the Machine's AI reacts and adjusts to the player's actions, all of those systems working together have never been done in a game like that. The sheer scope of that it well beyond even GTAV. Now I'm sure the next Rockstar game will push things further in it's own way, but for now Horizon is a technical marvel that proves where the leap is. If you want graphics that show a sheer leap on that level check out the real time cut scenes of Uncharted 4, the intro of Drake & Sully and the ending with Elena, all of that detail is on a completely new level than even The Order and that game was as close to CG level in a game that video games have ever reached. Cost wise a lot went into the tools, engines and reiterating tech have swallowed up much of the development budget, but things have gotten easier and developers can actually basically use the assets they used to make for marketing materials in game, running in-engine and in real time. Visuals are definitely on another level compared to last gen, the gap is definitely there when you look at current gen exclusives by comparison to the 7th gen and the features are adding more depth to the experience. I'd argue that the higher efficiency of current gen hardware have basically made it so that developers didn't need as much of a power leap to make that leap in output.
"I remember reading that the ps3 and ps4 are actually not far off in terms of average game development costs. That is insane." Huh? Thus is a good example of reading and believing some bullshit on the internet I think.
I wouldnt say that. ESram is something that will be mostly ignored or used superficially by third party devs. CELL, however, was a mandatory adjustment in the dev process. Devs were forced to learn to use it properly. @ below I dont think so. A PC with 8GB DDR3 is fully capable of 1080p gaming. I think you underestimate the complexity of development. Its not even remotely as simple as "DDR3 + ESRAM = 1080p"
eSRAM is necessary to achieve 1080p. Without using it or not using it properly will cause lower resolution. This is why DX12 will be very beneficial to the XB1 because the tools will make the eSRAM usage automatic.
Wait...a PC with only 8GB of DDR3, and an apu of some sort, is hardly 'fully capable of 1080p gaming'... PC resolution capabilities usually come from dedicated gpus that use more wattage than an entire console...and have their own dedicated bank of memory, which is almost never ddr3 anymore...
They were also "forced" to learn how to utilize the emotion engine in the PS2. They were forced to learn the R3000 instruction set for the PS1. In addition they were "forced" to learn to develop for the 9 distinct processors found within the Saturn. As well as "forced" to learn how to program for every bit of proprietary piece of hardware that has existed within gaming consoles since they moved away from the 8080/8086 after the NES generation. PS3 had a new way of doing processing which was a departure from the standard RISC/CISC instruction sets, but those things were quite powerful for what games were starting to do last gen....which is why Sony went with the Cell in the first place. Learning new hardware has always been a think in console development. Developers just never complained about it publicly until last gen. It's pretty funny really, because many devs complained about it, yet the things Cell architecture introduced are pretty much the exact things being done today with GPU compute and ESRam. It's just implemented differently. Of course, looking at it logically, and from a price stand point, Sony probably would have been better off going with a standard set up, allowing for more memory. The raw computational speeds and data throughput of the Cell were pretty amazing for their time though.
Sony went with the cell because it was good at streaming which is what BluRay movies, not most games, required. Since Sony was involved in a format war they built the system to play movies first and games second. It worked! Sony lost market share compared to PS2, but gained revenue from every Blu Ray disc and player sold to the entire world. It's sad that Sony's greed caused them to take advantage of Playstation loyalists by selling them short on games technology in order to win a format war. It's also sad that people still defend Sony like they're suffering from Stockholm syndrome. Hypocritically, many of the people who lauded PS3's multimedia capabilities are now bashing MS for making the XB1 a console that does more than just games, while supporting the PS4's relative lack of features.
No, not really. From what I have understood, ESRAM is a bottleneck, CELL was optional and if you didnt want to deal with it, you didnt touch it. CELL also involved whole different set of coding skill (though I think Sony released better development tools later on?) and how to implement that functionality alongside with the other machine functions. ESRAM is mainly about resource management AFAIK. I dont think that even the eDRAM in the XB360 was a bottleneck because I remember it was optional too. Though its been said that Microsoft has managed to sort their stuff out when it comes to the ESRAM (software side) which is why we have seen less drastic differences between games lately.
Cell wasnt optional..lol The cell WAS the ps3!!!
That's incorrect. ESRAM isn't a bottleneck. It is not mandatory to use. It's actually the fastest memory. DDR3 is more of a bottleneck than the cell..
Neither the Cell nor the ESRAM were optional, you are smoking some bad weed dude.
A dev didn't have to utilize all the Cell's abilities to improve their game, that was optional, but the Cell was pretty gimped if it just went with standard coding practices. If you used the base core, it was a single core processor with a high clock and some nice memory bandwidth. ESRam itself is not a bottleneck, it's used to circumvent a bottleneck. The actual bottleneck is the bandwidth of the system memory. It is also optional, but not using it hinders the systems ability to render a screen. Nowadays, GPU compute is optional, but many devs use it because it works, and they understand it. GPU compute is basically what the Cell did extremely fast...even faster than today's video cards. It was also flexible to use for things other than GPU processes, whereas GPU compute can be limiting at times. After the first couple years of the generation, there is no reason devs should be griping about the Cell architecture, because it did natively what they did anyways with GPU compute....although they had to code in those native commands. And the tools to handle such things did get better over time. If it "retarded the industry significantly" it's because devs didn't take their time to optimize for the Cell, and just wanted a easy port from base PC code to a console system...which is much more likely given that that's what devs were asking for Sony to create in the PS4 when asked.
@UkrainianWarrior Obviously XB1 was designed so that the ESRAM is pretty much essential. Its also a very small memory, maybe little undersized (I know that it relies massively on the speed but even then the size of the memory must be sufficient in order to be very effective). Thats why I said its a bottleneck. Its the only way to get desirable speeds but it might be slightly undersized. Lets just say that XB1 wouldnt have had nearly as many inferior ports from the beginning if the ESRAM was slightly bigger than it is now. @rainslacker Good reply. I was always in the belief that the hard part of CELL was the cluster of 7 SPEs? Like, I figured the PPE and the SPEs are probably in the same die but I didnt think the main PPE was that hard to work with. Optional might have been a wrong word to use there. I have also thought that unlike the eDRAM in XB360 that could be used for additional speed boost, all the essential (framebuffer?) data has to pass through ESRAM which makes it much more trivial than the eDRAM? Then again, Im just parroting what I have heard... Im not a developer or anything. It would be better to get my facts straight than be said "no! dummy"
From my experience, it wasn't that the Cell used SPE's, it's that the logical order of how you had to program for it was completely different than what was standard practice compared to the more standard RISC/CISC chip. For general purpose parts of games, the basic loop, AI, etc, there wasn't really any change to the way things were done. It still uses standard RISC based instructions, and the API's were adept enough to access it without much trouble. However, when implementing things like physics or certain aspects of graphics rendering, it required some new trains of thoughts and proper ordering of instructions. These instructions were an extension of the RISC set, and while not overly complex, it added an extra layer of processes for the developer to take care of, particularly when paired with the RSX as opposed to a standard graphics processor. When used properly though, it was extremely efficient, and even 1st party devs came nowhere close to what it was capable of. The PS3's design itself was extremely flexible to many different implementations, and was designed in such a way to be able to process things that were on the uptake within game design...mainly GPU compute(which CELL used a special implementation of) and hyper-threading. The real hard part was that it wasn't easy to port from the more standard DX12/OpenGL methods to the SPE's special instruction set, because there was no real direct translation that was applicable to all situations. As it is now, there are direct translations for most DX->OpenGL instructions, so porting requires less work. This is why I believe many devs complained, because it added more work onto their already tight schedules. Game developers are quite talented...they have to be, and they are capable of learning things like any good engineer would be, but it doesn't remove the need to actually do the work. I don't consider them lazy for not wanting to do it, because I know what kinds of schedules these guys work, and it doesn't help that a new thing made it even worse. EDRAM was more of a cache than ESRAM is. ESRAM is meant for a very specific purpose, although it can be used for other things. From what I've read, it's not as efficient when used in ways that it's not meant to be used. ESRAM isn't really meant to be used as a memory cache, but more of a way for the GPU to access the frame buffer without using the slower system bus inherent from using slower DDR3 memory. However, I will qualify all of this by saying that nothing is as black and white as my or other's comments, or countless articles, or numerous developer statements may make it seem. There is always "tricks" a dev can use to extract more out of a system...or make it seem that way.
Those are what I call inept developers. They blame other things rather than their aptitude to make games. Its not all about having the most power, its also about efficiency. Sometimes people are so stuck in their ways that they refuse to try something new. How long did it take to convince people the world was round? A case could be made for DVD9 on the xbox 360 for retarding industry. But then it takes an experienced team like Rockstar to crank out GTA5 or or Bethesda with Skyrim. When I initially had GTA4 and oblivion I thought those were the best graphics that I would see that generation, the draw distance was at its max, that every couple thousand steps there would be a loading screen, but then we get completely blown away when the sequels came out. Now I cringe at GTA4 and oblivion, what I once used to think as the most beautiful games (exaggerating a bit). That is talent. Those are great developers; the ones that can learn new ways and adapt and be efficient.
Remind me who the hell are you again ?
...except Frostbite is one of the most impressive engines from last and current gen, so...
Frostbite engine might be impressive, but it's the developer who learns to harness the power of the engine. Give the Uncharted/Naughty Dog engine to any other developer or a monkey, doesn't mean that they are going to churn out Uncharted quality games.
Wow moving Goalposts esram still works like ram, it was used in 360 and it is used in Xbox One, Cell changed the whole architecture of the way programming was done, which was stupid
Cell used RISC architecture. The only thing that changed was offloading tasks to sub-processors. Today, that's called hyper-threading and GPU compute. Same principal as used in the cell, and no one is complaining about those things. ESRam works as a buffer to render the frame. It is most definitely not like RAM, and falls more into the category of post processing that would be utilized on the Cell should a dev decide to offload rendering tasks onto the Cell which was available to them if they learned how and stopped complaining about it. It wasn't even hard. I learned to do it on my own with hardly any documentation in my 2nd year of game programming classes with only a cursory knowledge of graphic rendering API's. Was it stupid to go with the Cell? The Cell was a beast of a machine for what games were doing at the time. It was poised to take full advantage of current trends in game development. Every system from almost every company released since the NES had proprietary hardware that the developers had to learn to use, and never once was there complaint except for maybe the Saturn where there were 9 processors. People who make the criticisms you do have no idea what the architecture or the code that surrounds it means. The PS4 used a RISC based instruction set, which is what the 360 used last gen. It had an extended instruction set for Cell processes. Other than that, and the ability to transfer tasks between GPU and CPU and some serious direct memory bandwidth which was impeccable for it's time, there was nothing foreign about what programmers were doing. The complaining, I believe, is more from the fact they couldn't just port games from PC, to Xbox, to PS3 without having to do extra work on optimization...something usually mitigated by the Unreal game engine the vast majority of them used anyhow.
It was funny to discovered that the PS3's CELL is actually more powerful than the CPUs used on the Xbone and PS4... I really don't think it was the CELL itself, but the architecture of the PS3 itself. So what is this about the CELL retarding the industry now?
I read that in an article too,they did some comparisons with ps3,360,ps4 and x1 and it turned out that the ps3 cell processor was indeed stronger than the cpu processors in both x1 and ps4.#you know they gonna down vote us before looking up the facts
In terms of raw throughput, yes that is true. However, that power is limited to what kinds of data can be processed faster than on the CPU's in he new systems. The Cell in the PS3 is actually more powerful than the CPU in many high end gaming rigs to be honest. But that power is meaningless if it can't be utilized for the tasks you throw at it. For general computing tasks, the Cell isn't a beast. It's a single core processor with a high clock. Where the Cell really shines is in manipulating specific kinds of data, and it can compute huge amounts of data when that data is in certain formats. Overall, the Cell could have been more than what we saw implemented from it in the PS3. But the premise of the Cell has simply been moved into more general computing principals, and implemented through GPU compute and hyper-threading. The Cell lives on in this way more than in hardware, and I don't see developers complaining about it. Most were asking for it. Even the architecture of the PS3 wasn't all that foreign. Outside the Cell, the rest of the set up was pretty common. The split memory was a bottleneck, but it was made up for with the ability to offload graphics tasks onto the Cell. This is the principal I think tripped up a lot of developers, because while offloading tasks was nothing new, it was usually CPU->GPU and not GPU->SPE->GPU. Overall though, raw power of a CPU is only a small factor in how powerful a system is. Everything has to be looked at as a whole, and a system is only as powerful as it's weakest link, and only if that weakest link isn't circumvented by another means.
I might say this Frostbite and other "Tweeting" devs could have a point, if it wasn't for the fact that Uncharted, God Of War and THE LAST OF US weren't the technical Marvels that they were for the PS3. I mean, these Frostbite devs can't even release their own game on current 8th gen systems without bugs, glitches and problems.
agree with you, ps3 had great titles and well optimized with first party games, imagine what i would be like if 3rd party knew the console to its potential at the early years of the last gen. Just the thought of it maybe the outcome would have been different but interesting between the 360 and ps3
You know what's really retarded other than this supposed ps3 architecture, xbox360 RROD 54.2% failure rate. Anyone remember that. Ah that felt good. Keep the war raging fellow losers hahaha
Then they should go flip burgers and quit calling themselves developers/coders/programmers if they can't figure things out when others clearly made it work just fine.
Yes, the PS3 was a very capable machine... However, Sony fully admitted to the problem by ensuring that the PS4 was much, MUCH easier to program for.
Totally different situation. One could argue that the PS3 innovated both the gaming and entertainment industry by bringing Blu-ray to the forefront. The cell allowed us to play games we had no business playing last gen ie Uncharted 3, God of War Ascension, and TLoU were incredible technical feats. Honestly, it didn't take everyone all that long to start yielding incredible results, Motorstorm was gorgeous and it released in 2006, Uncharted 2007, Uncharted 2 & inFAMOUS 2009, Killzone 2 early 2009, and God of War 3 in early 2010. Even the first iteration of GT5 came out in 2007 with the next coming 2010. In the first 4 years PS3 produced some incredibly impressives games. Even non Sony 1st party studios had early success with the cell, on games like Resistance, ToD, and of course MGS4 Guns of the Patriots.
im not sure u could compare them because CELL is a a hugepower but it was bottlenecked with bandwidth wasnt it if i remember right. it was very complicated and devs didnt want to spend the resourses learning it unlike Sony 1st party devs. im not sure how well devs are doing with ESRAM but damn xbox1 has some fine looking games coming soon(tomb raider and quantum)
I don't care that it was hard to develop for, all I know is I had some of the best gaming experiences on the platform from the devs who worked hard to understand it. You know what's funny this generation the systems are easier to develop for, and yet these devs are still releasing broken games. I think it's more about the devs than the equipment.
I sort of agree that some developers don't have the resources or possibly the expertise to develop on a different architecture. Still many development tools actually do hide the "metal" from the programmer and if designed properly there would not be much of a difference designing for Intel, PowerPC or the Cell chip which is based on PowerPC anyway. Technology does change and those that slam technology because it is new or different are usually set in their ways. Any Professional Engineer (note the capitol "P" and "E") be it Civil, Electrical or even IT should be able to adapt to change, those that can't well? Just because someone has a tertiary qualification such as a degree or doctorate does not mean they can now forget about learning, in fact the only time people who have technical qualifications should stop learning is when they die.
Okay, i lol'd at that header.
Insert Forrest Gump comment here:
Devs were still able to make many exceptional titles on the PS3. Not to mention there are still more upcoming PS3 games than any other 7th gen platform
Exactly. Third-party devs couldn't be bothered to take advantage of the Cell, but first-party devs proved a point time and again of just what it was capable of. PS3 has the most technically advanced games of last-gen, period.
THANK GOD someone actually said it. The CELL was such a massive blunder. You get very similar performance from the 360 but it's far more accessible to developers and cheaper to manufacture.
Funny how the PS4 is the easiest console now especially with its unified memory. Sounds like Sony really did listen to the criticism from developers. Anyways its better for the industry if consoles have a PC like architecture.
You can thank MS for the PS4 being easy to develop for. Beauty of competition 👍
If by Sony learning you mean Sony paid AMD to do it, you are absolutely correct. AMD makes the APU for both the Xbox One and PS4. The difference lies in the memory used and how it is accessed. Sony went with GDDR5 graphics memory and Microsoft went with CPU friendly DDR3. Neither was the ideal choice and both rely on different techniques for balance. Sony's solution is easier for graphics computations and Microsoft's is easier for CPU tasks.
@ septic Or you can thank the developers whose feedback was given. Or thank Mark Cerny for his vision of what the PS4 should be.
@Septic Well if it was easy to develop for in the first place developers wouldn't have complained about it so much. Thankfully Sony listened to developers and changed the systems design. Which is why I give thanks to the developers who complained about it in the first place more than Microsoft. That's the beauty of listening to developers.
@Death Actually by "Sony listened" I meant that Sony listened to the complaints on developers and then they decided on how to make the system alot easier to develop for. They didn't just appear at AMDs door step and asked them to give them an easy system. They appeared at AMDs door step with the blueprints of the system and paid them to build it with the tech that they had. AMD didn't listen to developers it was Sony. And if you don't believe me you can look at the specs of each APU and come to the conclusion that each console manufacturer told AMD what features they wanted in it.
You should be thanking PC gamers .
How does Microsoft factor into it at all, Septic?
@Spotie, "How does Microsoft factor into it at all, Septic?" unlike your constant plugging of mentioning the playstation in xbox articles, mentioning the xbox 360 is actually relevant here. the popularity of the xbox 360, especially among third party western developers, woke up sony who have previously always had proprietary hardware that made it difficult to work on. something sony always wanted as it forced developers to learn to make games especially for their systems. the ps4 is a machine finally made for developers with the right architecture to make games far easier.
Microsoft factor into it because the 360 featured a pool of unified memory. 512MB of GDDR3 to be specific, serving both the CPU and GPU. Sony listened to devs. And those devs said "make it more like the 360". This is not a bad thing.
Apart from the different talent sony and ms had at their disposal why was it that the ps3, despite being a w*nk to develop for yield the best games from a graphical standpoint? GT Gow TLOU etc. Again, the ps was a pain but shouldn't the console that was easier to dev on surpass what was on the console that was harder to dev on more frequently. Turn10 output vs PD output for example?
@Spot- Its Septic....MS factors into anything he post. When did MS actually even work on the PS4? Oh Septic lolz. @Volkama- So MS should thank Sony for backing and help creating Bluray now that XONE has it? Developers must have been asking for more space.. http://www.1up.com/features... and http://www.tomsguide.com/us... So maybe they said "make it more like PS3" when they talked to MS about their format issues. lol