Sony Considering Larrabee for Future Playstation

Sony Computer Entertainment (SCE)'s next generation game console CPU and GPU has been put into consideration.

That being said, among the choices was to use Playstation3's Cell Broadband Engine (Cell BE) and further improve on it.

However, another processing unit was also being considered. And that processing unit was the Intel Larrabee, which was considered for its CPU and GPU functionality.

Oculus Quest Giveaway! Click Here to Enter
The story is too old to be commented.
SRT4Chris4014009d ago

They should just coompletely ditch the Cell processor, and take the PS3 as a lesson, stop tryin to get to exotic with the architecture. The current gen of devs are nothing but whiners.

muddygamesite4009d ago

I agree with you in the Sense that the current gen of developers are big whiners.

However i disagree on you point about ditching the CELL architecture. They invested alot of Money into it, and it is indeed a processor that is ahead of its time. by the time the PS4 comes out, the CELL would cost STI peanuts to manufacture, while being able to really step up its capabilities for next to nothing. So ditching it for a newer more expensive part might not be too bright.

We also have to look at the issue of Backward compatibility. By ditching the CELL, Sony will have to do more work in order to make the PS4 BC with the PS3. But by having the CELL in the PS4, Backward compatibility will be a breeze, and cost them next to nothing.

You also have to look at the point that developers would have spent 5-6 years learning the CELL. Ditching it will mean they will have to learn something new again. If Sony maintain the CELL, then it means all the multi-core programming knowledge that developers have learnt from the PS3 programming kingdom will not be put to waste. This in turn will save them time and money.

However nothing stops the Larrabee and the CELL from sitting in the same machine. We all know the SPU's are good for graphical calculations, and the Larrabee is supposed to act as a CPU and GPU. Maybe the CELL could act as the CPU and Larrabee act as the GPU? and in some cases the CELL will assist the Larrabee chip?

Sounds like something very nice . . .

Kushan4009d ago (Edited 4009d ago )

Developers are the most important people for a console's success. In the past, consoles have completely failed due to lack of developer support. I'm sorry, but it takes more than 1st party titles to make a console survive - just look at the Dreamcast, it had a great lineup of first-party titles, but EA practically single-handedly killed it by not supporting it.

Take a leaf from the 360 - sure, it might not be as powerful as the PS3, but it's got oodles of developer support. Treat your developers well and they'll do wonders with your machine. If the PS3 was any other brand, it probably wouldn't have sold half as many consoles as it did - and developers would have been less likely to support it (Basically what happened to the Saturn). Luckily, Sony is a HUGE brand with a great following, so developers more or less had to, but the side-effect of that is that they're more keen to play to the 360's strengths and just "make it work" on the PS3. In the end, no 3rd party titles really show off the power of the PS3 because it's just not worth developers putting the effort in.

Call them lazy, call them whiners, but they can't ALL be wrong, can they? Sony should make the PS4 as easy to develop for as possible.

Please note that I am NOT bashing the PS3 at all, it's a great machine, I'm just saying that sometimes being a bit easier to use isn't such a bad thing.

evrfighter4009d ago (Edited 4009d ago )

larrabee huh...

Intel's done some great work promoting Larrabee but the best GPU developers in the world already work for ATI and Nvidia. Their best bet is to stick with the cell technology and take advantages of Amd's Stream technology or Nvidia's CUDA technology.

Intel being able to compete with the other two is only heresay and major intel PR. Larabee is being labeled as laughabee in the pc world because it's unproven. You could say there's a chance of a cinderella story. Sure Intel owns a big chunk of the graphics chip market but the question you game seriously on intel graphics chips? why not?

Chris3994009d ago (Edited 4009d ago )

Juice it up a little or add another processor or two, toss in Larabee for the GPU and they're golden.

As the above posts touch upon, keeping the Cell makes it easier to program for, keeps costs down and allows for backwards compatability. By introducing Larabee, they have the backing and clout of IBM and a forward-thinking strategy that the engineers at Sony love (one that won't be as universally bemoaned about by the developing community for the first 2 years - especially as that same community will be using Larabee for mainstream computing).

That sounds the most reasonable to me.

Tempist4009d ago

@1.0 - the Larrabee is based on x86, aka the processors that have been mainstream for over 20 years, and so not exotic.

Now as for this article full of speculation, no Sony will not use the Larrabee. They just finished testing on if more cores = faster and it doesn't work out like that.

There is too much invested in the Cell to be shelved after it's one mainstream use. What's more than likely to happen is that the Cell gets a re-design and is used again as by then costs are going to be very minor and familiarlity will be better.

It's near nonsense to make long term, large cost decisions on untested and uncreated hardware.

marinelife94009d ago (Edited 4009d ago )

Throw in one of those new CellBE's with 32 cores and up the clock speed form 3.2 to something higher. Use Larrabe as your graphics chip.

Developers will already be familiar with the Cell architecture in the PS4 they'll have more and faster SPU's to play with.

DeadlyFire4009d ago

Cell 2 will be in the PS4. So will Larrabee like designed GPU. The interesting part of the equation is one that noone talks about. Intel's new API. Which is also with Larrabee. Nothing is certain on that just yet. Likely we will get a preview in late March at the GDC showing that Intel is preparing.

Kushan4009d ago

Uhh guys, CELL isn't "exotic" because it's different than normal processors (it's based on PPC for a start), it's "exotic" because there's 7 SPUs. It doesn't matter if those SPUs are CELL or x86 or whatever, the complicatedness of them stems from the fact that there's 7 of them and that's hard to take advantage of IN A GAME.
The "in a game" is the key part there. Look at [email protected], that takes full advantage of the CELL because it's easy to make those calculations very parallel, games are a completely different kettle of fish though.

ThanatosDMC4009d ago

Didnt people know that a PS4 is what's inside Optimus Prime?

RussDeBuss4009d ago

ditching the cell would be moronic, devs are already coming to terms with it, killzone 2 shows this. and by the time ps4 is out they will be even better at it.

i see ps4 as likely to have ever a newer cell with more spu's, or multiple cell's. something like that. an evolution of current system

riskibusiness4009d ago (Edited 4009d ago )

There won't be a Cell/Larrabee combo. It will be one of the two doing both CPU functions and GPU functions. Remember the Cell was initially intended to task both CPU/GPU functions and at the last minute the GPU was super glued on as an afterthought to make up for deficiencies in the cell graphics capability thus the bottleneck in memory. So, Sony will use a super duper Cell, or will scratch it and go with the Larrabee.

Personally, I think both MS and Sony ought to use same with more. They have had time to evaluate the bottlenecks in each system and they can work those out and offer rigs 4-5 times faster at a relatively cheap price. Nintendo would be best suited for Larrabee, Wii is so graphically outdated they can use a major architecture change.

deeznuts4009d ago

Ditch it right after developers get used to it. You didn't do well in school did you?

Kushan4009d ago

Developers aren't getting used to the CELL, they're getting used to Parallel programming.

The Great Melon4009d ago (Edited 4009d ago )

While they may consider other options I doubt they will go with something else. Just listening to my electrical engineering professors talk about the future of computer architectures, it is obvious that computers will evolve in the direction of having one or two powerful cores with many small ones surrounding the larger ones, much like the cell.

+ Show (11) more repliesLast reply 4009d ago
dexus4009d ago

The CPU is not the issue when we talk about PS3.

But the VGA was past gen when was PS3 released, totaly outdated.
Same CPU and new Graphics is what PS4 needs ... (btw, i think that next ps will be ps5 because japanse and other azian cultrures hate number 4 (the translation of the numer means also "death" )

TOO PAWNED4009d ago

Then call it "quattro". Latin(and in Italian) for 4

AdolfBinBush4009d ago (Edited 4009d ago )

Or let them call it "Quad" or 3++.

DeadlyFire4009d ago

No matter the culture. Sony has chosen the number scale. 4 will be next.

PotNoodle4009d ago

If the xbox 360 wasn't around then developers wouldn't be complaining about how hard the PS3 is to develop for.

Kushan4009d ago

Yes they would. They complained when the PS2 was around and what competition did that have before the GC and Xbox came out? The Dreamcast? Pfft!

GUNS N SWORDS4009d ago (Edited 4009d ago )

no you still would have wii and pc to compare.

what, do you think those other options never entered the equation, or do you just see it as ONLY ps3 vs 360?

AdolfBinBush4009d ago

developers complain because they want to do less work for more money.

PotNoodle4009d ago

"or do you see it as ONLY ps3 vs 360? "

In this case, yes. Why? Because multiplatform games are usually most likely 360 and PS3.

Sitdown4009d ago

Seriously? Do you really think complaining would cease to exist? As long as you have lazy people..and as long as you have really brilliant individuals...there will always be complaining. Why? Cause lazy people will not want to do the extra work..and therefore complain. And those brilliant individuals will recognize how much easier things could be...and therefore complain. Seriously....Sony is not the Holy Grail...the world does not start and end with what they offer.

GUNS N SWORDS4009d ago

"In this case, yes. Why? Because multiplatform games are usually most likely 360 and PS3."

most multiplat games start on pc, so i don't see a reason why not to use it as reference.

PotNoodle4009d ago

^^ Not anymore they don't.

+ Show (4) more repliesLast reply 4009d ago
Kushan4009d ago

This is all rumour and speculation. Obviously Intel would love to get larrabe into a console, it pretty much means they can ramp up production and make them MUCH cheaper to produce - which gives them a nice way into the discreet GPU market for PC's.

However, although it's an interesting piece of kit, by the time the next console generation begins, both AMD and nVidia will have similar technology inside all their GPU's. In fact, you can already run programs on all of their cards right now - and they're no slouch. Nvidia has shown that using a GPU to handle physics is a great move and my 9800GTX runs [email protected] faster than my PS3 does - imagine that power in the next consoles, only about 3x better. You wouldn't need "exotic" things like the CELL, you could have a really fast regular processor and a badass graphics chip and still be very "next gen".

evrfighter4009d ago

well said. Although I game on ATI cards I personally think that Nvidia's physx will lead the way in future games over AMD/ATI's Havok physics system and hope either M$ or Sony take advantage of it.

Even with the powerful GPU's they've released already like ur 9800gtx Things are only going to get brighter for both companies as they both have developed a technology that incorporates the cpu and gpu working together. That being AMD's stream and Nvidia's CUDA.

Add to the fact that that the hd4870 which is on par with Nvidia's GTX260 is about to drop to $150. the price for future console's should Sony go the route of a console with the guts of a pc.

tudors4009d ago

I doubt that, I cannot see them promoting and then abandoning there own CPU.

likedamaster4009d ago

It's also too much money in the long run. It would be $500-600 prices all over again.

Show all comments (46)
The story is too old to be commented.