660°

IBM may introduce the new CELL chip for Sony next-gen console ?

Sony and IBM still together?

IBM still supports the CELL Broadband Engine, signed Sony Playstation! The official website continues to offer the chip created for the Playstation 3 and displays it proudly in the company's history with the symbols of buttons on the pad, and has signed also agreements with Toshiba (the CELL is present in his REGZA TV) and have achieved server supercomputer.

Read Full Story >>
translate.google.com
user39158004111d ago

No it wont happened, Sony already stated they are going with multi procesors, so no cell and to be honest im glad there is no cell on the ps4.

decrypt4111d ago (Edited 4111d ago )

Cell is over rated imo, its only Sony fanboys cheer leading it, hell even Sony themselves have stopped.

Its a fact had Sony released the PS3 with a normal CPU and GPU like the 8800GTX, the PS3 would have easily been capable of running most of its games in 1080p @ 60fps.

While Cell is better than a standard CPU when it comes to doing graphics, thats because CELL was designed to do graphics a CPU isnt designed for that job. However against a real GPU Cell stands no chance. It would absolutely fail even against the RSX, let alone anything that was relevant at the time the PS3 was released aka the 8800GTX.

CELL tried to do everything at once CPU tasks and GPU tasks and ended up being bad at both of them. It cant compete with a Dual core CPU when it comes to general purpose tasks, while it absolutely gets devastated by 4-5 year old GPUs. Proof is in the pudding, With Sony running at the last moment to Nvidia for help. Had their been no RSX in the PS3 there was no way it was going to compete with Xbox 360 even with 2 cells on it.

Coming to the current time. It takes billions of USD to develop chips. IBM is no where near a contender in the graphics business. Even if they did update the CELL it wouldnt stand a chance against any of the current offerings from Nvidia or AMD. Hence it would be in Sonys best interest to go with a standard CPU and a good strong GPU from AMD.

AMD and Nvidia have put in years or work and billions of USD into these chips, there is no way IBM can compete in this department with either of them. Sony would be better of putting the extra money toward a better GPU than to invest in upgrading the Cells ancient design.

Edit:

Here come the disagrees lol, for the disbelievers go ahead put together a 4 year old PC with a dual core CPU and a 8800GTX watch as it will play most console ports in 1080p. Something which PS3 will barely do in 720p, despite any sort of optimization from the Cell.

Hence for the PS4 it would be in Sony best interest to drop the Cell watch and see as thats exactly whats going to happen.

blackbeld4111d ago

Maybe, if so then it will easy be backward compatible.

Still rumour, we should wait and see.

yewles14111d ago (Edited 4111d ago )

*sigh* decrypt, WHY are you comparing the PS3 to a graphics card that had cost $50 MORE than the whole console at launch?

zebramocha4111d ago

@yewles do you have a time frame for your next video? I like them and your intros,going up and down my inner tubes.

steve30x4111d ago

@ decrypt : If you put an 8800GTX into a console the console would die within an hour because the 8800GTX got extremely hot. With the stock cooler it would reach up to 95 celcius.

4111d ago
4110d ago
DeadlyFire4110d ago

Only reason I see is for PS3 compatibility for PS3 games. Which is very likely to be in at least the first wave of consoles.

Cell is not necessary anymore. Power series is a parallel or better to it now really. I personally believe Power7 is the successor to the Cell tech.

32 threads in 8 core Power7 Cell 2 was aimed to have 32 SPUs. What is the real difference between 32 SMT threads and 32 SPUs. Its possible the performance was equal or lesser than Power7 series CPUs that were coming out which could be the main factor in it getting canned.

Cell up to 256 Gflops.
Power7 264 Gflops and is more efficient at keeping that pace.

Power8 Could possibly stomp both of them, but doubtful Power8 is coming along side AMD APU, but it technically could. Just doesn't seem like enough room in the system.

sikbeta4110d ago

2 things:

.CELL is dead as a viable chip

.Unless you want to buy a $599 console again and Sony want to be the laughing stock for another generation... it's just better to cut expensive PS3-BC and go for an affordable console that is easy to sell

Now, when Sony unveil the PS4 and tell there is no PS3-BC, people will RAGE! and XBX8 will have another advantage compared to PS4

+ Show (6) more repliesLast reply 4110d ago
Enemy4111d ago (Edited 4111d ago )

Nope, the Cell Saga is over, folks. They're moving on to better, simpler things.

Unless you were expecting:

Sony: And now for our worst kept secret...the PS4!
Crowd: YEAHHHH!! HOO HOO HOO HOO!
Sony: ...to be powered by the most advanced and most complex architecture ever put into a single chip. PlayStation fans, I give you the Cell 2!
Microsoft: (laughs)

nukeitall4111d ago

I hope Cell isn't returning for Sony's own sake. I want to see the PS4 flourish and be a serious competitor in the industry.

The industry needs it.

Ulf4110d ago (Edited 4110d ago )

Enemy, you just stuck a red sign on your head, with the word "CPU clueless" on it.

The next gen platforms will probably have barely the performance of the Cell, when it comes to some tasks, like movie-quality animation, and good physics -- its a real shame, from a CPU and gamer standpoint, that we won't see a quad-core, 16-32 SPU Cell this next gen.

DragonKnight4110d ago

@nukeitall: BAHAHAHAHAHA! That was a good one. Oh wait, you're serious? Gee, what can one say to that. Hmm, well I guess the PS3 didn't have the best looking console games of the gen. Oh, it did? Well damn. I guess the PS3 didn't sell at a faster rate globally than the 360 (meaning it was more desired throughout the world) and has likely surpassed it now. Oh, they did that too? Gee, I wonder how anyone could come to the conclusion that the Playstation brand isn't flourishing. Oh, an MS fanboy? That explains a lot.

nukeitall4110d ago

@DragonKnight:

The Playstation brand is the weakest it has ever been as reflected by the marketshare of each respective company.

Seeing how I hurt your feeling with a pretty innocent comment, it is clear who is the real fanboy is!

hesido4110d ago

It wasn't Cell that held back PS3, it was the RSX. CELL's cycles were used to make up for it (e.g. Vertex skinning on Xbox is usually done straight on Xenos, while for PS3, this had to be done on the Cell to get better performance.)

I hope PS4 has an butt-kicking graphics card where you can throw anything including aiding the cpu for some tasks, not like on Ps3 where it was the other way around. Then it can have whatever CPU they like.

+ Show (2) more repliesLast reply 4110d ago
ATi_Elite4111d ago (Edited 4111d ago )

although "SOME" people believe The Cell is the second coming, can cure Cancer, and get them a date with Megan Fox or Megan Good........

hate to bust your bubbles but the Cell is Dead!

. IBM has taken it's Cell Servers off-line
. Toshiba has no plans for the Cell in it's Next Gen TV's
. Sony has already said x86-64 for PS4
. Toshiba sold the Cell plant back to Sony and Sony will use the Cell manufacturing plant to make sensors for cameras and other parts but NO Cell manufacturing!

The Cell was a nice chip but it just doesn't fit in too nicely in a x86 world no matter how much power you give the PPE part of it!

True no PS4 News has been given but Sony saying NO to the Cell for the PS4 did slip out and you only have to read a SOny or IBM or Toshiba Financial report to know that Sony is not making any Cell Processors and Neither is IBM or Toshiba!

trenso14110d ago (Edited 4110d ago )

Yell Dead Cell!

(hope they get the reference)

DarkHeroZX4110d ago (Edited 4110d ago )

Sony has not confirmed anything on the ps4. They where testing how much power it would take to run GT5 at 4k resolution and it took 4 ps3's to do so. So at minimum they need 4 cells which 1 new one blows the current cell out of the water. They need a minimum of 1 gig of vram and 1 gig of system ram to play a ps3 game at 4k, and atleast a 660 gtx or equal graphic card. We'll what Sony does though. This could help keep cost low for the next console.

4111d ago Replies(3)
Irishguy954111d ago

I hope they stick with Cell, so the system will be backwards compat with the Ps3.

IG-884111d ago

well if they do not go with the cell next gen than Gaikai could be used for backwards compatibility.

guitarded774111d ago

Yeah, but I have a collection of about 200 disc games. If they go cloud, they damn well better have a system in place where I can play the games I already paid for, for free or I'll be pi$$ed.

Cueil4111d ago

how about this... keep your PS3... when you're done with your PS3 sell it and all your games on EBay... till then keep it hooked up. It's not going to poison your PS4 because you play with it more. They are consoles, not women.

Saigon4110d ago

A while back I asked a question if Sony decided to use the cell chip with the supposedly graphics chip they implied from AMD. I received some various answers.

What if the APU is a custom chip using the cell technology (not necessarily the cell chip) with a weak GPU for the low end processing and an off the shelf high end GPU + APU (Previously mentioned) for high end processing. Do you think that will work.

Or what if they decide to use the cell chip as the main processor and the APU + GPU will be used for all game processing, of course in combination with the cell, could that work?

Grandmaster-B4111d ago

I hope so, the re-bought cell fabrics could consider this.

Cell2 FTW

TheBreezyBB4111d ago

Wouldn't it be better if Sony are indeed using a new Cell Processor?
Think about, BC and easier workflow for most developers, as they would not need to start all over again with some new hardware.

MasterCornholio4111d ago

True some people are freaking out because they believe that CELL is difficult to develop for but the truth is that developers have had 6 years to program for the processor so its normal that they get used to it. I really dont see why sony cant use a better version of the cell processor.

Pandamobile4111d ago

Maybe they got tired of seeing their platform get the ass-end of pretty much every multiplatform game?

Use a more standard architecture: More happy devs, more happy players because multiplatform PS4 games don't look inferior to the same game on a next-gen Xbox.

Consoldtobots4110d ago

I dont know what's been hated on more this gen the cell or the PS3.

Show all comments (94)
80°

This RTX 4090 just got an eye-catching deal that's definitely worth a second look

If you're in the market for an RTX 4090 graphics card, this one is currently the cheapest one you'll find on Amazon after a hefty discount.

DustMan44m ago

$1,878 (Was $1,969.45) LOL. What a steal.

330°

Nvidia DLSS 3.7 drives a further nail in the coffin of native performance

Nvidia DLSS 3.7 is the latest update to the long-running AI upscaling technology, and it further shows native performance doesn't matter.

DustMan4d ago

I think hardware development is at a point where they need to figure out how to draw less power, These beefy high end cards eat wattage, and I'm curious if using DLSS & AI in general will lower the power draw. It would seem like the days of just adding more VRAM & horsepower is over. Law of diminishing returns. Pretty soon DLSS/FSR will be incorporated into everything, and eventually the tech will be good enough to hardly notice a difference if at all. AI is the future and it would be foolish to turn around and not incorporate it at all. Reliance on AI is only going to pick up more & more.

Tapani3d ago (Edited 3d ago )

DLSS certainly lowers power consumption. Also, the numbers such as the 4090 at 450W does not tell you everything, most of the time the GPU stays between 200-350W in gameplay, which is not too different from the highest end GPU of 10 years ago. Plus, today you can undervolt + OC GPUs by a good margin to keep stock performance while utilizing 80% of the power limit.

You can make the 4090 extremely power efficient and keep 90% of its performance at 320W.

However, in today's world the chip manufacturing is limited by physics and we will have power increases in the next 5-10 years at the very least to keep the technology moving forward at a pace that satisfies both businesses and consumers.

Maybe in 10 years we have new tech coming to the markets which we are yet to invent or perhaps we can solve existing technologies problems with manufacturing or cost of production.

On the other hand, if we were to solve the energy problem on earth by utilizing fusion and solar etc. it would not matter how much these chips require. That being said, in the next 30-40 years that is a pipedream.

MrBaskerville3d ago

I don't think fusion is the way forward. It will mosy likely be too late when it's finally ready, meaning it will probably never be ready. Something else might arrive before though and then it becomes viable.

Firebird3603d ago

We need to stop the smear campaign on nuclear energy.
We could power everything forever if we wanted too.

Tacoboto4d ago

PS4 Pro had dedicated hardware in it for supporting checkerboard rendering that was used significantly in PS4 first party titles, so you don't need to look to PC or even modern PC gaming. The first RTX cards released nearly 6 years ago, so how many nails does this coffin need?

InUrFoxHole4d ago

Well... its a coffin man. So atleast 4?

Tacoboto4d ago

PSSR in the fall can assume that role.

anast3d ago

and those nails need to be replaced annually

Einhander19724d ago

I'm not sure what the point you're trying to make is, but PS4 Pro was before DLSS and FSR, and it still provides one of the highest performance uplifts while maintaining good image quality.

DLSS is it's own thing but checkerboarding om PS5 still is a rival to the likes of FSR2.

Tacoboto3d ago

Um. That is my point. That there have been so many nails in this "native performance" coffin and they've been getting hammered in for years, even on PS4 Pro before DLSS was even a thing.

RaidenBlack3d ago

Don't know what's OP's point is either but ... checkerboard rendering was good enough for its time but in terms of image quality its wayy behind what's DLSS 3 or FSR 3 is currently offering.
The main point of the article and what OP missed here is that DLSS 3.7 is soo good that its nearly undisguisable from native rendering and basically throws the "its still blurry and inferior to native rendering" debacle, (that's been going around in PC community since 2019), right out of the window.

Einhander19722d ago

RaidenBlack

DLSS is as i said a different thing from FSR and checkerboard.

But you're talking about FSR 3 which probably is better than checkerboard, but FSR 3 has only started to get games this year, so checkerboard which was the first hardware upscaling solution was and is still one of the best upscaling solutions.

Give credit where credit is due, PlayStation was first and they got it right from the get go, and PSSR will almost certainly be better than it will be given credit for, heck digital foundry is already spreading misinformation about the Pro.

Rhythmattic3d ago

Tacoboto
Yes... Its amazing how many talekd about KZ2 deferred rendering, pointing out the explosions were lower res than the frame itself..
And of course, Then the idea of checkerboard rendering, not being native....
For sure, maybe this tech makes it minimal while pixel counting, but alas, seems performance and close enough , and not native now matters.....
I want to see it run native without DLSS.. why not?

RonsonPL4d ago

Almost deaf person:
- lightweight portable 5$, speakers of 0,5cm diameter are the final nail in coffin of Hi-Fi audio!

Some people in 2010:
- smartphones are the final nain in the console gaming's coffin!

This is just the same.
AI upscalling is complete dogshit in terms of motion quality. The fact that someone is not aware of it (look at the deaf guy example) doesn't mean the flaws are not there. They are. And all it takes to see them, is to use a display that handles motion well, so either gets true 500fps at 500Hz LCD TN or OLED (or faster tech) or uses low persistence mode (check blurbusters.com if you don't know what it means) also known as Black Frame Insertion or backlight strobing.

Also, image ruined by any type of TAA is just as "native image" as chineese 0,5$ screwdriver is "high quality, heavy duty, for professional use". It's nowhere near it. But if you're an ignorant "journalist", you will publish crap like this article, just to flow with the current.

There's no coffin to native res quality and there never will be. Eventually, we'll have enough performance in rasterization to drive 500fps, which will be a game changer for motion quality while also adding other benefit - lower latency.
And at 500fps, the amount of time required for upscalling makes it completely useless.
This crap is only usable for cinematic stuff, like cutscenes and such. Not for gaming. Beware of ignorants on the internet. The TAA is not "native" and the shitty look of the modern games when you disable any TAA, is not "native" either as it's ruined by the developer's design choice - you can cheat by rendering every 4th pixel when you plan to put a smeary TAA pass on it later on. When you disable it, you will see a ruined image, horrible pixellation and other visual "glitches" but it is NOT what native would've looked like if you'd like to honestly compare the two.

Stay informed.

RaidenBlack3d ago

Main point of the article is how far DLSS has come with v3.7 since 2018. If this is what we're getting already, then imagine what we'll get within next ~3 years. Yes parity will obviously be there compared to the then native rendering tech but it'll slowly narrow down to the point it'll be indistinguishable.
Something similar is like the genAI Sora ... AI generative videos were turd back when they were introduced (the infamous Will Smith eating video) ... but now look at Sora, generating videos that just looks like real life.

3d ago
Yui_Suzumiya3d ago

How much VRAM is standard today? My laptop has a 1080p QLED display but only an Intel Iris Xe with 128MB of VRAM. I currently do all my gaming on it but certain titles do suffer because of it. I plan on getting a Steam Deck OLED soon to play the newer and more demanding titles.

purple1013d ago

Maybe better to get a budget gaming laptop and link a dualsense to it

= Portable console with far better graphics than a steam deck! + bigger screen and able to use it for work / etc

90°

One of our favorite OLED gaming monitors just got over $200 axed from its MSRP

This LG gaming monitor has a stunning 240Hz OLED display, and now it's a fraction of the price thanks to this deal on Amazon.