Top
The story is too old to be commented.
darthv722100d ago

sony's next iteration of the cell would have a gpu on the same die. Making it a system on a chip type of design instead of going with a separate gpu from a 3rd party.

Not to mention that shying away from Nvidia (if they were to) could potentially cause compatibility issues with existing software. unless the idea is not to have bc in the next ps4.

Supposedly the new cell SoC would be more capable to emulate the gpu of a ps3 than if they went with an AMD design....supposedly.

JsonHenry2100d ago

I highly doubt Sony will come up with a better GPU or a SoC than AMD who has years of experience making this sort of thing.

Anyone that thinks the CELL could somehow be a better performer than an Nvidia or AMD GPU can look at Intel latest debacle attempting to do the same thing. Intel Larrabee ring a bell anyone?

Sony is finally doing what they MUST to stay competitive this time around. And a 3rd party GPU was a must.

Arksine2100d ago

@JsonHenry

The PS3 uses a 3rd party GPU from NVidia. The Cell is great at assisting a GPU (Particularly for physics, Geometry, and post processing), but it isn't a GPU itself. If the PS3 has a weakness, its the RSX. NVidia still makes fast GPUs, but they haven't been able to match AMD in performance per watt, which is important for a console (lower power = less heat).

With Sony's investment in the Cell I believe that they have to continue to use it for the PS4. It will be interesting to see how AMD would design a GPU to complement it. One wouldn't assume that they would be making the same GPU for both MS and Sony.

gypsygib2100d ago

People who want BC so bad should just keep their existing PS3s to be safe.

Seriously, by the time PS4 comes out you'll get 50 bucks for it and still have to transfer all of your game saves. Besides, I'm sure most people with spend 90% plus of their time playing new beautiful PS4 games.

hiredhelp2100d ago

@arksine
Thats exactly bang on mate, way i see it too keep the cell processor or take the upgraded model offerd by ibm last year and simply put in a upgraded amd gpu.

Your right they have many years infact they started before nvidia.
Always more processors too.

ProjectVulcan2100d ago (Edited 2100d ago )

If the console GPU designs are based on the upcoming/current GPU generation then Sony would be better off sticking with Nvidia.

When PS3 was built, Ati had a clear edge in the GPU architecture stakes. The X1k series Radeons were faster and more advanced than the Geforce 7 RSX is based around. It came as no surprise Ati delivered a quality custom chip for 360, while PS3 got stuck with a crippled Geforce 7. The architecture was a loser before it was ever offered to Sony.

This time out, it seems like Nvidia will have the edge. Whatever comes out in April should have the beating of AMD's Radeon 7000 series according to inside reports. This is quite likely to be the generation any next gen console is built around.

I do find it highly unlikely any sort of CELL chip would be used unless the machine is going to be a Wii/Wii U style improvement over PS3, rather than a big step.

CELL has run its course. IBM have taken ideas and things learnt from the architecture to implement in other chips as their successors but essentially the basic design is dead and outmoded.

It will probably end up another PowerPC based design however. A custom POWER7 chip maybe with less cache. 4-8 cores are possible. RISC is still the go to design for consoles i think.

I already heard about a POWER7 chip on the silicon grapevine for PS4 anyway so it wouldn't be a shock.

The GPU at the end of the day is still THE key chip. It is probably more important than ever to get this right. Since PS3 launched GPUs have become vastly more flexible- GPGPU is so big now. As long as Sony get the GPU right, it won;t really matter too much what CPU they pick as long as it is balanced.

Persistantthug2100d ago

As is, those brands of cards and chips are wholly interchangeable now.

For there to be PS3 backwards compatibility, the real 'X Factor' really resides in the CPU.

And with that, I'm sure the PS4 will carry B/C.
Why?
Because of the network (PSN/SEN) that Sony has to carry over. Not only are consumers and developers invested in the network....but so is Sony.

Autodidactdystopia2100d ago (Edited 2100d ago )

To people who think 1080p is too much or the end of the need for more resolution.

I just want to inform you that you cannot see what you cannot see.

This is not a troll or a razz, but im gonna be pushing this idea for a while so I might as well get started here; where better? :)

Some of the texturing you see in real life, for example actual surface texturing for paper plasic metal and other fine grit surfaces cannot be fully realized without extremely high resolutions. somthing on the order of 8000000x6000000 pixels.

I'll explain further in other posts over time, it just amuses me that people think that 1080 is it.

I know 8mil by 6mil screens wont exist for years to come but when they do and you take your old ps3 and play a cutting edge game for the time like uncharted 3; You will see that the surfaces that your brain used to improvise for and make you feel that they could be real, will look either blocky due to the use of bitmaps or blurry due to the use of stuff like superSal 2x texture filtering.

The fine surface details arent there but because you are so used to watching things on a tv with a severely crippled pixel resolution in comparison to real life, that your brain does that improvisation and accepts the image as finely detailed when it actually isnt.

Just wait until you can look into a game world and 50 yards away you can still see surfaces and make out the texturing of those surfaces and not just colored silouettes.

Grass on a football field is one example that suits this argument perfectly.

You might argue that when watching a live 1080P football game everything looks real.(even the grass) But try to see the grass 5 yards back and it becomes a blurry green field from then on, and yes you would be right the approximation of reality at the resolution you are watching it at is spot on perfect. However it is just that... an aproximation of reality at 1080p. Take a football game recorded in the 70s and watch it on a 1080 screen. You will see that all of the surface detail is missing albeit with a spot on approximation of reality at the recording resolution.

Now go stand in a football field in real life and notice that as you look further away from you the individual blades of grass are still visible no matter how far you look. (depending on your eyesight) This is due to the extremely high resolution image your eyes provide you.

Also notice that the blades dont alias in your vision because the way you see is more of an adaptive resolution allowing for the clearest image possible.

The same concept applies to videogames. you can approximate reality at any resolution but you lose detail the lower the resolution you do so. which is why xbox 1 games looked so great but then you got an hd tv and they dont look so great aymore. and why ps3/360 games look so great. for now.

I cant wait. :P

best regards
-Auto

Autodidactdystopia2100d ago

I know, I know.... now one likes to hear complicated stuff. but there will come a day that you will know that little old Auto was right.

I dont know what there was to disagree with.

I only stated fact. Hmmm.. Im puzzled.

anywhoo I like Nvidia and ATI hmmmemmmm I mean AMD.

Good things come to those who wait.

hellvaguy2100d ago (Edited 2100d ago )

"To people who think 1080p is too much or the end of the need for more resolution. I just want to inform you that you cannot see what you cannot see."

High end pc gamers using 2500x1600 resolution monitors say hi.

Autodidactdystopia2100d ago

High end pc gamers using 2500x1600 resolution monitors say hi?

I am one. Hi back

Bucept Im more of a max guy now as in 3ds

http://www.youtube.com/watc...

Blaine2100d ago

@Auto

Probably because you're way off topic, not replying to anyone, and writing a wall of text to explain something we're all aware of... "which is why xbox 1 games looked so great but then you got an hd tv and they dont look so great aymore. and why ps3/360 games look so great. for now." Wow really? O_O

inveni02100d ago (Edited 2100d ago )

The reason you can't see "5 yards back" in a football game is because the camera isn't focused to five yards back. You'll never be able to see 5 yards back unless the camera focuses to that point. When watching TV, the lens of your eye is focused on the TV...not through a window. So you're at the mercy of the lens in the camera, and not the lens within your own eye.

Can the human eye detect finer than 1080p resolution? Yes. But as for focusing on whatever you want to focus on, you'll always be at the mercy of the camera. And that's the way it will remain until we have displays that can project light field imagery into 3-dimensional space, allowing you to focus on whichever point you wish.

http://www.lytro.com/

RememberThe3572100d ago

People, don't forget Sony has had issues with Nvidia in the past, not to mention MS's issues with them on the first Xbox. Remember way back when we were raging about the price of the PS3? Well back then it was rumored that Nvidia was one of the reasons Sony couldn't cut the price any lower at that time. In that same rumor it was said that Sony had made the decision to not work wit Nvidia on their next console. It was much more detailed and I can't seem to work the Google on the internet machine, but I remember it relatively clearly.

darthv722099d ago

and reading...the idea of all 3 using chips provided by the same mfg isnt that bad of a decision. If true, it could lend credibility to the next generation finally offering parity between platform.

If that were to be achieved, we could potentially see a developer create a single game that is coded to play on any one of the platforms available. Similar to how a pc game is able to support a range of specs from one company to another.

Thus making the individual platform essentially just like a player for the game as there are players for movies. Different brands but a set of standards each must conform to. The deciding factor to buy would not rest on the game but on the level of support and features each platform would offer.

Exclusives would still have their role to play but seeing as there are more neutral developers than exclusive ones.....you get the idea.

Autodidactdystopia2099d ago (Edited 2099d ago )

nahh man nahh.

rasterizers dont focus.

yes cameras have focus.

no focus doesnt apply to rasterizer engines.

im talking about pixel density and your talking about photometry.

if there arent enough pixels in the given area to describe the blade of grass, no matter how in focus you are the blade of grass will have what is referred to as sub-pixel detail and will be averaged to the pixels in the space.

also depth of focus can be adjusted to include the whole scene for example in movies you would not be able to look at objects in the background if the focus range was not adjusted to fit the scene. useful for closeups but not in scenery.

also if you are looking at a game engine generated image you are looking at what an orthographic rasterizer sees, not light. no focus if there is blur then that blur is being applied in post, giving the illusion of DOF or imagesmoothing/texture filtering. and also being done intentionally, there is no inherrent blur in rasterizer engines only solutions to emulate it, ie antialiasing depth of field and fov. all orthographic projections of 3d space adjusted to emulate perspective view. which assumes all lines paralell to the camera will inevitably terminate at a single point.

so what you said kinda wasnt related to my point but im sure you said it in good consience.

BubloZX2099d ago

Actually AMD cards have been very similar to Nvidia as of late. They have also been good at giving you the same performance and in some instances better for a much cheaper price then Nvidia. I love Nvida the best but if Sony wants to keep cost down dramatically a partnership with AMD would be better then Nvidia.

sikbeta2099d ago

I need next gen NOW! so this is good news to me, but with the other article saying Sony is patenting a kinect-like camera ala MS, my expectations are pretty low :(

DeadlyFire2099d ago

IBM stated that future Cell designs would be implemented in the Power series in 2010 I believe. So if anything I believe PS4 has Power7+(2012) or Power8(2013) potentially. Yes + is more likely than just Power7 chip.

inveni02099d ago

You're back peddling. I didn't bring up "5 yards back". You did. If you want to switch to strictly talking about simulations, then you're still providing an inherently flawed argument. We are 100 years away from having the processing capability to give a blade of grass true to life detail. When I say, true to life, I don't mean through simulation. I mean by actually rendering worlds constructed with atoms that behave precisely like the real world doppelgänger. Only when the detail exists within the game world can the detail be presented to the screen. For now, 1080p is good enough to see on screen what were capable of putting on screen. When we're capable of building Drake's world atom by atom, we'll surely also have the ability to display it.

malol2099d ago

did they say AMD ??

PHHHHHHAAAAAAAAAAAAAAAAIIIIII IIIIILLLLLLLLLL

Sarcasm2099d ago

On the topic of Nvidia or AMD, right now it goes to AMD. They've managed to produce cards that surpass the best offerings of Nvidia while cutting power consumption by half or more, which is no small feat. (Specifically Radeon 7970 vs GTX 580)

Of course we're still waiting to see how the GTX 680 (or whatever it will be called) will answer.

Even then, AMD has been successful with their APU platform in making the cpu and gpu harmonious. If they can translate that type of integration onto something like the Cell processor with Sony, then they could have something amazing.

Exciting times thinking about the possibilities of it all.

ProjectVulcan2099d ago (Edited 2099d ago )

@ Sarcasm- it is a small feat. 7970 v GTX580 is not a brilliant comparison at all for AMD. 7970 is pretty unimpressive TBH when you look at the underlying stats.

The 7970 is a brand new generation, whereas GTX580's base architecture is nearly 2 years old and the GTX580 itself is about 15 months old and on the 40nm process.

The brand new Radeon is on the latest gen TSMC 28nm process. This will instantly give it good power characteristics versus a previous generation chip. Can't really be compared fairly in the power stakes versus a GTX580.

The Radeon is about 25 percent faster, it has 4.3bn transistors. GTX580 has 3bn- so 7970 has over 40 percent more transistors than GTX580 for only 25 percent more speed. Not a fantastic return for the size of the chip.

The past few architectures AMD have built smaller chips than Nvidia with competing performance, this means that up til now, they have probably had the edge in efficiency and cost. Smaller chips cost less to make and use less power. Speed versus size is crucial.

However that looks set to change come April. The fairly strong word on the street is Nvidia have themselves a top notch design (GK104 @ 3.4bn transistors says the leaks) which is quite a bit smaller than 7970 but gets very close to it. Nvidia also have much bigger designs too that should batter 7970 in the pipeline, although they won't be around before the summer.

There is likely to be a big swing to Nvidia, they look to have taken the lead in efficiency- right around the time one might expect console manufacturers to come knocking for console architecture.

+ Show (19) more repliesLast reply 2099d ago
Mikhail2100d ago

If true, then all console manufacturer are going with AMD. Well, AMD gives more performance for buck.

JsonHenry2100d ago

typically using less power in the process as well.

dirthurts2100d ago

Going with AMD is a fantastic move.
Not only will using an AMD make coding easier, it would allow them to save a lot of money.
If they're smart, they'll probably go with an intergrated cpu/gpu combo. This would allow for super easy coding, compatibility, and price cuts.

WitWolfy2099d ago

WTF are you talking about?

CarlitoBrigante2099d ago

Integraded cpu/gpu combo? Kid do you even know what the hell you just said? Even if you did, do you know how weak the integrated gpu is in some CPUs?

mechlord2099d ago

xcept thar going aka gpgpu would mean throwing the cell out and if you guys have any sense for business you ll know sony aint gonna do that. They cant afford to lose money over something they already got right.

If every thing on the ps4 and the 720 are the same cept for the cpu, am guessing the ps4 will trumph the box on that one, as 3 does now.

And i wonder why do you think a gpgpu makes for super easy coding...

And SONY & MS dont want compatibility. Do you honestly think they want to reach a point where there is nothing to differentiate the consoles, hardware wise, except for the color?

dirthurts2099d ago

You guys have never coded a day in your life right?
Using an AMD processor and video chipset means no more coding for a complicated processor. Easy ports of pc games isn't for Sony, it's for the developers. 3rd party games would be able to excel with their hardware and produce better games.
Just because current cpu/gpu combos are designed to be entry level, or low power consumption, doesn't mean they all have to be. Do keep in mind that the latest xbox revision now has a gpu/cpu combo. Both on one chip. If you check out the latest tech that AMD has been developing, it's looking pretty great.
And by the way, I'm a computer engineer. I kinda know what I'm doing here.

Megaman_nerd2099d ago (Edited 2099d ago )

@dirthurts

what makes coding easier is not the hardware but the tools and API's. The PS3 had CELL which was a completely new architecture so Sony had to create everything from scratch and that's what caused all these problems.

The Cell had a PPE (PowerPC based) so tools like Unreal Engine 3 could run on it without any type of optimization but very poorly because it was only 1 core. But then Epic added additional instructions to make the SPE's work with the PPE and that's when we finally started seeing better performance. But since the engine was originally made for general purpose CPU's and to mostly render stuff on the GPU the Xbox's and PC's have a clear advantage over the PS3's weak GPU.
----------------------------- --

And the conclusion is that having a Nvdia or AMD card wouldn't make anything more easier than the other because both run on either DirectX or OpenGL's API's. Now the problem would be the CPU because if they go with a new architecture then they'll need to develop new tools for it but if they go with an architecture similar to Cell then they already have the tools for that and if they go with something more conventional then we have had those for years. Making the GPU/CPU integrated wouldn't change a thing when it comes to development, that's just to reduce heat and cost but it comes at a price. You'll need to make those transistors smaller enough to be able to pack something powerful in there and that's why "dedicated" it's a given unless they are thinking about making something weak. Xbox have them integrated now but that was after years of transistor's reduction.

And BTW, an engineer is an engineer and a programmer is a programmer. You create the hardware based on theories but we make the software that makes it work. Kutaragi thought the Cell would make the PS3 the most powerful system ever, on theory, but the results were quiet different.

Liquid_Ocelot2099d ago (Edited 2099d ago )

Are you trying to make the next-gen consoles more like a tablet or something similar? CPU/GPU combo? Wtf

Edit:
Ohh that's right, the newer 360s have the CPU/GP combo. I had forgotten.
Um, no thanks:)

+ Show (3) more repliesLast reply 2099d ago
andibandit2099d ago

Why are they even making a PS4, couldn't they just unlock the last 6 SPE's of the PS3's cell CPU.

BubloZX2099d ago

Sony will probably use the Power 8 version of the cell which is comparable to the i7 sandybridge processors. Those things are not only 2x the performance of current cells but they have more SPEs too. And I can see Sony using 2 or 3 of these.

Ck1x2099d ago

like I stated before just because they put it in the box doesn't mean that developers have axcess to the processing power. The power 7 architecture is IBM and the CELL chip is based off of the power pc technology, So where are people getting power 8 from?

suntzu4202099d ago

@Ck1x:

https://en.wikipedia.org/wi...

This is more than likely what is being referenced when people mention Power8. This is very misleading, since currently Power8 is the successor chip to the Power7 IBM processor, not the next version cell processor. Hope this clears things up.

+ Show (1) more replyLast reply 2099d ago
DarkSniper2100d ago

The only thing that needs to be said about PlayStation®4 is that it will perform beyond any of our imaginations.

Given the unparalleled and unrivaled power of PlayStation®3, Dark Sniper cannot even fathom what PS4 will bring to the forefront. Sony Computer Entertainment has always been the leaders in consumer electronics. PlayStation®4 will clearly set the bar once again for a quality package that includes gaming and online social interaction.

SCE's PlayStation®3 Home Entertainment Terminal provides all of the modern features, combined with the ability to evolve with technology and the common gamer simultaneously. While Dark Sniper is ecstatic to get his hands on a PS4, PlayStation®3 still has 5 healthy years in it's lifecycle that he will enjoy to the fullest.

In all honesty, PlayStation®3 is a juggernaut machine. Ken Kutaragi should give himself a pat on the back for assembling the finest piece of machinery mankind has ever created. There's no need for Sony to even manufacture the PS4, it can be delivered to your household via firmware update to your PS3.

$niper

zeal0us2100d ago (Edited 2100d ago )

Why are you referring to yourself that way? Just use I.
----
Sony wouldn't be waiting to till next year just to start.

darthv722100d ago

then he wouldnt be Dark Sniper.

thereapersson2100d ago

Sniping fools left and right.

Trolololololooool

dredgewalker2100d ago

I have to admit, that was a funny trolling from Dark Sniper.

JBaby3432100d ago

I've missed Dark Sniper from back in the day. Glad to have you back.

+ Show (3) more repliesLast reply 2100d ago
Pandamobile2100d ago

What possesses people to write comments like this?

StayStatic2100d ago

I know what you mean and I think my head is going to explode =/

Kenshin_BATT0USAI2100d ago

It's called trolling, Dark Sniper is a normal dude, this is just his mainpage persona.

SilentNegotiator2100d ago

Whatever possesses people to pretend to be a fanatic of something they hate.

BattleTorn2100d ago (Edited 2100d ago )

In the word of Tony the Tiger....

Your post was "GG®®®eat! !"

hahaha, Battle Torn was amused.

deadfrag2100d ago (Edited 2100d ago )

Ken Kutaragi the guy that told the world the PS3 will do 120fps in games!What a joke!It makes 60fps in two or three games and 30fps barely in the others!

cannon88002100d ago (Edited 2100d ago )

And it can, it all depends on the game and how demanding it it on the hardware. You also need a 120 hertz tv or else you'll only be receiving 60 fps at max. The ps3 can totally run some games at those frames but they would probably have to be cartoony games like shank and castle crashers, where they don't really use all of the graphical and processing power. Nothing crazy like killzone 3 or uncharted 3.

kaveti66162100d ago

"You also need a 120 hertz tv or else you'll only be receiving 60 fps at max."

If that's the case then the Super Nintendo can also run games at 120fps.

NiKK_4192100d ago

Wasn't that supposed to be with multiple PS3s? TVs can't even handle 120Hz anyway.

cannon88002099d ago

@kaveti I was clearly talking about the games that could be running at 120 frames per second, but wouldn't because the tv didn't support 120 hertz. But I failed in my part because i should have written the second half of the paragraph first and then the first part last so that then you guys could have understood better. But i guess you guys like to cherry pick at everything.

cannon88002099d ago

@NiKK and Kaveti are you sure about that??? 120 hertz means that it refreshes the screen at 120 times per second, meaning, that if a game is able to be rendered at 120 individual frames per second then it would all be showing on the tv.

+ Show (2) more repliesLast reply 2099d ago
jmac532100d ago

Because Dark Sniper sounds like a Sony employee that just posted a press release on the N4G forums. Who actually uses the restricted symbol when typing.

Tru_Ray2100d ago

Dark Sniper is definitely a PS3 Evangelist. How can anyone take his hyperbole seriously?

Tru_

Rage_S902100d ago

Darksniper is one of the main reasons N4G is now sony central before that believe it or not it was Ms land. He is an N4G legend.

STONEY42100d ago (Edited 2100d ago )

About 2 or 2 and a half years ago and before, everything was pro-MS here. There was even a guy who acted exactly like DarkSniper named Zhuk, except he was pro-MS instead of pro-Sony. As in, he would put Xbox 360 with that trademark logo, and reference himself in third person while sounding like a Microsoft advertisement. Just like DarkSniper. They both existed at the same time, but Zhuk kinda disappeared when the sides started shifting...

Wait... coincidence?

JBaby3432100d ago (Edited 2100d ago )

It was bad circa 2007 but by 2008 it was really starting to change. I miss the comments and fanboy wars from back then. They were very entertaining. Those who weren't around missed all the verbal spits from zhuk, jason360, sak500, firstknight, the mart, dark sniper, black zhuk, meus renaissance, icewake, silogon and others. I wonder if the Sony and MS fanboys will come out in full force next-gen like they did at the beginning of this gen.

Sevir2099d ago

It was a serious war on n4g back then when it was split into 2 sites, Ps3today and xbox360today. All the xbox fanboys came pouring in during 2006-2008 after MGS4 released and LBP and Resistance 2 dropped that year and uncharted 2 was announced in December that year they all cleared out and as you see now, this became haven for ps3 loyalist!

PimpDaddy2099d ago

I joined this site in 2007. By the time I joined Sony loyalists had taken over this site. Sure you had a fair amount of 360 loyalists including myself trolling and flaming. But we were outnumbered even back then. It was at least 60/40 in favor of Sony. As I write this comment today it has to be around 90/10 in favor of Sony.

Nintendo lovers are either afraid to speak up or all migrated to VGChartz. PC lovers are considered the worst trolls so they are damn near extinct too.

That is the problem with this website. If your opinion does not agree with the mass pro Sony opinion on here then you get bombarded with disagrees and have bubbles taken. I hardly ever comment anymore. Just come here to read some news and laugh at the pro Sony bias on this website.

sak5002099d ago (Edited 2099d ago )

@jbaby

Did somebody mention my name ;)

Dont forget, zeeshan (aka naseem and his 100s of fake accounts), philharrison, Ken, maddens raiders.

Man if you read back the comments in 2k7 you'd see these guys writing about their top games coming out on ps3 in every article. They'd already predicted death of 360 when games like KZ2 would come out or GT5. For GT5 we surely ripped them apart as the game delayed 4~5 years after it's initial annoucements.

Those were the good days of my life.....in the summer of 2007..

@sevir right on POG is still on my xblive friends list

JBaby3432099d ago (Edited 2099d ago )

Sorry for all the people I missed in my list. I was just going off the top of my head. Those were definitely the days.

@ sak500: That's funny about POG. Maddens Raiders is on my PS3 friends list. He's still on this site pretty frequently too if I'm not mistaken as is the Mart who is very moderate now loving both the PS3 and 360. By the way you forgot Jack Tretton and Hydrolex.

sak5002098d ago

@jbaby

That's cool i used to have arguments with maddens quite a lot.

BTW what was your handle back then?