Sony proudly showed off its PlayStation 4 hardware for the first time at E3, and now we're getting a peek at what developers are working with this generation thanks to the FCC.
Is that clock speed taking into account all the extra stuff a dev kit needs to do or does it indicate the rumoured increase to 2ghz is possible? Shoot me down on this please. I'm totally at sea when it comes to tech. edit. From Gaf: 'Maximum theoretical radio emission frequency of all embedded radio devices (WLAN, bluetooh, etc.). It's a document for the FCC after all. Has nothing to do with silicon clock speeds.'
It still means nothing to me but I'm sure some of my tech savvy friends on here can clue me in. I will accept baby talk as a means of communicating the information, so feel free to patronise me.
There could be a lot of factors here. It is likely due to the processing overhead the development tools add. Such as debugging etc. The dev kit looks much larger than the final PS4 design, so they may be able to run the higher clock and deal with the greater amount of heat that generates. Sony may be playing it safe as they will be launching in territories where it is seasonally very hot - and that PS4 casing does look small. It could be that AMD could not guarantee the yields required at that clock speed, so went for the higher yield lower speed option. It might just be that Sony decided on a particular clock speed and that's what we're getting. The fact the dev kits have higher spec is not unusual (look at the spec of the PC's running many of the XBOne demos at E3 - they were significantly higher and some even ran on Nvidia GPU's instead of AMD). It is highly unlikely we'll see an increase in CPU speed before launch as Sony are likely to either have started production or close to ramping up assembly. However, if the clock speed is limited in BIOS they could in theory unlock it later with a firmware update - but I would be 99.9% certain that won't happen for a whole bunch of reasons - not least security.
Jdoki nailed it. It's the same thing as Microsoft using a PC for a demo at e3. At this point devs could be dealing with unfinished system software, and games and demos most likely run inside or alongside dev tools like debuggers and crash recovery. This really means nothing new for the final version of ps4 that the end user receives. Also, any software is still going to be designed for the final specs, not designed specifically for the higher clock of the devkit.
Could be AMD Turbo core tech in the Development kits. or AMD could have based PS4 APU CPU core design off of upcoming Beema features. Which could give it a higher clock rate than 1.6 Ghz, but I suspect 2 Ghz is likely a wall it will hit. Not really sure. Sony has never officially stated the Ghz of the CPU/APU in the system did they? Rumors AMD Beema is to be talked about around August. Gamescom is also in August. Sony could have another surprise. It also fits the RAM though.
yes very true and that only prove ....wait i have no idea what the hell you guys are talking about *gone to cry in the corner*
Thanks guys/gals. I feel enlightened but stupid. :)
Well that's a pretty impressive kit. It needs all the extra horspower to do, not only the games execution, but also running all the debugging and monitoring tools that devs use for optimation and testing.
I love your humility and humour. Attractive and lovable qualities. x
Here is the comment gribblegrunger posted on the site he's officially A bubbled up fanboy Xbox troll liar LMAO GribbleGrunger 11 hours ago @iRanch @Kingsford "The PS4 architecture to be more straightforward with emphasis on raw power. The XB1 is more advanced because of eSRAM, a more reliable ram due to much lower latency compared to the GDDR5 ram on the PS4. Sony was almost going the same route with eDRAM, but Cerny was told by developers to change the plan because Sony would need strong in-house tools ready to support such a complex system. Easy to develop is eventually the PS4's new goal. MS is a software company first and foremost, it knows how to optimize hi-perf games even on a ***tiny static ram*** hardware like the X360. Thanks to the proven advanced tools/compilers that are known among game devs for years, devs don't need to deal with "tricks" anymore. Strong hardware spec is nothing without strong-yet-easy dev tools. All in all the XBOX ONe is day ONe purchase for me. Oh btw i am a system engineer at SCEE but XB1 is a winning formula no question. "
@stevehyphen No it's nothing like what Microsoft did -- using $3000 gaming rigs at the E3 demo that are over 10x the power of the Xbox One.
@cee773: "Here is the comment gribblegrunger posted on the site he's officially A bubbled up fanboy Xbox troll liar LMAO" What the hell are you referring to, son? Is that quote in your post supposed to be from me or something? I'm confused. You're either lying or someone thinks I'm so damned cool that they're copying my sig. I'd like to think it was the latter but it's probably the former.
@cee773: I've just been posted this by a friend and it seems you are telling the truth, but trust me that is NOT me LOL. Not only have they used my name but the avatar they used is my own doctored avatar. http://gamentrain.com/the-p... I'm guessing it's someone from this site.
I don't care if he is an xbot or a clever troll. I still love him . It's been ages since I've had half decent trade anyway
From [email protected] http://beyond3d.com/showpos... "The 2.75GHz "maximum clock frequency in the system" is clearly the WCLK of the GDDR5 running at the announced 5.5 Gbps. GDDR5 needs two clock signals to work, one at 1.375GHz and the other one at the mentioned 2.75GHz (for 5.5Gbps operation), which is likely the highest clock in the system, not the CPU clock."
Interesting. Sounds like a sensible explanation. Thanks for the info.
en.m.wikipedia.org/wiki/GDDR5 The wiki overview has a decent description of the two clock setup of gddr5 as well
Does that mean 'Rocket Science' is involved?/jk I'm kind of clueless as to what that means, but theroretically could it mean that the PS4 built for extensive multitasking like you would see in a desktop computer with multiple programs running?
It just means they are reporting the maximum frequency found in the system, which matches the clock frequency of GDDR5. It doesn't provide information on multi-tasking.
It looks like an overclock which means most likely the clock speed will be lowered for the retail version
What about GPU+CPU=APU CPU=1.9gz GPU=800mhz =APU 2.7ghz ???????
That's what I thought,2.7 GHz is easily the sum of the two processors inside the PS4's APU: CPU + GPU. CPU runs at 1.94 GHz and the GPU at 800 GHz. So we just add 1.94 + 0.800 = 2.74 GHz.
You hit it on the money..
Why those two numbers would be added together makes no sense. The 2.7GHz frequency is probably the maximum clock speed using AMD's Turbocore functionality, in which case half of the cores need to basically be disabled to run at that speed.
No, doesn't work like that.
Sexy black and white pictures.. wonder where i insert my usb dongle.
I really would like for the system to have a couple more USB ports. But i guess its a bit better since the camera will no longer require its own port.
My reaction to this post due to my vast (null) knowledge of this kind of tech.. http://assets0.ordienetwork...
that is my reaction- >Whenever new games are announced >Getting New Graphic Card. >Getting laid. >A game gets downloaded @512kbps. and you get a bubble
jesus that's hilarious!
That is definitely NOT William Dafriend! Sorry had to.
Made me smile, and yeah, I'm feeling that inside for weeks now. :)