"How to Survive developer Eko told us about their issues with PS4&XB1 CPU, though in the end the game runs at [email protected] on both. They also said that they didn't find any important peformance gap between the two consoles."
"Let’s just hope this won’t limit games in the coming years": I fear so.
I am confident the tablet CPUs running at 1.6 - 2.0Ghz will be more than enough. If the devs cant find a work around then they are plain lazy :P
^this... The same CPU can do the same operation once (optimized) taking nanoseconds and if not optimized it can take millisecons or even seconds! Lazy programmers just don't care about optimizations and hope that the "compiler" does the job for them... One of the biggest hogs is the locallity principle not being taken into acount... It generates hundreds of cache misses and the CPU takes ages to get data from RAM that was not in the L caches... It's something like (in an i7): Copy from L1: 0.5ns Copy from L2: 7ns Copy from RAM: 100ns (yeah... slow...) (source: https://gist.github.com/jbo... Like @decrypt said, even a tablet's CPU (well... it's RISC after all, but still...) should be enough to handle what that game needs to do in the CPU. Heck, Intel and AMD don't even know what to do anymore because more Ghz won't add any performance and more cores won't do it too... it's all about knowing how to use what we have now... EDIT: Obviously any programmer knows that this is only to be used when useful... on high performance dependent programs (like a complex game)... Otherwise, prefer clean code to optimized horrible code :)
Compute can indeed help a lot, they need to switch many things for that though. The big issue I can see is that NVIDIA has a completely different architecture, so it would be a problem for PC versions of a game if they heavily optimized for compute.
@M1ST4K3 Learn to read sarcasm. The tablet CPUs are a disaster and anyone thinking they will last the next 8-10 years is clearly deluded. It will be hilarious how limited this gen of consoles will be compared to the last. At least last gen consoles started of with mid - high end hardware. This gen they have low to midrange GPUs while their CPUs are a disaster. Gaming will finally progress once the cancerous tumors Sony and MS exit the industry. Heres hoping Sony keeps making those losses and exits soon while MS gets a Windows sales slow down forcing an exist on them.
"Gaming will finally progress once the cancerous tumors Sony and MS exit the industry. Heres hoping Sony keeps making those losses and exits soon while MS gets a Windows sales slow down forcing an exist on them." Now THAT was sarcasm... https://www.youtube.com/wat...
So you are a developer? Have you worked 60-90 hours a week for 2-4 years to create the next cutting edge AAA game? That's what its like on those projects... It is not about being lazy, it is about making the next deadline and you need to cut as many corners as possible to get there... Do you want them spending time optimizing code or creating the next great game engine? The people that develop the games we love are probably some of the hardest working people you will ever meet. Calling developers lazy is just plain stupid and shows how little you appreciate the people that sacrifice 2-4 years of their life to bring us the best possible game they can. Being on a AAA game project is nothing less than a personal life sacrifice for the entire time you are on the project.
M1ST4K3: Do not know why you are getting disagrees, but this is well known knowledge. A cache miss will cost you dearly as you keep going up the chain. What people don't realize is that going from L1 to L2 is roughly a 10x slowdown and going to RAM is another 100x slowdown, and if you had to go to hard drive or optical, you are f'ed. As explained numerous times before by none other than MS, how the CPU is the limiting factor and that is why they want to introduce cloud for AI tasks and upped the speed when possible on the CPU. Again this was mentioned by Ubisoft, and again by this developer. A lot of the games are CPU bound, not GPU bound. "Heck, Intel and AMD don't even know what to do anymore because more Ghz won't add any performance and more cores won't do it too... it's all about knowing how to use what we have now..." That is why you have to optimize your code to take advantage of L1 cache, but you can do something meaning you can make a larger cache. Each core also has it's own cache as well so you do as much as possible in parallel. Certain things do aren't possible due to the vast information needed i.e data mining on consoles, so again cloud can offset this. Again, processors will get faster, you will get larger caches, and more registers, but both Intel/AMD are both hitting physics limits for how small they can make a transistor as well. So there a lot of challenges. Glad I'm a software dev as opposed to a chip designer!
Breaking it down to cache lines and keeping data (and code) in L1 cache only, requires quite some skills which usually only a handful of people, usually in the engine team, have. It also requires a programmer to know what multi threading and actually thread affinity means. If you don't, don't bother talking about cache control or configuring cache lines. And game play programmers usually don't know that level of detail. It's quite a complex topic and requites a lot of work; not trivial and expert knowledge. Also, what might work in an isolated use case, might not work if all subsystems actually work together.
It certainly isn't good news that they already complain about CPU. Those consoles aren't even one year old and yet they may already be dated... I wish they made a proper 499 $ hardware without the Kinect bullshit
It's the clock speed, it's too low! The 360 and PS3 has 6 core processors but ran @ 3.2ghz, If the CPU's in the X1 and PS4 were at that clock speed with 8 cores, they would be fine, Hell Im pretty sure the PS3 had a stronger CPU than the X1 and PS4!
Both the PS3 and Xbox had destop levels CPUs. Tablet CPUs are no replacement. Moving over tasks to GPUs will just take away GPU power. There already isnt enough GPU power to run games in 1080p. This gen of consoles is clearly designed to cut costs. Milk the Market based on brand names thats it.
the PS3 absolutely did have a more powerful CPU than both the XB1 and the PS4
The article says the problem was they didn't make use of the multi-core very well. The CPU was fine, they just weren't using it to its full potential This is something developers will need to learn to do this gen as both have 8-core processors
To be honest, Devs complain throughout the lifecycle of any console. But, they still manage to get the Job done. ---- Well, the CPUs are final, so they'll have to make do. .
Of course, it's their job. The point is that the better hardware you give them, the better games they can produce. Mainly because of AI/physics etc., which requires a powerful hardware if you want to make it sophisticated.
No. They will always find something to complain. Especially designers tend to overload the HW with features. Give them more, and they'll overload it some more.
Oh they do, enjoy 30 fps with dips and make do with it lol....
Not all games will run @ 30frames with dips ^_^
PS4 could release PS4 with 499$ (same with xbone launch price) and and make PS4's CPU way more powerful than shitty jaguar AMD lap top cpus !
"PS4 could release PS4" No it cant. Unless if you have two PS4's on the same bed....
They could have, and some of us would have been very happy to pay more for better hardware. But the low price point has been critical to the success of the console. I doubt Sony/MS/Nintendo will ever release a "powerful" console again, unless they can do so in parallel with a cheap version that runs the same software.
yeah of course not :D http://i.imgur.com/OLLp7RU....
How to Survive. lol Is it this. http://www.youtube.com/watc... No really surprising really, looks crap.
It's a fun little game, especially co-op, but I doubt it's anything to make hardware break a sweat.
IF you can look at the link the x box 1 CPU us actually more powerful than the 360, PS3, PS4, CPU. http://1-ps.googleuserconte... nOT In interest of fairness then a test was done on the GPU's and the PS4 won by a good margin. The problems always has been how many CU's are Devs using for the PS4 graphics? http://www.worldsfactory.ne... Sony had one of their internal documents leaked and they suggested 14 CU's not 18 was better for balance and even a Playstation dev revealed they are using 14+4 set up. If this remains true to date the situation is the PS4 GPU is not weaker, but the performance gap between both consoles is lower than people think.
You just don't give up! Another 14+4 CU's bull****. All 18CUs are fully independent for compute or graphics rendering. http://www.neogaf.com/forum... Smh... Sorry, dude!
If you'll look at the link below, you'll notice Ubisoft's PS3 coding not using all of it's resources (Page 14)... http://twvideo01.ubm-us.net... And Sony's current internal documents don't suggest a 14+4 setup, but rather a 15+3 setup instead (Pages 53 through 56)... http://develop.scee.net/fil... The diagram showing 960 clusters (streaming processors, equivalent to 15 CU's) used by graphics, then the remaining 192 (equivalent to 3 CU's) for for compute tasks.
This doesn't mean 15+3, all this means is, if your profiler detects idle CUs you can interleave compute job in that frame. In that case, the render pipeline leaves enough holes to actually schedule 3 CUs for compute jobs during one render frame (16ms). This is one example how this could work. What's a bit surprising in Ubi's demo, is that the PS4 GPU compute is almost 100% faster than the XBo (1600 vs 830), not just 50%. There is quite some more to it then just the number of CU units, those would suggest an increase of only 50% (would be ~1200 vs 830).
Cpu's are comparable in x1 and PS4. There is a decent size gap on the GPU side. But as Mark Cerny stated. Soon the GPU compute in PS4 will be used for physics simulation alongside graphics. Also cost/power effective for a console
And Azure will handle physics and AI for MSFT. You can't watch that Crackdown video w/o being impressed by a 10x increase is destructible objects and particles while maintaining 30 fps, while the one with 1/10th the content still fall as to 3 fps.
I've seen a few posts in this thread that actually make any sense. The majority of comments here are based on voodoo and mumbo jumbo... Example: PS4/Xbox One Architecture is not based on tablet CPU's, they're based on Laptop CPU's - "Jaguar/KABINI"
N4G is a community of gamers posting and discussing the latest game news. It’s part of NewsBoiler, a network of social news sites covering today’s pop culture.