The Wii U is out and some details regarding it's hardware specs are now out. The 2GB of RAM it possesses appears to have 17GB/s, which is lower in comparison to the 360/PS3.
Dam 17GB/s is nothing really. PS3 and Xbox 360 have about 25GB/s Any modern GPU on the PC side costing about 150usd has about 120 - 150 GB/s, High end GPUs which cost around 400usd have a bandwidth around 200GB/s. In comparison Wii U looks pale. @marcus_fenix Sorry my bad: Console gamers... dont care about graphics.
It's about the experience not just the power. When will people realize this? Have fun enjoy the gaming experience and quit worrying about graphics and power non stop. It's important but not the main thing.
how dose this affect the games exactly? is it loading times or is it something else?
It matters tremendously, because they are talking about bandwith, therefore basically how much stuff the system can pump out onscreen at once. So everything eats ups bandwith, you want more polys onscreen, its takes bandwith, you want higher color bit? bandwith, antialiasing? bandwith, higher level antialiasing like 2x, 4x, 6x etc, bandwith, texture resolution, bandwith, you want higher resolution, bandwith, and so on, and so on. Bandwith is usually one of the biggest determiner of performance on a graphics card. So basically this means that the Ps3/360 can literary and more easily put more pixels onscreen with a higher framerate, cuz oh yeah, i forgot to mention it also effects framerate. cuz as you approach/pass your bandwith limits, your framerate starts to drop. and p.s. no it doesnt have anythting to do with loading. basically this means that the WiiU is gonna have a harder time displaying higher res, higher AA, higher frames onscreen, etc, than ps3 or 360
@decrypt I do care to a certain point but not over the overall experience.
I think gameplay does come first, however i do believe Nintendo or any other hardware maker for that matter should provide fair value in their hardware. If the specs are true and its only a DDR3 2GB chip with speeds of around 1066. That sort of chip only cost 10usd on the open PC market. Surely Nintendo should be less stingy than that. This system is ment to last 5-6 years atleast. I would call this terrible planning on Nintendos part. Unless all they want to be doing is catering to casuals. As Decrypt pointed out 150usd GPUs of today are coming equipped with GDDR 5 Memory which provides bandwidths of upto 150GB/s on entry level hardware. 17GB/s in comparison is pathetic.
Little correction. PS3 has 25GB/s + 22.4GB/s and 360 has 22.4GB/s + 32GB/s eDRAM (inside eDRAM 256GB/s). Wii U will fail if it has low bandwidth(it seems it does). It has to render second screen too!
News that the brand-new "next gen" WiiU has weaker graphics than a PREVIOUS generation and then of course the excuses come flooding out. "It's not about graphics! Gameplay is what matters" Okay, so you're saying that you are more than happy to drop $300+ on a brand-new system with (potentially) more limited graphical capabilities instead of buying a...how much are PS3s and 360s now? $200? $250? Anyway, Nintendo clearly dropped the ball on this system when it comes to the hardware. There are no excuses.
Console developers will squeeze more juice out of the hardware than you think. They work solely on that hardware to push the limits and not to mention the power behind the consoles if far more efficient than you think. A tiny amount can be huge if used properly.
Well again you need to gain some knowledge about how things work with GPUs before you make such comments. Memory bandwidth is a very important component. Even if you have a great GPU, but you starve it with memory bandwidth its not goint to perform. As an example you can take a GTX 680 GPU which has a memory bandwidth of around 192GB/s, you can manually clock its memory speeds down and you will see performance collapse. Take it down to about 50GB/s and you will see the mighty GTX 680 struggle to play games in 1080p. The PS3 and Xbox 360 can be used as a benchmark for this, devs have been working on these systems since the last 6 years and have essentially taken those systems to the limits. With around 25GB/s bandwidth on both of them, we are seeing limitations. You can see most of their games dont run in 1080p. Regardless of coding. No matter how good coding gets, you cant get around these limitations. If the Wii U is going to be even more bandwidth starved than a PS3 or Xbox 360, I dont think there will be much Devs will be able to do. Also its a Myth that development on consoles is way more efficient. Yes it is efficient however there are limitations. Realize when games are run on PC, its mostly the GPU that does all the work. Other system resources are not strained much. Hence an OS running in the background doesn't impact much. If console development was so efficient we would have seen PS3 or Xbox 360 outperform 5 year old PC GPUs like the 8800GTX. However this doesnt happen. 8800GTX came with a memory bandwidth around 90GB/s even today it outperforms PS3 or Xbox in pretty much any game. Its also one of the reasons 8800GTX even today plays most console ports in 1080p, its something consoles themselves fail at.
I don't think t works that way. My desktop runs only 1066ram and have a 660ti GPU inside and 1080p with 8x AA is not problem, I make out almost all games and stay locked at 60FPS
@Neogeo Read my comments again, i mentioned that major loads are on the GPUs. Things work a bit differently with PCs. PCs have system ram, and they have Dedicated ram on the GPU too. Dedicated GPU ram is much faster than system ram. While consoles have unified ram. Meaning the GPU has to use the ram bandwidth available on the System. Since GPUs on the PC have its on dedicated ultra fast ram. It doesnt really matter if the system ram is slow ie your ram running at 1066. Since its the GPU ram that gets used. On consoles its the System ram being used since GPUs dont get dedicated ram. Since a 660ti probably has about 150GB/s dedicated GDDR5 ram it can easily handle all that AA. I hope the concept is clear now. Edit: Check this link which mentions the 660TI memory bandwidth figures: http://www.tomshardware.com... Since it is this memory that is being used to render the complex graphics, dependance on system ram is low, in case of consoles the GPU must rely on system ram. If that is slow then performance will be effected.
Going off what I've read, I get the impression, Nintendo pumped all of their money into the gamepad rather than the actual console. However it will be interesting to see what they do with the gamepad, will it always be used by debs? Or will it just lose it's appeal after a while?
"This is a little disappointing on paper, but I’m not well versed in exactly what these numbers mean." this says is all. We still need to wait because this could be 100% wrong
Wii U Ram is 55GB/s, i just did a tare down there, I dont have any pics to prove my findings , this is all i can tell you for now, sorry i couldent give anymore evidence than the article in question apart from its 55GB/s so dont worry people
its 55TB/s. easy mistake to make
That is how his comment comes off man...^,^ lol. I mean, not to be a jerk Schawk but we can't go off just your word... can you prove it in any way?
I hope so because if your not do not quite your day job. "its 55TB/s. easy mistake to make" I hope that was you just being funny, because if you are trying to be a straight shooter on your info do you know what you just wrote? if you did, you have no idea about hardware because that for consumer electronics would be in Terabyte's per second throughput: LOL i mean you would need The thing scales to 8EB, eight thousand petabytes, and supports from a few dozen servers up to one million. Fujitsu says it enables high-speed parallel distributed processing of very large amounts of read/write transactions from the compute nodes in an X86 HPC cluster. FEFS can be configured with 10,000 storage systems delivering what Fujitsu claims is the world's highest throughput of 1 TB/sec. The file-system can create several tens of thousands of files per second. Its metadata management is up to three times faster than basic Lustre, according to Fujitsu. http://www.theregister.co.u... LOL i mean even if you were talking about network connections: currently the highest speed right now! The beam is then transmitted over open space (just one meter in this case), and untwisted and processed by the receiving end. 2.5 terabits per second is equivalent to 320 gigabytes per second, or around seven full Blu-ray movies per second. http://www.extremetech.com/... LOL
no i'm dead serious, i just dont have any pics to prove it. The 55TB/s allows the wiiu to render the universe and its entire contents in photo realism and real time. The movie avatar by james cameron was actually rendered in real time on an early build wiiu dev kit. Prometheus, the movie, was rendered in real time on finished wiiu hardware. so that gives you an idea of what the wiiu is capable of. hope that helps :)
Though I'm "not happy at all" with this bit of information, hopefully the extra 32mb of eDram is enough for developers to work with. Holy ****, Nintendo (really upset)!
Thats disappointing, even mid range gaming laptops are four times that much. And this isnt a useless stat, this is physically binding to what resolutions you can use with how much AA, how many textures, how much AF, so on so forth.
Gearbox, says Wii U is Very, Very powerful. They say that Wii U has more power than PS3. Gearbox also said that The Wii U version of Aliens Vs Colonial Marines is going to look and run the best on Wii U because it has more power. So I think there must be something up Wii U's sleeve. Here are the links for proof I am not just saying this cause I am a Nintendo fan. http://www.nintendolife.com... http://www.eurogamer.net/ar... Now these people are working on Wii U, so I think I am going to listen to them rather than what was seen when they took it apart :)
i do not think anyone is not saying that, over all powerful does not mean in other areas it may be more powerful, it may have in other areas of the system that can cause a development problem. if people did say that the WiiU is pretty weak they are sadly mistaken. what I and a few people in this thread are saying, and this is by Nintendo's own tear down themselves: http://arstechnica.com/gami... http://arstechnica.com/gami... that you can boast about your game all you want if your a developer trying to sell your game, which does not take away from how entertaining the game is. thats not the point, the point is by the design that Nintendo has used, the hardware and that is every hardware platforms limitations are there . with Sony's PS3 it was split ram which would have a negative effect on open worlds with development Game engines that 3rd parties use. with Microsofts it was where the over all placement of its components was overheating the system and to top it off causing even the solder joints to break! the WiiU will have in this case lower Bandwidth connect on the MCM between its chips and the I/O.
cant wait to get mine
Schawk perhaps you could upload a photo to a blog or youtube video to show us that the ram is faster than what this article says I'm a bit worried that the ram is slower than current consoles. I hope its faster
"I'm a bit worried that the ram is slower than current consoles. I hope its faster" its not the speed of the ram, that is the issue, its the bandwidth connection between the components. for instance the MCM of the WiiU has the GPU, the CPU and the Memory's bandwidth all on the same packaged PCB. off chip connection between each component is what the communication speed between each system component is the point that is being made, like for instance if say for instance you have off board Edram of certain memory speed but the connection bandwidth to the i/o to and from that chip is slower than its native speed its going to only be able to communicate with another system component only as fast as the communication bandwidth memory speed of the connection memory will allow. its the connected wire if you will thats the problem, not the memory's base cycle speed.
It's obviously enough for the games they have planned for it, Marios brain training or whatever! Kids who play games with their parents and high five their mothers don't care about graphics etc!!
The speed of RAM matters a whole hell of a lot less than people realize. That won't, unfortunately, stop people from bitching though. Honestly, the gap between DDR2 and DDR3 is almost negligible. Nerds will argue tooth and nail that 1333mhz is so much slower than 1600mhz, but truly, it isn't.
N4G is a community of gamers posting and discussing the latest game news. It’s part of NewsBoiler, a network of social news sites covering today’s pop culture.