Top
150°

Wii U RAM has 17GB/s bandwidth

The Wii U is out and some details regarding it's hardware specs are now out. The 2GB of RAM it possesses appears to have 17GB/s, which is lower in comparison to the 360/PS3.

Read Full Story >>
gimmegimmegames.com
The story is too old to be commented.
decrypt1854d ago (Edited 1854d ago )

Dam 17GB/s is nothing really.

PS3 and Xbox 360 have about 25GB/s

Any modern GPU on the PC side costing about 150usd has about 120 - 150 GB/s, High end GPUs which cost around 400usd have a bandwidth around 200GB/s.

In comparison Wii U looks pale.

@marcus_fenix

Sorry my bad:

Console gamers... dont care about graphics.

Ghost_of_Tsushima1854d ago (Edited 1854d ago )

It's about the experience not just the power. When will people realize this? Have fun enjoy the gaming experience and quit worrying about graphics and power non stop. It's important but not the main thing.

NewMonday1854d ago

how dose this affect the games exactly?

is it loading times or is it something else?

HateFanboys1854d ago (Edited 1854d ago )

It matters tremendously, because they are talking about bandwith, therefore basically how much stuff the system can pump out onscreen at once. So everything eats ups bandwith, you want more polys onscreen, its takes bandwith, you want higher color bit? bandwith, antialiasing? bandwith, higher level antialiasing like 2x, 4x, 6x etc, bandwith, texture resolution, bandwith, you want higher resolution, bandwith, and so on, and so on. Bandwith is usually one of the biggest determiner of performance on a graphics card. So basically this means that the Ps3/360 can literary and more easily put more pixels onscreen with a higher framerate, cuz oh yeah, i forgot to mention it also effects framerate. cuz as you approach/pass your bandwith limits, your framerate starts to drop.

and p.s. no it doesnt have anythting to do with loading. basically this means that the WiiU is gonna have a harder time displaying higher res, higher AA, higher frames onscreen, etc, than ps3 or 360

Ghost_of_Tsushima1854d ago (Edited 1854d ago )

@decrypt

I do care to a certain point but not over the overall experience.

T9001854d ago (Edited 1854d ago )

I think gameplay does come first, however i do believe Nintendo or any other hardware maker for that matter should provide fair value in their hardware.

If the specs are true and its only a DDR3 2GB chip with speeds of around 1066. That sort of chip only cost 10usd on the open PC market. Surely Nintendo should be less stingy than that. This system is ment to last 5-6 years atleast. I would call this terrible planning on Nintendos part. Unless all they want to be doing is catering to casuals.

As Decrypt pointed out 150usd GPUs of today are coming equipped with GDDR 5 Memory which provides bandwidths of upto 150GB/s on entry level hardware. 17GB/s in comparison is pathetic.

Shaman1854d ago

Little correction. PS3 has 25GB/s + 22.4GB/s and 360 has 22.4GB/s + 32GB/s eDRAM (inside eDRAM 256GB/s). Wii U will fail if it has low bandwidth(it seems it does). It has to render second screen too!

dedicatedtogamers1854d ago

News that the brand-new "next gen" WiiU has weaker graphics than a PREVIOUS generation and then of course the excuses come flooding out.

"It's not about graphics! Gameplay is what matters"

Okay, so you're saying that you are more than happy to drop $300+ on a brand-new system with (potentially) more limited graphical capabilities instead of buying a...how much are PS3s and 360s now? $200? $250?

Anyway, Nintendo clearly dropped the ball on this system when it comes to the hardware. There are no excuses.

+ Show (1) more replyLast reply 1854d ago
Ghost_of_Tsushima1854d ago (Edited 1854d ago )

Console developers will squeeze more juice out of the hardware than you think. They work solely on that hardware to push the limits and not to mention the power behind the consoles if far more efficient than you think. A tiny amount can be huge if used properly.

T9001854d ago (Edited 1854d ago )

Well again you need to gain some knowledge about how things work with GPUs before you make such comments.

Memory bandwidth is a very important component. Even if you have a great GPU, but you starve it with memory bandwidth its not goint to perform.

As an example you can take a GTX 680 GPU which has a memory bandwidth of around 192GB/s, you can manually clock its memory speeds down and you will see performance collapse. Take it down to about 50GB/s and you will see the mighty GTX 680 struggle to play games in 1080p.

The PS3 and Xbox 360 can be used as a benchmark for this, devs have been working on these systems since the last 6 years and have essentially taken those systems to the limits. With around 25GB/s bandwidth on both of them, we are seeing limitations. You can see most of their games dont run in 1080p. Regardless of coding.

No matter how good coding gets, you cant get around these limitations. If the Wii U is going to be even more bandwidth starved than a PS3 or Xbox 360, I dont think there will be much Devs will be able to do.

Also its a Myth that development on consoles is way more efficient. Yes it is efficient however there are limitations. Realize when games are run on PC, its mostly the GPU that does all the work. Other system resources are not strained much. Hence an OS running in the background doesn't impact much. If console development was so efficient we would have seen PS3 or Xbox 360 outperform 5 year old PC GPUs like the 8800GTX. However this doesnt happen. 8800GTX came with a memory bandwidth around 90GB/s even today it outperforms PS3 or Xbox in pretty much any game. Its also one of the reasons 8800GTX even today plays most console ports in 1080p, its something consoles themselves fail at.

neogeo1854d ago

I don't think t works that way. My desktop runs only 1066ram and have a 660ti GPU inside and 1080p with 8x AA is not problem, I make out almost all games and stay locked at 60FPS

T9001854d ago (Edited 1854d ago )

@Neogeo

Read my comments again, i mentioned that major loads are on the GPUs.

Things work a bit differently with PCs.

PCs have system ram, and they have Dedicated ram on the GPU too. Dedicated GPU ram is much faster than system ram.

While consoles have unified ram. Meaning the GPU has to use the ram bandwidth available on the System.

Since GPUs on the PC have its on dedicated ultra fast ram. It doesnt really matter if the system ram is slow ie your ram running at 1066. Since its the GPU ram that gets used. On consoles its the System ram being used since GPUs dont get dedicated ram.

Since a 660ti probably has about 150GB/s dedicated GDDR5 ram it can easily handle all that AA.

I hope the concept is clear now.

Edit:

Check this link which mentions the 660TI memory bandwidth figures:

http://www.tomshardware.com...

Since it is this memory that is being used to render the complex graphics, dependance on system ram is low, in case of consoles the GPU must rely on system ram. If that is slow then performance will be effected.

Angrymorgan1853d ago

Going off what I've read, I get the impression, Nintendo pumped all of their money into the gamepad rather than the actual console.

However it will be interesting to see what they do with the gamepad, will it always be used by debs?
Or will it just lose it's appeal after a while?

1854d ago Replies(1)
neogeo1854d ago

"This is a little disappointing on paper, but I’m not well versed in exactly what these numbers mean."

this says is all. We still need to wait because this could be 100% wrong

Schawk1854d ago (Edited 1854d ago )

Wii U Ram is 55GB/s, i just did a tare down there, I dont have any pics to prove my findings , this is all i can tell you for now, sorry i couldent give anymore evidence than the article in question apart from its 55GB/s so dont worry people

Thepcz1854d ago

its 55TB/s. easy mistake to make

ronin4life1854d ago (Edited 1854d ago )

That is how his comment comes off man...^,^ lol.

I mean, not to be a jerk Schawk but we can't go off just your word... can you prove it in any way?

joeorc1854d ago (Edited 1854d ago )

I hope so because if your not do not quite your day job.

"its 55TB/s. easy mistake to make"

I hope that was you just being funny, because if you are trying to be a straight shooter on your info do you know what you just wrote?

if you did, you have no idea about hardware because that for consumer electronics would be

in Terabyte's per second throughput:

LOL i mean you would need

The thing scales to 8EB, eight thousand petabytes, and supports from a few dozen servers up to one million. Fujitsu says it enables high-speed parallel distributed processing of very large amounts of read/write transactions from the compute nodes in an X86 HPC cluster.

FEFS can be configured with 10,000 storage systems delivering what Fujitsu claims is the world's highest throughput of 1 TB/sec. The file-system can create several tens of thousands of files per second. Its metadata management is up to three times faster than basic Lustre, according to Fujitsu.

http://www.theregister.co.u...

LOL

i mean even if you were talking about network connections:

currently the highest speed right now!

The beam is then transmitted over open space (just one meter in this case), and untwisted and processed by the receiving end. 2.5 terabits per second is equivalent to 320 gigabytes per second, or around seven full Blu-ray movies per second.

http://www.extremetech.com/...

LOL

Thepcz1854d ago (Edited 1854d ago )

no i'm dead serious, i just dont have any pics to prove it.

The 55TB/s allows the wiiu to render the universe and its entire contents in photo realism and real time.

The movie avatar by james cameron was actually rendered in real time on an early build wiiu dev kit.

Prometheus, the movie, was rendered in real time on finished wiiu hardware. so that gives you an idea of what the wiiu is capable of.

hope that helps :)

Show all comments (31)
The story is too old to be commented.