Some truths about game development, that should help discern facts from fiction, in "next gen might be this!" stories here on N4G:
Game development requires debugging, which entails the use of a debugger -- a program which "watches" a program as it executes, and allows a developer (engineer) to observe issues. Debuggers often require very large amounts of memory, to do their work, and thus:
(1) Game console development kits almost always have twice the memory of retail consoles. An 8 GB development kit likely means the retail console will have 4 GB of memory.
Be careful to observe whether or not reported information comes from someone casually regurgitating facts about a dev kit itself, or from someone who would know the above, and halve the dev kits' memory before reporting likely retail memory.
Game engines, on modern processors, spend most of their time waiting on the cache. This is because while processors have increased dramatically in the number of instructions that can be executed per second, memory has not really become all that much faster. Modern bus architectures can move memory in much larger quantities than ever before, due to large modern bus widths and pumping, but they have a lot of trouble moving a *small* amount any faster than they did all the way back in the late 90s. Hence, the performance numbers are reported in large volumes / second, and not small data CPU cache miss latencies -- because the latter hasn't hardly changed.
The end result of this, is that single processor cores largely get speed boosts from smarter cache designs, better instruction scheduling, out-of-order execution, and things like hyperthreading/SMT, which utilize processor resources while another process waits on uncached data to arrive. Hence the "clock is meaningless" statement that a knowledgable person will often spout on N4G. Its not truly meaningless -- it is still relevant for tight loops executed on large amounts of data which is well-pipelined, but here's a startling fact to digest:
(2) Modern console game engines are typically waiting on cache misses for between 85% and 90% of their cycles. During the PS2/GC/XBox era, game engines spent ("only") between 60% and 80% of their CPU time idling.
(2)(a) The XBox360 and PS3 PPU cores are notorious for exaggerating this issue, thanks to their in-order, small cache, poor branch prediction designs. PC CPUs are more efficient, but not as much as you might think.
(2)(c) Due to this fact, a 1.6 GHz core, on a modern CPU, may be somewhat faster, in reality, than an individual 3.2 GHz core on the PS3(PPU) or 360.
Consoles have "lightweight" operating systems -- meaning that they require much less background processing than their PC counterparts, and that the driver abstraction on PC OSes is almost a non-issue on consoles. nVidia and AMD routinely release new drivers for their GPUs, which increase software performance dramatically -- this is because the driver CPU usage actually has a significant impact on PC platforms, whereas on consoles, it does not.
(3) PC Operating systems consume a lot of memory, and a lot of CPU time. This factors into console performance, and apparent available memory, in a much larger way than you might imagine.
Memory is useful for more than storage of "more". As an example, some compression algorithms for animations provide a 10-to-1 compression ratio for animation data. There is no CPU or GPU hardware to do this decompression, as there isn't a compression standard to design hardware for. Games often need to compress dozens, or even hundreds, of megabytes of raw animation data in this manner, and subsequently need to spend an enormous amount of CPU time to decompress necessary data each frame, in order to blend animations and do things like render an animated character. By utilizing extra memory to store uncompressed animations, one of the largest modern game engine costs is removed from the work the CPU needs to do each frame -- thus memory can be transformed into CPU speed gains.
(4) Extra memory, even a few hundred MB of it, can result in very large performance improvements in some games -- augmenting the CPU "power" by reducing the amount of work it actually needs to do. Developers who realize this fact may be able to utilize it for their game engine, and subsequently report that a particular console is "just fine" whereas other developers may complain. Likewise, in situations where CPUs are near identical, performance may see a boost on the console with more memory.