Hardware each console should/could have used
Now that the fat lady has sang her final tune, we can now relax all the rumors regarding what each console manufacturer is going to use to power their next-gen systems. While there is still a bit of ambiguity in some of the next-gen consoles (Wii U and to a lesser extent Xbox One) we pretty much know exactly what these new machines will be capable of, so let's get into some hard facts.
As far as multiplatform games go, these are the ideal rankings / comparisons / peak performance you can expect from each console.
PS4 = mid-low gaming PC (expected performance 7850 (low) - 7870GHz (high))
Xbox One = low-high gaming PC (expected performance 7770 GHz (low) - 7850 (high))
Wii U = bottom end gaming PC (expected performance PS360 (low) - AMD Llano APU 6550d (high))
Exclusive developers will be willing to push the console a farther than what multiplatform developers will, but that's about it. Now to the question of the day. Is this good enough for a 5+ year cycle?
By the time the PS4 and Xbox One launch the Wii U will be 1 year old, and AMD & NVIDIA will be on their 8000 and 700 series GPU's with the 9000 and 800 series likely scheduled for early - mid 2014.
The PS4 will be the most powerful console on paper with a 50% lead over the Xbox One, but currently there are multiple GPU's ahead of the PS4 already in AMD's line-up alone excluding prior versions (7970m, 7870 GHz, 7950, 7970 GHz, and the 7990). Add in the 8000 series and there's more. Add in NVIDIA and you there's even more. Compare the to the Xbox One and it's worse, and we're not even going to fancy comparing all the GPU's that are above the Wii-U since it's their entire line-up.
So did console manufacturers do enough to power their systems. In my opinion Yes and No. Obviously they could have made significantly more powerful hardware with what's available to them. However, the price would have also taken a significant increase with GPU advancement they went with, and at the end of the pricing holds more influence in a price:performance ratio than power.
The next step up for the PS4 is the 7870 GHz which is a $200+ GPU. Up next is the 7950, which is $300+. 7970 GHz $400+. Finally the 7990 which is a $1,000 GPU. Obviously the 7970 GHZ and 7990 are completely out of the question for the success of any console, and even the 7950's price point would push any console up to the $600+ price range and we've seen that even Sony off it's greatest success (PS2) can't dominate market share at that price (and due to a few other factors). Even subsidized those consoles would still be expensive unless your contract was for more than 2 years or at a higher price than $15 a month. Sony’s GPU sits comfortably above AMD’s 7850 GPU which is frequently the go to for most mid-range gaming PC’s looking for AMD’s cards. Contrary to what most PC gamers want you to believe these cards last us a good 3 - 4 years before we’re ready to upgrade, aka when our system can’t handle much more.
So at best we could only hope for the next step up to improve upon the PS4 with a GPU comparable to AMD’s 7870 GHz (which has a 40% performance increase over the PS4). We can conclude that the PS4 should be priced at $399 - $449 at launch, but would it be worth paying an extra $100 for a 40% boost in performance? To PC gamers yes, console gamers not so much, which is why the PS4 is the best we could expect performance-wise from these consoles. Sony could have done more, and had they gone with a GPU on par with the 7870 GHz then they would have been 2x as powerful as their nearest rival.
Moving on to the Xbox One. Microsoft chose to go with a weaker GPU than Sony, which isn't surprising considering how well they did with the 360 vs. the PS3. Microsoft realized that they don’t need to be the best to get the audience they need (which is the only win they’re looking for). As long as their console is good enough, has a great online experience, and plays Halo, Call of Duty, and Gears then they’ll be successful and who can blame them. The Xbox One’s GPU is on par with AMD’s 7770 GHz which is currently a low-high GPU meaning it can playing everything, but it’s definitely not the best. When looking at the PS4‘s specs. it’s easy to say it’s the winner and be done with it, so I’m going to dive a bit deeper into the Xbox One. While the console reveal was an awful experience for the gamers, the actual innards of the Xbox One are actually solid. Yes the PS4 is more powerful, but the Xbox One will still get and be able to play every multiplatform game that comes to PC and PS4. The difference will be sometimes the PS4 version runs at full 1080p while the Xbox One version runs at 720p, making the PS4 version technically better, but not a deal breaker for the Xbox One. As I said earlier Microsoft isn’t concern with being the best, they concerned about getting the most market share and making money, which is why they chose a weaker console so they could launch at $349 - $449 (possibly $100 cheaper than Sony). Microsoft saw that as long as they were $100 less than Sony than they sold similar numbers, but as soon as that gap was closed Sony would take away. Microsoft could have equaled Sony and possibly been the one to use an near 7870 GHz powered GPU, but it ran the risk of flat out losing (which it still managed to do unless E3 is killer for them).
Finally the Wii U. I think the Wii U’s hardware is okay, but that okay is accompanied by 2 big WTF were they thinking. My biggest gripes with the Wii U is the CPU and RAM. The CPU is utter garbage and Nintendo needs to abandon that old 1999 tech (lol reminds me of the Xbox One conference), and get with the times. I don’t care how many revision you do, nearly 15 year old tech is nearly 15 year old tech, you can’t change that into a current-gen competitor. Tech’s life cycle is every 2 - 3 years before being outperformed so the Wii U CPU is running on 5-year-old generational hardware. To put that into perspective a 150 year old racing against their grandkids. The grandkids are always going to win even if it’s a newborn because grandma and grandpa are dead in their grave, that’s the Wii U CPU (maybe a bit extreme, but you get the point). 2GB of RAM is a modest improvement over the PS360, but when that RAM is 50% slower than the PS3’s RAM it doesn’t really do the console any service, besides holding and storing more resources at a slower pace. The saving grace of the Wii U is the GPU and based on the size it’s safe to say it’s like a 4650m or 4670m GPU which would put it at 320 or 480 GFLOPS (35% (4650) or 2x (4670) as powerful as the Xbox 360). Although rumors point towards the 4650, with most breakdowns confirming the console to house 352 GFLOPS which would be a near 50% boost in performance over the Xbox 360 (no confirmation from Nintendo).
As you can see I’m going to be a bit harder on Nintendo for just reasons. I really don’t know what Nintendo were thinking. Maybe they truly believed the rumors that the PS4 and Xbox One (Nextbox at the time) were really going to be using AMD Trinity APU’s to power their console. Even if they believed that then Nintendo should have at least gone with AMD’s Llano A6 APU (3500 (tri-core) / 3650 (quad-core)) and launched in 2011 with a system that was equal to the Wii U in performance, but had a significantly more powerful CPU, and RAM with speeds and bandwidth at least on par with the PS360. That way they can easily say yes we’re more powerful than the PS360 hands down and the PS4 and Xbox One won’t be significantly ahead of our console this time. But no they waited a year to release a console that’s weaker than what they could have released back in 2011 at a similar price. If anything after waiting another year Nintendo could have gone with an AMD Trinity APU (614 GFLOPS) or the rumored 6670 (768 GFLOPS), a significantly better CPU, 4GB of RAM at least on par with PS360 speeds, and been out a year ahead with a console that would still be viable in the PC realm and have justification to the $350 price tag.
But enough about Nintendo, we have the specs. of today’s console, but what do you think, should they have done more or are you satisfied with what they’ve got? In a perfect world we would be gaming on 7990’s all around or at least 7970’s, but with price being the strongest draw to console gaming do you think these systems are justified?
PS4: (Possibly $399 - $449, they don’t want to hit $500)
AMD 8-Core CPU
1.84 TFLOPS GPU (HD 7850+)
8GB GDDR5 RAM (176 GB/s)
X1: (Possibly $349 - $449)
AMD 8-Core CPU
1.23 TFLOPS GPU (HD 7770 GHz)
8GB DDR3 RAM (68.3 GB/s) + 32MB eDRAM
Wii U: ($299 & $349 w. Game)
Tri-Core PowerPC (3 Wii CPU cores with a 75% clock boost),
352 GFLOPS (4650)
2GB DDR3 RAM (12.8 GB/s) (1GB for system, 1GB for games) + 32MB eDRAM
Top-end gaming PC’s: $2,000+
Intel Core i7-3930K
8+ TFLOPS (HD 7990, GTX 690 or SLI & Xfire: GTX Titan, 780, HD 7970 GHz, GTX 680)
16GB - 32GB DDR 3 RAM (Bandwidths varies)
Single card high-end gaming PC’s: $1,000 - $2,000
Intel Core 3770k / AMD FX-8350
3+ TFLOPS (GTX Titan, GTX 780, HD 7970 GHZ, GTX 680, HD 7970, GTX 670, HD 7950+)
16GB - 32GB DDR 3 RAM (Bandwidths varies)
Potential top-end console: $500 - $700
AMD 8-Core CPU,
8GB GDDR5 RAM (176 GB/s)
What Wii U should have been 2011 console: $300 - $400 (2011)
AMD 4-Core CPU (AMD A6-3500 Llano / AMD A6-3850 Llano)
480 GFLOPS (6550d)
2GB DDR3 RAM (25.6 GB/s) + 32MB eDRAM
What Wii U should have been 2012 console: $300 - $400
AMD 4-Core CPU (AMD A8-5500 Trinity)
614 GFLOPS (7660d) / 768 GFLOPS (HD 6670)
4GB DDR3 RAM (25.6 GB/s) + 32MB eDRAM