New Xbox With Six-Core CPU and “Dual-GPU” From AMD (Rumor Analysis)

Pinoytutorial: Here's an analysis with the previous rumor we've heard today about the next-gen Xbox ("Xbox 720") powered by a six-core CPU with a prototype dual-GPU from AMD. And its partial announcement will be noted by Microsoft on CES 2012. More details on this report.

So do you think a "hexa-core" type of CPU with dual-GPU will be enough for Xbox 720? Show your comments below.

Read Full Story >>
The story is too old to be commented.
Shackdaddy8362136d ago (Edited 2136d ago )

For the CPU, it depends more on the clock speed. Cores are great and all and they do matter to an extent, but clock speed is more important when you take gaming into account. I would like to see a rumor on the clock speed before I make any decisions.

Dual-GPU sounds very doubtful to me.
1. That would be a lot more expensive
2. It would create a ton of heat requiring more efficient cooling for the case
3. It would be bulkier probably forcing Micro to have a bigger case than they probably want.

Also RAM is a big factor for games. I think there was a rumor about RAM somewhere as well.

MaxMurdoch2136d ago

1. That would be a lot more expensive

Wrong. Dual gpu is cheaper than single gpu, and offers more power.

2. It would create a ton of heat requiring more efficient cooling for the case

Possibly, but not probably. AMD gpus are pretty efficient and dont make that much heat.

3. It would be bulkier probably forcing Micro to have a bigger case than they probably want.

Unless you know which dual gpu they intend to use, you cant make that assumption.

Shackdaddy8362136d ago (Edited 2136d ago )

Dual GPUs are not cheaper than a single GPU. It's cheaper than two single GPUs but not one. IDK where you would get that from....

Anyways, if the rumors about the nextbox's GPU being dx11 capable are true, then having a dual GPU would be outrageously expensive for a console.

DeadlyFire2136d ago

Depends on power. Dual GPU low end GPU is very cheap compared to Single GPU High end. It couuld potentially be either. If its Dual GPU High end then it will be expensive, but not much of an effect on final console price either way.

DirectX 11 will be supported regardless.

I personally am a little surprised at 6 core CPU. Is it AMD or Power based? Thread count I am wondering. If Power based then its up to 4 threads per core. If AMD based its likely just 2 threads per core. If WiiU has 12 threads, Xbox 720 has 12 threads. Similar scale GPU as well is likely. Even if dual GPU graphics likely will start out on a similar level next generation. Really exciting for Nintendo fans.

inveni02135d ago

First, I disagree that clock speed is more important than number of cores. We can only get so fast with processor speeds, and the current consoles are already very fast with regard to their processors (3-core in 360 and the SPU set-up in PS3). They really don't have much room when it comes to clock speeds. So more cores sounds likely. 6-core might be a little high, though. For costs, using a good 4-core would give a lot of improvement over the current processor.

As for dual-core GPU...nope. It ain't happen'n. They need to keep costs down to stay competitive with Nintendo. So, at best, we'll probably see something similar to the equivalent of the NVidia GT560...

Most likely, however, they'll be using the APUs AMD recently released.

Sarcasm2135d ago

Whoever said more cores or more clock speed is both wrong. A better architecture is far more important. Don't believe me? Compare an i7-2600K clocked at 3.4ghz vs a i7-920 clock for clock. The 2600K is much faster.

So if anything, let's hope they put an extremely efficient CPU in that thing.

inveni02135d ago

What you're referring to is simply an improvement in the design of processors over time. Of course no one wants them to use 4-core 3.2GHz processors that were created 4 years ago. The same number of cores and clock speeds from today's models are what we're looking for.

steve30x2135d ago

@ MaxMurdoch : So are you telling me a dual GTX580 would be cheaper than a GTX580 LOL. I've heard it all now.

@ Shackdaddy836 : You speak about having dual GPU's creatring too much heat , but still you want an overclocked CPU which will also cause a lot of heat. You need a CPU that will run @ at about 3.6GHZ 4 core's that wont cause a lot of heat like AMD chips. The Intel 2600K or 2700K would be perfect because at stock speeds they run cooler than the AMD CPU's and are much faster at stock speeds than a highly overclocked AMD CPU , thats including the newest AMD Bulldozer 8 core CPU's that are very hot and arent as fast as the Intel 2600K.

frostypants2135d ago

No way will they go dual-GPU. That's an expansion solution, not an out-of-the-box solution. It's more efficient to go single powerful GPU, assuming they aren't going for bleeding-edge (which they won't).

ProjectVulcan2135d ago (Edited 2135d ago )

I said for a while Dual GPU would actually be possible, and actually fairly smart to start with and work well in consoles because the developers would be able to code for it.

Why? Well it is a well known fact that two midrange GPUs can edge out a high end, very expensive GPU, while not actually costing any more. One recent example being GTX560 SLI comfortably bests a single GTX580, but actually costs less! This has been going on for some time now and there are prior examples of most generations. 5750 crossfire usually beating or at least matching one GTX285 for considerably less cost.

How is this possible? Simple, midrange or lower mid parts are much much easier and most importantly cheaper to make. They will always have better yields, wider tolerances than big, hot, brand new cutting edge designs.

In short it is far more difficult and often more expensive and ultimately slower to build one big fast chip than it is to just design two smaller slower ones and run them together.

Besides the fact on PC games heavily rely on driver level optimisations to work properly with dual GPUs, console games would have no such hurdles. A developer would design the game to specifically use both GPUs to maximum every time. This is why console hardware is more efficient with its assets anyway, PC does use brute force to beat it.

Designing a good midrange GPU, pairing them together on a hugely fast bandwidth bus and fitting them to a console does actually make more financial and developmental sense than you might imagine.

Arksine2135d ago

SLI/Crossfire in a console is not happening. The power requirements are too high, the heat output is too much, and eliminating microstutter is a developers nightmare.

darthv722135d ago

is the notion of "dual gpu" any different than a dual core gpu? I can see in the current PC trend of having two physical boards in sli/crossfire but if this were a refined gpu (like a console generally uses) it could be a dual core gpu.

A six core cpu isnt that hard to believe. We live and play in a world where quad is the norm these days with 8 core cpus and 32 core and 64 core etc.

also, clock speed isnt reflective of performance. There are chips out that run slower but do more per clock cycle than a really fast chip that cant do as much.

An efficient chip could be clocked down to lets say...2ghz or even 1ghz but because of the way it is designed, it handels the amount of data that a chip running at 3.2+ghz would process at the same rate.

Things are getting smaller and slower in the "speed" sense but they are getting faster in the efficiency sense.

Shackdaddy8362135d ago

@Steve where did I say I wanted an overclocked CPU? You don't overclock a CPU on a console. That's just dumb.

frjoethesecond2135d ago

@ Arksine


As long as microstuttering exists in a dual GPU setup it'll be a severe limitation for a console. I'd be very surprised to see a dual gpu in a console.

Kudos for mentioning microstutter.

Kurylo3d2135d ago (Edited 2135d ago )


when he means dual gpu are faster and cheaper he doesnt mean 2 580s are less expensive then 1 580... he mans 2 gtx 560s are both faster and cheaper then a single 580.

gta28002135d ago

I love how everyone at N4G becomes a computer engineer when ever articles like this make it on here lol.

ProjectVulcan2135d ago (Edited 2135d ago )

Hmmm some people still thinking inside the box. Micro stuttering is heavily, heavily, heavily driver and bandwidth dependent on a windows PC, using Direct X, and a standard PCIE bus. As i said, if the GPUs are connected with a very high speed bus and developers are aware of the maximal load of this bus micro stuttering would not exist. They probably couldn't exceed it anyway. This is why microstutter usually doesn't happen for those people who have powerful PC setups because they have so much bandwidth they can't see any microstutter. A high speed bus would not be hard to create in fixed hardware like a console for two GPUs to communicate. 5 years ago microsoft employed one to shuttle between the mother and daughter die of their seperated GPU designs. Works well to my eye.

A console would simply not suffer this problem as long as the developer was aware of what they were doing, and the console was engineered correctly. At the design stage this would be eliminated, thanks in no small part that consoles don't have driver software, developers have direct to metal access to the hardware. Very low level access. A massive part of the stutter problem are the driver interactions on PC. These don't exist on a console...

As for the power and heat yet again why are people citing this as if it would only be cramming two very high end parts in there and not two midrange ones??

A pair of Radeon 6850s pull LESS peak power and thus less heat to dissipate than a single GTX470. Certainly no more than even a single GTX580 and they are considerably faster!!! See how they stomp GTX480, despite the fact they are hindered by drivers that suck away some efficiency.

AMD know what they are doing now when it comes to power efficiency, for several years they have had the philosophy of building smaller parts than Nvidia and controlling power consumption knowing two such cards would beat one big expensive Nvidia one.

Blackdeath_6632135d ago

all you need to have is logic to find out that 2 of something cost more than one of the same thing

Computersaysno2135d ago (Edited 2135d ago )

Syncing identical graphics processors inside a console would be pretty easy and stop microstutter. As vulcanproject said microstutter is a problem that has unique circumstances on a PC for why it exists. Such things as vastly different motherboards, memory, hardware and software components make it hard to eliminate microstutter totally on PC. It has improved a lot since the early days of SLI/Xfire tho!

On a closed set designed machine where everyone has the same hardware and software set and configuration like a dedicated console then it wouldnt be an issue.

Besides this the Saturn had a bunch of multiple processors inc a pair each of identical graphics and central processors. At the time it was too complex and expensive to make as it wasnt mainstream but it has been done before in consoles. Now the tech has been sorted and advanced a lot it would work!

Masta_fro2135d ago


Ok so let me explain this in logical terms so that maybe you and Shackdaddy can understand.

If i have one really big stick that costs 20 cents, but too slightly smaller sticks that costs 8 cents each, if i tape the two slightly smaller sticks together with super glue, ill have a longer stick than the 20 cents stick, but for only 16 cents.

Read that a couple of times, then stop pretending you know anything about computer hardware.

ProjectVulcan2135d ago (Edited 2135d ago )

Yep, the saturn used dual graphics. It used dual graphics by splitting the tasks fairly strictly between them, which made it hard to develop for. Dynamic load balancing wasn't a concept that had been explored much on multi GPU systems back then.

AFR (alternate frame rendering) is the typical technique for PC multiple GPU systems which is more prone to stutter. It is mainly used because this is the easiest and most practical way to make as many games as possible function and see a performance boost with more than one GPU on a wide variation of hardware. It isn't well optimised, but it works for the most part, so its used all the time.

As this is probably not the best way to do multi GPU rendering performance wise there are better ways. One way would probably be to break up the scene into seperate parts, and have each GPU render a different part of the scene then combine them in the final framebuffer. This is not problem free, but would be more efficient. You can get near perfect performance scaling with this technique without the classic AFR downsides like sync issues. You do need a smart controller, but this idea and tech is not a pipe dream...

This has been tried on PC. Lucid Hydra. But again the difficulties are the variables of software and hardware involved. Not to mention resistance from the graphics companies and their marketing departments. It needs more investment and backing to perfect, but with the right people and money it would work.

On console all these problems go away with development of a new machine. With a fully flexible dual GPU system, then the developer can choose how to set up their engine to take full advantage of them.

Console is the ideal platform for multiple graphics processors, it makes sense even better than it does for PC!

dcbronco2135d ago

There is no reason it can't be dual GPU. Many for get the current 360 basically uses and APU. And it uses a lot less power than the original 360. Add to that that a CPU next year will be either 32 or 22nm and there are far fewer heat issues. Heat won't be a problem. I still believe they will stick with IBM. The Power architecture IBM uses has already been used to being make fused chips. Plus the Power 8 should be available next year, if not, early for MS anyway. And that architecture has quad core threads like the current Power 7. So 6 cores is really 24 instructions or four times the current Xbox. And with a far more power CPU to begin with. And one thing that does ring true about the rumor is that a MS exec mentioned that they had started working on the next Xbox right after the 360 launch to make sure there would be no RRoD type issues.

+ Show (17) more repliesLast reply 2135d ago
2136d ago Replies(3)
ATi_Elite2136d ago (Edited 2136d ago )

CPU = The 6 core would be very power efficient while running cool at around 3.0ghz and would be equal to a mid range 4 core desktop CPU. Making it a 6 core would provide mid range 4 core performance while maintaining heat and power requirements.

*Desktop PC CPU's have hyper threading higher clocks and a ton of special features that require a lot of power where as the xbox 720 would need to work with in a certain Watt range. Adding the extra 2 cores gives you the performance boost without the heat and watt usage*

Dual Core GPU = AMD has really done very well with dual GPU scaling along with having cool and energy efficient GPU's. AMD's Crossfire set-up is a great way to boost performance while maintaining lower prices.

1. first point is price = A dual GPU would be cheap cause AMD would be using something already made and like HD5000 or HD6000 series GPU's. Dual Gpu's give you more power for a cheaper price.

Perfect example GTX580 = $500 GTX560ti SLI = $475 but give you 25% more power

2. Second point of Heat = Non issue here as well cause the 720 isn't gonna get the stand alone PCI-E slot GPU's that PC's get you silly person. The 720 would get two GPU chips integrated onto one board along with the CPU and share the 2GB of XDR ram.

*2Gb of shared XDR ram is more than enough for a console (512mb system 1.5Gb Gaming). Your not gonna be running a dual firewall, antivirus, editing video, web browsing, and burning DVD's all at the same time on your console so it doesn't need 4, 6, 8, or 10GB of ram.*

The clocks would be modded to fit the power and heat requirements of the Xbox 720 JUST like how CPU's and GPU's are modded to fit the specs for LAPTOPS.

3. Third point is the size of the Xbox 720. Nothing will ever be made heavier and bulkier than the original XBOX so chill-out. Chip sizes have been shrinking for a while now as we have entered the 28nm and 22nm processing range for CPU's and GPU's. Remember the 360's Xenon was 90nm then the slim was 65nm! Even at 40nm is a big difference from 65nm.

I expect the 720 to be slightly larger than the 360 but way more cooler cause of smaller chips and more efficient chips. I expect the size increase to be for the most important feature......HDD. 3.5"HDD are cheap as hell now a days and having a dual HDD in the consoles would allow for flawless SP gaming while downloading new games.

Raider692136d ago (Edited 2136d ago )

were did you saw that a sli gtx560ti is 25% more powerful than a single gtx580?You are Wrong one single gtx580 is more powerful than a 560ti in Sli and in same benchmark its even tops a gtx570 in Sli too.

frostypants2135d ago

Check out the benchmarks at Tom's Hardware. I think you're wrong.

vortis2135d ago

You make valid points up until the RAM.

Destructibility has become a huge factor in a lot of games and the consoles will need about 4GB of RAM if a game like GTA, Mafia or Saints Row plan to go the route of full-destructibility. That's not to mention that I'm pretty sure games will be a standard of 1080p, so yeah...RAM will still need to be plentiful especially for open-world games.

I would hate for the newer consoles to repeat what we seen in this gen's games where you blow up something and it fades away in 0.3 seconds as if it never existed, just to keep the cache clear.

Scenarist2135d ago (Edited 2135d ago )

@vortis .,.. i agree

Ram is a much bigger factor if you want high polygon models and high resolution texture maps.

the only limitation i can think of, while developing the 720, is money lol. Primarily trying to find the best price/performance ratio
for ex: making the system as capable as possible but still being able to sell it at a respectable (video games market) value. I alot of people need to be able to afford it so to speak.

I wish they would put Solid state drives and 24GB of ram in it ....shit

and then only your imagination can be the limitation of whats possible

+ Show (1) more replyLast reply 2135d ago
TheXgamerLive2136d ago

this is a failed rumor.
Fact is they've been testing many variations in alpha form for well over a year now, including 12 and 16 cores. there's actually some articles regarding this. search it.

kevnb2135d ago

more cores are a pain to program for, but in a console devs will always know the cores are there.

JsonHenry2135d ago


1. That would be a lot more expensive
- not if they use two cheaper parts to make it faster than one higher end parts. (think SLi with two 460s instead of a 480)

2. It would create a ton of heat requiring more efficient cooling for the case.
- two cheaper parts do not make more heat than one higher end higher clocked part. Especially not since they will more than likely to be on the same chip as the CPUs.

3. It would be bulkier probably forcing Micro to have a bigger case than they probably want.
- Look at my response to number 2.

More than likely what we will be getting is basically AMD's Llano CPU/GPU chipset, possibly with higher performance. Their current line when coupled with the right RAM can easily run Crysis maxed out at 768p on laptops. (to give you some sort of perspective)

Or it could be none of this and something completely different. Kinda upset to only see 2gigs of ram. Was hoping for 3-4gigs.

Persistantthug2135d ago

Arm Processors are mainly for Mobile devices.
A tablet does NOT equal a console.

Ulf2135d ago

I find it hilarious that this small comment is the only one with any relevance, and will largely go ignored by the N4G community.

Most of them didn't even notice that the rumored CPU was an ARM, which would never cut it in a console, since even the performance of a hexacore mobile CPU wouldn't touch a tri-core desktop CPU.

zero_cool2135d ago

Also the flexibility & efficiency of the cores matter just as much as the speed of the cores!

Cheers Gamers & Happy Gaming!

sjaakiejj2135d ago

"For the CPU, it depends more on the clock speed."

I'm just going to stop reading here, as this is completely incorrect. Clock Speed has some relevance, but it is by far the worst stat to look at when you're thinking in terms of performance. Its instruction rate is far more important, and is a far more accurate stat on how well a processor will perform.

Look for DMIPS and WMIPS to get some stats.

MaxMurdoch2135d ago ShowReplies(1)
SegataShanshiro2135d ago

It seems that nobody is asking aboutt he single most important
Processing unit, THE BLAST PROCESSOR

vortis2135d ago

You're wrong about dual GPUs being more expensive and causing more heat. My dual 5770s create less heat than my 4850 and out-performed it in DX10 games by about 20%.

AMD is known for making efficienct XFire cards that require low-maintenance and low-energy consumption while producing very little heat for good results. Dual-GPU sounds about right for AMD and it would be the cheaper way to go.

jerethdagryphon2135d ago

so an amd 6 core buldozer then...... thats interesting completly off the shelf most likly

DA_SHREDDER2135d ago

the clock speed for the ps3 and xbox cores are already 3.2 ghz.. How much faster of a speed do you need?

+ Show (11) more repliesLast reply 2135d ago
Hanif-8762136d ago

The rumored ram is 2gb DDR3 in which i think is just pathetic because they should throw in at least 4gb.

Focus2136d ago

Until PS4 is revealed to also have 2gb, then it is totally awesome because the new supercell transforms it to an 8gb with its awesome phoenix and unicorn magic. Right?

Micro_Sony2136d ago

Hahahahahahaahha - Bubbles

blumatt2136d ago (Edited 2136d ago )

I hope both the PS4 and Next Xbox have at least 4GB of RAM. Both of them should, so there's no RAM bottlenecks.

Sounds like some pretty good hardware though. Will be curious to see the price. I hope both of them (PS & Xbox) end up only costing $399 launch price.

tehnoob32136d ago

ps4 will likely use XDR ram running at 4ghz+ which is great and it makes ps3 emulation easier.

Killzone3Helghast2135d ago

[email protected] tehnoob you don't even know what you're talking about. They don't need to emulate PS3 games with the PS4 it uses the same technology

RedDead2135d ago (Edited 2135d ago )

Haha I agree but, eh, ddr-3 is fu**ing cheap these days. May as well throw a good bit in for the consoles. can still run a good gaming rig. DDr-3 is fine

Dread2135d ago

that was too much dude...made me laugh

+ Show (3) more repliesLast reply 2135d ago
a_bro2136d ago

2gb is more than fine.

SJPFTW2135d ago Show
HaHa_Ostrich2135d ago

No it is not. PS2 had 32MB RAM. PS3 had split 512. Thats 16x boost in memory, yet still considered to be the main bottleneck of consoles. So an upgrade of only 4x of current ram is simply not enough. In 2013-14 itll be laughable.

T3MPL3TON 2135d ago


Are you slow? I dare you to have a PC with only 2GB of ram and try to run a newer game. When your computer tells you no for the 12th time maybe you'll get it.

a_bro2135d ago (Edited 2135d ago )

its a console...Not a PC... on pc, its necessary to go beyond 2gb. a game console is simply that, a game console. a machine that is dedicated on just simply playing games.

you guys must be high if you think game consoles will have as much ram as a gaming PC. I mean, think how much that's going to cost. and again its a dedicated gaming console, not a pc.

vortis2135d ago

2GB won't even be enough to run GTA VI on consoles without bottlenecks. I hope they don't use your logic because we'll end up with fail consoles.

I agree with all the sensible people: 4GB or GTFO.

We all know they're going for 1080p and AAx, etc., etc., which eats RAM like crazy, not to mention destructibility, voice chat, in-game audio and assets, etc. You'd be flatout mental to think that 2GB would get you much on a next-gen machine. 2GB even in a PS3 or Xbox 360 would only make nominal differences to the games we get today (such as slightly more peds, car variety in open world games, and more destructibility and lasting debris in shooter games).

+ Show (2) more repliesLast reply 2135d ago
iagainsti1202136d ago

Not to mention DDR3 is to slow for graphics right now in modern Video cards. Xbox 360 uses Gddr3 so i wouldn't be surprised if they end up using 2-4GB of GDDR5 or something faster.

STONEY42135d ago (Edited 2135d ago )

Huh. RAM DDR does not = GPU GDDR.

And 4GB VRAM graphics cards? We're barely adopting 2GB, and only needing more than 1GB at 2560x1600. Which most gamers don't run at anyways.

Unless you mean "RAM" RAM, but that GDDR5 thing makes no sense in that case.

iagainsti1202135d ago (Edited 2135d ago )

Im not saying graphics cards need 4gb of ram, im saying that using Gddr3 or DDR3 is a step backwards. not to mention that you can look at several ATI or nvidia cards back when GDDR5 was being introduced. A card running GDDR5 using the same components and clock speeds has a huge advantage over a card using GDDR3. Its all about the bandwidth and GDDR5 is 2x faster than GDDR3. Faster all the components are the better the frame rates will be. Im Sure Pandamobile would agree.
And consoles use GDDR for system memory too btw as an example the 360 uses shared memory for graphics and system.

ExCest2136d ago

Actually true. It should have more than 2Gb.
i mean, I'm pretty sure 2Gb is enough but, 4Gb is freakin cheap as hell nowadays. Just look at prices. If ram is on sale (and since these companies buy in bulk which = cheaper prices than the following) for often like 20-40$, it shouldn't too hard or expensive. With 4Gb, there would be no memory problems probably ever (I'm talkin about the PS3 XMB in game) ((The Ps3 XMb would run smooth and it could potentially load the icons crazy fast)) (((if I'm wrong, it's because I'm not too tech-savvy)))

RegorL2135d ago

"With 4Gb, there would be no memory problems probably ever"

Pretty sure you are correct, since it is so much more than 640kB :-)

But what is actually more important than the RAM is fast secondary storage - this time it might be SSD.

ExCest2135d ago


and if i'm correct, remembering myself playing pc, yes, ram is fast secondary storage. It would amount to the fact that maps in FPS will actually load faster as they would be cached into the RAM and since Ram is fast... really fast map loading like PC instant fast

wow this run-on sentence

JsonHenry2135d ago

I agree with the RAM statement Spartan. However when I play current high end games on my PC I have my second monitor set up with system monitor running on it. Games like Crysis 2, BF3, and Stalker DX11 use only about 3.5 gigs of my system RAM. And my OS is running with high overhead. (uses about 1gig just sitting still) So those games are only using about 2.5 gigs and they look really damn good maxed out at high settings. THE ONLY thing I am not taking into consideration though is my video card has 1gig of RAM. And since MS seems to like the unified memory architecture 2gigs may be a bit of a limit to even meet the current standard of PC gaming. If the video card does not have dedicated RAM then I fear 2gigs will not be enough to do what current high end PCs can do but certainly enough to look head and shoulder better than what we are getting consoles now. Certainly enough to look "next gen" but not enough to push the increases in gfx fidelity later on in the consoles life.

Hanif-8762135d ago

@JsonHenry Yup, thats exactly my point.

+ Show (2) more repliesLast reply 2135d ago
DarkBlood2136d ago

Well i just want the release date for Xbox Loop or my preference Xbox Fruit Loop lol

Pandamobile2136d ago

Dual GPU in a console?

Rumor credibility goes out the window.

shenglongg2136d ago

If we'll consider the innovations provided by AMD: CrossFireX on 2005, AMD Fusion APUs on 2010, you might say the concept of dual-GPUs may not be too impossible.

irepbtown2135d ago (Edited 2135d ago )

Lets not forget AMD saying their chips for Mobile phones can equal console graphics,

I think it is VERY possible.

Edit: My cousins Samsung Galaxy S II can put out some pretty impressive graphics. They were top notch, definitely something that could compete with consoles. So if they dont do something crazy, then mobile phones even will be ahead.

MaxMurdoch2136d ago (Edited 2136d ago )

why is that such an outrageous idea? dual gpu would offer more power than single gpu and cost less for the gaming muscle it provides. I think its possible, and actually a good idea.

honestly with nearly 100% scaling with the current gen amd gpus and nvidia gpus, this doesnt seem impossible.

kaveti66162136d ago

It would cost more than the console itself.

Even if Microsoft purchased them in bulk, they would end up paying 200 USD a piece for them. And the CPU would end up being a bottleneck.

They wouldn't be able to fit it into a small enough form factor to call it a console.

It's not going to happen, man. Give it a rest.

irepbtown2135d ago


Business man BUSINESS.

Obviously there will be deals if you order thousands/millions.
Try going onto wholesale shops/sites, you can buy laptops for a quarter of the normal price.

Machioto2136d ago (Edited 2136d ago )

Aren't powervr GPUs technically multi GPUs and I thought the shrink from 40nm to 28nm would help any heat problems also AMD make small GPUs to begin with their biggest one is the same size a gtx560 ti.

steve30x2135d ago

Their 6990 is bigger than a GTX590.

Raider692136d ago (Edited 2136d ago )

Alienware have desktops like M18x with SLI gtx580m and ATI 6990 CFX and the M17x R1 with dual GPU too all stick under the hood so i dont see the reason a console cant have dual GPU.The only valid reason i say this is bogus is because M$ and Sony most likely will want to release a new console at affordable price to mass market and a dual GPU set up its not viable economically to neither,people are forgetting that Sony and M$ already stated that they dont want to lose money on new generation consoles at release like they have lost with the 360 and PS3.