Top
150°

Intel: an expensive many-core future is ahead of us

Ars Technica: Intel has bad news for software developers. It's been hinted at already, but now the company has stated explicitly: it's not enough for software developers to be targeting dual, quad, or eight cores. No, the future holds tens, hundreds, or thousands of cores, and developers are going to have to bite the bullet and write programs that will scale to such systems.

Read Full Story >>
arstechnica.com
The story is too old to be commented.
Kakkoii3666d ago

You mean....

Sorta like what CUDA does on the GPU?

Since a GPU is already made of of 100's of little cores. LOL.

Look's like CPU's will be becoming more like GPU's overtime.. So what's the point. Just let the GPU take over :).

VIVA LA CUDA!

JoelR3666d ago

no quite a bit different.

most multicore processors (and massive parallel systems) have turing complete processors as subprocessors - most GPU subcores are not turing complete.

DJ3666d ago

But Intel needs to bite the bullet (as well) and create a multi-core platform that ISN'T x86-based, has dedicated bandwidth lines connecting all the cores, and is inherently scalable. The reason they need to do so is because STI (Sony, Toshiba, IBM) have already done so with the Cell platform, and it won't be much longer before Cell-based PCs are made available to consumers.

Isaac3666d ago

I would love that, but was not the cell supposed to be very different from old PC architecture?

JoelR3666d ago

Cell is different from regular PC architecture. That is true. But as with any good processor it can have programs that replicate everything a Traditional processor (multithreaded or not) does.

We have been saying all along that processors would lean toward the Cell design and that schools need to be teaching students how to handle this (of course we have found many PhDs don't understand it) and so we have been aggressively trying to get teachers to learn and to put out a significant amount of multiprocessing research into parallel algorithms into the publics hand.

killax35633666d ago (Edited 3666d ago )

Toshiba already offers a laptop with Cell processor in it....but the Cell processor in the laptop just acts to assist the main cpu which is Core 2 Duo chip.

http://crave.cnet.co.uk/lap...

Want to know why Toshiba didn't make the Cell processor the main CPU? Its because the laptop uses windows and the actual programming for windows is custom made for x86-based chips....NOT the Cell processor.

So my point is this- Cell based computers (sans the ps3) will NEVER become mainstream if no OS's are written for the Cell's unique architecture.

JoelR3666d ago (Edited 3666d ago )

"So my point is this- Cell based computers (sans the ps3) will NEVER become mainstream if no OS's are written for the Cell's unique architecture."

yep that is why it is not in current computers - but there is a fairly excellent linux distro (partially) that supports it and AUX does as well (fully) so do not be surprised to see a EEE unix based type palmtop or some such that uses a cell processor in the future (not to distant)

+ Show (1) more replyLast reply 3666d ago
zapass3666d ago

gabe newell: "can't you take care of my stupid old quake2 code and make it 'nextgen' by magic??"

intel: "no, you're gonna have to adapt your engine to multicore otherwise you'll be slow as a dog. sometimes, hardware cannot save poor software. this is a one time redesign and then, your performance will scale with the ever increasing number of cores in the future. there is no alternative: you better bite the bullet now..."

gabe newell: "oh noooooooooo, TEH CELL all over again! SONYYY, IBMMMMM, I APOLOGIZE, YOU WERE RIGHT AFTER ALLLLLLLLL"

LMFAO!

JoelR3666d ago (Edited 3666d ago )

Yep... the change from single core and naive multiprocessing to massively parallel is going to kill off a lot of the old guard who were something back in the day but haven't adapted.

Adapt or Die - there is no stopping.

I myself am afraid of quantum computing... I don't understand Qbits enough so if that replaces traditional gates /shudder

Apocalypse Shadow3666d ago (Edited 3666d ago )

when i mentioned the terminator chip and how the star trek enterprise uses multiple chips or cores to run the ship.and how nerds used scifi movies and tv shows along time ago to come with alot of products we use today.they strived to recreate the tech they saw being used.

but,sony knew it,ken k. knew it,toshiba knew it and other companies that can SEE where we are going.

it's just lazy programmers from PC that aren't ready to jump into next gen multicore,multichip design and programming.they're the ones that still like programming to one chip.

but they'll soon find that they'll get left behind.

apocalypse......

Baka-akaB3666d ago

With intel pushing so much raytracing , is it any wonder they "can't find any other way around" ? Might be the ruth , but its convenient for them anyway .

JoelR3666d ago

The issue is not really raytracing but that gains are no longer easily accessible to basic single processor development.
The design of chips is now at a point that things like electron tunneling and other quantum effects make it harder to create a denser circuit. Higher Speeds = Higher Temps on small spaces = Thermo break down etc...
The way to get performance improvements with modern computer theory is covered by three real options since the traditional Moore's law style approach to processors does not work anymore.
they are:
-Qbit Quantum Computers
-Multiprocessor
-Protein Computers

of the three only one is readily usable without another 10-20 years of development... multiprocessor -> Thus symmetric and asymmetric processing algorithms are the order of the day for programmers wanting to use the hottest hardware.

Show all comments (15)