GPUs Will Have 20 Teraflops of Performance by 2015 – Nvidia Chief Scientist

Chief scientist of Nvidia Corp., William Dally, expects graphics processors to become more general purpose and get rather extreme performance in the coming years. In addition, Mr. Dally emphasized importance of further parallelism in graphics processors and implied that in future graphics processing units should transit to MIMD architecture.

Read Full Story >>
The story is too old to be commented.
shocky163344d ago

Gonna cost a fortune though.

Kakkoii3344d ago

Why would it cost a fortune?

The chips don't increase in size, because the fabrication process's get smaller. Thus the prices don't increase, because they stay around the same size. The main cost is how many chips you can fit on each wafer.

After we can't make the chips fabrication any smaller due to the physical limit of matter lol, then yeah, they will start getting bigger and more expensive. At least until cheaper production methods come up.

Graphene gates and carbon nanotube interconnects. Doesn't get any smaller than that. From there on we have to look into photonics, and then quantum computing.

shocky163344d ago

Don't mind me, but thanks for telling me that. Maybe i'll purchase a gaming pc in the near future if it's cheaper.

Shane Kim3343d ago

I want a PC with all that you just said. quantum computing and all :P.

JsonHenry3343d ago

Just look at how cheap it is to build a system capable of playing Crysis maxed out now compared to just a year ago.

Near Photorealism is just around the corner and the price of admission is only going to get cheaper as time goes by. Hell, the nm size of chips are getting so small and energy efficient I imagine it is only about 5-10 years before HDTVs come with a PC built in capable of playing HD x264 net-streamed videos at 1080p and external Hard drive connectivity. Hell, the price of power might be so cheap by then your HDTV might be able to play recent (maybe even near future) games with no problems.

I am truly excited about the future of tech now more than ever.

evrfighter3343d ago

ATI just dropped the price their hd4890's

hd4850's can be found for less than 100
hd4570's can be found for less than 150
hd4890's can be found for less than 200

it truly is a great time to build a pc

+ Show (2) more repliesLast reply 3343d ago
thePatriot3344d ago

I want my next gen consoles to come out.

Kakkoii3344d ago (Edited 3344d ago )

It seems William Dally is hinting at the MIMD architecture. Which helps solidify the rumor that Nvidia's GT300 series are going to use MIMD instead of old SIMD. Hopefully it's true that they will be doing it this coming generation. And if they do, I feel sorry for ATI! Because it's confirmed their 5000 series architecture is still SIMD.

Conviction_GoTy_20093343d ago

Glad to see Nvidia is supporting Windows 7 :D

OmarJA3343d ago

Looks the PS4 will be king of graphics again.

commodore643343d ago

bu bu... teh 10 year ps3 lifespan?

Simon_Brezhnev3343d ago


You dumbass what do 10 yeah lifespan got to do with this.

Pandamobile3343d ago

What the hell does this have to do with the PS4?

STONEY43343d ago

I think he's saying that since PS3 uses an nVidia graphics card. But by that time I'm sure ATI will the same tech too.

Pandamobile3343d ago

Come to think of it - I thought I remembered Sony saying they're using an Intel manufactured graphics processors. Something along the lines of the Larrabee.

Kakkoii3343d ago

Nah Pandamobile, the PS3 uses a Nvidia GPU, that's based on the Geforce 7800.

Which is incredibly piss poor compared to today's GPU's lol.

Pandamobile3343d ago

I meant I heard that Sony is using an Intel GPU for the PS4, not 3.

commodore643343d ago

yeah i made that comment because the whole idea of the ps3 having a ten year lifespan is really a bit moot.

In 2015 the ps3 will be just over 8 years old.
And competitive GPUs (not even including cpus) will have ten times the power of the mighty ps3...

Kinda puts the ten year lifespan into perspective, is all.

Kakkoii3343d ago

@Pandamobile: Ah, the PS4. Well those are 100% just rumor's that the PS4 would use the Larrabee or some iteration after Larrabee from Intel. Firstly because Sony is dedicated to the Cell processor, they have lots of stock invested in it. And since Larabee is a CPU/GPU package, that sort of makes that rumor unlikely.

Then there's the fact that Larrabee isn't meant to compete with the high end market, just the middle and low end. It been speculated to only be about as powerful as a GTX280, and it coming out in Q1 2010. By then both Nvidia/ATI will have their new, much more powerful generations out and will be steady on improving their new architecture. So it would be a poor choice for Sony to go with Intel's GPU, instead of the high end from ATI or Nvidia.

And finally, there is no way Sony would already have plans to use an Intel GPU in their next console, when Intel's Larrabee hasn't even been released to companies yet for testing. It would be a pretty blind leap for Sony to make a commitment like that without even seeing how Intel's first higher performance GPU turns out.

Pandamobile3342d ago

Yeah, I read around and found that it was just a rumour and got debunked many times.

+ Show (8) more repliesLast reply 3342d ago
Show all comments (45)
The story is too old to be commented.