nVidia's GT300 specifications revealed - it's a cGPU

Over the past six months, we heard different bits'n'pieces of information when it comes to GT300, nVidia's next-gen part. We decided to stay silent until we have information confirmed from multiple sources, and now we feel more confident to disclose what is cooking in Santa Clara, India, China and other nV sites around the world.

GT300 isn't the architecture that was envisioned by nVidia's Chief Architect, former Stanford professor Bill Dally, but this architecture will give you a pretty good idea why Bill told Intel to take a hike when the larger chip giant from Santa Clara offered him a job on the Larrabee project.

Read Full Story >>
The story is too old to be commented.
Mikerra173487d ago

your good at that, cpus are dominated by intell and amd

Fishy Fingers3487d ago (Edited 3487d ago )

They are, it's still 'just' a graphics card at the end of the day with new architecture to allow things such as direct programming on the GPU.

Sounds amazing if you ask me. But probably very pricey.

CaseyRyback_CPO3487d ago (Edited 3487d ago )

When the games being designed for the PC must run on consoles. Lotsa top end, no where to run.

Crytek is the rare acception. But even they are degrading engines to run on hard driveless hardware at solid framerates.

Someone needs to make games that ran on last gens maxed out hardware.. so far.. only crytek has.

Fishy Fingers3487d ago (Edited 3487d ago )

Huh? I dont understand what you mean. Games maybe on both consoles and PC but you can still max out the setting far beyond that of the consoles. 2560x1600 resolution, 16X AA etc etc

INehalemEXI3487d ago

Get it out there ASAP, by the time it's price is dropped to something reasonable, I will be in need of it.

CaseyRyback_CPO3487d ago (Edited 3487d ago )

You can run HL2 at 4000x4000 at 90x AA. With 10 sli'd videocards, and the same could be said about dig dug. That doesn't render them in some crysis like quality all of a sudden. Just better refining that 500x500 texturemap.

PC games AT THIS POINT should all be looking like Cryteks engine. PC gaming used to follow that trend, developers were one upping one another with the game that you knew you'd have to upgrade to play in its full glory. HL2, BF2, DOOM3, S.T.A.L.K.E.R, black and white, kingpin, etc.

Remove that factor and you have the PC gaming in the state its currently in. Rendering low quality console games in grand resolutions, but they are still the lower quality assets. You dont need super power to run the unreal/source engine. Sure you can render them amazingly with 900fps, but they are still no where near the demanding requirements of Crysis.

I'm an avid PC gamer. PC gaming sucks, im not going to pretend to be excited that L4D can run MAXED beyond wuxga, so did hl2.. 5 years ago.

My top played PC games are more sims like Black Shark, or Lomac these days. But dont confuse what im saying for some lame consolite argument. Its truth.

Kakkoii3486d ago


Lol, Then why is Intel scared of Nvidia, and why is Intel trying to create a GPU?

Because Intel knows GPU's work on a much better principal for computing. GPU's have shown to be much MUCH better at many different tasks that used to be handled by CPU's. Intel doesn't want to miss the bandwagon and be left in the dust if the industry does move to a mainly GPU computing one. Which all signs show it is moving towards.

GPU's were already very awesome for tons of different computing tasks. Nvidia is just tweaking them to be able to do so much more now. Really putting a stranglehold on CPU's.


lol i bet you this will cost 1k per card.

Kakkoii3486d ago


Nah, it's mainly about Die Size that constitutes most of the cost. This chip will be around the same range as the GTX 280 was when it first came out. Perhaps a bit more. But not to much. Nvidia knows it can't price it's gaming cards to high, or else it won't make enough sales.


i was exagerating a little.

so since this is basically the same jump in performance as the 8800 ultra was, you think it will be around the 6-800 range depending on the bin?

Mikerra173486d ago

Ill admit defeat, your right.
I just really like nvidias video cars

zagibu3486d ago

@CaseyRyback: It's true that pc games are negatively influenced by console gaming (not just on a technical level, btw.). It's not true, however, that more gpu power is useless for all games except Crytek's offerings. There is more to computer graphics than FPS, polycount, display resolution, texture resolution, texture filtering and anti-aliasing. Nowadays, complex shader programs affect performance much harder than any of those basic things, and it's pretty easy to switch them off for console versions or less powerful cpus. There is also draw distance in scene graphs, geometry detail, density of dynamically added dummy objects (like grass, etc.) and many more.
I know your answer to this: why should they implement those advanced features, if only 1% of the user base can even see them? The answer is prestige. The currently best looking game will benefit of additional media hype. Another thing is the separation of producer and developer. Even if it doesn't make financial sense, a dev might implement it, simply because he thinks it's cool.

SkyGamer3486d ago

I agree with the the first poster. You really think Intel/AMD will sit quietly and trying to do too many things at once will more than likely cause the company to go belly up, or at least ATI will easily surpass them.

nVidia, do what you do best and that is graphics. They can't even make decent chipsets in mobo's, (esp. laptop) so they need to just focus on graphics cards.

Kakkoii3485d ago

@SkyGamer: Your obviously not very educated in this subject. Because you don't seem to even know that AMD and ATI are the same company. ATI is a subdivision of AMD that's targeted towards graphics.

And Nvidia is doing what it does best, massively parallel computational chips. It's enabling it's GPU to do even more.

+ Show (10) more repliesLast reply 3485d ago
Farsendor13487d ago

as long as its no more than 500$ ill buy it.

richierich3487d ago

So anyone got any idea when its out?

NRG3487d ago

Rumors are suggesting October or sometime near that, the last I heard. Just in time for Modern Warfare 2 :)

SpartanGR3487d ago (Edited 3487d ago )

-From the info i've gathered from the press so far- near the end of 2009 -AT BEST- early 2010.My estimate is near the public release of WIN7 (november).

Pandamobile3487d ago

If the single GT300 GTX card can out perform my two 295's, I'll kill myself :(

nnotdead3487d ago

to be fair most of the time a single more powerful GPU is better than two lower end GPUs.

INehalemEXI3487d ago (Edited 3487d ago )

we are gathered here today in remembrance of Pandamobile .... he was an avid gamer, who found peace in sli...or did he?

CptBach3487d ago

yea but a single GPU that can beat 2x gtx295 is something amazing so i doubt it will be able to do that

NRG3487d ago (Edited 3487d ago )

The GT300 might not, but imagine the generation after the GT300's. And there's always GT300 SLI...

Kakkoii3486d ago

No it shouldn't. Because what your talking about is 4 GPU's in 2xDual SLI. The GT300 will be damn powerful, but not 4 GPU's powerful lol. As the article said it should be around double. Although it will probably get close to the performance of your setup, because SLI/Crossfire has bottlenecks that don't actually result in 2x the performance with 2 chips. Unlike a 2x more powerful chip.

N4g_null3486d ago

WOW man you are so dead!

+ Show (5) more repliesLast reply 3486d ago
xg-ei8ht3487d ago

16aa isn't needed fishy.

Only the most hardcore pc players want to outdo eachother.

I used to be like that, i never had the highest, somewhere inbetween, but benchmarking gets boring.

I'd rather be playing games.

You see the Wii. Hardly any games on the system have AA.

and people are running them on big hd tv's/

Same can be said of some PS3 games i've played. No aa, or quincinex which is between 2x 4x. but with slight blur.

As long as the game looks good and plays well, i'm not that bothered.

I prefer plenty of AF. Which shouldn't cost anything today.

Don't get me wrong i'd love to see what is capable on a banged out pc, with all the latest stuff with the new nvidia card.

Trouble is, most of the time we don't. It's mainly gone the console route and is likely to stay that way for awhile.

Kakkoii3486d ago

Preferences, preferences, preferences.

I love technology, so I like to see advancements like this. And I like to play with the most advanced technology. This isn't just about gaming, but the future of computing.

And of course, not extra detail settings are "needed". They are just "wanted" if available. It's a strive to bring gaming to life like realism. You can still make cut unrealistic games like on a Wii. This isn't meant to eliminate those kinds of games. It's merely meant to open doors to more advanced games. Games that can blow our minds.

Show all comments (41)
The story is too old to be commented.