Top
120°

BrightSideofNews: nVidia GT300's Fermi architecture unveiled: 512 cores, up to 6GB GDDR5

"Beside the regular NV70 and GT300 codenames [codename for the GPU], nVidia's insiders called the GPU architecture - Fermi. Enrico Fermi was an Italian physicist who is credited with the invention of nuclear reactor. That brings us to one of codenames we heard for one of the GT300 board itself - "reactor".
When it comes to boards themselves, you can expect to see configurations with 1.5, 3.0 GB and 6GB of GDDR5 memory, but more on that a little bit later."

Read Full Story >>
brightsideofnews.com
The story is too old to be commented.
Major_Tom3303d ago (Edited 3303d ago )

Yeah, I figured we were gonna have to pay out the ass if we wanted a GT300. I really doubt 3GB and 6GB are going to be consumer grade tho.

EpsilonTeam3303d ago

Specially you Major Tom a known ATI fanboy. BTW Nvidia always end up with something better. Told you so.

Kakkoii3303d ago (Edited 3303d ago )

1.5GB and 3GB will be consumer. 6GB will be workstation. Of course, you never know what the AIB's might produce, one company might make a consumer 6GB card lol.

But anyway, this card is awesome. Don't know where your "pay out the ass" comment came from. No prices have been stated. And on the contrary, it was stated they it should be a good price. Nvidia knows they can't play the super high price game this round.

Major_Tom3303d ago (Edited 3303d ago )

Lol How am I an ATI Fanboy, this site really is so defensive. Nvidia has been coughing up blood lately it doesn't take a room full of scientists to figure that out.

Well let's hope it ignites a price war because that's what we all really want, Nvidia shoots low and then AMDATi will shoot low it'll be awesome for everyone.

EpsilonTeam3303d ago (Edited 3303d ago )

Yes Nvidia is no saint thats for sure but show some respect to the company that brought us CUDA,PhysX and 3D gaming, and now with their new architecture we're getting even more. Something you cant say for ATI.

aGameDeveloper3302d ago

NVIDIA may have brought us CUDA, but NovodeX brought us PhysX and there were several graphics hardware venders (not to game developers) that brought us 3D Gaming, before NVIDIA was a force with which to be reckoned. From Wikipedia:

"In the early and mid-1990s, CPU-assisted real-time 3D graphics were becoming increasingly common in computer and console games, which led to an increasing public demand for hardware-accelerated 3D graphics. Early examples of mass-marketed 3D graphics hardware can be found in fifth generation video game consoles such as PlayStation and Nintendo 64. In the PC world, notable failed first-tries for low-cost 3D graphics chips were the S3 ViRGE, ATI Rage, and Matrox Mystique. These chips were essentially previous-generation 2D accelerators with 3D features bolted on. Many were even pin-compatible with the earlier-generation chips for ease of implementation and minimal cost. Initially, performance 3D graphics were possible only with discrete boards dedicated to accelerating 3D functions (and lacking 2D GUI acceleration entirely) such as the 3dfx Voodoo. However, as manufacturing technology again progressed, video, 2D GUI acceleration, and 3D functionality were all integrated into one chip. Rendition's Verite chipsets were the first to do this well enough to be worthy of note."

+ Show (2) more repliesLast reply 3302d ago
chak_3303d ago

nah, those might be for scientists. Though that's impressive numbers.

As might be the price.

darkequitus3303d ago

Typical Nvidia. After all they said about running C++ on Larrabee and they do the same thing. Just like they did when ATI mention unified shaders. They were no use until Nvidia brought out their own interpretation. Let us see what the consumer part cost [a lot].

Lemon Jelly3303d ago

I bet this one card will cost more then my whole rig.

FordGTGuy3303d ago

For a brand new over the top graphics card though.

Kakkoii3303d ago

A card that's pretty much it's own computer now. (RAM, CPU, GPU, Motherboard.) lol.

snaileri3303d ago

+700 euros, that's for sure.

Show all comments (16)