Top
50°

NVIDIA "G80" Retail Details Unveiled

DailyTech's hands-on with the GeForce 8800 series continues with more information about the GPU and the retail boards. The new NVIDIA graphics architecture will be fully compatible with Microsoft's upcoming DirectX 10 API with support for shader model 4.0, and represents the company's 8th generation GPU in the GeForce family.

NVIDIA has code-named G80 based products as the GeForce 8800 series. While the 7900 and 7800 series launched with GT and GTX suffixes, G80 will do away with the GT suffix. Instead, NVIDIA has revived the GTS suffix for its second fastest graphics product-a suffix that hasn't been used since the GeForce 2 days...

Read Full Story >>
dailytech.com
The story is too old to be commented.
Silverwolf3860d ago (Edited 3860d ago )

Sounds like a monster of a card! But damn the wattage is high as hell just imagine running it on SLI mode. I wonder how much heat this card will emanate!?

kmis873860d ago

Does anyone know a price for the gtx series cards? 128 unified shaders sounds unbelievably powerful.

Deceased3859d ago

The card in the 360 is unified, so i gues ATi was right when they said NVidia was going to unified pipe architecture, becuase it is more efficient than dedicated pipes. I guess the 360 GPU is pretty powerful.

Silverwolf3859d ago

Can't wait to see ATI's new offerring.

Marriot VP3859d ago

Yah I forgot about that unified featured, it pleases the devs for sure.

Still happy the 360 GPU isn't even matched by any single video card yet. It's a R520, I believe, could be wrong. And I laugh in the faces of people buying dual or quad sli, who bury themselves over 1000 bucks for graphics and inch better.

Antan3859d ago (Edited 3859d ago )

R520 yes,which is what the X1800 architecture is, but the R580 architecture has been around since December 2005/January 2006 which is the X1900(1950) range, and the soon to be released R600 which will be Ati`s flagship card, looks to be quite a monster just like the G80. For those of us with bottomless pits of hard earned cash, these cards will be something to look forward to greatly !!!!!

Marriot VP3859d ago

oh okay thanks for clearing that up antan, I lost the interest in pc cards when I bought my computer. But it kind of sucked for me cause I bought it a year ago with a 64 bit 3200 amd processor for 160, but sure enough dual cores come out and split the price of single cores. Dang PC upgrades screw me everytime.

Antan3859d ago

LOl, always been the same im afraid, and it aint about to change!!!! A spot of overclocking could delay any upgrades for a spell at the very least, and its free of charge of course.........assuming you don`t push to far and fry your machine !!!! I usually build a new machine every 18 months or so, so come april/may ill see wots out there, i also expect these new cards will have refreshed models come that time which of course will see even more impressive (and expensive) cards on the market.

+ Show (2) more repliesLast reply 3859d ago
DJ3859d ago

Here's a link to a developers' forum where they discussed a bunch of stuff related to Cell's GPU-like capabilities. http://www.beyond3d.com/for...

"Sega had a booth to note which had Virtua Fighter 5 (in video form only) and more interestingly Sonic playable on both PS3 and XBOX360 side by side. One of the managers asked why the PS3 version looked better as far as lighting and contrast and accused Sega of connecting the 360 version up with composite cables and the PS3 with HDMI to make the difference hit home. At which point the rep showed us all that both units were connected with component cables and running on the same make and model television. The skeptics accused him of lying even after he turned the monitors to reveal both rear connections. He later stated that again the difference was the Cell and not the video card at the time."

"The reason that the PS3 already looks as good if not better than the 360 ((according to Valve(Half-Life 2) and Vivendi(F.E.A.R.))) from the booths was that the Cell processor was orignally going to be a graphics processor as well for the PS3 (yes there were originally going to be two in the system) but the developers at the show stated that once NVIDIA joined the Sony ranks they backed off of that plan and decided to let NVIDIA build the RSX instead. Now this is where the rub is though (form the developer's mouths not mine) although the RSX (PS3 graphics card) and the XENOS (XBOX360 graphics card) are comparable the ability to use the Cell to share the rendering of textures and special effects gives the PS3 a decided advantage and that difference will only grow as time passes."

With this information at hand, I'm starting to wonder whether the reason nVidia hasn't released the full architectural details of the RSX is due to the fact that it has features that aren't supposed to be revealed to the public until later PC GPU cards are released. It was 3 years in the making, so it obviously uses a lot of custom tech that needs to stay under the radar due to competitive purposes (i.e. ATI). It's great that devs are also utilizing the Cell for rendering and shader processing, especially since it's built to handle those tasks with ease.

Daytona3858d ago (Edited 3858d ago )

Your previous information you copied and posted is one person's so called heresay speculation only.
however.................
http://news.teamxbox.com/xb...

nambo3853d ago

The reason why Sony isn’t releasing the GPU specs is because they know it can’t compare to the 360’s. The Nvidia chip was a rushed decision after they realized the Cell processor couldn’t handle the job of both CPU and GPU. I assume that was over two years ago considering the PS3 was supposed to launch last year, and delayed due to problems with Blu-Ray drives. As far as the GPU is concerned, the PS3 is aging and it’s not even out yet. The 360 on the other hand is still cutting edge after being on the market for almost a year. And I hear it may be possible for the 360 GPU to run DX10. If this is true, the 360 will pull far ahead of the PS3.

Say's you3859d ago

Now that would deffinitly overheat your PC if it goes that high or even higher then that infact why didn't Nvidia make that graphics card for the PS3 why are they holding back on the power of the graphics card that they are making?.

Show all comments (14)