Top
490°

NVIDIA Fermi GF100 Video Cards To Support 32x Anti-Aliasing Mode

NVIDIA has been releasing all sorts of GeForce GF100 Fermi video cards facts on Facebook and Twitter for the past month. Tonight they just posted up that the GF100 supports a brand new 32x anti-aliasing mode for ultra high-quality gaming

Read Full Story >>
legitreviews.com
The story is too old to be commented.
Nihilism2909d ago (Edited 2909d ago )

People who intend to buy these cards like myself do, butthurt ati fanboys of course do not,

"3d gaming is useless, 32xaa sucks"

"but eyefinity is TOTALLY relevant and necessary "( according to them)

"therefore ati wins"

Jesus christ, I didn't even comment because the B.S was inevitable.

But it will look amazing in all older games that can run 32x without a hitch, like bioshock and the witcher, I force 16qaa on all games that are 2 or more years older, like those above, and I can definitely notice the difference between 16q and 8aa, people that can't, need to throw in the towel and get a console, if your paying all that money and don't care about IQ

Pandamobile2909d ago

Anti-aliasing is a big deal. I'd sacrifice texture resolution and shaders quality for anti-alising. 32x looks great. I tried it out in TF2 (where aliasing can be really noticable) and it looked flawless. I was able to force 32x AA because I have a dual GPU card. One GPU for rendering, and the other devoted strictly to anti-aliasing.

Now they can do that sh1t on one card? :0 is all I've got to say.

kaveti66162909d ago

I would take a graphics card that could support higher resolutions over a graphics card that can do better anti-aliasing. At the end of the day, anti-aliasing is designed be a solution for jaggies created crappy graphics. Games like Crysis don't really even need anti-aliasing. It's nice to see that NVIDIA is pushing further, but there's a limit to how effective anti-aliasing can be. Too much AA makes games look weird.

Nihilism2909d ago

I think crysis looks better without AA, strange but true, on the very high setting crysis uses parallax occlusion mapping, which is not compatible with AA ( full scene aa, edge aa still works, which is the best type for crysis anyway as it affects the folliagem which is the only real jagged part), so I believe it actually has no effect when enabled on very high, unless there is a config workaround.

Perjoss2909d ago

I mostly play my games @ 2x or 4x AA, anything over 4 is overkill really.

MegaPowa2909d ago

DO NOT TURN THIS INTO SOME KIND OF GRAPHICS CARD WAR like the console fanboys do with their systems.

likedamaster2909d ago

How bout some pics for these "supposed" 32X AA. All this crap about this and that but no proof. I want to see the difference and the result, damn it.

+ Show (4) more repliesLast reply 2909d ago
Blasphemy2909d ago

Nothing but smoke and mirrors until the actual cards get here. Come one nvidia your late to the game. By the time your cards come ATI will be be putting their next gen cards on the market.

Nihilism2909d ago (Edited 2909d ago )

actually ati have the 6000 series scheduled for late 2011, by then I suspect nvidia will have their next gen out also

@Pandamobile

2 years from the 5000 to the 6000 series isn't long considering it's a complete re-design

Kakkoii2909d ago

What, you expect Nvidia to just sit around and say nothing until they have enough chips ready to ship? That's corporate suicide.

MajestieBeast2909d ago

Any idea when these are coming cause i think i might need a new card soon.

Nihilism2909d ago

jan-march, march at worst case scenario

TheIneffableBob2909d ago

NVIDIA should be announcing something at CES in January.

PotNoodle2909d ago

If NVIDIA's cards are that much of a leap above my current 2x 5850's, then that will be more of an incentive to go ahead with another gaming build for the other room by the end of 2010.

Well, keep the new NVIDIA build for the main room and move the one i'm currently using to the other, but still - i can't wait to see what these cards can do, because they usually do have the power advantage, and already being impressed by what ATI's 5800 and 5900 series can do, it just makes me all that more excited.

Show all comments (40)
The story is too old to be commented.