NVIDIA has been releasing all sorts of GeForce GF100 Fermi video cards facts on Facebook and Twitter for the past month. Tonight they just posted up that the GF100 supports a brand new 32x anti-aliasing mode for ultra high-quality gaming
I mean really......
People who intend to buy these cards like myself do, butthurt ati fanboys of course do not, "3d gaming is useless, 32xaa sucks" "but eyefinity is TOTALLY relevant and necessary "( according to them) "therefore ati wins" Jesus christ, I didn't even comment because the B.S was inevitable. But it will look amazing in all older games that can run 32x without a hitch, like bioshock and the witcher, I force 16qaa on all games that are 2 or more years older, like those above, and I can definitely notice the difference between 16q and 8aa, people that can't, need to throw in the towel and get a console, if your paying all that money and don't care about IQ
Anti-aliasing is a big deal. I'd sacrifice texture resolution and shaders quality for anti-alising. 32x looks great. I tried it out in TF2 (where aliasing can be really noticable) and it looked flawless. I was able to force 32x AA because I have a dual GPU card. One GPU for rendering, and the other devoted strictly to anti-aliasing. Now they can do that sh1t on one card? :0 is all I've got to say.
I would take a graphics card that could support higher resolutions over a graphics card that can do better anti-aliasing. At the end of the day, anti-aliasing is designed be a solution for jaggies created crappy graphics. Games like Crysis don't really even need anti-aliasing. It's nice to see that NVIDIA is pushing further, but there's a limit to how effective anti-aliasing can be. Too much AA makes games look weird.
I think crysis looks better without AA, strange but true, on the very high setting crysis uses parallax occlusion mapping, which is not compatible with AA ( full scene aa, edge aa still works, which is the best type for crysis anyway as it affects the folliagem which is the only real jagged part), so I believe it actually has no effect when enabled on very high, unless there is a config workaround.
I mostly play my games @ 2x or 4x AA, anything over 4 is overkill really.
DO NOT TURN THIS INTO SOME KIND OF GRAPHICS CARD WAR like the console fanboys do with their systems.
How bout some pics for these "supposed" 32X AA. All this crap about this and that but no proof. I want to see the difference and the result, damn it.
Nothing but smoke and mirrors until the actual cards get here. Come one nvidia your late to the game. By the time your cards come ATI will be be putting their next gen cards on the market.
actually ati have the 6000 series scheduled for late 2011, by then I suspect nvidia will have their next gen out also @Pandamobile 2 years from the 5000 to the 6000 series isn't long considering it's a complete re-design
Late 2011 :\
What, you expect Nvidia to just sit around and say nothing until they have enough chips ready to ship? That's corporate suicide.
Any idea when these are coming cause i think i might need a new card soon.
jan-march, march at worst case scenario
NVIDIA should be announcing something at CES in January.
ok thx for the info.
If NVIDIA's cards are that much of a leap above my current 2x 5850's, then that will be more of an incentive to go ahead with another gaming build for the other room by the end of 2010. Well, keep the new NVIDIA build for the main room and move the one i'm currently using to the other, but still - i can't wait to see what these cards can do, because they usually do have the power advantage, and already being impressed by what ATI's 5800 and 5900 series can do, it just makes me all that more excited.
sounds interesting, but I care more about the high end desktop chips/cards. I suppose they have people working on them behind the scenes. But they make less profit so they won't be out at the time best for us :( oh well if they want to let ATI take over the market it's their loss.
Ati deserves the lead they have at the moment, and god knows they need the $'s, but I also hope nvidia hurries up, CES is on jan 7, if we don't hear some majorly huge info on the consumer model cards then it will be safe to assume nvidia are in a bad way
I stepped away from PC gaming for a while but after buying L4D2, I'm starting to come back. And this news is awesome to hear. By the time cards like this come out, I'll hopefully have the money to build a new rig.
glad you bought it on the pc. nobody should ever buy valve games on the xbox they are just nowhere near as good. that said i thought l4d2 was bad - or relatively so
32aa, ROFL. What will they think of next to get you to buy it, suckers!!! 4aa is mainly enough and a game with very crap geometry 8aa. You'd notice aa vs geometry with BF2. And pc gamers, do yourself a favour and open those pc eyes:) When you play a pc game, your sitting 2ft from your monitor. WHen your playing a console game, your playing on a massive screen from much further away, hence it's not directly in your face so to speak. I know i'll get disagrees, but at least i tell the truth. Been a Pc gamer all my life. But i have some love for consoles as well. As it stands i have, PC/PS3/360/Wii. and i'd view like this. PC - sweet looking graphics and m/b/ Depending on type of game. And you need a rig to handle it. 360 - Some nice looking games, with x2-x4aa, Shadows can sometimes be dithered. PS3 - Alot of Nice looking games, some have aa, some don't. Wii - SD, Low quality textures and hardly ever any aa. And the best selling console goes to Wii. Is it price. Is it the controller. Or is it, most people don't care. Best graphical game on PC is Crysis and i'm guessing until Crysis 2 arrives that'll be the case.
I think Crysis, Cryis WH, HL:E2, Shattered Horizon, FC2. All look better than anything on a console with the right rig and settings. And you don't need that good a rig either (look at mine. L4D 1 & 2 looks miles better on a PC than the 360. Dirt2 at over 60fps @ 1080P. "WHen your playing a console game, your playing on a massive screen from much further away, hence it's not directly in your face so to speak. " What is this suppose to mean. Before I changed my PC from HTPC to a normal Mid tower, it used to sit under my TV. I have on of my 360's connected to monitor. So whether using a PC/360 with a TV or Monitor, the experience is good. Though I will give the nod to a monitor.
If you can't tell the difference between console and PC graphics then you either have a crap computer or you're freaking blind. Don't get me wrong, games like Killzone 2 and Uncharted 2 looks great on the consoles. But compared to the PC the textures are crushed and lack details. Plus cards like these will give birth to the nextgen of consoles so quit acting all butt hurt.
"You'd notice aa vs geometry with BF2. And pc gamers, do yourself a favour and open those pc eyes:) When you play a pc game, your sitting 2ft from your monitor. WHen your playing a console game, your playing on a massive screen from much further away, hence it's not directly in your face so to speak. I know i'll get disagrees, but at least i tell the truth." Does any of this even make sense? First, you actually have to make sense to speak the truth. Second, like dchalfont said, there's a screen size to screen distance ratio which you clearly have no knowledge of, and third, PC gaming has sold more than all 3 consoles combined, so no, you're wrong on that too. A PC GPU doesn't just do Anti-Aliasing. This is just another feature which none of the consoles you're trying to defend here even have. No, I'm sorry. They have it, it's just such a minuscule amount, that they might as well not mention it. Sorry, but when a PC can support resolutions up to 2560x1600, with 16xAA, framerates that are unlimited, higher resolution textures, etc., compared to a game like Uncharted 2 which runs at 1280x720 with 2xAA and a framerate that is locked at 30 FPS, and still think that consoles have the same muscle as a PC, you can safely classify yourself as an idiot. Here's a better thing to ask, really. Do you even know what anti-aliasing is?
I'd take a 24" monitor at a 2.5 foot distance over a 60" TV at 8 feet any day. You're so much closer to the action when the screen takes up most of your field of vision. With a 60" TV at 8 feet, your field of view is like 30 degrees. When you sit 2 feet from a 24" monitor, your FOV is like 65 degrees. And you really don't understand what anti-aliasing is if you think you can cover it up by adding more geometry.
post an image comparison between 16xAA and 32... cuz I dont even feel much difference from 8x to 16...
Can Nvidia please mention the pricing. I just bought an Ati 5850 and im impressed don't see myself changing it anytime soon. My card loads at 55c which is amazing compared to my gtx280 loading at 90c(lasted 1 year).
Please release these cards....ATI has no competition at the moment.
I will pawn my 5850 for one, but they are still not out yet, and might come at a complete opportune time before I head off to europe for 4 months, so I guess I could wait. BTW dchalfont, ATI's new architecture is slated for 2010 not 2011, unless something has changed. http://www.fudzilla.com/con... In the end, I might be dropping some cold hard cash on a true 120Hz monitor(going back from the HDTV lol) and getting me a new Fermi + 3D Nvidia Vision. BTW 4-8x Super Sampling >>> 32x multi sampling IMO IMO IMO
I play PC games on my big screen sometimes (LAn parties) so 32x AA sounds nice but unless it's connected to an Eyefinity set up I don't see the need for it. I think Nvidia has dropped the ball. ATI has sold 300,000 HD 5000 series cards and counting in two months. All Nvidia has to counter with is 32x AA? HD 5970 is a flat out final stage end boss monster of a card that's only gonna drop in price whenever Fermi is finally released. i wish Fermi would come out so I can get me a HD 5970.
Uhh.. How exactly is this their only counter? Nvidia is merely stating one of the new features of their upcoming Fermi architecture. They already released most of the details of what's new in Fermi with the whitepaper they released a couple months back: http://www.nvidia.com/conte... Although a lot of what's in it you probably don't have the tech knowledge to make sense of. But it's a good read none the less.
S.T.A.L.K.E.R. Clear Sky or S.T.A.L.K.E.R. Call of Pripyat in my opinion have the best graphics on the P.C. (yeh alot of bugs during install but patch it and it runs smooth) The game is far better optimized to run on multiple configurations than Crysis. Pop the game in, Max out the graphics an just watch the sunrise through the trees and you'll see what first rate real time rendering is all about. Or wait till after midnight when it's pitch black, turn off your flash light and notice the moonlights real time reflections then set off a fire anomaly. Plus your gun doesn't look like a piece of black rubber in your hand but a weapon with movable parts unlike Crysis. Consoles and onboard VGA's need not apply the Sun rays alone will turn your frame rate into a slide show.
Stalker has the advantage of the new multi-threading built into DirectX 11. As long as your on Vista/7 and have DirectX 11 installed, you can take advantage of it. It doesn't need a DX11 GPU to work. Games such as Stalker that are made around DX11 are seeing the huge increase in performance this brings. So it's not so much that STALKER is optimized more, but that it's taking advantage of some new DirectX technology to increase game performance. You'll see the same happen with Crysis 2. It's going to run so well and looks so amazing. I've recently been playing Dirt 2, and I'm amazed at how well it runs. It seems it's coding is very GPU based.
KaKKoii the 384 bit memory interface coupled with the "possibility" of 6 gb gddr5 dram is great. ( I believe ATi should've increased their memory interface w/ the 5000 series) Plus the 32 bit integers with 64 bit extensions is a plus. I have always like the GpCpu movement and Nvidia is at the forefront of that? Lmao But I can write something just as impressive and put it on some nice paper or even a power point presentation. Just like Amd with their HD 6000 series cards. Until Fermi is screwed together on a 40nm Die mounted on some silicone, smashed between a heat sink, plastic, and a fan and firmly entrenched in a x16 pcie slot being benched against CYPRESS or KING HEMLOCK I can careless. words on paper mean nothing when there's no hardware to back it up. Eyefinity is impressive, 7680x1600 or higher across six screens is awesome. Crushing Nvidia while using less electricity and producing less heat is "PRICELESS". Not to mention HD 6000 is on the horizon.
If Nvidia released this whitepaper 2 years ago, then yes it wouldn't be worth a damn thing. But this is released only a few months before Nvidia's expected release date of the cards. They can't make any real changes to the chips at this point, nor when the whitepaper was released. New GPU's take years to develop and months to crank out the first version of the chips, and then more months for another revision or 2 of the chips to get yields up. Thus the whitepaper details what features are going to be in the new architecture. The consumer cards won't be coupled with 6GB. That would be a ridiculously expensive card lol. That's for the workstation Tesla cards that need large amounts of RAM for 3D modeling, scientific calculations and all that jazz. But they will have some more RAM than ATI's 5xxx series. And of course Nvidia is at the forefront of it. Where have you been hiding all these years lol? Nvidia has been very outspoken about GPU computing for quite some years now. That was the whole point of their "CUDA" architecture. It stands for "Compute Unified Device Architecture". CUDA isn't software, it's an actually hardware architecture to make general computing on a GPU easier, faster and more of a reality. It's also one of the main reasons Nvidia's GPU's are so much bigger than ATI's, and thus use more electricity and generate more heat. The only thing GPU computing ATI's really pushed for is OpenCL. But all you need to do is support OpenGL for that. It's software based, so it's nothing revolutionary when it comes to GPU computing. And even with that Nvidia was the first one out with stable WHQL Certified OpenCL drivers haha. CUDA makes it much easier by being able to execute C/C++ code on the GPU using some extensions added in. (But with Fermi, you won't even need those extensions anymore. Can natively execute the code like a CPU can.) And the 6xxx series and their Bulldozer/Bobcat CPU's aren't on the Horizon. Their coming out in early 2011 if all things go smoothly for ATI. Which they hardly ever do when it comes to whole new architectures, especially while also launching on an all new fabrication size (28nm). And Bulldozer/Bobcat aren't going to have a GPU integrated in their first versions. http://www.anandtech.com/cp...
Stalker Call of Pripyat added Tessellation to the water, an enhancement to volumetric smoke and another feature dealing with wet spots, like water dripping of your weapon when it rains. these are Directx 11 features in the game I'm only running directx 10.1. Directx 11 cards will easily push Crysis because they are making it for the consoles so the PC will get some ported crap. I'll post the HD 6000 link later, code name Bulldozer and Bobcat. Bulldozer being a GPCPU on a 32nm die.
Yes I'm aware of the DX11 enhancements. I merely left them out in my comment to you because I saw you don't have a DX11 GPU, so it had nothing to do with commenting on your experience of the game. DX11 is indeed great. And actually no, Crysis 2 won't be ported crap on the PC. Crytek is still a dedicated PC gaming company. That's where their passion is, pushing games past the limit. They are merely expanding their games to the consoles for some increased revenue. The CryEngine 3 has been configured for easy development on PS3/360/PC all at the same time. So there's no "porting" that needs to be done. You make the game on the PC, for the PC. Then you reduce and optimize things according to a consoles needs/requirements and output it for the consoles also. (p.s, there's a little "Reply" button below peoples post's. You should try using it.. It keeps things more organized, and people can be fully aware of what post your replying to.)
N4G is a community of gamers posting and discussing the latest game news. It’s part of NewsBoiler, a network of social news sites covering today’s pop culture.