Top
280°

NVIDIA Proves PC Gaming Superiority - 1000% Gain in GPU Power

Gamers Nexus writes, "First person shooters are on the edge of coolness," said Tamasi at the conference, "and reality is a really, really hard thing to get right. We know what it looks like." It's been said before, but given the context of the discussion, he is absolutely correct. We are able to -- as both gamers and "real" people -- criticize things like smoke and shading technology because of our exposure to real-world environments where, of course, shadows and smoke originated. When playing cartoony games like Torchlight, however, it is much easier to overlook cell-shaded practices and any visual artifacts that could be passed off as "fitting of the theme." The inherent difficulty in reproducing actual environments within a 3D world has only spurred nVidia on to meet the challenge, urged Tamasi, "rapid iterations are the key to creativity.""

Read Full Story >>
gamersnexus.net
The story is too old to be commented.
Substance1012623d ago (Edited 2623d ago )

They can jump that figure to 100,000% it wont make a dam difference as long as majority of games are limited to console ports.

Lelldorianx2623d ago

If you take a look at the article, you'll see that Tamasi had slides comparing the Xbox 360 to 2015 predicted technology. See that insane gap? Maybe I should title this "NVIDIA Flaunts PC Gaming Superiority" :P

jony_dols2621d ago

Tamasi's comparisons are pointless. By 2015 we will have the next-gen consoles.

It's like if Tamasi was comparing the GPU power of current DX11 PC's to the PS2.

Pointless and irrelevant.

Kakkoii2621d ago (Edited 2621d ago )

@jony_dols: And the next-gen consoles will likely be using current-gen GPU's, which won't be anywhere near as powerful as the ones coming in the next few years. As fabrication size becomes smaller the amount of transistors per area increases exponentially. We are getting to the point now where it is an immense increase in transistors with each new node size. Later this year 28nm GPU's are coming out, and then 16nm hopefully by 2013, and 11nm by 2015.

http://en.wikipedia.org/wik...

We are getting quite close to the limit, where we are working at the scale of individual atoms. To give a little perspective, 1 helium atom takes up roughly 0.1 nanometers, or 100 picometers. Soon we will have to transitions to new chip types instead of just smaller sizes, such as carbon nanotube and graphene chips, and perhaps further in the future, quantum computers. As well as producing chips that have multiple core layers.

DeadlyFire2621d ago (Edited 2621d ago )

Am I the only one that thinks its strange that Xbox 360 GPU is mentioned here at all with rumor of an AMD GPU in Wii 2/NES 6?

Another hint?

fluffydelusions2622d ago (Edited 2622d ago )

Exactly this. Does it matter how much potential hardware has when no one is using it? Why look at 2015, current tech has tons of potential but it isn't even utilized and consoles are hardly to blame. There are tons of PC exclusive developers out there. Didn't Nvidia give a huge chunk of cash to Crytek and it was supposed to be the flagship game to showcase DX11 potential. What happened there? Well I'm not sure but I know Nvidia took down Crysis 2 from the supported DX11 page. You also have to look at it from a development perspective. It takes a huge amount of both time and money to utilize this tech to full capabilities and it may not be financially worth it in the end.

soundslike2621d ago

Its not even just console ports, most games, sans from the bigger studios, will have familiar underbellies with fancy visual effects plastered on.
Just look at STALKER, a very very amazing looking game and atmosphere, but so much of it is built upon something they started years before the effects in the final version were available, so it seems almost like a "remake" of itself so-to-speak. The textures were awesome, the lighting was great, and the AA was demanding, but it all revolves around a dated center because they can't compete with bigger companies who can afford to restart from scratch to really flesh out the new technology.

ATiElite2621d ago

"It takes a huge amount of both time and money to utilize this tech to full capabilities and it may not be financially worth it in the end."

your right....especially when the company (Crytek) turns it's back on the PC community by making a DX9 console port to "sale 10 million units" on the console but so far trails Homefront which has barely 1.5 million units sold.

BF3 which is DX11 only will CRUSH crysis pooh (2).

macky3012622d ago (Edited 2622d ago )

It is ridiculous,.. they are comparing 6 year old console now, with a GPU ' performance predictions' of 2015 ,..

I really wonder what that gpu will cost when launched at 2015 ,.. and what the price of 360 will be in 2015,.. probably around 30usd (if it even exists by then) ,.. and that card 600+

fluffydelusions2622d ago (Edited 2622d ago )

Yep, and when new consoles launch it will be the exact same as now. PC guys bashing consoles because the hardware is outdated and holding back games. Tech moves to fast for the average person to keep up, most people don't want the hassle. I used to constantly update my PC years ago but long since stopped. I'm quite happy with consoles and just being able to pop a game in and not have to worry about how well it run. That said, my lappy still works great to play any current PC game out there but I mostly just use it for Starcraft 2 and other games that don't hit console e.g. Amnesia etc.

macky3012621d ago (Edited 2621d ago )

Actually I am really glad consoles exist,.. because this shit is getting ridiculous,.. I've bought 1800eur laptop few months ago,.. And if it weren't for consoles,.. I would probably have trouble even by now,.. because they are very known to not optimize shit when putting stuff on PC,.. It is only a few developers that actually not use brute force on everything on pc,..

And it is kinda sad that I have more vram on my Gpu than the 3 PS3s or 360s have as a whole,..And real ram for about 12 of 13 of these systems,..

Yes shit usually looks a bit better,.. but I really usually don't see that leap in visuals apart for reso and higher resolution assets (and AA, aniso) that I really stopped giving a fuck,.. from watching it from TV,..

If the game is great I will want to play it (Hell I even fucking loved Donkey Kong on the wii ,.. and that shit runs really low reso , you can see jaggies from mars)

Probably got my Pc only for some indi exclusive shit and Starcraft 2 and Diablo3,.. I sometimes buy some multiplat now and then, because they are cheaper,.. but coming from exclusive PC gamer for so many years,..I kinda love some games that are console exclusives,.. They are just better quality and polished games there, and I am really shocked what I was missing out for so many years,.. Even some damn previous gen(or2) games,.. are just flooring me still, how well designed they are,..

STONEY42621d ago (Edited 2621d ago )

"I've bought 1800eur laptop few months ago,.."

Jesus Christ... protip: Don't waste money on laptops for PC gaming. Laptop are terrible for that. You can build a VERY HIGH end dual-GPU PC for that money that would be maxing out things at over 60fps, in a few years still 30 at 1920x1200, lasting over 5 years. The 8800GTX launched nearly 5 years ago, that thing is still running multiplat at higher than console quality. I wish I had that kind of money.

Anyways, consoles honestly haven't been helping PC optimization. If anything, it's worse. That fact that GTA4 runs at nearly the same framerate as Crysis, maxed, is sad. Just Cause 2, while it looks really great, doesn't justify, once again, not running on constant 60fps, while other much better looking games do. Dragon Age 2 runs worse than the first, and actually LOOKS worse (judging from demo of DA2, pre-driver update) Mass Effect 2 looks like a whole generation ahead of Dragon Age 2, but runs at over 100fps. What is that about!?

To be honest, I wouldn't be suprised if the Witcher 2 (PC exclusive for now) ran the same Dragon Age 2, except with it's really mindblowing graphics.

Enate2621d ago (Edited 2621d ago )

STONEY4 Don't go pulling the its consoles fault that devs sucked on said games optimization for PC. They just sucked at it period. If they wanted to they could have optimized it better or optimized it at all. Has nothing to do with console put the blame where it needs to be. Either on the dev or the game engine or both.

ct032621d ago

Reading would help. The 10x comparison is against 2011 hardware.

Enate2621d ago

Macky exactly man, its what I have been saying for years now. The polish and overall better quality in a lot of console exclusives is just far better. An I to like you have my PC for a few things here and there like the Witcher 2 even though I didn't like Witcher 1 very much this one looks much improved. I have been saying for so long now that if PC's are so much more powerful then why is my 275GTX buckling on Dragon Age II at 1024x768. The Optimization as of late has been down right garbage on PC.

An the funny thing is a lot of the times as long as it is using some new tech. It gets applauded because its hard to run. When it shouldn't be hard to run at all its just not optimized well.

tdrules2622d ago

BREAKING NEWS: New hardware beating old hardware.
In other news, coding for different systems still not as good as coding for one

Lelldorianx2622d ago

Read the article, there's a lot more technical information on game development than what you seem to be taking from the first sentence.

PS360PCROCKS2622d ago

jeebus!!! 32,039.8 floating points!?!?!!? LMAO!! I remember when the xbox 360 came out and everyone was like "whoa 228!!"

waterboy2622d ago

strangest thing is why do they need to prove it that pc has this "superiority?"

fluffydelusions2621d ago

Because Nvidia wants you to buy their cards of course :)

Show all comments (35)
The story is too old to be commented.