240°
Submitted by Orpheus 775d ago | rumor

Rumor: GTX 680 Performance Revealed

The latest rumors of NVIDIA's GTX 680 are pointing to its performance compared to the previous generation GTX 580, as well as AMD Radeon HD 7970 and 7950. The benchmarks, courtesy of VR-Zone, show the GTX 680 coming in anywhere from 0-28% faster compared to AMD's Radeon HD 7970. (NVIDIA, PC, Tech)

Need more votes
Is this rumor true? Rumor votes 6
JsonHenry  +   775d ago
If those numbers are true it is not exactly head and shoulders better than the newest AMD flagship. But this isn't supposed to be the new high end card is it? I thought they are releasing the first step down from flagship first then releasing the actual high end card a little later? Or am I completely incorrect?
#1 (Edited 775d ago ) | Agree(4) | Disagree(0) | Report | Reply
T900  +   775d ago
Well the more important question is, there is no software to strain GPUs. Unless you want to be playing on multiple screens with 3D or want to run some insane scientific calculation.

Most games run fine 8800GTX from 5 years back and if you want to be stressing out DX11 effects then even the most stressful game of this gen like BF3 runs great on a 5870 from 3 years back. Leave out the few stressful games and there really isnt much to warrant an upgrade.

Hence someone really has to question whats the point of more powerful hardware. GPU makers might as well close shop until next gen consoles.
#1.1 (Edited 775d ago ) | Agree(3) | Disagree(1) | Report | Reply
JsonHenry  +   775d ago
I play in 3D on multiple monitors. That is why. :)

@ Below. I only have the 27inch ACER full 3D/LED monitor for 3D. Then two more acers for 3 screens. I don't do all three in 3D. I saw it once where all 3 monitors were 3D but the glasses cut off the two on the sides and it just didn't look right to me. I figured with the huge hit to performance and peripheral vision being so lousy I would save myself some money and just game in 3D on one monitor.
#1.1.1 (Edited 775d ago ) | Agree(2) | Disagree(0) | Report
T900  +   775d ago
Do that too bro. I personally have a GTX 580 Sli.

3 screens that are all 3D enabled. Its great playing games on 3 screens, 3D not so much.

With the level of power my current GPus are outputting i dont think i will be upgrading any time soon. Mind you these GPUs only get stressed when on 3 screens.

For the normal public who want to be playing on 1 screen. A GPU like 5870 is more then enough. Set aside the few games like BF3 which probably wont run on ultra settings. Anything else will get stream rolled.
Red_eyes_Gremlin  +   775d ago
I understand how you think but I don't agree with you, Because think about it : There is only a question of time before I think a lot of games will start to need more gfx.

Nvidea physx: Future games in my mind:

Black And White 3 - With the power from the card that comes after the 680 (or is 680 there top card?) And the new Physx engine releasing soon (look @ GeForce homepage) The game could look like Monsters Inc or Close.

Unreal engine 4: The new physics engine and sick gfx - OH BTW - THIS IS NOT A LIE -! Will be in the new UREAL 4 (look for the interview on the homepage).

Sry didn`t post link:

There you go : )
http://www.youtube.com/watc...
1 more
http://www.youtube.com/watc...

Found it on youtube instead : )
#1.1.3 (Edited 775d ago ) | Agree(1) | Disagree(0) | Report
OpenGL  +   775d ago
Well that's definitely what a lot of people were expecting based on the fact that its memory bus was only 256-bit, but I find it hard to believe that Nvidia has a card considerably more powerful than this just over the horizon(that isn't dual-GPU) seeing as it seems to beat the Radeon 7970 in everything and by fairly significant margins in some games. We should expect that from the flagship not the successor to the GTX 560, but who knows maybe Nvidia's new architecture is really that great.

Also if these benchmarks are real than it seems pretty clear that Nvidia's shader architecture has changed as this card is nowhere near 3x as powerful as the Geforce GTX 580.
TABSF  +   773d ago
Yep Nvidia defiantly changed the architecture

It needed to happen Fermi was 529mm2, Largest chip I known of.
If they went with 1024 Cuda cores on Fermi even at 28nm we'd be looking at around 700mm2, which is just idiotic it needed to change.

Now they have a smaller die than Tahiti, it requires less power than Tahiti and probably run cooler than Tahiti.
Best part is could be between 10-20% faster than HD 7970

Its no wonder Nvidia put the breaks on GK110, they can make more adjustment, improve that chip more and release it as GTX 780
Nvidia said themselves they were surprised with HD 7970 (lack of) performance
ninjahunter  +   775d ago
The pentagon called...
Red_eyes_Gremlin  +   775d ago
A NEW GENERATION OF GAMING IS ABOUT TO START PEOPLE :D

Enjoy the videos :)

There you go : )
http://www.youtube.com/watc...
1 more
http://www.youtube.com/watc...
Fkabbz  +   775d ago
http://videocardz.com/30564... This is a link to the original article. This is from February 12th. It is fake!
ChrisW  +   775d ago
So... the information on the link you posted is also fake?

Add comment

You need to be registered to add comments. Register here or login
Remember
New stories
40°

The Top 5: Reasons why we got a Vita

13m ago - With the console lists completed, the Gamer Horizon crew is moving on to list off the reasons the... | PS Vita
30°

10 Years Later: Unreal Tournament 2004

29m ago - Pixel Critique's month-long look back at the 30 games that defined 2004, wonders if there will ev... | PC
40°

Recreating the Nets’ Hybrid Lineups in NBA 2K14

32m ago - GoodGameBro writes, "The Brooklyn Nets have reinvented their team using hybrid big/small lineups.... | PC
20°

Demon Gaze Review - RPGFan

35m ago - "Reward is the keystone to a punishing video game. Without it, a game crumbles like poorly stacke... | PS Vita
Ad

Enter to Win a PS4 and More!

Now - We are buying one lucky N4Ger a PS4 just for commenting on any N4G story! | Promoted post
10°

Dark Souls II Review | High-Def Digest

35m ago - While disappointed that the PC version retains the geometry and visual leaks of the last-gen PS3,... | PC