Top
210°

NVIDIA Kepler GK104 Gaming Performance Figures Exposed

Gaming performance results of Nvidia’s upcoming Kepler based GK104 chip has been exposed over at Pcinlife forums.

Read Full Story >>
wccftech.com
The story is too old to be commented.
T9002074d ago (Edited 2074d ago )

Hardware is useless without the software. Games are now designed with 6 year old hardware in mind. Only people really able to utilize this tech are people who want to play on Multiple screens (oh wait the last gen GPUs already did that). This time you can have multiple screens and 3D all at once.

For anyone want to play at lowly 1080p the last 2 gens of hardware is more then enough.

@below

U r fooling yourself if you think any of those will not run maxed out on last gen GPUs, unless they are terribly optimized.

Lol at Prey 2, bioshock Infinite, i assure you both will be a console ports.

Orpheus2074d ago (Edited 2074d ago )

War of Roses, Planetside 2 , Prey 2 , PROJECT CARS, ANNA, BIOSHOCK INFINITE, Says hi to you .......

@Up

LOL y not talk abt the other games i listed ?
Prey 2 uses PRT (DX11.1 feature) , Tessellation (see the developer interview taken by nvidia)
Bioshock uses Tessellation nd high res textures to require good enuf GPUs

Pain_Killer2074d ago

The DX11.1 + Tessellation features for PC sound great but there's a downside to it, They would probably arrive within a Patch/Update later or the game itself would be delayed. Id itself commented that they don't see PC as their leading platform and the current market belongs to console.

Look at Crytek, First develop a console port even though it was us who made them popular but ended up providing us a DX9 port and a patch which was delayed for 5 months.

But still CryEngine3 shows the potential of what could Crysis 2 would've been if it was developed primarily for the PC like Frostbite 2.0, Another great engine currently in use is Ureal Engine 3.9 with its PhysX Technology which in my opinion is a more immersive effect than Tessellation DX11.

reynod2074d ago (Edited 2074d ago )

@Orpheus

Lets wisen up a bit and not fall for gimmicks like our console brethern.

As pain_killer mentioned, the game is initially developed with console hardware in mind. The GPU makers have DX11 features patched in later. They really arent even efficient most of the time. Hell most of the time you cant even see a difference.

I personally have a very high end setup, i dont think i will be upgrading anything until the next consoles are out. If thats how the GPU makers want things then thats the way they are gonna get em.

SephirothX212074d ago

Are you mentally disabled? You think Infinite will be a port. Go look at Irrational Games' history. Their background is pc. Accept the fact that high-end pcs are better than consoles and move on.

T9002074d ago (Edited 2074d ago )

Lol dude i am a PC gamer, nevermind high end pc gaming even lower end pc gaming is better than console. Infact i dont even own a console.

However its time we faced the truth, games are developed for the lowest denominator. When developers are like Crytek are first designing their games for consoles, i see no reason why the next bio wont be a port.

All of this makes a compelling argument against buying any more upgrades for the PC.

SephirothX212074d ago

Sorry I jumped the gun there. I'll be pissed if Bio is a port because I spent a lot of money on a pc in the Summer and I want to get use out of the hardware. Sooner next gen consoles come out the better so they start taking advantage of the hardware out there.

ChrisW2074d ago

T900 said, "Hardware is useless without the software. Games are now designed with 6 year old hardware in mind."

The most important thing to think about is when next gen consoles come out, such power will start to look a little more practical.

BTW, I remember when I was scoffed at for picking up a quad-core CPU when most AAA games were being built for single or duo. Now, a quad-core is part of the "recommended specifications".

Mikhail2074d ago

well that was an off topic comment...

I expect the performance of nvidia 600 series amd radeon 7000 series to be a repeat of the present generation of video cards. 680 would take the crown while the 670 and 7970 duking it out..the same goes for 660ti and 7950. I just want to be conservative on my expectations. The key will always be price for pc hardware along with temps and wattage.

Pain_Killer2074d ago

680 is faster than the HD7970 in some cases so yeah it would be like GeForce 500 vs HD6000 but one thing to note is that 680/670 are based on the GK104 chip which belongs to the performance segment.

The GK110 High End Chip is yet to arrive packing 2304 Cores as the article states. Its also likely to be part of the High Performance Computing systems only which would mean those cards would be limited to Quadro/Tesla designs.

As far as TDP goes, its around the 225-250W mark and price as other articles from Semiaccurate hint would be $399.

I can't wait to get my hand on the 680 if performance numbers shown here are true, Would go great with my Ivy Bridge setup in April :D

pandehz2074d ago (Edited 2074d ago )

I believe Ati n Nvidia can still squeeze a lot more performance out of the previous 2 generations of GPU's before they make 2 newer gens.

They need to work on better patches and stability improvements.

On Ati's side a lot of games still have problems from bad drivers.

Pain_Killer2074d ago

Rage is the biggest example!

The bad catalyst drivers led to intense texture popping and FPS stuttering do to bad coding for OpenGL in them and the game itself.

GPU manufacture's are doing their part of work providing new driver revisions every month, Game developers need to do the same and work in their collaboration.

Crysis 1 had some bad coding but the visuals were a millenia ahead of its time, Even when i play it today i see the immense potential of the visuals/textures which put 80% of today's games to shame.

gamernova2074d ago

Can't wait to tap into this power! As a PC gamer, I am also a benchmarker and I get a boner from having unnecessary power hahaha

2074d ago
Show all comments (15)