260°

Nvidia GTX 980/970 – First Synthetic Benchmark “3DMark” Scores Leaked

DSOGaming writes: "As we can see, the performance difference between Nvidia’s GTX980 and GTX780Ti is pretty small. In fact, some could say that these are disappointing results. Still, if these cards are priced at around $499, then we can safely say that there will be a market for them."

Read Full Story >>
dsogaming.com
Razputin3516d ago

Just came in my pants.

Hopefully the 980ti comes out soon enough after this. Been waiting patiently for this still have my GTX 670.

DeadRabbits3516d ago

Wow..........so did I!!!!

This is awkward @_@

lelo2play3516d ago (Edited 3516d ago )

Kind of disappointing. They skipped the GTX 800 series for this? I expected more.... way more.

infamous-butcher3516d ago

they didnt realy skip the 800 series. This is it. Its just called the 900 series because the mobile gpus are on the 800 nomenclature. It would just be more alkward if they kept going and named this the gtx 800 series and the next mobile gpus the gtx 900m series. Its also not the first time they done it either with the 300 desktop series being skiped. Its still going to be the same performance whather its called the gtx 800 or even if it was calle the gtx 1000 series. Its just a name.

R6ex3516d ago

Extremely disappointing! Still at 28nm node i.e. no big improvements. GPU is now like Intel, producing new CPUs with yearly 10% improvement. GTX 980 may be at best, around 10% better than GTX 780 Ti. Sad. CD Projek already stated that even the GTX 780 Ti can get 35-40 fps for The Witcher 3 at 1080p max. settings with 8x AA.

My GTX 670 is struggling already, and I can't wait to upgrade. However, I'll at least wait for the GM200 part or new 20nm part later.

ProjectVulcan3516d ago

R6ex, it's a massive performance per watt improvement. It's a 170w card that's quicker than an R9 290X and 780ti. These are both 250w+ cards.

You'll see the huge raw performance gains when they get around to releasing the 250w Maxwells....They will come, but probably not for at least 6 more months.

Magicite3516d ago

Hopefully I can buy HD7950/7970 for cheap soon.

wannabe gamer3516d ago

Came in your pants over a 1% increase from the 780 to 980..... doesnt take much does it.
this is not impressive at all

Razputin3516d ago

LOL, your negativity makes me laugh, thank you.

I see many people can't take a joke and see exaggerated over enthusiasm.

If your idiot eyes weren't stuck on my ejaculation, I'm waiting for a 980ti, which will be a lot better than the 1% of the 980. But again, you must be a really smart, if you only cared about me and what goes on in my pants.

compguru9103513d ago

Hmmm, maybe your eyes are bad. If you notice that 980 has more than a 20% gain on the 780.... Maybe you were thinking the Ti (which is already the maxwell architecture)

Gamer19823516d ago

next to no performance increase from the 780ti. Yet it will cost a lot more..

+ Show (3) more repliesLast reply 3513d ago
DougLord3516d ago

Might be the best thing on the market at $500, but a 10% upgrade vs a 290x is pathetic for "next-gen". I guess Nvidia only has to run as fast as AMD makes it. Like Intel. Very sad.

user56695103516d ago

This always have you thinking are these companies in bed with one another when they don't try to out do each other

R6ex3516d ago

No. Its TSMC & Apple's fault for not having 20nm wafers for GPU-makers.

Are_The_MaDNess3516d ago

still has better driver support, more direct support for games and features that AMD dont have.

ABizzel13516d ago

LMAO, I though the first score was from a single 980. I was like OMG, it's up there with the 295x, but it's SLI XD

Still it has great performance, but I'm waiting for the real Maxwell, before I upgrade so it's SLI for me.

Letros3516d ago (Edited 3516d ago )

They are probably going to milk it, this is Maxwell, next series will be higher TDP Maxwell, then 20nm Maxwell(or Pascal) will be the big boost.

ProjectVulcan3516d ago (Edited 3516d ago )

Understand that this is akin to the GTX680 aka GK104. This is not the 'top' size Maxwell die. It's TDP is allegedly just 170w, 80w less than a 780ti! I find that impressive, considering it'll be 10 percent faster.

GTX680 was superceded by the 'top' size kepler the GK110 after about a year, first the Titan, then the 780 using that design.

Therefore GTX980 will be slightly faster than a 780ti, but use an awful lot less power.

Then maybe 6-12 months after, you'll see the top dog Maxwell employed in Teslas and limited run Titans no doubt. Soon after, a consumer card that relegates the GTX980 to midrange.

Same strategy as the GTX680 > Titan > GTX780 series

+ Show (2) more repliesLast reply 3516d ago
3516d ago
Are_The_MaDNess3516d ago

my GTX 580 3GB is ready to be degraded to PhysX duty.

sourav933516d ago

Trust me, you don't wanna do that. New nVidia cards have enough umph to do PhysX on their own. Unless you've got a GTX 260 or something for PhysX, don't bother. As sometimes using a high end card just for PhysX can reduce performance than using just the main card for PhysX.

Are_The_MaDNess3516d ago

yeah i know about the timing bottleneck.

that still does not stop me trying it out first. aslong the the physx timing is as fast as the timing on the frames you wont have a worse performance, and i dont think i will have a problem like that with a 580 and 970/980.

and yes the newer cards can do physx alot better than the 500 series but you still give it more room for a higher framerate or modding or even higher res or supersampling.

Nerdmaster3516d ago

I don't understand this chart. Why are there three GTX 980 and two 780 Ti?

Nerdmaster3516d ago (Edited 3516d ago )

SLI is pretty obvious (if I counted it, there'd be four GTX 980, not three), but I didn't notice the different clocks (I forgot that GPU clocks are two numbers for memory and whatnot). Thanks for the information.

I still think it's too confusing to create a table this way, having to search for the lowest occurence of a model in the list to find the (hopefully) stock model values. The 980 SLI item in the list isn't even using stock clocked 980.

wannabe gamer3516d ago

some are for a single card and some are for dual card setups

Show all comments (36)
330°

Nvidia DLSS 3.7 drives a further nail in the coffin of native performance

Nvidia DLSS 3.7 is the latest update to the long-running AI upscaling technology, and it further shows native performance doesn't matter.

DustMan12d ago

I think hardware development is at a point where they need to figure out how to draw less power, These beefy high end cards eat wattage, and I'm curious if using DLSS & AI in general will lower the power draw. It would seem like the days of just adding more VRAM & horsepower is over. Law of diminishing returns. Pretty soon DLSS/FSR will be incorporated into everything, and eventually the tech will be good enough to hardly notice a difference if at all. AI is the future and it would be foolish to turn around and not incorporate it at all. Reliance on AI is only going to pick up more & more.

Tapani12d ago (Edited 12d ago )

DLSS certainly lowers power consumption. Also, the numbers such as the 4090 at 450W does not tell you everything, most of the time the GPU stays between 200-350W in gameplay, which is not too different from the highest end GPU of 10 years ago. Plus, today you can undervolt + OC GPUs by a good margin to keep stock performance while utilizing 80% of the power limit.

You can make the 4090 extremely power efficient and keep 90% of its performance at 320W.

However, in today's world the chip manufacturing is limited by physics and we will have power increases in the next 5-10 years at the very least to keep the technology moving forward at a pace that satisfies both businesses and consumers.

Maybe in 10 years we have new tech coming to the markets which we are yet to invent or perhaps we can solve existing technologies problems with manufacturing or cost of production.

On the other hand, if we were to solve the energy problem on earth by utilizing fusion and solar etc. it would not matter how much these chips require. That being said, in the next 30-40 years that is a pipedream.

MrBaskerville12d ago

I don't think fusion is the way forward. It will mosy likely be too late when it's finally ready, meaning it will probably never be ready. Something else might arrive before though and then it becomes viable.

Firebird36011d ago

We need to stop the smear campaign on nuclear energy.
We could power everything forever if we wanted too.

Tacoboto12d ago

PS4 Pro had dedicated hardware in it for supporting checkerboard rendering that was used significantly in PS4 first party titles, so you don't need to look to PC or even modern PC gaming. The first RTX cards released nearly 6 years ago, so how many nails does this coffin need?

InUrFoxHole12d ago

Well... its a coffin man. So atleast 4?

Tacoboto12d ago

PSSR in the fall can assume that role.

anast12d ago

and those nails need to be replaced annually

Einhander197212d ago

I'm not sure what the point you're trying to make is, but PS4 Pro was before DLSS and FSR, and it still provides one of the highest performance uplifts while maintaining good image quality.

DLSS is it's own thing but checkerboarding om PS5 still is a rival to the likes of FSR2.

Tacoboto12d ago

Um. That is my point. That there have been so many nails in this "native performance" coffin and they've been getting hammered in for years, even on PS4 Pro before DLSS was even a thing.

RaidenBlack11d ago

Don't know what's OP's point is either but ... checkerboard rendering was good enough for its time but in terms of image quality its wayy behind what's DLSS 3 or FSR 3 is currently offering.
The main point of the article and what OP missed here is that DLSS 3.7 is soo good that its nearly undisguisable from native rendering and basically throws the "its still blurry and inferior to native rendering" debacle, (that's been going around in PC community since 2019), right out of the window.

Einhander197211d ago

RaidenBlack

DLSS is as i said a different thing from FSR and checkerboard.

But you're talking about FSR 3 which probably is better than checkerboard, but FSR 3 has only started to get games this year, so checkerboard which was the first hardware upscaling solution was and is still one of the best upscaling solutions.

Give credit where credit is due, PlayStation was first and they got it right from the get go, and PSSR will almost certainly be better than it will be given credit for, heck digital foundry is already spreading misinformation about the Pro.

Rhythmattic12d ago

Tacoboto
Yes... Its amazing how many talekd about KZ2 deferred rendering, pointing out the explosions were lower res than the frame itself..
And of course, Then the idea of checkerboard rendering, not being native....
For sure, maybe this tech makes it minimal while pixel counting, but alas, seems performance and close enough , and not native now matters.....
I want to see it run native without DLSS.. why not?

RonsonPL12d ago

Almost deaf person:
- lightweight portable 5$, speakers of 0,5cm diameter are the final nail in coffin of Hi-Fi audio!

Some people in 2010:
- smartphones are the final nain in the console gaming's coffin!

This is just the same.
AI upscalling is complete dogshit in terms of motion quality. The fact that someone is not aware of it (look at the deaf guy example) doesn't mean the flaws are not there. They are. And all it takes to see them, is to use a display that handles motion well, so either gets true 500fps at 500Hz LCD TN or OLED (or faster tech) or uses low persistence mode (check blurbusters.com if you don't know what it means) also known as Black Frame Insertion or backlight strobing.

Also, image ruined by any type of TAA is just as "native image" as chineese 0,5$ screwdriver is "high quality, heavy duty, for professional use". It's nowhere near it. But if you're an ignorant "journalist", you will publish crap like this article, just to flow with the current.

There's no coffin to native res quality and there never will be. Eventually, we'll have enough performance in rasterization to drive 500fps, which will be a game changer for motion quality while also adding other benefit - lower latency.
And at 500fps, the amount of time required for upscalling makes it completely useless.
This crap is only usable for cinematic stuff, like cutscenes and such. Not for gaming. Beware of ignorants on the internet. The TAA is not "native" and the shitty look of the modern games when you disable any TAA, is not "native" either as it's ruined by the developer's design choice - you can cheat by rendering every 4th pixel when you plan to put a smeary TAA pass on it later on. When you disable it, you will see a ruined image, horrible pixellation and other visual "glitches" but it is NOT what native would've looked like if you'd like to honestly compare the two.

Stay informed.

RaidenBlack11d ago

Main point of the article is how far DLSS has come with v3.7 since 2018. If this is what we're getting already, then imagine what we'll get within next ~3 years. Yes parity will obviously be there compared to the then native rendering tech but it'll slowly narrow down to the point it'll be indistinguishable.
Something similar is like the genAI Sora ... AI generative videos were turd back when they were introduced (the infamous Will Smith eating video) ... but now look at Sora, generating videos that just looks like real life.

11d ago
Yui_Suzumiya12d ago

How much VRAM is standard today? My laptop has a 1080p QLED display but only an Intel Iris Xe with 128MB of VRAM. I currently do all my gaming on it but certain titles do suffer because of it. I plan on getting a Steam Deck OLED soon to play the newer and more demanding titles.

purple10111d ago

Maybe better to get a budget gaming laptop and link a dualsense to it

= Portable console with far better graphics than a steam deck! + bigger screen and able to use it for work / etc

170°

Why I'm worried about the Nvidia RTX 50 series

Aleksha writes: "Nvidia has established itself as a dominant force in the world of AI, but I can't shake the worry of what this means for the RTX 50 series."

Tal16915d ago

Echo sentiment here - I think the way GPUs are going, gaming could be secondary to deep learning. Wonder if the 40 series was the last true generation of GPUs?

Number1TailzFan15d ago

No.. Jensen believes GPUs should stay expensive. Those wanting a top end GPU will have to splash out for it, or play at just 1080p and 60fps or something if you can only afford a low end option.

On the other hand if you don't care about RT or AI performance then there's always AMD that are doing ok at the mid range.

Christopher15d ago

***or play at just 1080p and 60fps or something***

My over 2-year-old laptop GPU still runs fine. I think this is more a reason why GPUs are going to other things in priority, because the market reach for new users is shrinking as more PC gamers focus less on replacing older and still working parts that run RT/AI fine enough as it is. Not to say there aren't people who still do it, but I think the market is shrinking for having the latest and greatest like it has been the past two decades. Problem is we aren't growing things at a rate as we were, we're reaching the the flattening of that exponential curve in regards to advancement. We need another major technological advancement to restart that curve.

D0nkeyBoi15d ago

The irremoval ad makes it impossible to read article

Tzuno15d ago (Edited 15d ago )

I hope Intel takes some lead and do a big dent to nvidia sales

Jingsing15d ago

You also need to consider that NVIDIA are heavily invested in cloud gaming. So they are likely going to make moves to push you into yet another life subscription service.

Kayser8114d ago

NVIDIA will never change their price point until AMD or intel makes a GPU that is comparable and cheaper than them .
it happend before in the days of gtx280 which they changed the price from 650$ to 450$ in a matter of 2 weeks because of rx4870 which is being sold at 380$.

Show all comments (8)
230°

Nvidia AI Demo Unwittingly Proves that Human Voice Actors, Artists, and Writers are Irreplaceable

Nvidia presented Covert Protocol, a tech demo aiming to showcase the "power" of the Nvidia Ace technology applied to video game characters.

Read Full Story >>
techraptor.net
Eonjay35d ago (Edited 35d ago )

They look like they are in pain. Almost begging to be put down. It was uncomfortable to watch.

PRIMORDUS36d ago

The tech. is too early. Come back in 10+yrs and see what it can do then.

N3mzor36d ago

That presentation sounds like it was written by an AI using corporate buzzwords.

CS736d ago

I don’t know why people keep thinking of it as AI vs no AI.

A much more likely scenario is the use of AI alongside human work.

Eg. AI voices used during side quests or banter to boost the lines of dialog.

AI generating additional pre determined branches in dialog tree options for more freedom in conversations with NPCs

Smellsforfree35d ago

"AI generating additional pre determined branches in dialog tree options for more freedom in conversations with NPCs"

I'm wondering about that last one. Will that make a game more fun or more immersive? In the end, how can it possibly be more than filler content and then if it is filler content how much do I really want to engage with conversing with it if I know it will lead no where?

MrBaskerville35d ago

It's one of those things that sounds cool on paper. But will probably get old fast.

DivineHand12535d ago

The tech is now available, and it is up to creators to create something unique with it.

Profchaos36d ago (Edited 36d ago )

The biggest thing to talk about here is that every interaction requires communication to inworld servers so there's three big impacts here
1) games are always online no question about it
2) delays in processing on inworld servers, outages or unexpected load as a result of some astronomically popular game will cause real time game delays ever wait for a chat got response this will be similar as the context must be pulled via the llm.

Now as for the other impact the artistic one no I don't think writers can be replaced I've mentioned before often AI generated writing is word soup I still standby that it's also evident in the video to.
AI can not convery accurately human emotions and I don't think ever will.

I know publishers are looking to cut down on development costs but what happens when inworld decide to charge per interaction or update their pricing a year after your game goes live you have no choice but pay it or shutter it.

Ive felt for a while that we are heading towards this place of games being disposable entertainment and now it's feeling more and more accurate

Show all comments (23)