290°

Nvidia Geforce GTX 880 3dMark Benchmark Surfaces - Nearly 35% More Powerful Than the 780

Leaked benchmarks of the GTX 880 Maxwell Flagship have just surfaced and the results are VERY impressive.

Read Full Story >>
wccftech.com
wtopez3575d ago

This being Nvidia, it'll probably cost around $50 at launch. $60 for the Ti version.

codelyoko3575d ago

Not sure whether very good sarcasm or typo.

Volkama3575d ago

I hear AMD aren't releasing any direct competition, so it is likely nVidia will hike the price up to about $65. You know, so they can share a little revenue with AMD.

r1sh123575d ago

this wccftech site is garbage.
Any quick article to generate hits is always posted.
Why not just post actual results from in game and 3d/2d benchmarks etc.
Its like the site has a google alert running for the smallest scrap of information just to get hits

codelyoko3575d ago

"Why not just post actual results from in game and 3d/2d benchmarks etc. "

Because the card isnt officially released yet lol? How out of touch with the tech scene are you?

r1sh123574d ago

Hence why I said post in game results.
I never said anything about it being released..
I work with AMD/Nvidia on an enterprise level.

Look up what an Enterprise Architect does..

So the card isnt released yet, so youre posting 'hearsay'
Hahaha.

Dont hate cos you generate tiny amounts of money from site hits.

Dont reply, I dont care.

ATi_Elite3575d ago

I'm gonna get a GTX860 SLI set-up!

Should come at a cheaper price than a GTX880 but offer up to 25% more power than a single gtx880 costing $560.

+ Show (1) more replyLast reply 3574d ago
JBSleek3575d ago

The next generation is here.

Hanuman3575d ago

And still developers will aim for the medium range cards when designing their games!

Clunkyd3574d ago

And this is why PC games will never be well optimized.
FACT!

XiSasukeUchiha3575d ago

Next gen is here and its ready!

Rob_Ko3575d ago

next gen every 6 months /sigh

Codewow3575d ago

Sigh? Be happy that they produce more powerful graphics units every 6 months. If they didn't do that, then games would have no reason to push their graphics higher and higher. It's a good thing. And for people who already have a GPU, they are fine. There's no reason to upgrade for a few years.

Rob_Ko3575d ago

there are no games that pushes 1 year old card to tthe limit, no need for new expensive tech that doesn't do anything extra.

Software_Lover3575d ago

If they stop producing cards and just let the devs focus on what is out, atleast for 2 years it could do nothing but help the industry.

1)Devs can focus more on the hardware that is there and optimize games for it on the pc side

2)The cards will become cheaper to produce over time and it would do nothing but help the bottom line of Nvidia/AMD

3)Devs can focus on the hardware that is there and optimize games for it on the pc side

4)did I mention devs can focus on the hardware that is there

5)BF3 doesn't even use the full power or memory of my HD7950 at 1200p

BiggCMan3575d ago

Here's what I don't like though right. These cards are REMARKABLY more powerful than what's found in a games console.

But the best looking games on PC are the multiplatform ones, and while they do look way better on PC because of the extra settings to turn on. I feel like these cards could produce way more than what we currently see.

I think every game on PC could look as good or better than what GTA 4 modders do with that game, or what Skyrim modders do with that game. But they don't because games are just ported over from consoles.

Imagine a PC game to take full advantage of a GTX Titan Z, a high end CPU etc. I can't even imagine it because it doesn't happen.

I am a console gamer for the most part, but I have a nice PC that could go for a new GPU soon. But I don't really want to since PS4 is handling games just as well as the PC version would nowadays. It shouldn't be like that.

edqe3575d ago

@Rob_Ko: Maybe not from Ubisoft or EA who makes console games.

"Moore's law is our friend."

http://www.eurogamer.net/ar...

Kleptic3574d ago (Edited 3574d ago )

CryTek is all but selling their office furniture...

NO DEVELOPER will touch this card until 2016...I almost guarantee that...not talking about tech demos, as i know the UE4 3D mark will be all over it...just mean released titles that actually utilize it...

its like BiggCMan said...it shouldn't be this way...but it is...so w/e...capability of hardware is so far beyond whats financially possible for game development anymore...the industry just doesn't make sense, at least the established market for PC hardware...

+ Show (2) more repliesLast reply 3574d ago
Codewow3575d ago

That's because there aren't any PC dedicated AAA devs. So the indie devs will be pushing them. Try Star Citizen for example.

edqe3575d ago (Edited 3575d ago )

... and 'Elite: Dangerous'
http://www.eurogamer.net/ar...

thisismyaccount3575d ago

So you´re saying that the the 100+? people bedind/making Star Citizen can be considered an indie developer?

Okay....

Bladesfist3574d ago

Indie means independent which means you are self publishing, it has nothing to do with team size.

JBSleek3575d ago

I hate people who want to stop technology from advancing.

Shabutie133575d ago

Rob_Ko, you are just incorrect with this. There are games that will push any set of cards to the limit. The problem is that you are thinking of 1080p 60 FPS. I go for 144 FPS with my monitor and my 780 TI can't do it without turning some things down in newer games.

arkard3575d ago

Why do you need 144 fps?

r1sh123575d ago (Edited 3575d ago )

Any progress is good progress.
DDR4 Ram is coming out soon, intel released the devils canyon CPU.

Its not a next gen every 6 months, its a better incremental step up in the gen.

Whereas with consoles - youre currently running a ps4/xbone that is about 5 years behind.
Check a video on linus tech tips, they got a 5 year old Nvidia card to post the same performance as both consoles playing watchdogs.

Im not hating on consoles, because I love the ones I have.
Its a realisation I came to when the specs were tested against real world computing.

BVFTW3575d ago

PC gamers otimization works throught developer patches in the cards drivers (most families of cards are compatible architecture) even 2010 gtx 480 cards are going strong as for example the controversial game watchdogs reaching more than 30 fps in high settings at "1200p" ( http://static.techspot.com/... with fxaa antialiasing, this game performance will keep improving with nvidea,amd, ubisoft patches (better optimization)
So whoever bought the 2010 card has been playing games with comparable "current console gen" performance since 2010 proving that you can efectively switch cards and have great graphics for a 5 year cycle, the new generation of cards is expected to have a longer life/performce span.

+ Show (3) more repliesLast reply 3574d ago
DougLord3575d ago

35% better than 780. Isn't that roughly 780 levels? No die shrink. No h.265. No thanks.

Volkama3575d ago

The figure is based on a 3DMark result
780 4500
780ti 5000
880 6100

If it is accurate it is a fairly sizable jump, but probably for an even more sizable outlay.

Personally I am looking for some card in AMD's line that will boast HDMI 2.0 and sit in crossfire with my existing R9 290s. Otherwise I wont be upgrading for a good 2 or 3 years.

Qrphe3575d ago

The lack of die shrink kills it for me. There has been no sizable power jump in years.

ginsunuva3575d ago

I know right, just tell them to get a shrink ray and shrink the die!

Sounds super easy!

Show all comments (83)
330°

Nvidia DLSS 3.7 drives a further nail in the coffin of native performance

Nvidia DLSS 3.7 is the latest update to the long-running AI upscaling technology, and it further shows native performance doesn't matter.

DustMan22d ago

I think hardware development is at a point where they need to figure out how to draw less power, These beefy high end cards eat wattage, and I'm curious if using DLSS & AI in general will lower the power draw. It would seem like the days of just adding more VRAM & horsepower is over. Law of diminishing returns. Pretty soon DLSS/FSR will be incorporated into everything, and eventually the tech will be good enough to hardly notice a difference if at all. AI is the future and it would be foolish to turn around and not incorporate it at all. Reliance on AI is only going to pick up more & more.

Tapani22d ago (Edited 22d ago )

DLSS certainly lowers power consumption. Also, the numbers such as the 4090 at 450W does not tell you everything, most of the time the GPU stays between 200-350W in gameplay, which is not too different from the highest end GPU of 10 years ago. Plus, today you can undervolt + OC GPUs by a good margin to keep stock performance while utilizing 80% of the power limit.

You can make the 4090 extremely power efficient and keep 90% of its performance at 320W.

However, in today's world the chip manufacturing is limited by physics and we will have power increases in the next 5-10 years at the very least to keep the technology moving forward at a pace that satisfies both businesses and consumers.

Maybe in 10 years we have new tech coming to the markets which we are yet to invent or perhaps we can solve existing technologies problems with manufacturing or cost of production.

On the other hand, if we were to solve the energy problem on earth by utilizing fusion and solar etc. it would not matter how much these chips require. That being said, in the next 30-40 years that is a pipedream.

MrBaskerville22d ago

I don't think fusion is the way forward. It will mosy likely be too late when it's finally ready, meaning it will probably never be ready. Something else might arrive before though and then it becomes viable.

Firebird36021d ago

We need to stop the smear campaign on nuclear energy.
We could power everything forever if we wanted too.

Tacoboto22d ago

PS4 Pro had dedicated hardware in it for supporting checkerboard rendering that was used significantly in PS4 first party titles, so you don't need to look to PC or even modern PC gaming. The first RTX cards released nearly 6 years ago, so how many nails does this coffin need?

InUrFoxHole22d ago

Well... its a coffin man. So atleast 4?

Tacoboto22d ago

PSSR in the fall can assume that role.

anast22d ago

and those nails need to be replaced annually

Einhander197222d ago

I'm not sure what the point you're trying to make is, but PS4 Pro was before DLSS and FSR, and it still provides one of the highest performance uplifts while maintaining good image quality.

DLSS is it's own thing but checkerboarding om PS5 still is a rival to the likes of FSR2.

Tacoboto22d ago

Um. That is my point. That there have been so many nails in this "native performance" coffin and they've been getting hammered in for years, even on PS4 Pro before DLSS was even a thing.

RaidenBlack22d ago

Don't know what's OP's point is either but ... checkerboard rendering was good enough for its time but in terms of image quality its wayy behind what's DLSS 3 or FSR 3 is currently offering.
The main point of the article and what OP missed here is that DLSS 3.7 is soo good that its nearly undisguisable from native rendering and basically throws the "its still blurry and inferior to native rendering" debacle, (that's been going around in PC community since 2019), right out of the window.

Einhander197221d ago

RaidenBlack

DLSS is as i said a different thing from FSR and checkerboard.

But you're talking about FSR 3 which probably is better than checkerboard, but FSR 3 has only started to get games this year, so checkerboard which was the first hardware upscaling solution was and is still one of the best upscaling solutions.

Give credit where credit is due, PlayStation was first and they got it right from the get go, and PSSR will almost certainly be better than it will be given credit for, heck digital foundry is already spreading misinformation about the Pro.

Rhythmattic22d ago

Tacoboto
Yes... Its amazing how many talekd about KZ2 deferred rendering, pointing out the explosions were lower res than the frame itself..
And of course, Then the idea of checkerboard rendering, not being native....
For sure, maybe this tech makes it minimal while pixel counting, but alas, seems performance and close enough , and not native now matters.....
I want to see it run native without DLSS.. why not?

RonsonPL22d ago

Almost deaf person:
- lightweight portable 5$, speakers of 0,5cm diameter are the final nail in coffin of Hi-Fi audio!

Some people in 2010:
- smartphones are the final nain in the console gaming's coffin!

This is just the same.
AI upscalling is complete dogshit in terms of motion quality. The fact that someone is not aware of it (look at the deaf guy example) doesn't mean the flaws are not there. They are. And all it takes to see them, is to use a display that handles motion well, so either gets true 500fps at 500Hz LCD TN or OLED (or faster tech) or uses low persistence mode (check blurbusters.com if you don't know what it means) also known as Black Frame Insertion or backlight strobing.

Also, image ruined by any type of TAA is just as "native image" as chineese 0,5$ screwdriver is "high quality, heavy duty, for professional use". It's nowhere near it. But if you're an ignorant "journalist", you will publish crap like this article, just to flow with the current.

There's no coffin to native res quality and there never will be. Eventually, we'll have enough performance in rasterization to drive 500fps, which will be a game changer for motion quality while also adding other benefit - lower latency.
And at 500fps, the amount of time required for upscalling makes it completely useless.
This crap is only usable for cinematic stuff, like cutscenes and such. Not for gaming. Beware of ignorants on the internet. The TAA is not "native" and the shitty look of the modern games when you disable any TAA, is not "native" either as it's ruined by the developer's design choice - you can cheat by rendering every 4th pixel when you plan to put a smeary TAA pass on it later on. When you disable it, you will see a ruined image, horrible pixellation and other visual "glitches" but it is NOT what native would've looked like if you'd like to honestly compare the two.

Stay informed.

RaidenBlack22d ago

Main point of the article is how far DLSS has come with v3.7 since 2018. If this is what we're getting already, then imagine what we'll get within next ~3 years. Yes parity will obviously be there compared to the then native rendering tech but it'll slowly narrow down to the point it'll be indistinguishable.
Something similar is like the genAI Sora ... AI generative videos were turd back when they were introduced (the infamous Will Smith eating video) ... but now look at Sora, generating videos that just looks like real life.

RonsonPL3d ago

You can improve quality but you will never be able to reach native quality in motion. The biggest part of why these upscallers are so praised is because they use previous frame data. You cannot do that without degrading latency and/or hurting the motion quality. If you put another flaw on top of it, coming from sample and hold method of displaying image, or coming from low framerate, sure, the difference between "screwed up image" vs. "image screwed up even more" may seem small or non-existent. But if you talk about gaming, not interactive movies, the upscallers are overhyped and harfmul tech for the gamers and the whole gaming industry. For example, a game designed around screwed up motion, like the TAA enabled games, will never be played with improved quality even 100 years later when hardware allows for native 16K res. The motion quality will be broken and even if you disable the AA pass, you will still get the broken image, cause the devs were designing their effects with smeary filter in mind - this is why you can disable TAA in some games today, manually, with some tinkering, but you get 1 to 16 understampled crap.
It's annoying that nobody seems to understand the serious drawbacks of AI assisted upscallers. Everyone just praises it and calling it a great revolution. Don't get me wrong. AI has its place in rendering. But NOT in gaming.

21d ago
Yui_Suzumiya22d ago

How much VRAM is standard today? My laptop has a 1080p QLED display but only an Intel Iris Xe with 128MB of VRAM. I currently do all my gaming on it but certain titles do suffer because of it. I plan on getting a Steam Deck OLED soon to play the newer and more demanding titles.

purple10122d ago

Maybe better to get a budget gaming laptop and link a dualsense to it

= Portable console with far better graphics than a steam deck! + bigger screen and able to use it for work / etc

170°

Why I'm worried about the Nvidia RTX 50 series

Aleksha writes: "Nvidia has established itself as a dominant force in the world of AI, but I can't shake the worry of what this means for the RTX 50 series."

Tal16925d ago

Echo sentiment here - I think the way GPUs are going, gaming could be secondary to deep learning. Wonder if the 40 series was the last true generation of GPUs?

Number1TailzFan25d ago

No.. Jensen believes GPUs should stay expensive. Those wanting a top end GPU will have to splash out for it, or play at just 1080p and 60fps or something if you can only afford a low end option.

On the other hand if you don't care about RT or AI performance then there's always AMD that are doing ok at the mid range.

Christopher25d ago

***or play at just 1080p and 60fps or something***

My over 2-year-old laptop GPU still runs fine. I think this is more a reason why GPUs are going to other things in priority, because the market reach for new users is shrinking as more PC gamers focus less on replacing older and still working parts that run RT/AI fine enough as it is. Not to say there aren't people who still do it, but I think the market is shrinking for having the latest and greatest like it has been the past two decades. Problem is we aren't growing things at a rate as we were, we're reaching the the flattening of that exponential curve in regards to advancement. We need another major technological advancement to restart that curve.

D0nkeyBoi25d ago

The irremoval ad makes it impossible to read article

Tzuno25d ago (Edited 25d ago )

I hope Intel takes some lead and do a big dent to nvidia sales

Jingsing25d ago

You also need to consider that NVIDIA are heavily invested in cloud gaming. So they are likely going to make moves to push you into yet another life subscription service.

Kayser8124d ago

NVIDIA will never change their price point until AMD or intel makes a GPU that is comparable and cheaper than them .
it happend before in the days of gtx280 which they changed the price from 650$ to 450$ in a matter of 2 weeks because of rx4870 which is being sold at 380$.

Show all comments (8)
230°

Nvidia AI Demo Unwittingly Proves that Human Voice Actors, Artists, and Writers are Irreplaceable

Nvidia presented Covert Protocol, a tech demo aiming to showcase the "power" of the Nvidia Ace technology applied to video game characters.

Read Full Story >>
techraptor.net
Eonjay45d ago (Edited 45d ago )

They look like they are in pain. Almost begging to be put down. It was uncomfortable to watch.

PRIMORDUS46d ago

The tech. is too early. Come back in 10+yrs and see what it can do then.

N3mzor46d ago

That presentation sounds like it was written by an AI using corporate buzzwords.

CS746d ago

I don’t know why people keep thinking of it as AI vs no AI.

A much more likely scenario is the use of AI alongside human work.

Eg. AI voices used during side quests or banter to boost the lines of dialog.

AI generating additional pre determined branches in dialog tree options for more freedom in conversations with NPCs

Smellsforfree45d ago

"AI generating additional pre determined branches in dialog tree options for more freedom in conversations with NPCs"

I'm wondering about that last one. Will that make a game more fun or more immersive? In the end, how can it possibly be more than filler content and then if it is filler content how much do I really want to engage with conversing with it if I know it will lead no where?

MrBaskerville45d ago

It's one of those things that sounds cool on paper. But will probably get old fast.

DivineHand12545d ago

The tech is now available, and it is up to creators to create something unique with it.

Profchaos46d ago (Edited 46d ago )

The biggest thing to talk about here is that every interaction requires communication to inworld servers so there's three big impacts here
1) games are always online no question about it
2) delays in processing on inworld servers, outages or unexpected load as a result of some astronomically popular game will cause real time game delays ever wait for a chat got response this will be similar as the context must be pulled via the llm.

Now as for the other impact the artistic one no I don't think writers can be replaced I've mentioned before often AI generated writing is word soup I still standby that it's also evident in the video to.
AI can not convery accurately human emotions and I don't think ever will.

I know publishers are looking to cut down on development costs but what happens when inworld decide to charge per interaction or update their pricing a year after your game goes live you have no choice but pay it or shutter it.

Ive felt for a while that we are heading towards this place of games being disposable entertainment and now it's feeling more and more accurate

Show all comments (23)