90°

Nvidia GeForce 306.23 WHQL Released - Improvements, TXAA Enabled on GTX 600 series, New SLI profiles

DSOGaming writes: "Nvidia has released a new set of WHQL drivers for its graphics cards. The new NVIDIA GeForce 306.23 WHQL drivers are now available to download. An essential upgrade for every GeForce GTX user, the GeForce 306.23 WHQL drivers enable high-quality TXAA anti-aliasing on GTX 600 Series GPUs, improve game performance by up to 17% on all GeForce 400, 500, and 600 Series GPUs, enhance seven top titles with NVIDIA Control Panel MSAA and Ambient Occlusion profiles, improve the overall user experience for all GPU users."

Read Full Story >>
dsogaming.com
FanboyPunisher4239d ago

Lmao AMD never have never will make good drivers.
Tons of issues with AMD.

iamgoatman4239d ago

Been using AMD drivers for a couple of years now and only had the odd problem. Their biggest issue is getting Crossfire profiles out promptly, but otherwise saying they never have or never will make good drivers is just ignorant and false.

Can say one thing though, never had any AMD drivers nearly fry my card, unlike those infamous Nvidia drivers I almost lost my trusty 8800GT to.

kharma454239d ago

Another one of those myths that's still perpetuated around.

Since the AMD takeover drivers have improved hugely, and since they've moved away from a month release cycle the quality has again risen.

AMD aren't the only ones who have driver issues, Nvidia has had their fair share too but since AMD had the bad name first it's stuck with them. At least AMD don't delete posts and threads from their forums about driver issues.

FanboyPunisher4239d ago (Edited 4239d ago )

Its not a myth, i'm a power user and had tons and tons of dual screen issues. I have said this from personal experience, i've given AMD attempt after attempt and I always come across issues.

They basically push me to Nivdia where i've never had an issue.

I've owned tons of AMD cards to prove it.
ve 7000
7500
8500
9800xl
X800 XT
2900
4850
7850

I've been buying AMD any time I see a good benchmark, but they always drop the f'n ball on their drivers.

Not a myth, so stop your damage control.
I'm a IT tech and programmer and this shit is real. I've had to do many workarounds with issues they have brought up time and time again.

Not like i'm the only one anyway, google it.

iamgoatman4239d ago

So you ALWAYS have issues with AMD drivers but NEVER had one with Nvidia drivers? Yeah, I don't believe that for a second.

JsonHenry4239d ago

I was using an AMD 5870 for almost 3 years. Never had any trouble. Now I am running a 680GTX and not having any trouble either.

hiredhelp4238d ago

Funny how theres articles about new Nvidia Drivers but never about AMD.
As of this nonsens of AMD drivers are worse well thats not actually true, at least not this gen.
I started off a huge fan of the green team for years didnt take to ati as they where named back not soo long ago.
But my mate said i got a card for sale na ati rubbish back then Ati wernt as good but he was selling me a card years on 4870HD not that old so i sold my old 8800gt and i was supprised at how much they changed.
I kept that card till last year never had driver issue.... But then the gtx8800 never had issue eaither. But last year i spent about £800 building another Rig for myself
Only this time i wanted to go back to green team its been while since 8800gtx.
Great orderd 2 kfa2 560's my first SLi and got my other mate into pc gaming went well for few weeks then i got "display has recoverd". Wtf. Hmmm ok carryed on relaunched game well after updating several times sending cards back getting new 2months past i was left embarrest as my mate i just got into pc gaming had same setup as me and card and same issues.
Every day i hit the fourums all saw was people screaming with TDR issues.
January i bought a refrence design 7970 my m8 followed month after my other mate stuck his 6950 even after 4 updates not once have i had TDR.
Except if i overclock push card too hard lol but... The morral of this pointless dribble is if anyone has got worse over the years dropped the ball its nvidia not AMD. Will i go back ....... NO Not as long as they keep using same drivers developed cheaply by same people who handled the 500series witch now does drivers for 600series BTW fourums filled complaints with 600series funny that.

+ Show (3) more repliesLast reply 4238d ago
john24239d ago

Curious to see what SLI bits Nvidia used for Sleeping Dogs this time around

jetpacksheep4239d ago

Thanks for the heads up, getting mine right now, although the improvements don't sound as great for the 500 series as they do the 600 series.

ninjahunter4238d ago

Have trouble with hardware not coming up, call dell for support, they install their driver, performance drops 30%.

Nvidia sure knows how to make drivers better and better every day.

hellvaguy4238d ago

Who buys Dell? THey prolly have some software issue or Dell spamware. Nvidia is second to none in driver support. Sometimes amd does decent with some cards, other cards not so much.

ninjahunter4238d ago

Lol, I see. I buy alienware, but your two bubbles tell me you dont have the comprehension for why.

hellvaguy4238d ago

Alienware is ridiculously overpriced, fact. Also ASUS gaming laptops are far superior and cheaper.

As they say, a fool and his money soon part ways.

ninjahunter4237d ago

Alienware is not rediculously overpriced, Its a stretch to say they even cost over 10% more than their competition. And Alienware has more than a few advantages over other companies.
-Alienware has unmatched support. Your laptop breaks down for a reason that isnt your fault, It will either be fixed, replaced, or you will be sent replacement parts in days.
-The Build quality is top notch, right now, I could take my laptop, hold it over my head and drop it onto a hardwood floor and I would bet you money it still works.
-The cooling systems in Alienware laptops are top notch. Built with overclocking and High power equipment in mind, then some more.
-Swag appeal, Lets face it, you can actually get away with bragging about an alienware laptop which doest look like an over sized netbook.
-FEATURES- You name it, an alienware laptop probably has two of it.

100°

5 Amazing Side Quests In Games That You Need To Experience

While gamers usually take notice of the mainline missions, these 5 side quests deserve more widespread attention for how entertaining they are.

120°

It's A Crime That There's No Sleeping Dogs 2 Yet

Huzaifah from eXputer: "Sleeping Dogs from the early 2010s is one of the best open-world games out there but in dire need of a resurgence."

LG_Fox_Brazil8d ago

I agree, I consider the first one a cult classic already

isarai7d ago

You say "yet" as if it's even possible anymore. United Front Games is gone, along with anyone that made this game what it is

CrimsonWing697d ago

That’s what happens when games sell poorly. And I’ve seen people wonder why people cry when a game sells badly… this is your answer.

solideagle7d ago

Majority of the time it's true but if a company/publisher is big (in terms of money), they can take a hit or 2. e.g. I am not worried about Rebirth sales as Square will make Remake 3 anyway but if FF 17 doesn't sell then Square might need to look for alternative. <-- my humble opinion

Abnor_Mal7d ago

Doesn’t Microsoft own the IP now since they acquired Activision?

DaReapa7d ago

No. Square Enix owns the IP.

Abnor_Mal7d ago

Oh okay, Activision owned True Crime, but when that didn’t sell as intended it was canceled. Six months later Square Enix bought the rights and changed the title to Sleeping Dogs.*

*As per Wikipedia

boing17d ago (Edited 7d ago )

Sleeping Dogs was a sleeper hit back then. It was fantastic. It actually still is. Would love a sequel to this, or at least a revive of True Crime series.

Show all comments (10)
330°

Nvidia DLSS 3.7 drives a further nail in the coffin of native performance

Nvidia DLSS 3.7 is the latest update to the long-running AI upscaling technology, and it further shows native performance doesn't matter.

DustMan11d ago

I think hardware development is at a point where they need to figure out how to draw less power, These beefy high end cards eat wattage, and I'm curious if using DLSS & AI in general will lower the power draw. It would seem like the days of just adding more VRAM & horsepower is over. Law of diminishing returns. Pretty soon DLSS/FSR will be incorporated into everything, and eventually the tech will be good enough to hardly notice a difference if at all. AI is the future and it would be foolish to turn around and not incorporate it at all. Reliance on AI is only going to pick up more & more.

Tapani10d ago (Edited 10d ago )

DLSS certainly lowers power consumption. Also, the numbers such as the 4090 at 450W does not tell you everything, most of the time the GPU stays between 200-350W in gameplay, which is not too different from the highest end GPU of 10 years ago. Plus, today you can undervolt + OC GPUs by a good margin to keep stock performance while utilizing 80% of the power limit.

You can make the 4090 extremely power efficient and keep 90% of its performance at 320W.

However, in today's world the chip manufacturing is limited by physics and we will have power increases in the next 5-10 years at the very least to keep the technology moving forward at a pace that satisfies both businesses and consumers.

Maybe in 10 years we have new tech coming to the markets which we are yet to invent or perhaps we can solve existing technologies problems with manufacturing or cost of production.

On the other hand, if we were to solve the energy problem on earth by utilizing fusion and solar etc. it would not matter how much these chips require. That being said, in the next 30-40 years that is a pipedream.

MrBaskerville10d ago

I don't think fusion is the way forward. It will mosy likely be too late when it's finally ready, meaning it will probably never be ready. Something else might arrive before though and then it becomes viable.

Firebird36010d ago

We need to stop the smear campaign on nuclear energy.
We could power everything forever if we wanted too.

Tacoboto11d ago

PS4 Pro had dedicated hardware in it for supporting checkerboard rendering that was used significantly in PS4 first party titles, so you don't need to look to PC or even modern PC gaming. The first RTX cards released nearly 6 years ago, so how many nails does this coffin need?

InUrFoxHole11d ago

Well... its a coffin man. So atleast 4?

Tacoboto11d ago

PSSR in the fall can assume that role.

anast10d ago

and those nails need to be replaced annually

Einhander197210d ago

I'm not sure what the point you're trying to make is, but PS4 Pro was before DLSS and FSR, and it still provides one of the highest performance uplifts while maintaining good image quality.

DLSS is it's own thing but checkerboarding om PS5 still is a rival to the likes of FSR2.

Tacoboto10d ago

Um. That is my point. That there have been so many nails in this "native performance" coffin and they've been getting hammered in for years, even on PS4 Pro before DLSS was even a thing.

RaidenBlack10d ago

Don't know what's OP's point is either but ... checkerboard rendering was good enough for its time but in terms of image quality its wayy behind what's DLSS 3 or FSR 3 is currently offering.
The main point of the article and what OP missed here is that DLSS 3.7 is soo good that its nearly undisguisable from native rendering and basically throws the "its still blurry and inferior to native rendering" debacle, (that's been going around in PC community since 2019), right out of the window.

Einhander19729d ago

RaidenBlack

DLSS is as i said a different thing from FSR and checkerboard.

But you're talking about FSR 3 which probably is better than checkerboard, but FSR 3 has only started to get games this year, so checkerboard which was the first hardware upscaling solution was and is still one of the best upscaling solutions.

Give credit where credit is due, PlayStation was first and they got it right from the get go, and PSSR will almost certainly be better than it will be given credit for, heck digital foundry is already spreading misinformation about the Pro.

Rhythmattic10d ago

Tacoboto
Yes... Its amazing how many talekd about KZ2 deferred rendering, pointing out the explosions were lower res than the frame itself..
And of course, Then the idea of checkerboard rendering, not being native....
For sure, maybe this tech makes it minimal while pixel counting, but alas, seems performance and close enough , and not native now matters.....
I want to see it run native without DLSS.. why not?

RonsonPL11d ago

Almost deaf person:
- lightweight portable 5$, speakers of 0,5cm diameter are the final nail in coffin of Hi-Fi audio!

Some people in 2010:
- smartphones are the final nain in the console gaming's coffin!

This is just the same.
AI upscalling is complete dogshit in terms of motion quality. The fact that someone is not aware of it (look at the deaf guy example) doesn't mean the flaws are not there. They are. And all it takes to see them, is to use a display that handles motion well, so either gets true 500fps at 500Hz LCD TN or OLED (or faster tech) or uses low persistence mode (check blurbusters.com if you don't know what it means) also known as Black Frame Insertion or backlight strobing.

Also, image ruined by any type of TAA is just as "native image" as chineese 0,5$ screwdriver is "high quality, heavy duty, for professional use". It's nowhere near it. But if you're an ignorant "journalist", you will publish crap like this article, just to flow with the current.

There's no coffin to native res quality and there never will be. Eventually, we'll have enough performance in rasterization to drive 500fps, which will be a game changer for motion quality while also adding other benefit - lower latency.
And at 500fps, the amount of time required for upscalling makes it completely useless.
This crap is only usable for cinematic stuff, like cutscenes and such. Not for gaming. Beware of ignorants on the internet. The TAA is not "native" and the shitty look of the modern games when you disable any TAA, is not "native" either as it's ruined by the developer's design choice - you can cheat by rendering every 4th pixel when you plan to put a smeary TAA pass on it later on. When you disable it, you will see a ruined image, horrible pixellation and other visual "glitches" but it is NOT what native would've looked like if you'd like to honestly compare the two.

Stay informed.

RaidenBlack10d ago

Main point of the article is how far DLSS has come with v3.7 since 2018. If this is what we're getting already, then imagine what we'll get within next ~3 years. Yes parity will obviously be there compared to the then native rendering tech but it'll slowly narrow down to the point it'll be indistinguishable.
Something similar is like the genAI Sora ... AI generative videos were turd back when they were introduced (the infamous Will Smith eating video) ... but now look at Sora, generating videos that just looks like real life.

10d ago
Yui_Suzumiya10d ago

How much VRAM is standard today? My laptop has a 1080p QLED display but only an Intel Iris Xe with 128MB of VRAM. I currently do all my gaming on it but certain titles do suffer because of it. I plan on getting a Steam Deck OLED soon to play the newer and more demanding titles.

purple10110d ago

Maybe better to get a budget gaming laptop and link a dualsense to it

= Portable console with far better graphics than a steam deck! + bigger screen and able to use it for work / etc