170°

AMD says Nvidia’s GameWorks “completely sabotaged” Witcher 3 performance

There was a war of words earlier this week as the Internet decided that, once again, Nvidia's GameWorks technology was messing with the performance of its games on AMD hardware.

Read Full Story >>
arstechnica.co.uk
chaldo3261d ago

They say this about every game.. Starting to think it's AMD now.

Ezz20133261d ago (Edited 3261d ago )

i have to agree
i'm a AMD user
but seeing as they complain every time a game comes out
and how they take for ever to release a new update for ATI cards which do little to nothing
and since this damn gen they never bother trying to fix the problems AMD users face with every game

i'm really looking to get a nvidia card in the next few months
it's not nvidia problem that AMD didn't want to use physx tech as well

Lon3wolf3261d ago

I jumped ship just before Xmas last year for those reasons, the product is great but if they spent more time on drivers etc. rather than calling foul every game they could improve a lot.

mikeslemonade3261d ago

NVIDIA drivers messed up my system. My SLI setup doesn't even work right now. And there has never been a first party driver that allows me to SLI. I have to do it with a program called DifferentSLI.

Testfire3261d ago

I'm an AMD user and happy so far but the trend I always see is devs saying AMD doesn't reach out to them while Nvidia does. You can't just make a product and say here you go make it work with your game. When its time for an upgrade I seriously have consider whether or not I'll continue to support AMD.

assdan3261d ago

Nvidia makes it next to impossible to fix these issues because of their policies. Nvidias practices are sketchy at best.

+ Show (1) more replyLast reply 3261d ago
DevilOgreFish3261d ago

Dx12 can't come soon enough. It may come in handy to own both an AMD and Nvidia GPU, for tressFX and Nvidia features.

slappy5083261d ago

Only a d'hoine can make such an outrageous claim

3261d ago
NovusTerminus3261d ago (Edited 3261d ago )

Here is the thing, it's not NVidia, they offered a great set of features that devs like to use (myself included) hard to blame them for making a devs job easier, and turning out looking better.

Nio-Nai3261d ago

Except it's pretty well known that Nvidia doesn't share it's code with AMD.

While AMD has shared it's code with both TFX and Mantle to help build DX12.

It's pretty obvious Nvidia is a bunch of Dbags.

assdan3261d ago

Well, maybe and should start doing the same...

SniperControl3261d ago

"Except it's pretty well known that Nvidia doesn't share it's code with AMD"

Look at it from a business perspective, would you?
Nvid are in it to make money, they want to sell there cards to as many people as they can, what's the point of giving your advantage to your main competitor.

Nio-Nai3261d ago

The advantage is to push the tech forward.

That's all that AMD is interested in, they care about pushing the tech. Not limiting it to gain a larger profit.

SniperControl3261d ago

Ahh c'mon, pushing tech forward is a necessity to push business forward, AMD aren't making chips and cards from the fondness of there hearts, there doing it to make money, simple as.

Gwiz3261d ago

AMD doing it to push tech forward?,i can't really agree with that.
Look at the CPU's they produce,one thing they've done is making
a cheap CPU[die]GPU solution(APU),they did increase their GPU
competition but not looking at the whole situation.
AMD once was highly competitive and now they don't want to be
known for the cheaper solution.Okay!what's left?

+ Show (2) more repliesLast reply 3261d ago
Avernus3261d ago

As someone looking to build a gaming PC in the coming months, I just don't know if I want AMD or Nvid.

Battlefield games are better optimized for AMD, but other games are Nvidia. Thoughts about this?

hiredhelp3261d ago

Know your budget on a Video card you wish to spend,Look up the options on the market once you find the 2 brands of cards that suite your pocket Start google looking up benchmarks for such and such VS such and such.

Shinuz3261d ago

Go nvidia for their support through drivers.

Nio-Nai3261d ago

Go with the one in your price range.

Running 3x R9 290 and I get better 4k benches and in game rates than 2x Titan X cards.

They are rated at 28k in 3dmark while mines rated at 36k.

Playing games like GTA 5 on maxed out settings at 3840x2160 above 100fps.

3261d ago
SniperControl3261d ago (Edited 3261d ago )

Go for Nvid, way better driver support and the cards are fantastic as well.

I have 2 overclocked Asus Strix GTX970's in SLI, simply amazing, cost me £600 for both.

GTA5 runs smoothly at around 60-70fps at 4K on ultra settings, The witcher 3(after update) is around 60fps at 4k, Shadow of Mordor(with ultra grapics pack) is around 70fps at 4K.

Not saying AMD cards are bad, just think Nvid cards are better. Besides, most new games have some sort of Nvid tie in, while it's rare to see a AMD tie in.

jdiggitty3261d ago

I run an AMD R9 290 and have had zero issues with it and the cost is great for the performance.

FunkYellowMonkey3261d ago

In the long term Nvidia for their driver support and well for me was DSR mode, it's great since i don't need to fork out more money for a new monitor! ;)

ElementX3261d ago (Edited 3261d ago )

I have always avoided AMD. Regardless of benchmarks, I'll chose Intel and NVIDIA over AMD. I started messing with computers in the mid 90s and always thought of AMD as the lower-class, cheaper alternative in regards to processors and GPUs. With processors and video cards I feel that you get what you pay for in most cases. I have read numerous forums about AMD users having various issues with their hardware, not that other companies don't have problems but I dunno I'm just biased against AMD.

+ Show (5) more repliesLast reply 3261d ago
Show all comments (32)
100°

Make your next GPU upgrade AMD as these latest-gen Radeon cards receive a special promotion

AMD has long been the best value option if you're looking for a new GPU. Now even their latest Radeon RX 7000 series is getting cheaper.

Father__Merrin3d ago

Best for the money is the Arc cards

just_looken3d ago

In the past yes but last gen amd has gotten cheaper and there new cards are on the horizon making 6k even cheaper.

The arc cards are no longer made by intel but asus/asrock has some the next line battlemage is coming out prices tbd.

Do to the longer software development its always best to go amd over intel if its not to much more money even though intel is a strong gpu i own 2/4 card versions.

330°

Nvidia DLSS 3.7 drives a further nail in the coffin of native performance

Nvidia DLSS 3.7 is the latest update to the long-running AI upscaling technology, and it further shows native performance doesn't matter.

DustMan13d ago

I think hardware development is at a point where they need to figure out how to draw less power, These beefy high end cards eat wattage, and I'm curious if using DLSS & AI in general will lower the power draw. It would seem like the days of just adding more VRAM & horsepower is over. Law of diminishing returns. Pretty soon DLSS/FSR will be incorporated into everything, and eventually the tech will be good enough to hardly notice a difference if at all. AI is the future and it would be foolish to turn around and not incorporate it at all. Reliance on AI is only going to pick up more & more.

Tapani13d ago (Edited 13d ago )

DLSS certainly lowers power consumption. Also, the numbers such as the 4090 at 450W does not tell you everything, most of the time the GPU stays between 200-350W in gameplay, which is not too different from the highest end GPU of 10 years ago. Plus, today you can undervolt + OC GPUs by a good margin to keep stock performance while utilizing 80% of the power limit.

You can make the 4090 extremely power efficient and keep 90% of its performance at 320W.

However, in today's world the chip manufacturing is limited by physics and we will have power increases in the next 5-10 years at the very least to keep the technology moving forward at a pace that satisfies both businesses and consumers.

Maybe in 10 years we have new tech coming to the markets which we are yet to invent or perhaps we can solve existing technologies problems with manufacturing or cost of production.

On the other hand, if we were to solve the energy problem on earth by utilizing fusion and solar etc. it would not matter how much these chips require. That being said, in the next 30-40 years that is a pipedream.

MrBaskerville13d ago

I don't think fusion is the way forward. It will mosy likely be too late when it's finally ready, meaning it will probably never be ready. Something else might arrive before though and then it becomes viable.

Firebird36012d ago

We need to stop the smear campaign on nuclear energy.
We could power everything forever if we wanted too.

Tacoboto13d ago

PS4 Pro had dedicated hardware in it for supporting checkerboard rendering that was used significantly in PS4 first party titles, so you don't need to look to PC or even modern PC gaming. The first RTX cards released nearly 6 years ago, so how many nails does this coffin need?

InUrFoxHole13d ago

Well... its a coffin man. So atleast 4?

Tacoboto13d ago

PSSR in the fall can assume that role.

anast13d ago

and those nails need to be replaced annually

Einhander197213d ago

I'm not sure what the point you're trying to make is, but PS4 Pro was before DLSS and FSR, and it still provides one of the highest performance uplifts while maintaining good image quality.

DLSS is it's own thing but checkerboarding om PS5 still is a rival to the likes of FSR2.

Tacoboto13d ago

Um. That is my point. That there have been so many nails in this "native performance" coffin and they've been getting hammered in for years, even on PS4 Pro before DLSS was even a thing.

RaidenBlack12d ago

Don't know what's OP's point is either but ... checkerboard rendering was good enough for its time but in terms of image quality its wayy behind what's DLSS 3 or FSR 3 is currently offering.
The main point of the article and what OP missed here is that DLSS 3.7 is soo good that its nearly undisguisable from native rendering and basically throws the "its still blurry and inferior to native rendering" debacle, (that's been going around in PC community since 2019), right out of the window.

Einhander197212d ago

RaidenBlack

DLSS is as i said a different thing from FSR and checkerboard.

But you're talking about FSR 3 which probably is better than checkerboard, but FSR 3 has only started to get games this year, so checkerboard which was the first hardware upscaling solution was and is still one of the best upscaling solutions.

Give credit where credit is due, PlayStation was first and they got it right from the get go, and PSSR will almost certainly be better than it will be given credit for, heck digital foundry is already spreading misinformation about the Pro.

Rhythmattic13d ago

Tacoboto
Yes... Its amazing how many talekd about KZ2 deferred rendering, pointing out the explosions were lower res than the frame itself..
And of course, Then the idea of checkerboard rendering, not being native....
For sure, maybe this tech makes it minimal while pixel counting, but alas, seems performance and close enough , and not native now matters.....
I want to see it run native without DLSS.. why not?

RonsonPL13d ago

Almost deaf person:
- lightweight portable 5$, speakers of 0,5cm diameter are the final nail in coffin of Hi-Fi audio!

Some people in 2010:
- smartphones are the final nain in the console gaming's coffin!

This is just the same.
AI upscalling is complete dogshit in terms of motion quality. The fact that someone is not aware of it (look at the deaf guy example) doesn't mean the flaws are not there. They are. And all it takes to see them, is to use a display that handles motion well, so either gets true 500fps at 500Hz LCD TN or OLED (or faster tech) or uses low persistence mode (check blurbusters.com if you don't know what it means) also known as Black Frame Insertion or backlight strobing.

Also, image ruined by any type of TAA is just as "native image" as chineese 0,5$ screwdriver is "high quality, heavy duty, for professional use". It's nowhere near it. But if you're an ignorant "journalist", you will publish crap like this article, just to flow with the current.

There's no coffin to native res quality and there never will be. Eventually, we'll have enough performance in rasterization to drive 500fps, which will be a game changer for motion quality while also adding other benefit - lower latency.
And at 500fps, the amount of time required for upscalling makes it completely useless.
This crap is only usable for cinematic stuff, like cutscenes and such. Not for gaming. Beware of ignorants on the internet. The TAA is not "native" and the shitty look of the modern games when you disable any TAA, is not "native" either as it's ruined by the developer's design choice - you can cheat by rendering every 4th pixel when you plan to put a smeary TAA pass on it later on. When you disable it, you will see a ruined image, horrible pixellation and other visual "glitches" but it is NOT what native would've looked like if you'd like to honestly compare the two.

Stay informed.

RaidenBlack12d ago

Main point of the article is how far DLSS has come with v3.7 since 2018. If this is what we're getting already, then imagine what we'll get within next ~3 years. Yes parity will obviously be there compared to the then native rendering tech but it'll slowly narrow down to the point it'll be indistinguishable.
Something similar is like the genAI Sora ... AI generative videos were turd back when they were introduced (the infamous Will Smith eating video) ... but now look at Sora, generating videos that just looks like real life.

12d ago
Yui_Suzumiya12d ago

How much VRAM is standard today? My laptop has a 1080p QLED display but only an Intel Iris Xe with 128MB of VRAM. I currently do all my gaming on it but certain titles do suffer because of it. I plan on getting a Steam Deck OLED soon to play the newer and more demanding titles.

purple10112d ago

Maybe better to get a budget gaming laptop and link a dualsense to it

= Portable console with far better graphics than a steam deck! + bigger screen and able to use it for work / etc

170°

Why I'm worried about the Nvidia RTX 50 series

Aleksha writes: "Nvidia has established itself as a dominant force in the world of AI, but I can't shake the worry of what this means for the RTX 50 series."

Tal16916d ago

Echo sentiment here - I think the way GPUs are going, gaming could be secondary to deep learning. Wonder if the 40 series was the last true generation of GPUs?

Number1TailzFan16d ago

No.. Jensen believes GPUs should stay expensive. Those wanting a top end GPU will have to splash out for it, or play at just 1080p and 60fps or something if you can only afford a low end option.

On the other hand if you don't care about RT or AI performance then there's always AMD that are doing ok at the mid range.

Christopher16d ago

***or play at just 1080p and 60fps or something***

My over 2-year-old laptop GPU still runs fine. I think this is more a reason why GPUs are going to other things in priority, because the market reach for new users is shrinking as more PC gamers focus less on replacing older and still working parts that run RT/AI fine enough as it is. Not to say there aren't people who still do it, but I think the market is shrinking for having the latest and greatest like it has been the past two decades. Problem is we aren't growing things at a rate as we were, we're reaching the the flattening of that exponential curve in regards to advancement. We need another major technological advancement to restart that curve.

D0nkeyBoi16d ago

The irremoval ad makes it impossible to read article

Tzuno16d ago (Edited 16d ago )

I hope Intel takes some lead and do a big dent to nvidia sales

Jingsing15d ago

You also need to consider that NVIDIA are heavily invested in cloud gaming. So they are likely going to make moves to push you into yet another life subscription service.

Kayser8115d ago

NVIDIA will never change their price point until AMD or intel makes a GPU that is comparable and cheaper than them .
it happend before in the days of gtx280 which they changed the price from 650$ to 450$ in a matter of 2 weeks because of rx4870 which is being sold at 380$.

Show all comments (8)