720°

Why 'Watch Dogs' Is Bad News For AMD Users -- And Potentially The Entire PC Gaming Ecosystem

Forbes.com’s Jason Evangelho has a fascinating story on the disappointment that owners of AMD graphics cards are about to feel when they discover that Watch Dogs is largely unoptimized for their system.

4035d ago Replies(6)
NYC_Gamer4035d ago

Nvidia aren't wrong by not going the sharable route when it comes to their tools/features..

starchild4035d ago

True. It's no different than Sony or Microsoft not sharing their exclusives or features or tools with the other.

UltraNova4035d ago (Edited 4035d ago )

Who said anything on sharing proprietary code?

Its the developers fault and their damn release date windows who is to blame here!

If they wanted to partner up with Nvidia and their Gameworks program its their right, but its also their responsibility when they do optimize their game for Nvidia products to sit down (if possible in parallel) with AMD and optimize their code for their products as well. They have to do this since they cater in both sides of the fence. Plus, they are expecting sales form AMD users right? Incidentally, 40% of the market!

After reading the whole article and their benchmarking test its obvious that not only WD is an un-optimized mess in general, but AMD users are screwed big time, and dont get me started on Crossfire's and SLI's non-existent support.

That's disrespectful to AMD customers and customers in general to say the least and all this because they didn't have the balls to delay the game for a few months more and deliver something actually finished, I would have respected that, shut up and wait for it!

I'm officially putting this game on the back burner until they fix it.

Born2Game834035d ago

You are talking about 2 entirely different console manufactures vs PC. It's not the same thing.

ProjectVulcan4035d ago (Edited 4035d ago )

It's a poor show when Ubisoft have screwed AMD users, however this is why a lot of people will pay a bit more for Nvidia hardware because I daresay this happens less frequently for the green team.

The problem is still Nvidia is considered the benchmark for most developers working on PC, they prefer to use Nvidia hardware in testing generally. Think back to Microsoft demonstrating their early Xbox One titles....on Nvidia powered PCs. This is despite the console itself being entirely AMD powered!!!

TWIMTBP was a whole bunch of Nvidia engineers working hard to improve PC gaming and games support on Nvidia hardware. Many criticise it for allegedly biasing developers against AMD, but if Nvidia are going to provide all that support at their own expense, you can see exactly why many developers would take advantage of it.

AMD's counter to this Nvidia favouring is mainly to pay to be associated with big titles and try to support such titles themselves, for example Battlefield 4 recently.

Ubisoft are at fault. But it's still really in AMD's interests to chase down developers of really big titles to try to ensure game compatibility.

Giul_Xainx4034d ago

I bought this game for pc. I have an ati radeon 5750....

I also own a ps3 and want a ps4.... I think I know what my ch o ice is going to be in the future from now on. Buy a console and stick with it.

frostypants4034d ago (Edited 4034d ago )

No, it's totally different. PC is a largely open platform. Consoles are not.

Having gamers worry about whether or not their video card's proprietary crap is supported is a huge problem. It's not like people are going to install 2 cards.

+ Show (2) more repliesLast reply 4034d ago
mrmarx4035d ago

amd is the xbone of cpu worlld.. weak

Jonny5isalive4035d ago

yeah and they also make the apu in ps4 as well, what a bunch of slow crap huh.

brads44035d ago

PS4 is build with AMD components. People are so stupid.

FlyingFoxy4034d ago (Edited 4034d ago )

That's why the R9 290 was so much cheaper than Nvidia's overpriced cards and forced them to drop prices a lot on the 780?

AMD are the ones keeping prices in check, if it weren't for them Nv would keep ripping us off even more than they do now. And they are not "weak" in the least with their GPU's

3-4-54034d ago

* Basically people WON'T buy this game because they have AMD Graphics card.

* Dev's will take note of that, in the future, and make sure games are compatible and run good on AMD, and it will effect NVidia in the future for sales of graphics cards.

^ Not a TON, or a lot, but there will be some cause and effect from this.

ITPython4034d ago (Edited 4034d ago )

One of these days I plan to build my own high-end PC with all the bells and whistles, but it's stuff like this that makes me want to stick to consoles for my gaming needs.

It may not have the power of a PC, but it sure is a heck of a lot more convenient knowing I don't have to mess with my console or worry about games not playing on it correctly. And if there is bugs and issues, they usually get resolved pretty quickly because if one person has the problem, everybody likely will have the same exact issue. Whereas with PC gaming and the insane amount of variety between hardware, OS's, and software, if one person has a problem it's likely local to them and the devs won't bother to look into it unless it is a large scale problem affecting the majority of PC gamers.

Plus I'm a bit OCD when it comes to my PC's performance, and I get real irritated when it doesn't perform as expected and I sometimes spend hours tweaking settings and troubleshooting. So if I am playing a game, and say the frame-rate gets a bit iffy or there is some performance issues, I will probably spend more time messing with the games settings and my computers settings than I would be enjoying the game. With consoles if there is a performance issue, there is nothing I can do about it so I don't let the issue get in the way of enjoying the game because it is out of my hands.

+ Show (1) more replyLast reply 4034d ago
sourav934035d ago

Those GTX 770 benchmark numbers make me happy. Not because they're better than 290x numbers (shame on you Ubisoft!), but because it means I'll be able to run the game decently (gtx 770). Hopefully Ubisoft fixes this AMD issue, as I've got a few friends who use AMD cards, and I wanna play WD with them online without they're game getting screwed up all the time.

starchild4035d ago

I can confirm that it runs well on my gtx 770. It's a demanding game but it's doing a lot and I think it looks great.

choujij4035d ago

I tested it on a gtx 770 with 4770k processor in 1080p, and it often stutters and lags on ultra settings.

adorie4035d ago

How much vram do you have?

uso4035d ago

I have a AMD R9 280x and i can run the game in 1080P in ultra, with 40 to 50 fps

Kayant4035d ago

"Hopefully Ubisoft fixes this AMD issue" - They can never fix it Ubisoft nor AMD knows what is going on with the code running in the gameworks library. It's a black box to everyone apart from AMD, that's the problem with this. Why a 290x that goes toe to toe with a titan and keeps up quite well to beating it a very times in non-gamework optimized games can't beat a 770 you know something is very wrong.

Kayant4035d ago

2It's a black box to everyone apart from AMD" - Meant to say Nvidia there :p.

ginsunuva4035d ago

You thought a 770 would have trouble with this game?

raWfodog4035d ago

According to choujij (#3.1.1), his gtx770 is having issues.

duplissi4035d ago

I was thinking that it was running somewhat slower than I imagined it would. regardless it runs well enough on my 290X. I imagine there will be a game update and a driver update soon that will correct this.

flozn4034d ago

Keep in mind that this ONLY applies to 4GB GTX 770.

sourav934034d ago

The only difference in the 4GB and 2GB 770 would be the texture settings. Everything else will be identical.

+ Show (2) more repliesLast reply 4034d ago
Are_The_MaDNess4035d ago

you get what you pay for i guess.
im glad that Nvidia is patching for games before release and is teaming up with so many devs with support for their cards and exclusive features, dont think i will change from NV any time soon.

VJGenova4035d ago (Edited 4035d ago )

Did you read the article? It stated that the $500 R290x didn't run as well as the $300 GTX770 ... Not sure what you mean ...

Ogygian4034d ago

Brand loyalty is only going to allow nVidia to continue to get away with ludicrously high prices for their cards in future.

lets_go_gunners4035d ago

It's one game....No need to over react, dice and other devs are still supporters of mantle so amd will be relevant going forward whether people want to believe or not.

Show all comments (147)
70°

NVIDIA Smooth Motion: Up to 70% More FPS Using Driver Level Frame Gen on RTX 50 GPUs

NVIDIA’s RTX 50 “Blackwell” architecture has been a bit of a bore for us gamers. Apart from Multi Frame Generation, which has limited use-case scenarios, there isn’t much to be excited about. It is achieved using GPU-side Flip Metering. The optical field data is generated using AI models in the Tensor cores.

Read Full Story >>
pcoptimizedsettings.com
60°

PNY NVIDIA GeForce RTX 5060 Ti GPU Review

Between the price, performance and power draw, with the GeForce RTX 5060 Ti, NVIDIA nailed the mainstream formula.

Read Full Story >>
cgmagonline.com
57d ago
230°

Nintendo Switch 2 Leveled Up With NVIDIA AI-Powered DLSS and 4K Gaming

Nvidia writes:

The Nintendo Switch 2 takes performance to the next level, powered by a custom NVIDIA processor featuring an NVIDIA GPU with dedicated RT Cores and Tensor Cores for stunning visuals and AI-driven enhancements.

Read Full Story >>
blogs.nvidia.com
ZycoFox70d ago

The raytracing probably doesn't even equal a low end PC GPU, even if it did it would probably be mostly useless. They'll probably force it in some game now that will run like shit maybe 30fps at best, just because "it can do it"

B5R70d ago

Raytracing is so unnecessary for a handheld. I just hope you can turn it off.

Vits69d ago

A lot of gamers don’t realize that ray tracing isn’t really about making games look better. It’s mainly there to make development easier and cheaper, since it lets devs skip a bunch of old-school tricks to fake reflections and lighting. The visual upgrade is just a nice bonus, but that’s not the main reason the tech exists.

So you can be 100% sure that developers will try to implement it every chance they get.

RaidenBlack69d ago (Edited 69d ago )

Agree with Vits .... but also to add, if devs and designers just implement RT to a game world then it won't always work as expected. RT is not just reflections but also lighting and illumination as well. For example, If you just create a room with minimal windows, then it will look dark af, if RTGI is enabled. Devs and designers needs to sort out the game world design accordingly as well.
DF's Metro Exodus RT upgrade is an amazing reference video to go through, if anybody's interested.

darthv7269d ago

So is HDR... but they have it anyway.

thesoftware73069d ago

Some PS5 and SX games run at 30fps with RT...just like those systems, if you don't like it, turn it off.

I only say this to say, you make it seem like a problem exclusive to the Switch 2.

Neonridr69d ago (Edited 69d ago )

sour grapes much?

"It probably doesn't do it well because it's Nintendo and they suck". That's how your comment reads. Why don't you just wait and see before making these ridiculous statements?

Goodguy0170d ago

Please. I'd like to play my switch games on my 4k tv without it looking all doodoo.

PRIMORDUS70d ago

Nvidia could have said this months ago and cut the bullshit. Anyway the rumors were true.

Profchaos70d ago

Would have been nice but NDA likely prevented them from saying anything

PRIMORDUS69d ago

TBH I don't think Nvidia would have cared if they broke the NDA. A little fine they pay, and they go back to their AI shit. They don't even care about GPU's anymore. I myself would like them to leave the PC and console market.

Tacoboto69d ago

This story was written half a decade ago when the world knew Nvidia would provide the chip for Switch 2 and DLSS was taking off.

Profchaos69d ago

Yeah but similar thing happened a long time ago when 3dfx announced they were working with Sega when they took the company public Sega pulled out of the contract for the Dreamcast GPU.

In response Sega terminated the contract and went to a ultimately weaker chipset.

So there's a precedent but that Nintendo would have much Of an option its AMD, NVIDIA or Intel

Profchaos70d ago

I'm not expecting of anything from ray tracing but dlss will be the thing that sees the unit get some impossible ports.

andy8570d ago

Correct. All I'm seeing online is it'll never run FF7 Rebirth. If it can run cyberpunk it'll run it. The DLSS will help. Obviously only 30 fps but a lot don't care

Profchaos69d ago (Edited 69d ago )

Exactly right when I buy a game on switch I know what I'm getting into I'm buying a game for its portability and I'm willing to sacrifice fidelity and performance to play on a train or comfortably from a hotel room when I travel for work.

69d ago Replies(1)
Show all comments (23)