Why 'Watch Dogs' Is Bad News For AMD Users -- And Potentially The Entire PC Gaming Ecosystem

Forbes.com’s Jason Evangelho has a fascinating story on the disappointment that owners of AMD graphics cards are about to feel when they discover that Watch Dogs is largely unoptimized for their system.

3679d ago Replies(6)
NYC_Gamer3679d ago

Nvidia aren't wrong by not going the sharable route when it comes to their tools/features..

starchild3679d ago

True. It's no different than Sony or Microsoft not sharing their exclusives or features or tools with the other.

UltraNova3678d ago (Edited 3678d ago )

Who said anything on sharing proprietary code?

Its the developers fault and their damn release date windows who is to blame here!

If they wanted to partner up with Nvidia and their Gameworks program its their right, but its also their responsibility when they do optimize their game for Nvidia products to sit down (if possible in parallel) with AMD and optimize their code for their products as well. They have to do this since they cater in both sides of the fence. Plus, they are expecting sales form AMD users right? Incidentally, 40% of the market!

After reading the whole article and their benchmarking test its obvious that not only WD is an un-optimized mess in general, but AMD users are screwed big time, and dont get me started on Crossfire's and SLI's non-existent support.

That's disrespectful to AMD customers and customers in general to say the least and all this because they didn't have the balls to delay the game for a few months more and deliver something actually finished, I would have respected that, shut up and wait for it!

I'm officially putting this game on the back burner until they fix it.

Born2Game833678d ago

You are talking about 2 entirely different console manufactures vs PC. It's not the same thing.

ProjectVulcan3678d ago (Edited 3678d ago )

It's a poor show when Ubisoft have screwed AMD users, however this is why a lot of people will pay a bit more for Nvidia hardware because I daresay this happens less frequently for the green team.

The problem is still Nvidia is considered the benchmark for most developers working on PC, they prefer to use Nvidia hardware in testing generally. Think back to Microsoft demonstrating their early Xbox One titles....on Nvidia powered PCs. This is despite the console itself being entirely AMD powered!!!

TWIMTBP was a whole bunch of Nvidia engineers working hard to improve PC gaming and games support on Nvidia hardware. Many criticise it for allegedly biasing developers against AMD, but if Nvidia are going to provide all that support at their own expense, you can see exactly why many developers would take advantage of it.

AMD's counter to this Nvidia favouring is mainly to pay to be associated with big titles and try to support such titles themselves, for example Battlefield 4 recently.

Ubisoft are at fault. But it's still really in AMD's interests to chase down developers of really big titles to try to ensure game compatibility.

Giul_Xainx3678d ago

I bought this game for pc. I have an ati radeon 5750....

I also own a ps3 and want a ps4.... I think I know what my ch o ice is going to be in the future from now on. Buy a console and stick with it.

frostypants3678d ago (Edited 3678d ago )

No, it's totally different. PC is a largely open platform. Consoles are not.

Having gamers worry about whether or not their video card's proprietary crap is supported is a huge problem. It's not like people are going to install 2 cards.

+ Show (2) more repliesLast reply 3678d ago
mrmarx3678d ago

amd is the xbone of cpu worlld.. weak

Jonny5isalive3678d ago

yeah and they also make the apu in ps4 as well, what a bunch of slow crap huh.

brads43678d ago

PS4 is build with AMD components. People are so stupid.

FlyingFoxy3678d ago (Edited 3678d ago )

That's why the R9 290 was so much cheaper than Nvidia's overpriced cards and forced them to drop prices a lot on the 780?

AMD are the ones keeping prices in check, if it weren't for them Nv would keep ripping us off even more than they do now. And they are not "weak" in the least with their GPU's

3-4-53678d ago

* Basically people WON'T buy this game because they have AMD Graphics card.

* Dev's will take note of that, in the future, and make sure games are compatible and run good on AMD, and it will effect NVidia in the future for sales of graphics cards.

^ Not a TON, or a lot, but there will be some cause and effect from this.

ITPython3678d ago (Edited 3678d ago )

One of these days I plan to build my own high-end PC with all the bells and whistles, but it's stuff like this that makes me want to stick to consoles for my gaming needs.

It may not have the power of a PC, but it sure is a heck of a lot more convenient knowing I don't have to mess with my console or worry about games not playing on it correctly. And if there is bugs and issues, they usually get resolved pretty quickly because if one person has the problem, everybody likely will have the same exact issue. Whereas with PC gaming and the insane amount of variety between hardware, OS's, and software, if one person has a problem it's likely local to them and the devs won't bother to look into it unless it is a large scale problem affecting the majority of PC gamers.

Plus I'm a bit OCD when it comes to my PC's performance, and I get real irritated when it doesn't perform as expected and I sometimes spend hours tweaking settings and troubleshooting. So if I am playing a game, and say the frame-rate gets a bit iffy or there is some performance issues, I will probably spend more time messing with the games settings and my computers settings than I would be enjoying the game. With consoles if there is a performance issue, there is nothing I can do about it so I don't let the issue get in the way of enjoying the game because it is out of my hands.

+ Show (1) more replyLast reply 3678d ago
sourav933679d ago

Those GTX 770 benchmark numbers make me happy. Not because they're better than 290x numbers (shame on you Ubisoft!), but because it means I'll be able to run the game decently (gtx 770). Hopefully Ubisoft fixes this AMD issue, as I've got a few friends who use AMD cards, and I wanna play WD with them online without they're game getting screwed up all the time.

starchild3679d ago

I can confirm that it runs well on my gtx 770. It's a demanding game but it's doing a lot and I think it looks great.

choujij3678d ago

I tested it on a gtx 770 with 4770k processor in 1080p, and it often stutters and lags on ultra settings.

adorie3678d ago

How much vram do you have?

uso3678d ago

I have a AMD R9 280x and i can run the game in 1080P in ultra, with 40 to 50 fps

Kayant3679d ago

"Hopefully Ubisoft fixes this AMD issue" - They can never fix it Ubisoft nor AMD knows what is going on with the code running in the gameworks library. It's a black box to everyone apart from AMD, that's the problem with this. Why a 290x that goes toe to toe with a titan and keeps up quite well to beating it a very times in non-gamework optimized games can't beat a 770 you know something is very wrong.

Kayant3678d ago

2It's a black box to everyone apart from AMD" - Meant to say Nvidia there :p.

ginsunuva3678d ago

You thought a 770 would have trouble with this game?

raWfodog3678d ago

According to choujij (#3.1.1), his gtx770 is having issues.

duplissi3678d ago

I was thinking that it was running somewhat slower than I imagined it would. regardless it runs well enough on my 290X. I imagine there will be a game update and a driver update soon that will correct this.

flozn3678d ago

Keep in mind that this ONLY applies to 4GB GTX 770.

sourav933678d ago

The only difference in the 4GB and 2GB 770 would be the texture settings. Everything else will be identical.

+ Show (2) more repliesLast reply 3678d ago
Are_The_MaDNess3679d ago

you get what you pay for i guess.
im glad that Nvidia is patching for games before release and is teaming up with so many devs with support for their cards and exclusive features, dont think i will change from NV any time soon.

VJGenova3679d ago (Edited 3679d ago )

Did you read the article? It stated that the $500 R290x didn't run as well as the $300 GTX770 ... Not sure what you mean ...