330°

DirectX 12 tested: An early win for AMD and disappointment for Nvidia

First DX12 gaming benchmark shows R9 290X going toe-to-toe with a GTX 980 Ti.

Read Full Story >>
arstechnica.com
Sir_Simba3171d ago (Edited 3171d ago )

The only reason I don't buy AMD cards is because of power consumption. Unfortunatly thats one important part of my choice.

3171d ago Replies(4)
antikbaka3171d ago

i don't buy them because 3 previous ones were glitchy and i had to exchange them to Nvidia

ABizzel13171d ago

It is what it is.

AMD simply have the better drivers and support for DX12 already, which is kind of surprising considering NVIDIA claims every chance they get that they've developed DX12 alongside MS.

My guess is DX12 won't be a big push for NVIDIA until Pascal drops sometime next year. But in the meantime AMD has been working with low level API since their own Mantle dropped, which gives their cards an early boost, and could very well give them some much needed sells.

This article doesn't inform everything, but not only do their high end GPUs gain a fairly large boost, so do their APUs and more importantly their CPUs.

Their APUs gain CPU performance rivaling Intel's i3 (something that was only a dream before DX12), and their GPU performance has gains of around 40%, making APU based systems an all around viable option.

But it doesn't end there with CPUs. Their higher brand of CPUs the FX 6300 and FX 8300 series have JUMPED in performance thanks to DX12 finally running multi-core configs, low level API, and Core Clock having greater results, all of which AMD CPUs benefit from greatly. It's such an improvement that AMD's $100 - $200 CPUs, now rival Intel's $200 - $350 i7 4000 series CPUs. The FX 6350 competes with the i5 4670k with a $100 savings, and the FX 8300 cards are literally half the price of the $350 i7 4690k and give it a run for it's money. AMD can FINALLY send a jab Intel's way with CPUs that dominate the low and mid range, with an insane price advantage.

The problem as @Sir_SImba said is AMD TDP thus heating is much higher than Intel's and NVIDIA's. The good news for CPUs and APUs is that AMD is dropping their new architecture 14nm Zen in 2016 which aims to fix their two biggest problems Single Core Performance and TDP. Single core performance is expected to have gains up to 40% (which still isn't touching Intel's single cores, but is a much needed gain) while TDP is expected to take a nose dive to a 10w TDP for dual core and 25w for quad core in mobile processors (again still not the best in the business, but a MUCH needed improvement).

Finally with the new architecture more cores can be added on to chips, which brings us to the most important changes. We may finally see a 6-core APU with 12 GPU cores if not 8-cores with 16 GPU cores (basically XBO's and PS4's with MUCH better CPU performance). And on the FX side we may very well see FX 10 and FX 12 series CPUs with 10 and 12 CPU core setups.

This is a good time for AMD, and while their competitors won't just sit by and take a loss, it's the first big break they had in a LONG time. Don't overlook them, DX12 removes a lot of the issue they had with drivers once again making things a level playing field.

I'm usually Intel / NVIDIA as well, but I'm not counting AMD out on this one, and might build an AMD build pretty soon just to have it, and before they think of raising prices due to performance gains.

pumpactionpimp3171d ago

I remember AMD making a big deal out of what they called graphics core next tech in the 7000 and newer series. Touting with mantle their gpus would surpass nvidia, due to the actual architecture they used. Of course we all laughed then (myself included, and I own AMD cards currently), and dismissed it as usual AMD big talk. I can't help but wonder if it is drivers, which AMD is not known for. Or the architecture they spoke so highly of in the past.

Either way I hope this at least evens the playing field, so there's actual competition again in the gpu market.

DevilOgreFish3171d ago (Edited 3171d ago )

I have both a sapphire R9 290 and recently a 970 from Evga. (Can provide pictures)

Tips for Nvidia users using Windows 10. custom resoluons is all messed up, To avoid this go to the control panel and go to manage 3D settings. Next enable DSR factors and check in the boxes. you can adjust DSR smoothness but i'd just leave it at default. Then go to change resolution, click apply first before finding your resolution. And you're pretty much done from there.

-------------------

For performance comparison, my R9 290 out performs my 970 by some frames. In 4k gaming the R9 290 is quite a bit better than my 970. At 1080p they're pretty much identical. at 1440p they're pretty close, R9 290 still has the lead by a couple frames though.

In tdp my 970 wins, in temperature my sapphire Radeon cooling does a better job over Evga, though it's also much longer because of the extra fan.

In overclocking nvidia's drivers takes control of the settings. The R9 290 can be manually overclocked, though only to a 10 percent increase. Water cooled it can go a little higher.

For DX12 my R9 290 is a beast, which is only because of how inefficient AMD's drivers were before. Nvidia's drivers are indeed normally efficient. The 970 is also a beast with dx12 with a few frames extra in performance.

Besides that The 970 is a quality graphics card and with Dx12 it meets a high level of satisfaction. You won't have any issues doing 1080p, 1440p and in some modern games even 4k. with Dx12 AMD has met and exceeded expectations in their graphics cards. The performance boost makes their budget high-end GPUs very tempting own. The competition is strong with this one.

I also can't wait to test cross matching GPUs! Main reason why i got both GPUs in the first place.

assdan3171d ago

You mean performing almost exactly the same per watt used? Seriously, keep up to date on tech before looking like an idiot.

DevilOgreFish3171d ago (Edited 3171d ago )

@ assdan

you talking to me? Type the "@" key and the user's name, it helps.

HeavenFall3171d ago

I bet for the amount of money you save from getting an AMD card instead of nVidia, you can afford the electricity bill, and then some.

ABizzel13171d ago

@HeavenFall

According to your electric bill company, using a desktop computers 2 hours per day, for an entire month only adds $1 to your monthly bill for a 200w computer, so it's going to take you several years before that AMD cards hits that NVIDIA price, and by then it's about time for a replacement.

spoonard3171d ago

Yeah, that extra few dollars a year power bill would really break the bank!

Tzuno3170d ago

Do you know that in some countries the electricity costs like a mofo?

babadivad3171d ago

I have an AMD GPU now but I'm looking hard at NVIDIA's Pascal GPU. If AMD's next gpu is still using the ancient GCN *with just a die shrink
* architecture then I'm definitely jumping ship.

ABizzel13171d ago

The die shrink would offer a good amount of performance increase....that being said I'm looking at Pascal as well, but then again Volta is suppose to drop in 2018 and be significantly better as well, and I try my best not to enter the annual / bi annual upgrade for GPUs

I might have to give in -_-

Tzuno3170d ago (Edited 3170d ago )

Glad to hear that are some people that care for power consumption, i am in the same boat even if i own a amd card but that amd card i have it only takes 55w, you cannot ignore the tdp when you see that nvidia is doing more performance with lower tdp while amd is lazy in that department, amd need to get his head out of his ass and recognize that this is their main problem. I plan to buy a gtx 750ti.

+ Show (5) more repliesLast reply 3170d ago
traumadisaster3171d ago

I would consider going amd if their freesync gave me a big enough advantage vs the money, maybe 15 fps I'm not sure.

I don't have a Gsync monitor but I wonder if there is not value there. If I could gain 15 or so fps just by a monitor that would be nice.

I've had a titan for years and several 4k tvs, but would like to see an affordable vsync solution in a 4k tv. If all things being nearly equal a vsync option would push me in 1 direction for my next setup.

TurboGamer3171d ago

Freesync and G-sync doesn't give you more fps. Both solutions simply refresh the screen when the gpu presents a new frame instead of refreshing at a preset Hz.

traumadisaster3171d ago

I suppose I was thinking I'd you could turn vsync off in game or nv control panel that would free up resources that could now go to fps.

JsonHenry3171d ago

I don't think you understand what Free-sync/G-sync do, Traumadisaster..

traumadisaster3171d ago

See above and please explain what I'm missing. Thanks

lemoncake3171d ago

I just recently got a 4k gsync monitor, it doesn't give you more fps but when your fps start to fluctuate you don't notice it at all. It's an amazing bit of tech and a must have tbh especially at high res gaming, much better than using vsync.

JsonHenry3171d ago

There is no noticeable difference with it on or off in terms of FPS gain/loss. I don't think I could say it any more simply. - http://www.anandtech.com/sh...

+ Show (1) more replyLast reply 3171d ago
SteamPowered3171d ago

I like the extra features GeForce brings to the table. This benchmark is the first game made with dx12 and it's not even the final version. Team green has engineers working round the clock on some new drivers.

I'm willing to wait and see.

BeefCurtains3171d ago

I hear you, if you read the whole article, there was a lot of data missing and incomplete testing done too. IM not knocking the huge boost AMD already showed (it was massive), but I wouldn't say the results for nvidia are any where near final.

xTheMercenary_3171d ago

I'll put all my money on the fact that something is wrong with the drivers when working with DX12. Like you said NVIDIA definitely would have there engineers working to find and fix this issue. Here's to hoping it's a driver issue, it makes absolutely no sense to go from dx11 and dx12 and see a reduction in performance, something is not right here.

Death3171d ago

Way too early to give a win to anyone or flame the NVidia AMD fanboy wars. Let's wait for optimization and games before declaring a winner. From the article the only thing DX12 accomplished was increasing AMD GPU's to NVidia levels. I hope DX12 does more than that.

3171d ago
Angeljuice3171d ago

Well it actually accomplished much more.

AMD have a much older, weaker GPU suddenly competing on equal terms with a much more powerful (and more expensive) Nvidia card.

AMD have been producing strong hardware for years, but they have been hampered by a lack of support for their tech solutions. DX12 looks as if it is going to remedy this problem for AMD and draw out the best from their hardware (about time too).

3171d ago Replies(1)
lemoncake3171d ago

Nvidia do great driver support so I expect them to be working overtime enhancing dx12 on their cards.

Show all comments (43)
100°

Make your next GPU upgrade AMD as these latest-gen Radeon cards receive a special promotion

AMD has long been the best value option if you're looking for a new GPU. Now even their latest Radeon RX 7000 series is getting cheaper.

Father__Merrin3d ago

Best for the money is the Arc cards

just_looken3d ago

In the past yes but last gen amd has gotten cheaper and there new cards are on the horizon making 6k even cheaper.

The arc cards are no longer made by intel but asus/asrock has some the next line battlemage is coming out prices tbd.

Do to the longer software development its always best to go amd over intel if its not to much more money even though intel is a strong gpu i own 2/4 card versions.

330°

Nvidia DLSS 3.7 drives a further nail in the coffin of native performance

Nvidia DLSS 3.7 is the latest update to the long-running AI upscaling technology, and it further shows native performance doesn't matter.

DustMan13d ago

I think hardware development is at a point where they need to figure out how to draw less power, These beefy high end cards eat wattage, and I'm curious if using DLSS & AI in general will lower the power draw. It would seem like the days of just adding more VRAM & horsepower is over. Law of diminishing returns. Pretty soon DLSS/FSR will be incorporated into everything, and eventually the tech will be good enough to hardly notice a difference if at all. AI is the future and it would be foolish to turn around and not incorporate it at all. Reliance on AI is only going to pick up more & more.

Tapani13d ago (Edited 13d ago )

DLSS certainly lowers power consumption. Also, the numbers such as the 4090 at 450W does not tell you everything, most of the time the GPU stays between 200-350W in gameplay, which is not too different from the highest end GPU of 10 years ago. Plus, today you can undervolt + OC GPUs by a good margin to keep stock performance while utilizing 80% of the power limit.

You can make the 4090 extremely power efficient and keep 90% of its performance at 320W.

However, in today's world the chip manufacturing is limited by physics and we will have power increases in the next 5-10 years at the very least to keep the technology moving forward at a pace that satisfies both businesses and consumers.

Maybe in 10 years we have new tech coming to the markets which we are yet to invent or perhaps we can solve existing technologies problems with manufacturing or cost of production.

On the other hand, if we were to solve the energy problem on earth by utilizing fusion and solar etc. it would not matter how much these chips require. That being said, in the next 30-40 years that is a pipedream.

MrBaskerville12d ago

I don't think fusion is the way forward. It will mosy likely be too late when it's finally ready, meaning it will probably never be ready. Something else might arrive before though and then it becomes viable.

Firebird36012d ago

We need to stop the smear campaign on nuclear energy.
We could power everything forever if we wanted too.

Tacoboto13d ago

PS4 Pro had dedicated hardware in it for supporting checkerboard rendering that was used significantly in PS4 first party titles, so you don't need to look to PC or even modern PC gaming. The first RTX cards released nearly 6 years ago, so how many nails does this coffin need?

InUrFoxHole13d ago

Well... its a coffin man. So atleast 4?

Tacoboto13d ago

PSSR in the fall can assume that role.

anast13d ago

and those nails need to be replaced annually

Einhander197213d ago

I'm not sure what the point you're trying to make is, but PS4 Pro was before DLSS and FSR, and it still provides one of the highest performance uplifts while maintaining good image quality.

DLSS is it's own thing but checkerboarding om PS5 still is a rival to the likes of FSR2.

Tacoboto13d ago

Um. That is my point. That there have been so many nails in this "native performance" coffin and they've been getting hammered in for years, even on PS4 Pro before DLSS was even a thing.

RaidenBlack12d ago

Don't know what's OP's point is either but ... checkerboard rendering was good enough for its time but in terms of image quality its wayy behind what's DLSS 3 or FSR 3 is currently offering.
The main point of the article and what OP missed here is that DLSS 3.7 is soo good that its nearly undisguisable from native rendering and basically throws the "its still blurry and inferior to native rendering" debacle, (that's been going around in PC community since 2019), right out of the window.

Einhander197212d ago

RaidenBlack

DLSS is as i said a different thing from FSR and checkerboard.

But you're talking about FSR 3 which probably is better than checkerboard, but FSR 3 has only started to get games this year, so checkerboard which was the first hardware upscaling solution was and is still one of the best upscaling solutions.

Give credit where credit is due, PlayStation was first and they got it right from the get go, and PSSR will almost certainly be better than it will be given credit for, heck digital foundry is already spreading misinformation about the Pro.

Rhythmattic12d ago

Tacoboto
Yes... Its amazing how many talekd about KZ2 deferred rendering, pointing out the explosions were lower res than the frame itself..
And of course, Then the idea of checkerboard rendering, not being native....
For sure, maybe this tech makes it minimal while pixel counting, but alas, seems performance and close enough , and not native now matters.....
I want to see it run native without DLSS.. why not?

RonsonPL13d ago

Almost deaf person:
- lightweight portable 5$, speakers of 0,5cm diameter are the final nail in coffin of Hi-Fi audio!

Some people in 2010:
- smartphones are the final nain in the console gaming's coffin!

This is just the same.
AI upscalling is complete dogshit in terms of motion quality. The fact that someone is not aware of it (look at the deaf guy example) doesn't mean the flaws are not there. They are. And all it takes to see them, is to use a display that handles motion well, so either gets true 500fps at 500Hz LCD TN or OLED (or faster tech) or uses low persistence mode (check blurbusters.com if you don't know what it means) also known as Black Frame Insertion or backlight strobing.

Also, image ruined by any type of TAA is just as "native image" as chineese 0,5$ screwdriver is "high quality, heavy duty, for professional use". It's nowhere near it. But if you're an ignorant "journalist", you will publish crap like this article, just to flow with the current.

There's no coffin to native res quality and there never will be. Eventually, we'll have enough performance in rasterization to drive 500fps, which will be a game changer for motion quality while also adding other benefit - lower latency.
And at 500fps, the amount of time required for upscalling makes it completely useless.
This crap is only usable for cinematic stuff, like cutscenes and such. Not for gaming. Beware of ignorants on the internet. The TAA is not "native" and the shitty look of the modern games when you disable any TAA, is not "native" either as it's ruined by the developer's design choice - you can cheat by rendering every 4th pixel when you plan to put a smeary TAA pass on it later on. When you disable it, you will see a ruined image, horrible pixellation and other visual "glitches" but it is NOT what native would've looked like if you'd like to honestly compare the two.

Stay informed.

RaidenBlack12d ago

Main point of the article is how far DLSS has come with v3.7 since 2018. If this is what we're getting already, then imagine what we'll get within next ~3 years. Yes parity will obviously be there compared to the then native rendering tech but it'll slowly narrow down to the point it'll be indistinguishable.
Something similar is like the genAI Sora ... AI generative videos were turd back when they were introduced (the infamous Will Smith eating video) ... but now look at Sora, generating videos that just looks like real life.

12d ago
Yui_Suzumiya12d ago

How much VRAM is standard today? My laptop has a 1080p QLED display but only an Intel Iris Xe with 128MB of VRAM. I currently do all my gaming on it but certain titles do suffer because of it. I plan on getting a Steam Deck OLED soon to play the newer and more demanding titles.

purple10112d ago

Maybe better to get a budget gaming laptop and link a dualsense to it

= Portable console with far better graphics than a steam deck! + bigger screen and able to use it for work / etc

170°

Why I'm worried about the Nvidia RTX 50 series

Aleksha writes: "Nvidia has established itself as a dominant force in the world of AI, but I can't shake the worry of what this means for the RTX 50 series."

Tal16916d ago

Echo sentiment here - I think the way GPUs are going, gaming could be secondary to deep learning. Wonder if the 40 series was the last true generation of GPUs?

Number1TailzFan16d ago

No.. Jensen believes GPUs should stay expensive. Those wanting a top end GPU will have to splash out for it, or play at just 1080p and 60fps or something if you can only afford a low end option.

On the other hand if you don't care about RT or AI performance then there's always AMD that are doing ok at the mid range.

Christopher16d ago

***or play at just 1080p and 60fps or something***

My over 2-year-old laptop GPU still runs fine. I think this is more a reason why GPUs are going to other things in priority, because the market reach for new users is shrinking as more PC gamers focus less on replacing older and still working parts that run RT/AI fine enough as it is. Not to say there aren't people who still do it, but I think the market is shrinking for having the latest and greatest like it has been the past two decades. Problem is we aren't growing things at a rate as we were, we're reaching the the flattening of that exponential curve in regards to advancement. We need another major technological advancement to restart that curve.

D0nkeyBoi16d ago

The irremoval ad makes it impossible to read article

Tzuno15d ago (Edited 15d ago )

I hope Intel takes some lead and do a big dent to nvidia sales

Jingsing15d ago

You also need to consider that NVIDIA are heavily invested in cloud gaming. So they are likely going to make moves to push you into yet another life subscription service.

Kayser8115d ago

NVIDIA will never change their price point until AMD or intel makes a GPU that is comparable and cheaper than them .
it happend before in the days of gtx280 which they changed the price from 650$ to 450$ in a matter of 2 weeks because of rx4870 which is being sold at 380$.

Show all comments (8)