240°

ATI ‘cheating’ benchmarks and degrading game quality, says NVIDIA

Justin Robinson of Atomic: MPC writes "From our testing it's clear that ATI's implementation FP16 Demotion does affect image quality in games as challenged by NVIDIA. However, the extent and prevalence of it is not universal - from four games, only one showed any real visible influence. On the other side of the coin are performance improvements, which are plentiful in two games from four: boosting performance by 17 per cent at high resolutions, and a still-impressive 10 per cent at lower resolutions."

Read Full Story >>
atomicmpc.com.au
jack who4970d ago

this like m$ saying sony is a 2 face lair....oh wait

zootang4970d ago (Edited 4970d ago )

Would it not be more like, Sony PS3(Nvidia) calling Microsoft Xbox360(ATI)

golsilva4970d ago

so which one is the best out the two, ati or nvidia in terms of sales and/or quality?

nnotdead4970d ago

Nvidia tends to be the bigger sellers. both companys have quality though. ATI tends to give you the better bang for the buck, but Nvidia better overall performance. this is of course not always the case. much better idea to start with a price you want to spend, and then go look up reviews for cards in that price range.

Pandamobile4970d ago

Both are good. Don't let someone tell you that one or the other is better. Take it on a case by case basis.

nycredude4970d ago

For the price I'd say ATI is a better value.

NYC_Gamer4970d ago

ATI is better in my opinion

likedamaster4970d ago (Edited 4970d ago )

Nvidia has consistently delivered much more stable cards, although ATI have always been cheaper. Nvidia gets my money most times.

peowpeow4970d ago

The 5xxx series are VERY efficient, and great in terms of price/performance. 5770 was the best budget card, but now the GTX 460 kills the 5770 especially when SLI'd with another. I'm glad I was able to get one =)

XxZxX4969d ago

I was a ATI fan since Radeon 8500 until recently I bought an Intel motherboard from ASUS that constantly fail whatever ATI card throw at it. Was using 2850, has random hung issues also but not that much. Then I tried 4870, random hung on Mass Effec everytime, so I went gtx 260, pretty good. Recently just upgraded to 5870, random hung on Starcraft II... FAILED again, went back to GTX 260. I'm gonna stick with Nvidia for now. I haven't bought an NVIDIA since geforce 6800, but Nvidia seem to be more stable to me. Happy with 260gtx, looking to upgrade the next 4 series.

jack_burt0n4969d ago

nvidia for me purevideo etc has always been more reliable for me.

+ Show (5) more repliesLast reply 4969d ago
i_like_ff74970d ago

nvidia has better drivers but ati is cheaper

crck4970d ago (Edited 4970d ago )

Historically true. But right now at this moment I think the GTX 460 and GTX 470(if you can put up with the heat and power req) are the best bang for the buck cards. Since they usually come with 1 or 2 games you can flip for $25 to $35. But this could change soon when more info is available about Ati's 6xxx series.

mittwaffen4969d ago (Edited 4969d ago )

I have used the 5850/GTX 470.

If you can afford it, and want to sit on an amazing card get the 470. Its hot, a little loud but it has the most consistent framerates i've ever seen from a card. Minimum framerates are amazingly stable on the 470.

My only problem is ATI sucks at DX11 drastically, I'd rather a card I can hold onto for a year or two down the road. I ended up trading the cooler/quieter 5850 in for the faster/newer 470.

Its a trade off, but at the end of the day I buy a video card for performance/upgrade.

ProjectVulcan4970d ago (Edited 4970d ago )

Nvidia have been the market leader for years, and offer different features, such as physx. Ati have just recently though have had better offerings when it comes to price/performance, and the cards themselves have had heat and power consumption advantages.

This is a bit of cheek really, because nvidia themselves are not above misleading benchmarks. For example, unigine tesselation benchmark used by nvidia to make Ati cards look very slow, when in actual real world the difference in games performance is minimal with tesselation enabled and doesnt impact overall performance nearly as much as the very specific benchmark would suggest.

This isnt new though. They have both been arguing about performance benches for years

likedamaster4969d ago (Edited 4969d ago )

You make a good point. However, real world tests still prove current Nvidia cards are more powerful and get higher frames with tessellation compared to the most powerful AMD cards.

http://www.hardwareheaven.c...

ProjectVulcan4969d ago (Edited 4969d ago )

My point being with unigine nvidia tried to show that GTX480 was nearly twice as fast as a 5870 on the synthetic benchmark. Whereas in DX11 games with tesselation enabled, the advantage is minor, arguably not even related to tesselation performance in some games. GTX480 is a massive GPU, much much larger than 5870, but its size does not correlate to the same increase in performance advantage. Synth benches are not to be trusted, especially wielded by the designers of the hardware themselves....

The radeon 68xx series will still be smaller than GTX480, and if its not faster across the board in games despite being on 40nm still (6870) i'll eat my shoes. AMD are on a roll, and have major architectural advantages regards die size and efficiency versus Nvidia's designs. Smaller die = cheaper to produce.

This is obvious even looking closely at the midrange, where the GTS450 cannot comprehensively defeat a radeon 5750, despite having a die size 30 percent larger. 5750 isnt even the fastest chip based on that die, 5770 is which soundly beats the GTS450 despite being far smaller. This is a huge advantage for AMD, because they can price parts for profit far below what nvidia could to only break even.

This sort of benchmark fudging has gone back years, recent examples often cited as physx score in 3Dmark favouring nvidia. Going waaaay back even more, i recall accusations being flung about 3D Mark 2003 and how the FX series results were being manipulated by nvidia.

z1ck4970d ago

ATI is the best choice , better price/performance , low power consumption ( means less heat and more space for overclock ) and now even in drives is better . nvidia still have the most powerfull single core cards but they very expensive and run too hot . if you plan on buying a new card wait for the ati 6000 series or until nvidia drop prices ( but wating for the 6000 series is recommended )

nycredude4970d ago

Really no need to buy the 6000 series now since they will cost an arm and leg now. Better waiting until 6000 series releases, then buy a late 5000 series for cheap. They can handle pretty much any games out at highest settings and will last a while. Spend the money you save on games.

jakethesnake4970d ago

Thats what I'm planning on doing. Wait til the 6000s come out and the 5000 prices go down, and then getting a good deal on a 5000 series! .The only hard part is being patient!

peowpeow4970d ago

You can say that again. The hardest part is waiting for the price drop xD

4970d ago
joydestroy4970d ago

like the guys have said, ATI is better bang for your buck.
i personally prefer Nvidia, though. you have PhysX in Nvidia cards.
i run off of 2 MSI GTX 460's

Trroy4970d ago

If you're looking for the best performance first, and looking to save money as well, ATI is usually better.

If you're looking for the best performance, period, or the best performance in a low- or mid-range card, nVidia is better. nVidia really only overcharges for the high-end of the moment.

jerethdagryphon4970d ago

@below it does indeed depend

on a game using physx api nvidia gives a huge advantage because of cuda, in games not running physx theres less difference

bang for buck ati puts more cores in a chip and are lower priced

optimazation and proformence in some areas is better with nvidia at a price

most review list same generation cards as very similar usually less then 10fps from one to another

of corse some games are different and nvidia shows ati up o nthem but there few

my 100£ 5770 runs the games i want at max spec and gives me a 3400 score on the ff14 benchmark

rexus123454969d ago

As of right this moment, AMD leads the market (ATI does not exist), but nvidia is gaining market with their new gtx 460 series

Sarcasm4969d ago

ATI is a better value and a lot of their cards run cooler.

As far as raw power, the high end Nvidia's is pretty hot stuff that cant be denied.

Once they roll out either a GTX 490 or maybe even the 500 series, ATI/AMD will have a lot on their hands.

Personally, the best value card in the entire market at least at this point is the GTX 460.

+ Show (8) more repliesLast reply 4969d ago
AwesomeJizz4970d ago (Edited 4970d ago )

ATI sucks because half of the games that come out don't support them, but overall they are a better company than NVIDIA IMO. I mean their products are a lot cheaper than NVIDIA's.

Btw, I thought AMD dropped the name ATI. What happened?

nycredude4970d ago

I don't know what games you are playing or what ati card you use but I have like over 30 steam games and they all run on my 5870.

z1ck4970d ago

really half the games ? can you name one ? because i dont remember a single one that dont work on ati . there was the saboteur but that easy fixed .

AwesomeJizz4970d ago

That's not what I meant..
I know that games get fixed, but it takes some time for ATI to fix them.

4cough4970d ago

Nvidia sound just like sony, Bitter and sour when thier hardware is actully put to the test it just doesnt live up to the hype.

Seijoru4970d ago

What the... Have you seen God of War 3, Uncharted 2, GT5, Killzone 2???????

evrfighter4969d ago

Are you serious.? Gow3 is all smoke and mirrors outside of kratos everything else is a blurry low res texture mess. Kz2 the land of crappy textures, low fps, and jaggies. Never played uc2 or gt5.

You are talking to pc gamers here. We expect quality across the board. Smoke and mirrors don't work on us.

wicko4969d ago

Here comes a flood of kids who have no idea what they're talking about.

Motorola4970d ago

Dont bother hes only saying that because an NVIDIA card is in the PS3. i think....

RememberThe3574970d ago

He's just trying to piss people off.

Shackdaddy8364970d ago (Edited 4970d ago )

I like how you say that even though its a fact that ps3 hardware is slightly better than the 360s. Everyone knew that since they first came out.

BTW, Im not one of those ps3 fanboys. I dont even own a PS3. I just dont like it when people say their stuff performs better when it really doesnt.

Zinc4970d ago (Edited 4970d ago )

I've owned cards from both companies. The TNT2 series lasted a long time and was very competitive. The Radeon X850 was a beast. I currently have a Nvidia 8800GTS 512 (G92) and it still kicks a good amount of ass, even to this day. I'm overall very happy with Nvidia, but both solutions have their positives.

Show all comments (57)
120°

Virtuos Working On A Multiplatform UE5 Remake, Rumored To Be The Elder Scrolls IV: Oblivion

Virtuos is currently working on a multiplatform Unreal Engine 5 remake, which is rumored to be The Elder Scrolls IV: Oblivion.

Read Full Story >>
nintendopal.com
BlackIceJoe5d ago

If this is true, I hope it leads to Morrowind getting remade next.

-Foxtrot4d ago

I thought they’d go for Morrowind first to be honest but this is a welcoming surprise to tide us over before ES6

TheColbertinator4d ago

Well said. Exactly my thoughts too.

Tedakin4d ago

Unreal Engine isn't efficient for open world games, so I question the reliability of this story.

isarai4d ago

Yeah, 1st thing that came to my mind too, although i CAN see it being likely for ease if use, but it's not going to run very well if so

Tacoboto4d ago (Edited 4d ago )

I'd wonder if it would just be exploiting Unreal for graphics but the underlying engine/logic is still the original framework.

Like how we had the graphical remakes of Halo Anniversary, Tomb Raider I-III, and Demon's Souls.

Could also end up a disaster like the GTA Trilogy

mkis0074d ago

I have heard that too but even if it's modified isnt arkham knight UE?

Fragslayer3d ago

"remake will run both an Unreal Engine 5 project and the old Oblivion project. For instance, new graphics are rendered in the Unreal Engine 5 project, but most of the gameplay and physics are still done on the original Oblivion engine"

So with this logic it'll run just fine using UE5 tool sets on top of Gamebryo's engine or maybe even Creation Engine, who knows.

+ Show (1) more replyLast reply 3d ago
kaos894d ago

Hopefully modders can fix the aged combart in this game if this is true. Enemies leveling up with you broke and defeated the whole purpose of leveling up.

Fragslayer3d ago

Yeah it could use some tuning for sure hence the need for a Remake not a Remaster. I'm surprised it's even a conversation within them group which makes me skeptical it's Oblivion. I wouldn't say they shouldn't level up at all though maybe just a leave them a couple levels behind to appease the masses.

Show all comments (12)
330°

Nvidia DLSS 3.7 drives a further nail in the coffin of native performance

Nvidia DLSS 3.7 is the latest update to the long-running AI upscaling technology, and it further shows native performance doesn't matter.

DustMan13d ago

I think hardware development is at a point where they need to figure out how to draw less power, These beefy high end cards eat wattage, and I'm curious if using DLSS & AI in general will lower the power draw. It would seem like the days of just adding more VRAM & horsepower is over. Law of diminishing returns. Pretty soon DLSS/FSR will be incorporated into everything, and eventually the tech will be good enough to hardly notice a difference if at all. AI is the future and it would be foolish to turn around and not incorporate it at all. Reliance on AI is only going to pick up more & more.

Tapani12d ago (Edited 12d ago )

DLSS certainly lowers power consumption. Also, the numbers such as the 4090 at 450W does not tell you everything, most of the time the GPU stays between 200-350W in gameplay, which is not too different from the highest end GPU of 10 years ago. Plus, today you can undervolt + OC GPUs by a good margin to keep stock performance while utilizing 80% of the power limit.

You can make the 4090 extremely power efficient and keep 90% of its performance at 320W.

However, in today's world the chip manufacturing is limited by physics and we will have power increases in the next 5-10 years at the very least to keep the technology moving forward at a pace that satisfies both businesses and consumers.

Maybe in 10 years we have new tech coming to the markets which we are yet to invent or perhaps we can solve existing technologies problems with manufacturing or cost of production.

On the other hand, if we were to solve the energy problem on earth by utilizing fusion and solar etc. it would not matter how much these chips require. That being said, in the next 30-40 years that is a pipedream.

MrBaskerville12d ago

I don't think fusion is the way forward. It will mosy likely be too late when it's finally ready, meaning it will probably never be ready. Something else might arrive before though and then it becomes viable.

Firebird36012d ago

We need to stop the smear campaign on nuclear energy.
We could power everything forever if we wanted too.

Tacoboto13d ago

PS4 Pro had dedicated hardware in it for supporting checkerboard rendering that was used significantly in PS4 first party titles, so you don't need to look to PC or even modern PC gaming. The first RTX cards released nearly 6 years ago, so how many nails does this coffin need?

InUrFoxHole12d ago

Well... its a coffin man. So atleast 4?

Tacoboto12d ago

PSSR in the fall can assume that role.

anast12d ago

and those nails need to be replaced annually

Einhander197212d ago

I'm not sure what the point you're trying to make is, but PS4 Pro was before DLSS and FSR, and it still provides one of the highest performance uplifts while maintaining good image quality.

DLSS is it's own thing but checkerboarding om PS5 still is a rival to the likes of FSR2.

Tacoboto12d ago

Um. That is my point. That there have been so many nails in this "native performance" coffin and they've been getting hammered in for years, even on PS4 Pro before DLSS was even a thing.

RaidenBlack12d ago

Don't know what's OP's point is either but ... checkerboard rendering was good enough for its time but in terms of image quality its wayy behind what's DLSS 3 or FSR 3 is currently offering.
The main point of the article and what OP missed here is that DLSS 3.7 is soo good that its nearly undisguisable from native rendering and basically throws the "its still blurry and inferior to native rendering" debacle, (that's been going around in PC community since 2019), right out of the window.

Einhander197211d ago

RaidenBlack

DLSS is as i said a different thing from FSR and checkerboard.

But you're talking about FSR 3 which probably is better than checkerboard, but FSR 3 has only started to get games this year, so checkerboard which was the first hardware upscaling solution was and is still one of the best upscaling solutions.

Give credit where credit is due, PlayStation was first and they got it right from the get go, and PSSR will almost certainly be better than it will be given credit for, heck digital foundry is already spreading misinformation about the Pro.

Rhythmattic12d ago

Tacoboto
Yes... Its amazing how many talekd about KZ2 deferred rendering, pointing out the explosions were lower res than the frame itself..
And of course, Then the idea of checkerboard rendering, not being native....
For sure, maybe this tech makes it minimal while pixel counting, but alas, seems performance and close enough , and not native now matters.....
I want to see it run native without DLSS.. why not?

RonsonPL12d ago

Almost deaf person:
- lightweight portable 5$, speakers of 0,5cm diameter are the final nail in coffin of Hi-Fi audio!

Some people in 2010:
- smartphones are the final nain in the console gaming's coffin!

This is just the same.
AI upscalling is complete dogshit in terms of motion quality. The fact that someone is not aware of it (look at the deaf guy example) doesn't mean the flaws are not there. They are. And all it takes to see them, is to use a display that handles motion well, so either gets true 500fps at 500Hz LCD TN or OLED (or faster tech) or uses low persistence mode (check blurbusters.com if you don't know what it means) also known as Black Frame Insertion or backlight strobing.

Also, image ruined by any type of TAA is just as "native image" as chineese 0,5$ screwdriver is "high quality, heavy duty, for professional use". It's nowhere near it. But if you're an ignorant "journalist", you will publish crap like this article, just to flow with the current.

There's no coffin to native res quality and there never will be. Eventually, we'll have enough performance in rasterization to drive 500fps, which will be a game changer for motion quality while also adding other benefit - lower latency.
And at 500fps, the amount of time required for upscalling makes it completely useless.
This crap is only usable for cinematic stuff, like cutscenes and such. Not for gaming. Beware of ignorants on the internet. The TAA is not "native" and the shitty look of the modern games when you disable any TAA, is not "native" either as it's ruined by the developer's design choice - you can cheat by rendering every 4th pixel when you plan to put a smeary TAA pass on it later on. When you disable it, you will see a ruined image, horrible pixellation and other visual "glitches" but it is NOT what native would've looked like if you'd like to honestly compare the two.

Stay informed.

RaidenBlack12d ago

Main point of the article is how far DLSS has come with v3.7 since 2018. If this is what we're getting already, then imagine what we'll get within next ~3 years. Yes parity will obviously be there compared to the then native rendering tech but it'll slowly narrow down to the point it'll be indistinguishable.
Something similar is like the genAI Sora ... AI generative videos were turd back when they were introduced (the infamous Will Smith eating video) ... but now look at Sora, generating videos that just looks like real life.

12d ago
Yui_Suzumiya12d ago

How much VRAM is standard today? My laptop has a 1080p QLED display but only an Intel Iris Xe with 128MB of VRAM. I currently do all my gaming on it but certain titles do suffer because of it. I plan on getting a Steam Deck OLED soon to play the newer and more demanding titles.

purple10112d ago

Maybe better to get a budget gaming laptop and link a dualsense to it

= Portable console with far better graphics than a steam deck! + bigger screen and able to use it for work / etc

170°

Why I'm worried about the Nvidia RTX 50 series

Aleksha writes: "Nvidia has established itself as a dominant force in the world of AI, but I can't shake the worry of what this means for the RTX 50 series."

Tal16916d ago

Echo sentiment here - I think the way GPUs are going, gaming could be secondary to deep learning. Wonder if the 40 series was the last true generation of GPUs?

Number1TailzFan15d ago

No.. Jensen believes GPUs should stay expensive. Those wanting a top end GPU will have to splash out for it, or play at just 1080p and 60fps or something if you can only afford a low end option.

On the other hand if you don't care about RT or AI performance then there's always AMD that are doing ok at the mid range.

Christopher15d ago

***or play at just 1080p and 60fps or something***

My over 2-year-old laptop GPU still runs fine. I think this is more a reason why GPUs are going to other things in priority, because the market reach for new users is shrinking as more PC gamers focus less on replacing older and still working parts that run RT/AI fine enough as it is. Not to say there aren't people who still do it, but I think the market is shrinking for having the latest and greatest like it has been the past two decades. Problem is we aren't growing things at a rate as we were, we're reaching the the flattening of that exponential curve in regards to advancement. We need another major technological advancement to restart that curve.

D0nkeyBoi15d ago

The irremoval ad makes it impossible to read article

Tzuno15d ago (Edited 15d ago )

I hope Intel takes some lead and do a big dent to nvidia sales

Jingsing15d ago

You also need to consider that NVIDIA are heavily invested in cloud gaming. So they are likely going to make moves to push you into yet another life subscription service.

Kayser8115d ago

NVIDIA will never change their price point until AMD or intel makes a GPU that is comparable and cheaper than them .
it happend before in the days of gtx280 which they changed the price from 650$ to 450$ in a matter of 2 weeks because of rx4870 which is being sold at 380$.

Show all comments (8)