DSOGaming writes: "Let's find out whether the NVIDIA RTX2080Ti was able to achieve its initial 4K/Ultra with 60fps goal these past two years in 83 PC games."
"These results also justify why we were describing the RTX2080Ti as a “1440/Ultra settings” GPU. " No, it doesn't. I don't agree with the premise here. A GPU isn't a 4k/60 GPU unless it can run 4k/60 at ultra settings across some arbitrary percentage of games? That's entirely subjective. And this completely ignores the poorly optimized games such as Anthem where the human element plays a significant part in the results. Nope.....just place silly labels on the hardware anyway? lol.....you do you.
I mean, its always going to be subjective - its trying to apply a generalized label to categorize "performance". But at what point can we also stop saying "games just aren't optimized well enough" if a card struggles with all of them at 4k/60? I think most people just assume that this hardware should always be able to run all graphical options enabled at a smooth 4k/60 and anything less than that is the developers fucking up. Honestly I think its pretty reasonable that most newer games have surpassed what the 20x0 series should expected to handle at 4k.
I totally agree. I'm happy that there are ultra settings that even existing cards can't handle at 4k60. That way if I play a game years later with a new card (because I missed it or whatever), the game won't look dated because it had some headroom to shine. It's one of the benefits of PC gaming I think.
But at what point can we also stop saying "games just aren't optimized well enough" if a card struggles with all of them at 4k/60?" I didn't say that at all. I'm talking about specific games with known optimization issues. And this card in particular doesn't "struggle with all of them". If these guys want to say a card isn't 4k/60/Ultra because it can't run whatever percentage of games then that is a different conversation. That is not what they are doing though. They proclaiming the 2080ti to be a 1440p/60 card because of their own Ultra settings requirement. I don't buy into that. "I think most people just assume that this hardware should always be able to run all graphical options enabled at a smooth 4k/60 and anything less than that is the developers fucking up. " No idea why anyone would make those assumptions, but that isn't the assertion being made here at all. "Honestly I think its pretty reasonable that most newer games have surpassed what the 20x0 series should expected to handle at 4k." Except that isn't the case. Again, the threshold in the article is 4k/60/Ultra settings.
@RazzerRedux I agree with you. Developers are not lazy or trying not to optimize games for PC. They have to do testing with multiple GPUs across the board, and they add PC ultra options on top of the games they develop for consoles as a baseline. Maybe they simply add them knowing that in the future new cards might be able to run them and at the same time devs can give options to gamers to run games 30fps/60fps/120fps/144fps etc.
But but the flops, PcSheepRace, the "best".
Who even wants to play anthem? And yes, having to play everything at ultra settings is more just the person running the benchmark being lazy.
This guy who wrote the article should use Optimized settings per game. There's no point cranking everything up to ultra and thinking that's how the developer thought gamers should play the game. For example RTX2080ti with optimized settings for SOTR runs 4K native 70fps+ all the time and you can't really see any difference between "ultra" and "optimized" unless you zoom in 200-300% and start making comparisons. Who does that if they want to game??
The problem isnt the card. Are the ports of the games and the millions of pc combinations. A gpu with 13.5 or 14 teraflops should do 4k without any problem. The lack of optimization is also a Huge deal. Using a card like this to play at 1440p even on ultra settings.... 14 teraflops... For me is just crazy.
It’s not even that, 4K 60 FPS is much harder to run than people seem to think. Just forget about teraflops, they don’t tell you much of anything unless both GPUs are from the same series and even then, it’s just a rough comparison.
If it was developed for in the same way consoles are, it would do it easily.
I don't think consoles run games at settings equivalent to "ultra" anyway, do they? In which case the article is saying no console is actually a "4k/60" console. Just seems silly to me.
The one x has an improved rx 580 with 12 gb vram, and guess what.. it runs just as expected. A few games are 4K 30 FPS with drops, so using math it takes over twice that gpu power for a solid 60 FPS at 4K. Most games aren’t even 4K on the one x by the way.
I enjoyed my 2080ti strix card. I will be retiring it his year for the new 3080. I might not have gotten 60fps second on utlra every time but it was in the high 50s when at it lowest FPS.
For 1440p I'll be looking at the 3080, mostly because I like playing well above 60fps, also pretty much guaranteed a solid high frame rate in more demanding games that come along similar to RDR2. If people want to play at 4k that's up to them, no chance of getting near 100fps at that resolution in any kind of demanding game though, and the high refresh screens are very expensive. Some game on big 4k tv's, but with a smaller monitor at 1440p you'd be getting similar/better pixel density anyways.
Plus smaller monitor's tend to have far less input lag
In 2009? I’ve 2 lgs and a tcl all are are low. The 4ktcl I think is about 15ms, the c9 oled is like 13, and the 86inch is like 10ms. If you want to look at rtings.com they put major effort into it.
Interesting results, but if the RTX 2080 Ti managed to hit 4K60 at max settings in some games then it surely delivered on its premise?
Two options for me at 4K: 1. RTX3080ti DLSS+RTX 2. Big Navi + Res Scaling + RIS + Ray Tracing. Let's see what produces the best bang for buck. Right now Nitro+ 5700xt provides 2080ti performance at 80% res scaling + RIS, and pretty much native 4K (sure, internal rendering resolution is lower, but if nobody can see the difference without 200% zoom, who cares!)
My 2080ti ftw3 ultra is a 3440x1440 on Ultra kind of Beast. It does do some games at 4K ultra but I knew it was better at maxing everything at Ultrawide 1440p at 110-120 fps. I'm not going to upgrade to a 30 series until I end up upgrading my mobo, cpu and monitor.
Might make sense to wait until DDR5 standard comes out. From 2080ti the only reasonable upgrade would be 3090, but I'd still wait for two years until upgrading. I'm thinking of moving from 5700xt to 6800xt, but I'll see what it looks like in reality for AMD's card before jumping in.
Click bait article.
I think to say card is a 4k60fps card only counts if its ultra settings is non sense.
weve been constantly told its a 4k 60fps card by pc fans on here. for example if someone said they are purchasing battlefield on ps4 someone would likely pop into the thread to say "im going to play it at 4k 60fps on my 2080ti" even though hes likely got a 1060 3gb....
N4G is a community of gamers posting and discussing the latest game news. It’s part of NewsBoiler, a network of social news sites covering today’s pop culture.