Top
440°

NVIDIA GeForce RTX 2080 and 2080 Ti Gaming Benchmarks Leaked

Today, we're seeing a leak that contains official benchmarks of the NVIDIA RTX 2080, 2080 Ti cards in what are being termed as the recommended titles.
Update: Update: A concerned Redditor came up with a detailed graph showing the performance gains of the RTX cards over the previous generation, along with the FPS/$ gain.

Read Full Story >>
techquila.co.in
The story is too old to be commented.
bluefox75534d ago (Edited 34d ago )

Didn't they say both could run all modern games at 4k/60fps? Impressive numbers, but I guess the 2080 couldn't swing it with the new Tomb Raider, Andromeda, or Shadow of War.

ChrisW34d ago

From what I've understood, the official nVidia GPU driver hasn't been released. So I'm going to say that all of this "leaked" stuff is bullcrap... which I've been noticing quite rather often from Techquila.co

darksky34d ago

These are supposed to be leaked Nvidia slides so not exactly bullcrap if genuine. I'm sure Nvidia would only create slides with their best working drivers. Unlikely that any release driver would push the fps much higher.

mikeslemonade33d ago

This is better than I thought. Now I will definitely get the 2080ti once they are avalible and I will buy Nvidia stock.

mikeslemonade33d ago

AMD’s grace period is now over

hanko1434d ago

im surprise that these arent hdmi 2.1

haydenlake34d ago (Edited 34d ago )

Benchmarks from NVIDIA are worthless to most people as they only account for the cream of the crop customers with 4K HDR G-Sync monitors. The standard NVIDIA-er running 1080p/1440p SDR monitors will not see these gains and the fps/$ stat will only get worse.

Larrysweet34d ago

Huh makes no sense u play on lesser screens ull get way more fps

haydenlake34d ago (Edited 34d ago )

Yes, my friend. If you look at the article you'll know i'm referring to the performance gain from Pascal to Turing at 4K HDR w/ G-Sync and how at popular resolutions you won't see the same result.

fr0sty34d ago

4K HDR requires more than 4X the amount of processing than 1080p, so that statement makes no sense at all.

haydenlake34d ago

You're not understanding. Look at the article and see the performance gain from Pascal to Turing at 4K HDR w/ G-Sync. Do you think the increase will be the same at popular resolutions like 1080p and 1440p? No.

xRacer74x34d ago

Sometimes you only show the improvements when you are running at the higher resolutions not the low 1080p.

Hungryalpaca34d ago

Resolution is resolution. What are you even talking about?

Less resolution will get higher FPS...are you not aware of how this works?

haydenlake34d ago

I am but you clearly don't understand that you won't see the same performance gains from Pascal to Turing at 4K HDR w/ G-Sync at more popular resolutions like 1080p/1440p.

Gwiz34d ago

No that's not how that works,shifting heavy loads towards a GPU will elevate some workloads from the cpu.
What he is saying is the increase in performance when playing with lower resolutions will have less percentage performance gains because the workload is shifted towards the CPU.Playing @4K will show you results based on the better GPU handling 4K better,not with lower resolutions where that difference will be more reliant on your CPU.

Adexus34d ago

You don't see as much of a gain at lower resolutions like 1080/1440p as they're more CPU bound rather than GPU so the gain won't be as high, doesn't mean the newer card won't wipe the floor with the old generation though.

haydenlake34d ago (Edited 34d ago )

Not necessarily. Memory bandwidth is crucial to playing at certain resolutions and typically the 1080 isn't a 4K card whereas the 1080 Ti is because it uses 11Gbps GDDR5X modules capable of a sustained 484GB/s. As for CPU bottle-necking you won't run into issues if you're using your GPU properly.

You'll see what i mean when legitimate benchmarks release. For the average 1080p/1440p SDR Non-G-Sync gamer, the 15-45% performance gain shown here will look like -5-25% as Pascal is bad with HDR & HDR G-Sync.

WeebLord34d ago

Use cases vary from person to person, 1440p/165hz takes a bit more than a 1060 to get to, even a 1070 can't do that on every game at max settings.

+ Show (2) more repliesLast reply 34d ago
aceloth34d ago

Impressive results, as these aren’t yet optimized for DLSS...

haydenlake34d ago

DLSS is fake performance, like checker-board rendering.

ossyc34d ago

You couldn't be more wrong. It's still a native 4K render; DLSS is just an incredibly efficient algorithm - a number crunch on Nvidia's supercomputers - for anti-aliasing, smoothing out the edges and the result is magnificent.

It just enables the Turing architecture to process it faster, resulting in a x2 + fps increase with unprecedented detail and quality......

........on top of the 50% out of the box

MrCrimson33d ago

checker-board looks amazing. DLSS performance increase is real - as in you see little to no fidelity loss and significant increase in performance.

Bobafret34d ago

Give it 2 or 3 years and this benchmark will be crap. Rinse, repeat, cha ching.

Show all comments (31)