410°

NVIDIA GeForce Titan RTX has been leaked

DSOGaming writes: "It appears that NVIDIA will release a new Titan card based on the Turing architecture, called NVIDIA GeForce Titan RTX. Although we don’t know much, we expect this GPU to be more powerful than the RTX2080Ti, and it will be even more expensive (so we’re basically looking at a $3000 GPU?)."

Read Full Story >>
dsogaming.com
SierraGuy1970d ago

10 gigarays of raytracing power.

NarutoFox1970d ago (Edited 1970d ago )

I bet this is going to be so expensive. $3000 😳😀

Parasyte1970d ago

I was about to say the same thing!

Lionsguard1969d ago

I remember when Titans used to be the golden unicorn at $1000.

sander97021969d ago

We need some revolutionary hardware soon I feel like we aren't getting the revolutionary changes we used to get.

Profchaos1969d ago

Definitely are not but I think we are in a race between and and NVIDIA and Intel in shrinking to 7nm gpu and CPUs and right now tech is over priced and under whelming.

MoshA1969d ago

Nvidia is exploiting the market right now. Wait for next-gen consoles and AMD/Intel gpus in 2020. Nvidia will stop screwing around or they will die.

Hungryalpaca1969d ago

I highly doubt the new consoles will come close to a 2080 base model and even that thing is disappointing. They’ll likely be around the1080 by the time they release.

Bladesfist1969d ago

The irony of this is that ray tracing is the most revolutionary change we have had in a while. I'm guessing you mean you want the same rasterization stuff but just faster or do you not think that ray tracing will be the future of realtime rendering?

NarooN1969d ago

Raytracing in and of itself isn't anything new, though. It's been getting researched and even implemented in various ways for decades. RTX is just the latest leap in it being feasible in real time in conjunction with rasterization based software. Even then we see the hardware still has a ways to go. It will be a long time before raytracing is doable on a wide range of affordable hardware as well as doable without utterly destroying performance. Not even Nvidia's top-end GPU gives a satisfying experience with it on at 1080p in BFV.

That said, it's a neat tech but I wouldn't even jump on the hype train yet. I'd rather it get utilized in ways that actually is integrated properly into the gameplay rather than just being a thing that makes games look slightly prettier or cool.

Hungryalpaca1969d ago

Ray tracing clearly isn’t ready. No one is going to play at 1080p 30fps with a $2000 gpu because it makes shadows and shinies prettier.

When it doesn’t chug performance it will be incredible. Until then it’s pointless.

NarooN1969d ago

Agreed. The GPU space has gotten quite boring these past few years. A big part of it is the complexities and problems the foundry partners have ran into, it's gotten increasingly difficult to achieve these smaller optical shrinks as well as becoming even more expensive to R&D. Hopefully 7nm we get some true jumps in performance and energy efficiency.

+ Show (1) more replyLast reply 1969d ago
ANIALATOR1361969d ago

We need the jump to next gen now so tech prices will come down

MoshA1969d ago ShowReplies(7)
Asuka1969d ago

and here I am trying to justify myself in spending $499 on an RTX 2070 LUL

NicSage1969d ago

I just picked up a vega 64 on black friday for 319. I was rocking a 970; i've never bought an AMD product and I was really nervous. Man I am so happy I did, i'm running BF5 2560x1440 maxed out butter smooth. My FreeSync monitor i've had for about 8 months can now be taken advantage of. I'm really happy.

FGHFGHFGH1969d ago

Isn't freesync for smoothing out low framerates? If you are getting higher fps the your monitor's freesync range, I hear that some people cap their fps if it goes past the monitors.

https://www.reddit.com/r/Am...

NarooN1969d ago

@FG

Freesync/Adaptive Sync can have a refresh range much higher than 60hz. It's not really about smoothing out low framerates than it is just eliminating screen tearing completely by having the monitor actually change its native refresh rate on the fly to match the game's framerate. Some people cap their framerate so it remains within the freesync range so they don't get tearing again. Like if I had a 120hz monitor, I would choose capping it at 119hz over getting 300fps in-game any day of the week.

Profchaos1969d ago (Edited 1969d ago )

Same I just picked up a 2070 for my new build had to chose something and the black friday sales put it on par with the 1080 in terms of price.

But most new hardware this year has been disappointing I was pumped for the i9 9900k until reviewers tore it to shreds for its poor power consumption and thermals and the 9700k was a minimal improvement on the 8700k so I ended up going back a gen to the 8700k

Show all comments (55)
40°

A New Era for Mixed Reality

Meta writes: "Xbox and Meta teamed up last year to bring Xbox Cloud Gaming (Beta) to Meta Quest, letting people play Xbox games on a large 2D virtual screen in mixed reality. Now, we’re working together again to create a limited-edition Meta Quest, inspired by Xbox."

60°

Razer Iskur V2 Gaming Chair Review - Lumbar Support Done Right

The Razer Iskur V2 is a high quality, premium gaming chair that your back will thank you for.

Read Full Story >>
playstationlifestyle.net
100°

Make your next GPU upgrade AMD as these latest-gen Radeon cards receive a special promotion

AMD has long been the best value option if you're looking for a new GPU. Now even their latest Radeon RX 7000 series is getting cheaper.

Father__Merrin1d 2h ago

Best for the money is the Arc cards

just_looken1d 1h ago

In the past yes but last gen amd has gotten cheaper and there new cards are on the horizon making 6k even cheaper.

The arc cards are no longer made by intel but asus/asrock has some the next line battlemage is coming out prices tbd.

Do to the longer software development its always best to go amd over intel if its not to much more money even though intel is a strong gpu i own 2/4 card versions.