250°

Witcher 3 should look similar across all platforms, but PCs with NVIDIA cards will have the edge

During Łódź Game Summit 2013 in Poland dobreprogramy.pl had a chance to ask CD Projekt RED many questions, one of them being the difference in graphics between powerful PCs and PS4 plus Xbox One. The studio aims to have a similar visual experience in every case and as of yet did not have to sacrifice anything for consoles, but at the same time it confirmed a close working relationship with NVIDIA. While the consoles are based on AMD solutions, some effects may be reserved for PCs with GeForce cards.

Read Full Story >>
translate.google.com
Moncole3825d ago

Bad for me 560ti. I gotta upgrade but I don't need to hury because I mostly play games that don't take a lot of power

LightofDarkness3825d ago

Wait till the 8 series drops sometime in Q1, then buy either buy a midrange Maxwell (should be Titan level performance) or a high end Kepler (770/780). Maxwell looks like it's going to be a pretty massive jump from even the Titan, and AMD look like they're trying to start a price war. Interesting times for GPUs coming soon.

Abriael3825d ago

Same. Can't wait for this game. It's my most anticipated next gen title so far.

ATi_Elite3825d ago (Edited 3825d ago )

Seriously that is NOT bad for you R9 290x is a BEAST! LOL

GTX660ti SLI Good for me!
HD7970GE CFX Better for me!

Shuyin3825d ago

Yes y'all, go on, keep messuring dat e peniz.

3825d ago
MadLad3825d ago

Same here.
Still a very, very capable card.

jy_mrnd3825d ago

Happy I bought the titan.

ATi_Elite3825d ago

Great for me!

Nvidia 6150 nforce 430 SLI

I should be able to easily run The Witcher 3 on Uber setting 1600p 60fps in 3D!

/sarc

LAWSON723824d ago

Bad for me then,lol
HD 7950

sofocado3824d ago

Bad for me then LOL
AMD A10 5800k APU crossfire with HD6670

tee_bag2423824d ago (Edited 3824d ago )

Great 780m SLI for me

MuhammadJA3824d ago

good for me then...i hope.

GTX670M

+ Show (11) more repliesLast reply 3824d ago
jay23825d ago

That takes the piss, 2 consoles with ATI cards and the comp gets the better deal :@,

Abriael3825d ago

Same with AC4. It's normal.

majiebeast3825d ago

Nvidia and AMD both moneyhat developers its something we have to deal with.

ProjectVulcan3825d ago (Edited 3825d ago )

I said it before and I'll say it again, whatever the consoles have inside them matter not to PC gaming.

Intel and Nvidia rule in PC land in market share, console hardware won't change that because it never has. It hasn't mattered in the slightest what has been in the consoles the past 20 years because PC always leads from the front and dictates the direction of videogame technology.

Having CELL or XDR memory in PS3, or eDRAM in 360 and a PowerPC CPU made a sum total of zero difference to PC gaming hardware the past 8 years.

As per usual all that happens is that newer consoles derive more and more from existing PC technology, not the other way around.

The new consoles are perfect case in point. They have never been more like PC derived hardware, x86 CPUs and Radeon based graphics ripped right out of a PC architecture and shoved together on a PC derived APU process.

Dante813825d ago

Intel rules the cpu market, Nvidia does not rule the gpu market(not yet!!). They love to peddle their proprietary crap, sort of like Microsoft.

ProjectVulcan3824d ago (Edited 3824d ago )

Nvidia's GPU market share on PC is over 60 percent for discrete boards. Thats a pretty significant lead. Its been that way since forever so they do rule the GPU market and have done for years like Intel have always dominated the x86 market.

Its ironic you accuse Nvidia of peddling proprietary crap when AMD harped on about mantle for a while now, Nvidia have their own technology IPs same as anyone else, except Nvidia usually get decent support for them without having to pay people to use them because of their aforementioned market share dominance....

Software_Lover3825d ago

It's coming to a point where we are gonna have to treat PC's like consoles. Devs are gonna need to make an Nvidia and AMD (ps3/360) version of the same game.

Jovanian 3825d ago

i don't think we're at that point yet, I don't think pc gaming could survive such extreme hardware polarization, it would be bad for business for both of them

3825d ago Replies(2)
kcuthbertson3825d ago

Still rocking my gtx 580 from a few years ago. I'll probably build a completely new rig once Maxwell drops though. I love building new systems..there's just something exciting about pressing that power button for the first time knowing that you completely custom built it.

Show all comments (56)
330°

Nvidia DLSS 3.7 drives a further nail in the coffin of native performance

Nvidia DLSS 3.7 is the latest update to the long-running AI upscaling technology, and it further shows native performance doesn't matter.

DustMan3d ago

I think hardware development is at a point where they need to figure out how to draw less power, These beefy high end cards eat wattage, and I'm curious if using DLSS & AI in general will lower the power draw. It would seem like the days of just adding more VRAM & horsepower is over. Law of diminishing returns. Pretty soon DLSS/FSR will be incorporated into everything, and eventually the tech will be good enough to hardly notice a difference if at all. AI is the future and it would be foolish to turn around and not incorporate it at all. Reliance on AI is only going to pick up more & more.

Tapani3d ago (Edited 3d ago )

DLSS certainly lowers power consumption. Also, the numbers such as the 4090 at 450W does not tell you everything, most of the time the GPU stays between 200-350W in gameplay, which is not too different from the highest end GPU of 10 years ago. Plus, today you can undervolt + OC GPUs by a good margin to keep stock performance while utilizing 80% of the power limit.

You can make the 4090 extremely power efficient and keep 90% of its performance at 320W.

However, in today's world the chip manufacturing is limited by physics and we will have power increases in the next 5-10 years at the very least to keep the technology moving forward at a pace that satisfies both businesses and consumers.

Maybe in 10 years we have new tech coming to the markets which we are yet to invent or perhaps we can solve existing technologies problems with manufacturing or cost of production.

On the other hand, if we were to solve the energy problem on earth by utilizing fusion and solar etc. it would not matter how much these chips require. That being said, in the next 30-40 years that is a pipedream.

MrBaskerville3d ago

I don't think fusion is the way forward. It will mosy likely be too late when it's finally ready, meaning it will probably never be ready. Something else might arrive before though and then it becomes viable.

Firebird3603d ago

We need to stop the smear campaign on nuclear energy.
We could power everything forever if we wanted too.

Tacoboto3d ago

PS4 Pro had dedicated hardware in it for supporting checkerboard rendering that was used significantly in PS4 first party titles, so you don't need to look to PC or even modern PC gaming. The first RTX cards released nearly 6 years ago, so how many nails does this coffin need?

InUrFoxHole3d ago

Well... its a coffin man. So atleast 4?

Tacoboto3d ago

PSSR in the fall can assume that role.

anast3d ago

and those nails need to be replaced annually

Einhander19723d ago

I'm not sure what the point you're trying to make is, but PS4 Pro was before DLSS and FSR, and it still provides one of the highest performance uplifts while maintaining good image quality.

DLSS is it's own thing but checkerboarding om PS5 still is a rival to the likes of FSR2.

Tacoboto3d ago

Um. That is my point. That there have been so many nails in this "native performance" coffin and they've been getting hammered in for years, even on PS4 Pro before DLSS was even a thing.

RaidenBlack3d ago

Don't know what's OP's point is either but ... checkerboard rendering was good enough for its time but in terms of image quality its wayy behind what's DLSS 3 or FSR 3 is currently offering.
The main point of the article and what OP missed here is that DLSS 3.7 is soo good that its nearly undisguisable from native rendering and basically throws the "its still blurry and inferior to native rendering" debacle, (that's been going around in PC community since 2019), right out of the window.

Einhander19722d ago

RaidenBlack

DLSS is as i said a different thing from FSR and checkerboard.

But you're talking about FSR 3 which probably is better than checkerboard, but FSR 3 has only started to get games this year, so checkerboard which was the first hardware upscaling solution was and is still one of the best upscaling solutions.

Give credit where credit is due, PlayStation was first and they got it right from the get go, and PSSR will almost certainly be better than it will be given credit for, heck digital foundry is already spreading misinformation about the Pro.

Rhythmattic3d ago

Tacoboto
Yes... Its amazing how many talekd about KZ2 deferred rendering, pointing out the explosions were lower res than the frame itself..
And of course, Then the idea of checkerboard rendering, not being native....
For sure, maybe this tech makes it minimal while pixel counting, but alas, seems performance and close enough , and not native now matters.....
I want to see it run native without DLSS.. why not?

RonsonPL3d ago

Almost deaf person:
- lightweight portable 5$, speakers of 0,5cm diameter are the final nail in coffin of Hi-Fi audio!

Some people in 2010:
- smartphones are the final nain in the console gaming's coffin!

This is just the same.
AI upscalling is complete dogshit in terms of motion quality. The fact that someone is not aware of it (look at the deaf guy example) doesn't mean the flaws are not there. They are. And all it takes to see them, is to use a display that handles motion well, so either gets true 500fps at 500Hz LCD TN or OLED (or faster tech) or uses low persistence mode (check blurbusters.com if you don't know what it means) also known as Black Frame Insertion or backlight strobing.

Also, image ruined by any type of TAA is just as "native image" as chineese 0,5$ screwdriver is "high quality, heavy duty, for professional use". It's nowhere near it. But if you're an ignorant "journalist", you will publish crap like this article, just to flow with the current.

There's no coffin to native res quality and there never will be. Eventually, we'll have enough performance in rasterization to drive 500fps, which will be a game changer for motion quality while also adding other benefit - lower latency.
And at 500fps, the amount of time required for upscalling makes it completely useless.
This crap is only usable for cinematic stuff, like cutscenes and such. Not for gaming. Beware of ignorants on the internet. The TAA is not "native" and the shitty look of the modern games when you disable any TAA, is not "native" either as it's ruined by the developer's design choice - you can cheat by rendering every 4th pixel when you plan to put a smeary TAA pass on it later on. When you disable it, you will see a ruined image, horrible pixellation and other visual "glitches" but it is NOT what native would've looked like if you'd like to honestly compare the two.

Stay informed.

RaidenBlack3d ago

Main point of the article is how far DLSS has come with v3.7 since 2018. If this is what we're getting already, then imagine what we'll get within next ~3 years. Yes parity will obviously be there compared to the then native rendering tech but it'll slowly narrow down to the point it'll be indistinguishable.
Something similar is like the genAI Sora ... AI generative videos were turd back when they were introduced (the infamous Will Smith eating video) ... but now look at Sora, generating videos that just looks like real life.

2d ago
Yui_Suzumiya3d ago

How much VRAM is standard today? My laptop has a 1080p QLED display but only an Intel Iris Xe with 128MB of VRAM. I currently do all my gaming on it but certain titles do suffer because of it. I plan on getting a Steam Deck OLED soon to play the newer and more demanding titles.

purple1013d ago

Maybe better to get a budget gaming laptop and link a dualsense to it

= Portable console with far better graphics than a steam deck! + bigger screen and able to use it for work / etc

170°

Why I'm worried about the Nvidia RTX 50 series

Aleksha writes: "Nvidia has established itself as a dominant force in the world of AI, but I can't shake the worry of what this means for the RTX 50 series."

Tal1696d ago

Echo sentiment here - I think the way GPUs are going, gaming could be secondary to deep learning. Wonder if the 40 series was the last true generation of GPUs?

Number1TailzFan6d ago

No.. Jensen believes GPUs should stay expensive. Those wanting a top end GPU will have to splash out for it, or play at just 1080p and 60fps or something if you can only afford a low end option.

On the other hand if you don't care about RT or AI performance then there's always AMD that are doing ok at the mid range.

Christopher6d ago

***or play at just 1080p and 60fps or something***

My over 2-year-old laptop GPU still runs fine. I think this is more a reason why GPUs are going to other things in priority, because the market reach for new users is shrinking as more PC gamers focus less on replacing older and still working parts that run RT/AI fine enough as it is. Not to say there aren't people who still do it, but I think the market is shrinking for having the latest and greatest like it has been the past two decades. Problem is we aren't growing things at a rate as we were, we're reaching the the flattening of that exponential curve in regards to advancement. We need another major technological advancement to restart that curve.

D0nkeyBoi6d ago

The irremoval ad makes it impossible to read article

Tzuno6d ago (Edited 6d ago )

I hope Intel takes some lead and do a big dent to nvidia sales

Jingsing6d ago

You also need to consider that NVIDIA are heavily invested in cloud gaming. So they are likely going to make moves to push you into yet another life subscription service.

Kayser816d ago

NVIDIA will never change their price point until AMD or intel makes a GPU that is comparable and cheaper than them .
it happend before in the days of gtx280 which they changed the price from 650$ to 450$ in a matter of 2 weeks because of rx4870 which is being sold at 380$.

Show all comments (8)
110°

The 7 Best Western RPGs: Immersive Adventures

RPGs are often huge, sprawling endeavours. With limited playtime, we have to choose wisely, so here's the best western RPGs available today.

SimpleSlave10d ago

"I started playing games yesterday" the List... Meh!

How about a few RPGs that deserve some love instead?
1 - Alpha Protocol - Now on GOG
2 - else Heart.Break()
3 - Shadowrun Trilogy
4 - Wasteland 2
5 - UnderRail
6 - Tyranny
7 - Torment: Tides of Numenera

And for a bonus game that flew under the radar:
8 - Banishers: Ghosts of New Eden

DustMan10d ago

Loved Alpha Protocol in all it's glorious jank. Great game.

SimpleSlave10d ago (Edited 10d ago )

Not only glorious jank, but the idea that the story can completely change depending on what you do, or say, or side with, makes it one of the most forward thinking games ever. The amount of story permutation is the equivalent of a Hitman level but in Story Form. And it wasn't just that the story changed, no, it was that you met completely new characters, or missed them, depending on your choices. Made Mass Effect feel static in comparison.

Alpha Protocol was absolutely glorious, indeed. And it was, and still is, more Next Gen than most anything out there these days. In this regard at least.

Pity.