The Witcher 3 dev: targeting 30 FPS with PhysX destruction, skeptical on Mantle

CD Projekt RED is targeting 30 FPS with PhysX based destruction for The Witcher 3. They are also skeptical of AMD's Mantle.

Read Full Story >>
The story is too old to be commented.
Festano1646d ago

I'm also skeptical about Mantle, I prefer the direct x that can be supported by both Nvidia and AMD. Surely I will take it to the PC.

SnakeCQC1646d ago

A massive problem with pc gaming is the lack of optimisation because of the lack of true low level hardware access for devs(why games on consoles look great even on decade old hardware).
ALOT of devs hate direct x because of how it encumbers them. Im pretty sure that nvidia is getting money from ms to stay with direct x.
I really do think its unimaginable to that mantle wont give a massive performance increase.
As a gtx 670 owner the only thing keeping me to nvidia atm is how good shadowplay is but once mantle is activated in bf4 etc once framerates are released i may change over to amd

decrypt1646d ago (Edited 1646d ago )

PC optimization is pretty dam good. Infact i would say more games are better optimized on PC than on console. Check out games like Skyrim, Mafia 2, BF3, Dragon age they all have problems on console.

Every platform has its share of badly optimized games. However i would bet with you Consoles have more of those than PC. PS3 was notorious for its share of badly optimized games.


Exactly, People also tend to ignore the advantages of DX. Ask people who game on consoles how much it sucks not to have BC. Thousands of USD worth of games bought last gen all rendered worthless this gen, unless they are willing to keep reinvesting on old hardware.

DX is pretty dam efficient for what it is. Check out a 7 year old GPU like the 8800GTX. In theory its about 2.5x as powerful as the PS3 or Xbox 360. Even today 8800GTX will play most games in 1080p, where the consoles will do 720p this is about a 2x difference in real world performance.

Hence i would think DX does a pretty good job when it comes to efficiency while giving us a bridge to backward compatibility that no other platform can offer.

NoLongerHereCBA1646d ago

I don't see why devs would hate on directX, because I don't see how they are being forced to use it? If they want, they can code in Assembly, but they won't since it is really difficult and a pain in the ass to optimize for different specs.

DirectX is there for developers/publishers who want to save an insane amount of money on optimization/compatibility. There is almost no reason for them to do it the hard way.

Alexious1645d ago

You're right, lower level hardware access would be good on PC but as pointed out by Torok (and previously by Carmack), some of these optimizations can be achieved with OpenGL as well, and OpenGL is compatible with pretty much every architecture out there - unlike Mantle.

If you add that Mantle even requires more effort and more development time/resources, I don't really see a majestic future for this API. Moreover, having too many APIs would be just a hassle for PC development in general.

As far as NVIDIA goes, you're forgetting G-SYNC which is another very good reason that is keeping me and many others in the "green field".

frostypants1645d ago (Edited 1645d ago )

@ decrypt: "PC optimization is pretty dam good."

No, it's terrible. There is very little consistency to the PC platform (to the extent that it hardly qualifies as one), so true optimization to the hardware level is almost impossible.

Prime1571645d ago

@decrypt "DX is pretty dam efficient for what it is. Check out a 7 year old GPU like the 8800GTX. "

I am still using that card. I prefer my ps3. Just saying since I laughed at the mention, then I got really embarrassed. I am waiting for March prices to drop about 200$ for a new card.

AndrewLB1645d ago

nVidia has had a low level hardware access API for years now called NVAPI and it works very well. Additionally, nVidia has CUDA which provides both a low level and high level API for GPGPU and PhysX calculations to be done on the GPU itself.

This nonsense that AMD fanboys keep repeating, that nVidia is screwed now that Mantle is coming, and how they need their own API if they think they're going to compete.... what a joke.

pete0071645d ago

changing HW based on a game benchmarks? either you lie, or youre a welthy b***ard.
like all proprietary technologies outthere, i am against it and a no supporter. even nvidia physx, wich i believa are a grat feature in games, is not sticking to the masses and not even implemented on the gamming market.
I´m on a 680sli that still crunches everything at max settings and as mantle might be a nice performance jump, its also very limited at the beginning. just 2 cards, only one game or two....whatever

+ Show (4) more repliesLast reply 1645d ago
C-H-E-F1646d ago

Well I LOVE this, because that means I will have closer to 0 lag playing the witcher 3 on my vita because it will not downgrade from 60fps to 30fps :D... WOOOOOOOOOOOOOOOOOOOOOOOOOOOOO OOOT

Allsystemgamer1646d ago's not 60 fps on consoles if that's what you're getting at...

L0Lcano1645d ago


What hes trying to say is, that the game will not be a massive downgrade when he plays it through remote play on his vita. This is bevaue when you use remote play, it is capped to 30fps so it really is different to the 60fps fluidity that you get with playing it on ps4 directly. I agree with him and now actually see a sort of weird advantage in 30fps releases now since I yearn to play them on vita.

Alexious1645d ago

DAT Remote Play. Let's hope they upgrade it to 60fps, anyway.

C-H-E-F1645d ago

Disagree's because I want the best experience on my Vita?? lOl wow, Oh well,

FPS stands for FRAMES PER SECOND in this case bro. NOT First Person Shooter.
Thanks for trying to clear it up for me, I was saying just that and yes Vita makes 30fps way more desirable to me now because I actually play my ps4 on my vita as much and or more than I actually play my ps4 on my TV and DS4.

YoungPlex1645d ago (Edited 1645d ago )

I understood what you were trying to say! I felt that way with NFS and ACIV it isn't that much of a difference because both are running at 30fps, so I do enjoy using remote play on those two titles. As far as game running at 60fps goes like BF4 and Resogun I won't dare use remote play, it doesn't ruin the experience but it isn't as solid as playing directly off the TV. I hope they can improve remote play down the line but it still works pretty well with games that run at 30fps, as of right now the only drawback is resolution between the two. Hopefully this could be the case in the near future if it can run games at 60fps.

pete0071645d ago

LOL.....playing a next gen title with all eyecandy on a 5inch tv ...... you killed me

+ Show (3) more repliesLast reply 1645d ago
starchild1646d ago

All I know is I can't wait for this incredible game. The first two are among my favorite games of all time.

SlapHappyJesus1645d ago

Witcher 2 might just be my favorite RPG, being out even Morrowind and New Vegas.
That game was everything that an action-rpg should try to be. The combat was actually fun and engaging, while still being very tactical and a challenge.
They also didn't take away depth of mechanics, and Witcher 2 actually proves a more complicated game than most rpg's released today.
Just a fantastic release all around and I can't wait for the third game.

Alexious1645d ago

To be honest I wasn't the biggest fan of them, mainly because of the combat system, but it's been refined a lot in The Witcher 3 seemingly.

amodestoccasion1645d ago

The Witcher 1 has some major bullshit, but aside from that it's one of my favorite games ever. 2 improves on it in almost every way, especially replay value! And it looks just gorgeous. I love The Witcher universe, having read a couple of the books, and I can't wait to dive back in for more.

shutUpAndTakeMyMoney1646d ago

AMD needs to make it open source for all. This will move pc in a new direction. But they want to profit from this so it will probably die.

Dasteru1645d ago

It already is open source and always has been. People like to conveniently ignore official statements from ATI and make up their own versions of reality just to hate on Mantle for no reason.

tee_bag2421645d ago

AMD have said it's open source. But that's mainly just lip service. PhysX is available to AMD too but that wasn't embraced either. It will be a cold day in hell before nVidia take on Mantle.
Which is a shame, but they have their reasons.

DeadManIV1645d ago

Mantle can be supported by Nvidia if they choose - AMD are not stopping them.

+ Show (2) more repliesLast reply 1645d ago
sdozzo1646d ago (Edited 1646d ago )

Just put out a good game and everything else will be fine.

Wouldn't want Dark Souls to steal your thunder again.

KING851646d ago

Don't know if it srole its thunder as both were excellent games. Personally I prefer The Witcher.

Dark111646d ago (Edited 1646d ago )

Dark Souls? lol
The Witcher has a great STORY and CHARACTERS and interesting DIALOGUES
while DS is just about killing monsters randomly

both are great but 2 different RPG's.

MrSwankSinatra1645d ago

You obviously never played dark souls if you honestly think that's what it is all about. Darks Souls tells a very grim story in very unconventional way. Which almost no game does aside from it's predecessor demons souls. One of the main things is that it leaves stuff open to interpretation thus allowing you to create your own theory's and exploring others theory's as well.

decrypt1646d ago

Lol Dark Souls, There is really no comparison here Witcher series is in a league of its own.

SlapHappyJesus1645d ago

Is that something that happened?

I wasn't aware.
And here I thought Witcher 2 propelled CD Projekt to the heights of RPG development, cementing them as one of the truly great developers around at the moment.

Alexious1645d ago

Personally, I think Dark Souls is a bit overrated.

+ Show (2) more repliesLast reply 1645d ago
DeadlyFire1646d ago

Well I would be skeptical about an AMD API if I were using NVIDIA PhysX SDK in my engine as well.

Kayant1646d ago

That's not why he's skeptical.

OT - A few devs have said the same thing on adoption by other vendors which makes sense & how OpenGL extensions can offer the same gains. If that's the case why aren't more devs coding/moving to OpenCL??

DeadlyFire1646d ago (Edited 1646d ago )

Its just utilizing more GPU compute. That is the whole point of Mantle for the PC. Its adding more space for the CPU workload as more tasks are sent to the GPU. Who wouldn't want that. Basically the CPU would normally run the game at 98% CPU use. Mantle could kick that down to 60% CPU use or something like that. Leaving the CPU more room to run even more tasks. That extra space could be used for a number of tasks inside a game. Faster draw distance is a basic example. Framerate boost is very likely.

OpenCL can grow I believe, but it won't happen overnight. Its been widely adopted on GPU tech so its there if developers do use it. I don't believe the extensions would offer the same effect as Mantle, but I do believe they could assist.

Kayant1646d ago

Ah thanks for the in-depth explanation :D

Gabenbrah1646d ago

Cannot wait to get this game on PC, it'll surely be a master piece just like previous Witcher games, I'll most likely upgrade to an GTX 880 for this game and Star Citizen.

SlapHappyJesus1645d ago

Really am hoping 800 series ends up being all that it is made out to be.
Would be the only real reason for me to upgrade from even my 680 at the moment.

R6ex1645d ago

I'm prepared for the long wait to 20nm GTX 980.

tee_bag2421645d ago

Same. Its looking that 880 in Q1 might be a fizzer and the Die shrink to 20nm we all want will come Q3 2014.

edgeofsins1646d ago

I wonder if allowing Nvidia cards to run PhysX applications as a secondary card while your main card is an AMD card would make sales go up or down. I think sales would go up for a little bit from a few AMD users but then would go down in the long run since a dedicated PhysX card does not need to be upgraded for a long time.

If Mantle is a success then I'll be very happy because my laptop should be able to support it and could run games better if they support it. It's just unfortunate I can't do PhysX since it's AMD and the processor isn't good enough for that support. Some games seem to shut it off even though I have the software downloaded because they seek Nvidia cards, or maybe I'm doing it wrong.

Lon3wolf1645d ago

Some games dont support software Physx thats why it gets disabled.

ATiElite1645d ago

There is NO such thing as Software Physx!

Physx is a proprietary function of Nvida GPU's and is all Hardware based.

You can run physx through a AMD GPU, actually the process gets off-loaded to your CPU and thus your CPU gets bogged down and you take a HUGE performance hit and not all Physx features work through a CPU.

Now you do have "Software Physics" notice the spelling as it matters a lot. We have always had Software Physics. Collision Detection was one of the first Physics processes and is STILL done by CPU today along with Paradigms!

All platforms had Software Physics at one time but the trend now a days is to off load Physics to the GPU through GPGPU programming.

edgeofsins1645d ago

Oh I know that. Just from Borderlands 2 it doesn't let me turn on PhysX when I have the software installed. It should let it be processed by my CPU since I don't have a NVidia card.

Lon3wolf1645d ago (Edited 1645d ago )


I have the software Physx runtimes installed now, so not sure why you think there is no software Physx (it is the software that pushes the Physx over to the CPU for processing). Some older games I have recently installed actually require a certain version of the runtimes to be installed and some Steam games install the Physx software component too.


I can use Physx in BL 2 on my AMD card I do have a 8350 CPU (dunno why that should make any difference though)so not sure why you cannot, most strange do you have the Physx software installed in add/remove programs?

Edit just saw you had the software installed, tis a mystery.

Alexious1645d ago

@ATiElite: you're misinformed, PhysX is a middleware SDK and many titles use it on consoles, even current generation ones.

NVIDIA just has hardware acceleration on their cards.

+ Show (1) more replyLast reply 1645d ago
Show all comments (54)