Far Cry 3 - Offloading graphical effects to the CPU, differences between PC and console builds

DSOGaming writes: "Rage3D’s member ‘Napoleonic’ has spotted an interesting PDF from Ubisoft’s Far Cry 3 GDC 2012 presentation. In this presentation, Ubisoft has highlighted the differences between the PC and the console builds of Far Cry 3, and revealed that some of the graphical effects will be offloaded to the CPU in order to save some additional GPU cycles."

Read Full Story >>
The story is too old to be commented.
yewles12037d ago


ProjectVulcan2037d ago (Edited 2037d ago )

I don't expect a huge lot from Far Cry 3 but I do expect it'll look miles better on PC than it does on console. Far Cry 2 did, it was waaaaaay better looking on PC and with quick save a better game overall to reduce a great deal of that trekking frustration.

Far Cry 2 made good use of multi cores, 3 or more cores and the game sped up a lot with a decent quad core over a dual. It needed a powerful machine to max the game in DX10 BUT it also didn't need a monster PC to wallop the console quality...

...except that was FOUR years ago. PC hardware has moved on a long way too and the gap between PC and console is so massive now way more than it was with FC2.

Even a modest modern gaming PC is gonna obliterate consoles on this game. A Radeon 4850 had no problem beating up the consoles on FC2.

Four years later and a Radeon 7850 is more than twice as fast as that as well as even budget desktops now sporting AMD quad cores. Four years ago the average gaming machine might have been a dual core and an 8800GT, now it has to be at least twice as fast as that. The top 2 cards on Steam are the GTX560 and 550Ti, making up over 10 percent of gamers on the service.

You are all screaming as if you'll need some monster machine, to max it, maybe, to utterly destroy the consoles?....

....No problem even for less than stellar hardware

john22037d ago (Edited 2037d ago )

The CPU offload was introduced as a console optimization and should stay only on consoles, plain and simple. If developers want to take advantage of the CPU, then by all means they should do so. Let them overhaul the AI for example with more complex calculations. CPUs though should not do graphical things on the PC, especially when we have all those powerful cards. What you also fail to understand is that even high-end CPUs would benefit from its absence on the PC. And last time I checked, developers should optimize their games on each platform and not rely on additional raw power to overcome their coding leftovers.

Persistantthug2037d ago (Edited 2037d ago )

"...especially when we have all those powerful cards."
Most PC's don't have the powerful cards you are implying.
Most PC's are WALMART'ish PC's, and can't just be discounted like some may want or think they should be.

Developer's games must be catered to those lower end PCs and laptops too, as they are needed and necessary to the total economy and ecosystem of PC gaming.
Without those proverbial Walmart'ish PCs (many of which are less capable game machines than the current HD consoles btw)....PC gaming would crumble.

Alot of people seem to forget this.

wicko2037d ago (Edited 2037d ago )

@John2, Grow up. What you fail to understand is that most programmers working on games like this are doing obscene amounts of overtime. It's not like this is a new development or anything, this has been going on for many years and has been reported on hundreds of times. So when you say stuff like "lazy developers" and "coding leftovers", you just sound like a tool, to me and to anyone else with a basic understanding industry.

Not to mention your arguments completely expose your knowledge (or lack thereof) of programming.

Yeah, imad.

john22037d ago (Edited 2037d ago )


Actually, you are wrong. If you're using a laptop to game, then you're doing it wrong. Not only that, but there are options to adjust the graphics if your laptop/Wamart'ish PC is not capable of running a game at High details. In fact, laptops would benefit from removing that as by reducing the details, they'd be getting a smoother gameplay experience. And a single GTX275 - yes, that really old GPU - is plenty enough for visuals that are even better of consoles (in the same 30fps target group). Saying that they left the console optimization on purpose for the Walmartish PC/laptops is like saying that excluding graphical options is the best thing to do since laptop gamers would not otherwise max some games out.

Ducky2037d ago (Edited 2037d ago )


It depends how the offloading happens.
If it happens after the GPU is capped out, then that's fine. It helps people with low-end hardware.

If it happens before the GPU is capped out, then it will actually hurt people with better hardware because a CPU will do things slower than a GPU would.
Since most PC gamers actually have a strong GPU, offloading to CPU isn't going to be seen as a desirable thing.

Remains to be seen how Ubisoft does it. It's either a smart move that benefits everyone, or a lazy move that leads to unoptimization.

fossilfern2037d ago

Exactly! Also I want to see more OpenCL for physics so developers dont have to dump Nvidias PhysX for hardware accelerated physics! GPUs have support OpenCL for a while and no developer is using it!

ProjectVulcan2037d ago (Edited 2037d ago )

An x86 CPU is a very general purpose thing, and in all honestly, is useless for most intensive rasteriser tasks. Anything that can be done with hardware acceleration i.e on the GPU really MUST be done in hardware. Its just preposterously faster and more efficient.

If someone finds that a task costs similar amount of processing time written for a CPU or the GPU, say physics, then it could be good to do it on the CPU.

However graphics and physics these days are massively parallel tasks in general. That is doing the same type of calculation over and over again, so the more you can do together at once, the faster it can be done.

GPUs have had a decade of development to make them capable of doing very specific things very fast, high parallelism with thousands of 'cores' and hundreds of gigabytes of memory bandwidth on dedicated chunks of memory to make them the fastest parallel processors around.

Architecture changes but there is a reason why the average X86 CPU cores have gone from 1 core to maybe just 8 in a decade, but GPU shader instruction cores have gone from half a dozen (in say the popular 2002 Nvidia Geforce Ti4400 with 4 pixel shaders and 2 vertex) to several thousand (660Ti has 1344 unified in 2012!)

Thats 8 times more cores for a CPU architecture, but what? Over 200 times more for an equivalent GPU architecture?

CPU tasks are just not as parallel and thus the hardware has not developed as much in that direction. Trying to use them for such tasks is often utterly pointless when you have a massively parallel GPU sat there...

On PC it is less advantageous to avoid working the GPU hard than it is on consoles. Finding a balance is important but thats a basic thing.

Lets not forget modern PC GPUs are massively more flexible than those in current consoles too, they are way more programmable and carry a wider set of abilities.

Often it is also easier to increase GPU performance of a machine than it is CPU performance, which can require changing an entire platform i.e chipset too. Another good reason as to why GPU performance is the focus on PC.

+ Show (3) more repliesLast reply 2037d ago
RyuStrife2037d ago

This is going to be troubling for GPU boosts. Especially Nvidia. Same thing has happened to Guild Wars 2 where they offload to the CPU and killing the max performance of the GPUs (Nvidia), thus creating jumping frame-rates 30-60+.

Blacktric2036d ago

Same thing is also present in ARMA 2 and Borderlands 2. Although latter uses CPU more efficiently than the horribly optimized ARMA 2 engine that doesn't use anything to it's full potential.

Saryk2037d ago


I agree with you on that one. That is why World of Warcraft is the best MMO out there. It catered to the lowest PC settings. I think that PC software developers should set a minimum specs for their games industry wide to be decent and increase accordingly.

stragomccloud2037d ago

Sounds good to me. I'll be getting the new AMD 8350 vishera CPU when it releases next month~

Still want to upgrade my HD5850, but I guess I can still wait. Leaning towards a GTX570 or HD7870....

DoomeDx2037d ago

Depends on the resolution you are playing at.

I play at 1440x900 resolution, and my GTX570 maxes out every game i throw at it. with 60 FPS.

But when play on my Full HD TV, the videocard is having some problems.
So if you play with a 'lower' resulation then 1920x1080, you will be fine with the GTX

TABSF2037d ago (Edited 2037d ago )

People really should not be going out and expecting Intel Pentium G620s or AMD Llano A8-3850s to run good for gaming. These CPU are not terrible as you could play older games quiet easily or run on the lowest settings possible.

What is not good is to expect a port to PC and then expect it to work. We got a hash with GTA IV however if you got a really good quad (Sandy/Ivybridge) or you've got hyperthreading then you should have no problems.

i5 2500k / i7 2700k Sandybridge
i5 3570k / i7 3770k Ivybridge

These CPUs are extremely powerful and should be a great source of performance for devs

In terms of the GPU, I don't care what developers say when it comes to fragmentation, learn and stop relying on more power from Nvidia and AMD.

It is ridiculous that in 2007 8800 GTX or Ultra could not be touched by anything let alone consoles yet 5 years on a these cards struggle to play new games yet the consoles can.

@ stragomccloud

HD 7870 > GTX 570

stragomccloud2037d ago

You're absolutely right about engine optimization. Since PCs are so much more powerful, seems like devs take it for granted.

I've seen the benchmarks. Seems pretty good. Sometimes I wonder about not being to to use Phys-X, but I've heard of people putting really old low end Nvidia cards in their systems as dedicated Phys-X only cards.

AMD always seems to give better performance for the price.

ProjectVulcan2037d ago (Edited 2037d ago )

7870 is techically a generation newer process than GTX570, so it is far better for power consumption than the 570.

Really it is GTX660 you should compare with the 7870. The Radeon probably offers slightly higher performance and better overclocking, the Nvidia useful extra software level features like FXAA, Physx and a Dynamic Vsync mode.

Personally I feel you wouldn't notice the extra performance of the 7870 very much in practical terms, but in my opinion you would certainly notice the usefulness of those three extra features I mentioned on the Nvidia cards! I try to point out that value is more than just raw performance these days when it comes to graphics cards. FXAA in particular is a wonderful little software AA trick that gives nice edge smoothing to the majority of games for far less performance cost than 4xMSAA. Works brilliant in many games that do not support MSAA.

As for engine optimisaton PC developers must do better, but many do well. It is a niggling problem for many titles but for many others it is not, and they run excellently on older and more modest hardware.

I don't believe this is a devastating problem for PC. I would say that games that are very poorly optimised are in a small minority, and notable, we can name a few obvious culprits.

Generally with any games these days console or not, they are often buggy and broken when they launch, and get improved after a couple of months. Similar thing can be given with PC optimisations. Skyrim was pants for optimisation, within a week a mod fixed the broken CPU performance and then Bethesda finally wised up and sorted it officially too. I don't think that many games STAY as massively unoptimised hogs.

A Radeon 6750 (essentially the same thing as the now ancient Radeon 5750!) here has no trouble maxing Borderlands 2, brand new title, at 16 x 10 rez. Maximum settings are actually way superior to console settings, which are equivalent to medium at best running @ 1280 x 720. It'll certainly have no problems doing 1080p with only a couple of settings toned down, minor adjustments that still put it ahead of consoles at a far higher resolution.

I game on a powerful desktop, but also a little laptop too when i am away from home. It has a GTX460M in it, which is about as fast as a desktop GT440/5670.

It has little trouble beating the consoles in virtually everything I play on PC, either by better settings, more resolution, but usually both!

2037d ago Replies(1)
Blacktric2036d ago

Get a GTX660 Ti. You'll get very close performance to a GTX 580 while using less power than a GTX560 Ti. Also as far as I know most of the new ones come with a Borderlands 2 download coupon.

+ Show (1) more replyLast reply 2036d ago
taquito2037d ago (Edited 2037d ago )

watch your mouth peasant!

i was being generous, far cry 3 on console wont even look as good as far cry 1 on pc maxed with a few mods, I'm TOTALLY BEING SERIOUS>

not trolling, console games just look terrible

decrypt2037d ago


Get with the times, console gamers dont care about graphics. No surprise they dont mind gaming on 6 year old hardware.

kamakaz3md2037d ago

very nice, if people actually knew what a good game was they would pick this up and stop getting so over hyped about crap like AC3 and COD

Show all comments (34)
The story is too old to be commented.