270°
Submitted by john2 727d ago | news

Far Cry 3 - Offloading graphical effects to the CPU, differences between PC and console builds

DSOGaming writes: "Rage3D’s member ‘Napoleonic’ has spotted an interesting PDF from Ubisoft’s Far Cry 3 GDC 2012 presentation. In this presentation, Ubisoft has highlighted the differences between the PC and the console builds of Far Cry 3, and revealed that some of the graphical effects will be offloaded to the CPU in order to save some additional GPU cycles." (Far Cry 3, PC, PS3, Xbox 360)

yewles1  +   727d ago
Translation: "WHAT??? CPU POWER??? BUT I WANTED IT TO BE ALL ABOUT MY ULTRA POWERED 580 AND JUST LEAVE MY 1999 PENTIUM III ALONE!!! WAAAAAAAAHHH!!! T_T"
Ashriel  +   727d ago
pretty much
vulcanproject  +   727d ago
I don't expect a huge lot from Far Cry 3 but I do expect it'll look miles better on PC than it does on console. Far Cry 2 did, it was waaaaaay better looking on PC and with quick save a better game overall to reduce a great deal of that trekking frustration.

Far Cry 2 made good use of multi cores, 3 or more cores and the game sped up a lot with a decent quad core over a dual. It needed a powerful machine to max the game in DX10 BUT it also didn't need a monster PC to wallop the console quality...

...except that was FOUR years ago. PC hardware has moved on a long way too and the gap between PC and console is so massive now way more than it was with FC2.

Even a modest modern gaming PC is gonna obliterate consoles on this game. A Radeon 4850 had no problem beating up the consoles on FC2.

Four years later and a Radeon 7850 is more than twice as fast as that as well as even budget desktops now sporting AMD quad cores. Four years ago the average gaming machine might have been a dual core and an 8800GT, now it has to be at least twice as fast as that. The top 2 cards on Steam are the GTX560 and 550Ti, making up over 10 percent of gamers on the service.

You are all screaming as if you'll need some monster machine, to max it, maybe, to utterly destroy the consoles?....

....No problem even for less than stellar hardware
#1.1.1 (Edited 727d ago ) | Agree(8) | Disagree(0) | Report
john2  +   727d ago
The CPU offload was introduced as a console optimization and should stay only on consoles, plain and simple. If developers want to take advantage of the CPU, then by all means they should do so. Let them overhaul the AI for example with more complex calculations. CPUs though should not do graphical things on the PC, especially when we have all those powerful cards. What you also fail to understand is that even high-end CPUs would benefit from its absence on the PC. And last time I checked, developers should optimize their games on each platform and not rely on additional raw power to overcome their coding leftovers.
#1.2 (Edited 727d ago ) | Agree(16) | Disagree(3) | Report | Reply
Persistantthug  +   727d ago
@john2.....Well, I think you got 1 part of your sentiment wrong....
"...especially when we have all those powerful cards."
.
.
.
.
.
Most PC's don't have the powerful cards you are implying.
Most PC's are WALMART'ish PC's, and can't just be discounted like some may want or think they should be.

Developer's games must be catered to those lower end PCs and laptops too, as they are needed and necessary to the total economy and ecosystem of PC gaming.
Without those proverbial Walmart'ish PCs (many of which are less capable game machines than the current HD consoles btw)....PC gaming would crumble.

Alot of people seem to forget this.
#1.2.1 (Edited 727d ago ) | Agree(4) | Disagree(14) | Report
wicko  +   727d ago
@John2, Grow up. What you fail to understand is that most programmers working on games like this are doing obscene amounts of overtime. It's not like this is a new development or anything, this has been going on for many years and has been reported on hundreds of times. So when you say stuff like "lazy developers" and "coding leftovers", you just sound like a tool, to me and to anyone else with a basic understanding industry.

Not to mention your arguments completely expose your knowledge (or lack thereof) of programming.

Yeah, imad.
#1.2.2 (Edited 727d ago ) | Agree(1) | Disagree(14) | Report
john2  +   727d ago
@Persistantthug:

Actually, you are wrong. If you're using a laptop to game, then you're doing it wrong. Not only that, but there are options to adjust the graphics if your laptop/Wamart'ish PC is not capable of running a game at High details. In fact, laptops would benefit from removing that as by reducing the details, they'd be getting a smoother gameplay experience. And a single GTX275 - yes, that really old GPU - is plenty enough for visuals that are even better of consoles (in the same 30fps target group). Saying that they left the console optimization on purpose for the Walmartish PC/laptops is like saying that excluding graphical options is the best thing to do since laptop gamers would not otherwise max some games out.
#1.2.3 (Edited 727d ago ) | Agree(13) | Disagree(2) | Report
Ducky  +   727d ago
@Persistant

It depends how the offloading happens.
If it happens after the GPU is capped out, then that's fine. It helps people with low-end hardware.

If it happens before the GPU is capped out, then it will actually hurt people with better hardware because a CPU will do things slower than a GPU would.
Since most PC gamers actually have a strong GPU, offloading to CPU isn't going to be seen as a desirable thing.

Remains to be seen how Ubisoft does it. It's either a smart move that benefits everyone, or a lazy move that leads to unoptimization.
#1.2.4 (Edited 727d ago ) | Agree(8) | Disagree(0) | Report
fossilfern  +   727d ago
Exactly! Also I want to see more OpenCL for physics so developers dont have to dump Nvidias PhysX for hardware accelerated physics! GPUs have support OpenCL for a while and no developer is using it!
vulcanproject  +   727d ago
An x86 CPU is a very general purpose thing, and in all honestly, is useless for most intensive rasteriser tasks. Anything that can be done with hardware acceleration i.e on the GPU really MUST be done in hardware. Its just preposterously faster and more efficient.

If someone finds that a task costs similar amount of processing time written for a CPU or the GPU, say physics, then it could be good to do it on the CPU.

However graphics and physics these days are massively parallel tasks in general. That is doing the same type of calculation over and over again, so the more you can do together at once, the faster it can be done.

GPUs have had a decade of development to make them capable of doing very specific things very fast, high parallelism with thousands of 'cores' and hundreds of gigabytes of memory bandwidth on dedicated chunks of memory to make them the fastest parallel processors around.

Architecture changes but there is a reason why the average X86 CPU cores have gone from 1 core to maybe just 8 in a decade, but GPU shader instruction cores have gone from half a dozen (in say the popular 2002 Nvidia Geforce Ti4400 with 4 pixel shaders and 2 vertex) to several thousand (660Ti has 1344 unified in 2012!)

Thats 8 times more cores for a CPU architecture, but what? Over 200 times more for an equivalent GPU architecture?

CPU tasks are just not as parallel and thus the hardware has not developed as much in that direction. Trying to use them for such tasks is often utterly pointless when you have a massively parallel GPU sat there...

On PC it is less advantageous to avoid working the GPU hard than it is on consoles. Finding a balance is important but thats a basic thing.

Lets not forget modern PC GPUs are massively more flexible than those in current consoles too, they are way more programmable and carry a wider set of abilities.

Often it is also easier to increase GPU performance of a machine than it is CPU performance, which can require changing an entire platform i.e chipset too. Another good reason as to why GPU performance is the focus on PC.
#1.2.6 (Edited 727d ago ) | Agree(4) | Disagree(0) | Report
RyuStrife  +   727d ago
This is going to be troubling for GPU boosts. Especially Nvidia. Same thing has happened to Guild Wars 2 where they offload to the CPU and killing the max performance of the GPUs (Nvidia), thus creating jumping frame-rates 30-60+.
Blacktric  +   726d ago
Same thing is also present in ARMA 2 and Borderlands 2. Although latter uses CPU more efficiently than the horribly optimized ARMA 2 engine that doesn't use anything to it's full potential.
Muffins1223  +   727d ago
Im tired of developers doing this shit....
Saryk  +   727d ago
@Persistantthug

I agree with you on that one. That is why World of Warcraft is the best MMO out there. It catered to the lowest PC settings. I think that PC software developers should set a minimum specs for their games industry wide to be decent and increase accordingly.
stragomccloud  +   727d ago
Sounds good to me. I'll be getting the new AMD 8350 vishera CPU when it releases next month~

Still want to upgrade my HD5850, but I guess I can still wait. Leaning towards a GTX570 or HD7870....
DoomeDx  +   727d ago
Depends on the resolution you are playing at.

I play at 1440x900 resolution, and my GTX570 maxes out every game i throw at it. with 60 FPS.

But when play on my Full HD TV, the videocard is having some problems.
So if you play with a 'lower' resulation then 1920x1080, you will be fine with the GTX
TABSF  +   727d ago
People really should not be going out and expecting Intel Pentium G620s or AMD Llano A8-3850s to run good for gaming. These CPU are not terrible as you could play older games quiet easily or run on the lowest settings possible.

What is not good is to expect a port to PC and then expect it to work. We got a hash with GTA IV however if you got a really good quad (Sandy/Ivybridge) or you've got hyperthreading then you should have no problems.

i5 2500k / i7 2700k Sandybridge
i5 3570k / i7 3770k Ivybridge

These CPUs are extremely powerful and should be a great source of performance for devs

In terms of the GPU, I don't care what developers say when it comes to fragmentation, learn and stop relying on more power from Nvidia and AMD.

It is ridiculous that in 2007 8800 GTX or Ultra could not be touched by anything let alone consoles yet 5 years on a these cards struggle to play new games yet the consoles can.

@ stragomccloud

HD 7870 > GTX 570
http://www.anandtech.com/be...
#3.2 (Edited 727d ago ) | Agree(1) | Disagree(0) | Report | Reply
stragomccloud  +   727d ago
You're absolutely right about engine optimization. Since PCs are so much more powerful, seems like devs take it for granted.

I've seen the benchmarks. Seems pretty good. Sometimes I wonder about not being to to use Phys-X, but I've heard of people putting really old low end Nvidia cards in their systems as dedicated Phys-X only cards.

AMD always seems to give better performance for the price.
vulcanproject  +   727d ago
7870 is techically a generation newer process than GTX570, so it is far better for power consumption than the 570.

Really it is GTX660 you should compare with the 7870. The Radeon probably offers slightly higher performance and better overclocking, the Nvidia useful extra software level features like FXAA, Physx and a Dynamic Vsync mode.

Personally I feel you wouldn't notice the extra performance of the 7870 very much in practical terms, but in my opinion you would certainly notice the usefulness of those three extra features I mentioned on the Nvidia cards! I try to point out that value is more than just raw performance these days when it comes to graphics cards. FXAA in particular is a wonderful little software AA trick that gives nice edge smoothing to the majority of games for far less performance cost than 4xMSAA. Works brilliant in many games that do not support MSAA.

As for engine optimisaton PC developers must do better, but many do well. It is a niggling problem for many titles but for many others it is not, and they run excellently on older and more modest hardware.

I don't believe this is a devastating problem for PC. I would say that games that are very poorly optimised are in a small minority, and notable, we can name a few obvious culprits.

Generally with any games these days console or not, they are often buggy and broken when they launch, and get improved after a couple of months. Similar thing can be given with PC optimisations. Skyrim was pants for optimisation, within a week a mod fixed the broken CPU performance and then Bethesda finally wised up and sorted it officially too. I don't think that many games STAY as massively unoptimised hogs.

http://www.techspot.com/rev...

A Radeon 6750 (essentially the same thing as the now ancient Radeon 5750!) here has no trouble maxing Borderlands 2, brand new title, at 16 x 10 rez. Maximum settings are actually way superior to console settings, which are equivalent to medium at best running @ 1280 x 720. It'll certainly have no problems doing 1080p with only a couple of settings toned down, minor adjustments that still put it ahead of consoles at a far higher resolution.

I game on a powerful desktop, but also a little laptop too when i am away from home. It has a GTX460M in it, which is about as fast as a desktop GT440/5670.

It has little trouble beating the consoles in virtually everything I play on PC, either by better settings, more resolution, but usually both!
#3.2.2 (Edited 727d ago ) | Agree(1) | Disagree(0) | Report
kenoh   727d ago | Spam
Blacktric  +   726d ago
Get a GTX660 Ti. You'll get very close performance to a GTX 580 while using less power than a GTX560 Ti. Also as far as I know most of the new ones come with a Borderlands 2 download coupon.
taquito  +   727d ago
far cry 3 on pc;

http://www.patch-your-games...

far cry 3 on console;

http://www.techpowerup.com/...
modesign  +   727d ago
nice trolling toolbag
taquito  +   727d ago
watch your mouth peasant!

i was being generous, far cry 3 on console wont even look as good as far cry 1 on pc maxed with a few mods, I'm TOTALLY BEING SERIOUS>

not trolling, console games just look terrible
#4.1.1 (Edited 727d ago ) | Agree(7) | Disagree(14) | Report
decrypt  +   727d ago
Modesign

Get with the times, console gamers dont care about graphics. No surprise they dont mind gaming on 6 year old hardware.
kamakaz3md  +   727d ago
very nice, if people actually knew what a good game was they would pick this up and stop getting so over hyped about crap like AC3 and COD
chukamachine  +   727d ago
Taquito.

I'm a pc gamer most of the time, but also enjoy the PS3.

Saying that killzone2,3,uncharted1,2,3, and many others look terrible is comical at best.

I game @1080p with x4aa,@60fps in most games, but some of those PS3 games look sweet.

The only problem i have with some console games is lower then 30fps dips. 30fps solid or 60fps solid is nice.

I've played CRYSIS1,2 On both PC/PS3.

And they both look excellent on both machines.
nepdyse  +   727d ago
No, no they don't. I own a PS3 too and Killzone/Uncharted do look like shit. Great art style? Yes, great games? Yes, but running at 1280x720 doesn't do them justice. So please try not to lie next time.
Yukicore  +   727d ago
I have a Intel i5 3570K, but I am still shocked of these news and worried that it might ruin the performance. Should I be? I know I might sound silly, as it's the best processor you can get from i5 series, as I wasn't interested spending fortune for virtual cores, as no game can fully utilize a 4 core processor's power.

I have not experienced the best results from this processor, it's cooled with an enormous cooler, and my whole system is very cool, and well ventilated. But I don't know what seems to be the problem. My guess is the hard-drive... I should really get one of them SSD's.

Can anyone help me, Please?
Lulz_Boat  +   727d ago
this is a good news for PS3 owner.
ninjahunter  +   727d ago
And thats what happened with skyrim, you could be sporting a $1000 PC, game would run like crap maxed out. SHADOWS ARE NOT SUPPOSED TO BE RENDERED BY THE CPU.
PiccoloGR  +   727d ago
Precisely that. This also happens with both F1 2011 and F1 2012. Shadows are rendered by the CPU and not the GPU
#9.1 (Edited 727d ago ) | Agree(1) | Disagree(0) | Report | Reply
Ziggyvertang  +   726d ago
Its going to be a disappointing game anyway. The multiplayer side of it just plain nasty

Add comment

You need to be registered to add comments. Register here or login
Remember
New stories

Counter Strike: Global Offensive now available for Linux

4m ago - SegmentNext - "Good news for the Counter Strike: Global Offensive fans, as the popular game makes... | PC
10°

Broken Bulb Studios Announces Ninja Warz 2 Will be iOS Exclusive

5m ago - Broken Bulb Studios, a developer and publisher of games for everyone, today announced that it is... | iPhone
10°

Table Top Racing Review | Super Clash Gaming

6m ago - We recently got the chance to try out a new Playstation Vita game known as Table Top Racing, a n... | Mobile
20°

10 Games you need to play on the PlayStation TV

15m ago - From the article, "The PlayStation TV is coming out on October 14, 2014, and this $100 micro-cons... | Culture
Ad

Celebrate the new TV season with Filmwatch

Now - With the 2014-2015 TV season right around the corner, come join us on Filmwatch as we celebrate and give all you TV lovers something to enjoy! | Promoted post
40°

Dead or Alive 6 Will Make Full Use of Features of PS4 and Xbox One According to Yosuke Hayashi

16m ago - Dead or Alive 5: Last Round will give us a taste of the franchise on PS4 and Xbox One next year,... | PS4
Related content from friends