Crysis 3 maxing out current-gen consoles

Crytek believes it has pushed the Xbox 360 and PlayStation 3 graphically as far as they can go.

Read Full Story >>
The story is too old to be commented.
Arai1862d ago

Here we go again...
Crytek has been playing too much of their own games, every time it's max this maximum that.

Crytek: Maximum troll suit engaged!

Thantalas1862d ago

Crysis 2 on a top end PC does look fantastic. But I played a bit of multiplayer on PS3 and it obviously doesn't have the same level of visual fidelity and it came across as being a bit generic.

Old McGroin1862d ago

From what I've been reading it's been maxing out top end PCs, let alone current gen consoles!

Kanzes1862d ago

Crytek: bullsh*t engaged

SCW19821862d ago

Agreed, all they talk about is graphics. It's like they are a broken record. They literally act like a 12 yr old talking about (Ta Gaphix). There is so much more to a game than just graphics people. And when it comes to these key areas Crytek falls very short. I would call there art style generic and uninspired at best. And lets not even get into how stupid and convoluted this tale of prophet is.

gameseveryday1862d ago

They always say that for each game they release. :\

coolmast3r1862d ago

And this fact doesn't make all the things they say a lie. =\

Coach_McGuirk1862d ago

doesn't Naughty Dog say the same?

SCW19821862d ago

Nope they just talk about pushing the PS3 but they don't ever say we have maxed it out.

givemeshelter1862d ago (Edited 1862d ago )

You are one brave soul here on N4G.COM to say that. Even though what you stated is 100% truth based on the very words of ND staff...You will lose bubbles galore and get disagrees up the ying yang for bringing this up.

"In a PAX 09 interview with GTTV, Amy Hennig Creative Director at Naughty Dog, have said that the first Uncharted game used about 30% of the PS3 capabilities while this time around with Uncharted 2 they are utilizing 100% of the PS3 power."

"I think Uncharted 1 used maybe 30 percent efficiency. Uncharted 2 we were finally using 100 percent, but it wasn't as efficient as it could be. Then, Uncharted 3 we got way more efficient," Minkoff explains. "With The Last of Us, we are as efficient as we can possibly be. It's just squeezing every last drop of power out of the system. And it's a system we know really, really well. We know its constraints, so we can push it to the edges and play it really fast and loose because we know what the system can handle."

Guess what PEOPLE? ALL developers say they MAXED out a console system...even the darlings of game creation on N4G.COM Naughty Dog...It's called P.R. Folks. Everyone does it. :-P

Lior1862d ago

The game looks like pixal crap on ps3 I have a pc and ps3 and played both versions it's like night and day

WeskerChildReborned1862d ago

Ik what you mean, i've played the beta on PS3 and it doesn't look that great but that is just from the beta, though i might get Crysis 3 for PC cause it looks great on PC.

Lior1862d ago

Beta graphics don't improve

deSSy27241862d ago

Maybe they will improve it.... but tesselation maybe.

ZainabSaccal1862d ago

Obviously they have lots of homework to do

thetamer1862d ago

PS3 aliasing is crap. It doesn't look real. I don't like playing games on the PS3 for that reason

SlyFamous021862d ago

Maybe you should get a better TV?

Tyre1862d ago

PS3 aliasing crap? 1st it isn't even used in the BETA MP. 2nd Have U played God of War III? It has the best AA in any console game. 3rd Crytek will be using SMAA: Enhanced Subpixel Morphological Antialiasing T2 in the SingleplayerCampaign of Crysis 3 on consoles(specially used in CryEngine3 for console, less resource demanding but with comparably good AA as the PC version), which is unique and not been used in Crysis2 console version. The only thing u guys are displaying here is ignorance & misguided hate towards Crytek. Crysis 3 SP will look spectacular, but i'd wish Cervil didn't run his mouth before the release, just so u guys have fuel for ur hate campaign, bunch of losers.

plaZeHD1862d ago

You have no idea, if developers take advantage of the Cell Processor rather than rely more heavily on the Nvidia Graphics Processor Unit, they can more than double their FLOPS resulting in better Anti Aliasing. Which means the combination of the Cell Processor Unit and the Graphics Processor Unit is actually "superior" than a 2010 high-end PC graphics. This technique has been used in a game called Saboteur.
Here is an image to give you an example.

givemeshelter1862d ago (Edited 1862d ago )

What are you smoking? Better then a high End 2010 PC Graphics?
YOU'RE KIDDING RIGHT? Ahhhhh... I knew you were joking! You had us there for a moment with that post of insanity...

The FLOPS as you seem to want to make that important... ALONE for a GPU in 2010 is ALMOST 5 times that of the ENTIRE PS3 INCLUDING THE CELL CPU and the RSX which is nothing more then a 7800GTX.


"The ATI Radeon HD 5970 graphics card delivers nearly 5 TeraFLOPS of compute power. Grab two of them together in CrossFireX and you're looking at nearly 10 TeraFLOPS."

That's not even taking into account the CPU with said memory on board.
Do yourself a favor and don't compare consoles whether it's the PS3 or Xbox360 to a high end PC even going back to 2008...It won't end well for you.

Muerte24941862d ago (Edited 1862d ago )

but I will agree with you on the CELL+RSX. AMD is using that exact same method with their new APU (CPU+GPU) is exactly like the CELL from a conceptual standpoint. Pretty much having your CPU perform GPU tasks freeing up some workload for the GPU. I don't know why he's laughing when DICE gave an entire workshop on this when they were developing Battlefield 3(ps3 looks closer to PC than Xbox360). That's why is isn't a shock when you see Sony going with AMD's APU+GPU. Very similar to ps3 design but with a simpler code for devs. Developer won't have to rely solely on the GPU to achieve better visuals. Also PC's OS bottlenecks graphics card's capabilities. So even with all that horsepower devs don't have low-level GPU access because Windows doesn't permit it.
Another reason why most developers move over to consoles from PC.

plaZeHD1862d ago

I mean 2009[must have been a mistake]. I know it sounds deceiving, but actually it's true. I have a link to back up my comment.
It's complicated, you have to read it carefully.

givemeshelter1862d ago (Edited 1862d ago )

The method used is MLAA. Which is a fantastic work around for FSAA or MSAA for consoles that lack the bandwidth and power to run Anti-Aliasing in its different capacities, however it's not always the best method to use in most games as it blurs the images in question and reduces image quality. It works for some...not all.
FXAA is also a similar solution.
These solutions are great because performance hits are almost nil saving resources, however it's not always the best solution for image quality over traditional Anti Aliasing as I mentioned above.

I was not laughing just stunned by the comment, however plazHD stated:

"Which means the combination of the Cell Processor Unit and the Graphics Processor Unit is actually "superior" than a 2010 high-end PC graphics."

Which is not true as he states that the combination of the RSX and Cell CPU doubles the flops calculations of the system aiding in superior Anti-Aliasing methods.
For example as I noted above, the 5970 AMD/ATI GPU alone has almost 5 Tera-flops of power calculations. That cards bandwidth alone is far and above the PS3 multiple times over.
Even a high end GPU from ATI/AMD from 2009 has more total FLOPS then the Entire PS3.

That card that came out in 2009 has 2.72 TeraFLOPs of processing power.
As for battlefield 3, compare MSAA Anti-Aliasing images for a high end GPU running the game COMPARED to said consoles running FXAA or MLAA?
The image quality brought forth by Multi-Sampling Anti Aliasing is superior. More over you can use a combination of BOTH techniques on the PC unlike these consoles.
MLAA and FXAA are amazing additions to aid in video game image quality, however to compare a high end GPU to these consoles even with the console running MLAA is pointless at best because only a hand full of games look better running it compared to a PC running MSAA (That's normally a GPU that's low end. A high end GPU running MSAA sampled above 4 times almost always looks better then MLAA for the same game)

I think that, going forward, the best way to apply anti aliasing in games would be a combination of coverage sampling AA and post-process AA...which is MLAA and FXAA.
However as I stated, high end cards today can run MSAA sampled 8 or 16 times and that will ALWAYS look better then MLAA or FXAA.

+ Show (1) more replyLast reply 1862d ago
Show all comments (57)
The story is too old to be commented.