Crytek believes it has pushed the Xbox 360 and PlayStation 3 graphically as far as they can go.
Here we go again... Crytek has been playing too much of their own games, every time it's max this maximum that. Crytek: Maximum troll suit engaged!
Crysis 2 on a top end PC does look fantastic. But I played a bit of multiplayer on PS3 and it obviously doesn't have the same level of visual fidelity and it came across as being a bit generic.
From what I've been reading it's been maxing out top end PCs, let alone current gen consoles!
Crytek: bullsh*t engaged
Agreed, all they talk about is graphics. It's like they are a broken record. They literally act like a 12 yr old talking about (Ta Gaphix). There is so much more to a game than just graphics people. And when it comes to these key areas Crytek falls very short. I would call there art style generic and uninspired at best. And lets not even get into how stupid and convoluted this tale of prophet is.
They always say that for each game they release. :\
And this fact doesn't make all the things they say a lie. =\
doesn't Naughty Dog say the same?
Nope they just talk about pushing the PS3 but they don't ever say we have maxed it out.
@taiyed80 You are one brave soul here on N4G.COM to say that. Even though what you stated is 100% truth based on the very words of ND staff...You will lose bubbles galore and get disagrees up the ying yang for bringing this up. http://www.joystiq.com/2009... "In a PAX 09 interview with GTTV, Amy Hennig Creative Director at Naughty Dog, have said that the first Uncharted game used about 30% of the PS3 capabilities while this time around with Uncharted 2 they are utilizing 100% of the PS3 power." "I think Uncharted 1 used maybe 30 percent efficiency. Uncharted 2 we were finally using 100 percent, but it wasn't as efficient as it could be. Then, Uncharted 3 we got way more efficient," Minkoff explains. "With The Last of Us, we are as efficient as we can possibly be. It's just squeezing every last drop of power out of the system. And it's a system we know really, really well. We know its constraints, so we can push it to the edges and play it really fast and loose because we know what the system can handle." Guess what PEOPLE? ALL developers say they MAXED out a console system...even the darlings of game creation on N4G.COM Naughty Dog...It's called P.R. Folks. Everyone does it. :-P
The game looks like pixal crap on ps3 I have a pc and ps3 and played both versions it's like night and day
Ik what you mean, i've played the beta on PS3 and it doesn't look that great but that is just from the beta, though i might get Crysis 3 for PC cause it looks great on PC.
Beta graphics don't improve
Maybe they will improve it.... but tesselation maybe.
Obviously they have lots of homework to do
PS3 aliasing is crap. It doesn't look real. I don't like playing games on the PS3 for that reason
Maybe you should get a better TV?
PS3 aliasing crap? 1st it isn't even used in the BETA MP. 2nd Have U played God of War III? It has the best AA in any console game. 3rd Crytek will be using SMAA: Enhanced Subpixel Morphological Antialiasing T2 http://www.iryoku.com/smaa/ in the SingleplayerCampaign of Crysis 3 on consoles(specially used in CryEngine3 for console, less resource demanding but with comparably good AA as the PC version), which is unique and not been used in Crysis2 console version. The only thing u guys are displaying here is ignorance & misguided hate towards Crytek. Crysis 3 SP will look spectacular, but i'd wish Cervil didn't run his mouth before the release, just so u guys have fuel for ur hate campaign, bunch of losers.
You have no idea, if developers take advantage of the Cell Processor rather than rely more heavily on the Nvidia Graphics Processor Unit, they can more than double their FLOPS resulting in better Anti Aliasing. Which means the combination of the Cell Processor Unit and the Graphics Processor Unit is actually "superior" than a 2010 high-end PC graphics. This technique has been used in a game called Saboteur. Here is an image to give you an example. http://images.eurogamer.net...
What are you smoking? Better then a high End 2010 PC Graphics? YOU'RE KIDDING RIGHT? Ahhhhh... I knew you were joking! You had us there for a moment with that post of insanity... FYI: The FLOPS as you seem to want to make that important... ALONE for a GPU in 2010 is ALMOST 5 times that of the ENTIRE PS3 INCLUDING THE CELL CPU and the RSX which is nothing more then a 7800GTX. http://www.hardwareheaven.c... http://www.tomshardware.com... "The ATI Radeon HD 5970 graphics card delivers nearly 5 TeraFLOPS of compute power. Grab two of them together in CrossFireX and you're looking at nearly 10 TeraFLOPS." That's not even taking into account the CPU with said memory on board. Do yourself a favor and don't compare consoles whether it's the PS3 or Xbox360 to a high end PC even going back to 2008...It won't end well for you.
but I will agree with you on the CELL+RSX. AMD is using that exact same method with their new APU (CPU+GPU) is exactly like the CELL from a conceptual standpoint. Pretty much having your CPU perform GPU tasks freeing up some workload for the GPU. I don't know why he's laughing when DICE gave an entire workshop on this when they were developing Battlefield 3(ps3 looks closer to PC than Xbox360). That's why is isn't a shock when you see Sony going with AMD's APU+GPU. Very similar to ps3 design but with a simpler code for devs. Developer won't have to rely solely on the GPU to achieve better visuals. Also PC's OS bottlenecks graphics card's capabilities. So even with all that horsepower devs don't have low-level GPU access because Windows doesn't permit it. http://www.examiner.com/art... Another reason why most developers move over to consoles from PC.
givemeshelter I mean 2009[must have been a mistake]. I know it sounds deceiving, but actually it's true. I have a link to back up my comment. http://www.eurogamer.net/ar... It's complicated, you have to read it carefully.
The method used is MLAA. Which is a fantastic work around for FSAA or MSAA for consoles that lack the bandwidth and power to run Anti-Aliasing in its different capacities, however it's not always the best method to use in most games as it blurs the images in question and reduces image quality. It works for some...not all. FXAA is also a similar solution. These solutions are great because performance hits are almost nil saving resources, however it's not always the best solution for image quality over traditional Anti Aliasing as I mentioned above. @Muerte2494 I was not laughing just stunned by the comment, however plazHD stated: "Which means the combination of the Cell Processor Unit and the Graphics Processor Unit is actually "superior" than a 2010 high-end PC graphics." Which is not true as he states that the combination of the RSX and Cell CPU doubles the flops calculations of the system aiding in superior Anti-Aliasing methods. For example as I noted above, the 5970 AMD/ATI GPU alone has almost 5 Tera-flops of power calculations. That cards bandwidth alone is far and above the PS3 multiple times over. Even a high end GPU from ATI/AMD from 2009 has more total FLOPS then the Entire PS3. http://www.amd.com/uk/produ... That card that came out in 2009 has 2.72 TeraFLOPs of processing power. As for battlefield 3, compare MSAA Anti-Aliasing images for a high end GPU running the game COMPARED to said consoles running FXAA or MLAA? The image quality brought forth by Multi-Sampling Anti Aliasing is superior. More over you can use a combination of BOTH techniques on the PC unlike these consoles. MLAA and FXAA are amazing additions to aid in video game image quality, however to compare a high end GPU to these consoles even with the console running MLAA is pointless at best because only a hand full of games look better running it compared to a PC running MSAA (That's normally a GPU that's low end. A high end GPU running MSAA sampled above 4 times almost always looks better then MLAA for the same game) I think that, going forward, the best way to apply anti aliasing in games would be a combination of coverage sampling AA and post-process AA...which is MLAA and FXAA. However as I stated, high end cards today can run MSAA sampled 8 or 16 times and that will ALWAYS look better then MLAA or FXAA.
Hogwash! If the PS3 was still relevant 2 years from now, and no next gen machines were on the horizon, you will see what Crytek is doing now can be bettered...you can always squeeze more out of hardware, as developers find more ways to streamline and code more effectively... It's more PR spin from a company you would expect it from ;/
Crysis 3 it's not even near the best looking game on PS3
Pfffft for some reason i doubt it Crytek. If first party devs can squeeze more out of them then i'm afraid you're just not optimizing your engine right. It's not the console's fault that third party devs are incompetent.
Crytek has never been able to optimize an engine. How long did it take for PC hardware to be able to run Far Cry and Crysis? Much better-looking games had been released for like a year after each of those games and PCs still couldn't max them out
I sort of agree... Optimization for the first Crysis sucked (in terms of hardware when it was released) but then I also have to disagree that any game came out even relatively soon after it that could compete visually with Crysis (without mods) as a whol for years. It was the total package for years in terms of textures, foliage, physics, draw distance, etc and in some ways still beats most games on any 1, if not more, on the fidelity front when compared side to side. The problem was getting all of that into one game was it made it required a beast of a machine at max. You could find a few games that did something better now and again but not all of it at once. That is what made Crysis such a huge benchmark game as if you could run it at max you could run anything for several years (Metro 2033 taking its place--that game can still make many pcs cry with heavy aa, high res, and tessellation, etc in city hubs especially). But with newer hardware it isn't near as bad as it was at launch either.
I honestly believe CRYTEK has maxed out the PS3 and 360, but that doesn't mean that their graphics are the best. What they're trying to say is... they've done the best they can and they just can't optimize any further to get better visuals. Obviously Naughty Dog are better developers.
naughty dog are smarter developers. They took what they had to work with, stylized their graphics, and never did anything that wouldn't work well.
I hope this game does not sell on consoles...they should of just keeped it a pc Exclusive. I said the same thing last week about them (Crysis 2has pushed the ps3/Xbox to there limits)
and it just looks like a big messy blur...
Is this is as far cry engine 3 will go on consoles, better more focused console games will look better