Assassin’s Creed Unity leaker provides clarity on its graphical fidelity

Recently, an anonymous individual leaked gameplay from the PlayStation 4 version of Assassin’s Creed Unity.

The leaker has noticed the criticisms of the gaming community in regards to the graphical fidelity of Assassin’s Creed Unity, and subsequently provided some clarification on the game’s graphics.

Read Full Story >>
The story is too old to be commented.
Eonjay1269d ago (Edited 1269d ago )

Regardless, you can be sure that the visuals will not look as good as the reveal at E3.

shloobmm31269d ago

Considering the game was running on an Xbox one at e3 i imagine it will look exactly the same.

radler1269d ago

It was probably running on a PC with an Xbox controller connected

Dudebro901269d ago Show
rahman701269d ago Show
Reibooi1268d ago

To be honest what I have noticed with Ubisoft is that the majority of their games are shown off with incredible visuals the first time they are revealed and then take little hit in the game play demos that we see later on and then when the game comes out they take another hit.

Watch Dogs did this and now it kinda looks like Unity is and I think Far Cry 4 may be the same as the first unveil looked better then the game play stuff we have seen since then. Maybe it's just my eyes but that is how it seems to me.

jatakk1268d ago

So much hate for AC:Unity, complaining about bull shots and "parity".

Truth is that the game looks identical to the E3 demo:

if you are looking for the best graphics, avoid the consoles and play it on a PC:

Stop whining, the choice is yours ;)

AngryTypingGuy1268d ago

This game will look stunning no matter which system you play it for, relax! Leave the obsession over the slightest of imperfections to the critics!

user55757081268d ago

ill skip this game its not 1080p.

+ Show (4) more repliesLast reply 1268d ago
starchild1269d ago

Everything I've seen from the game so far has looked as good, or better, than that first trailer. No other Assassin's Creed game has ever been downgraded either. But you guys have made up your mind to hate this game, so proceed.

markyboy21811269d ago

@rahman nah was xbox one at the microsoft press conference

DragonKnight1268d ago

"But you guys have made up your mind to hate this game, so proceed."

And you've made up your mind to bend to Ubisoft, so proceed too I guess.

joab7771269d ago (Edited 1269d ago )

1st, let's wait til release. 2nd, if they did lose some visual fidelity to give us a vibrant almost fully populated Paris, please do not criticize.

I do not want every dev to have graphics as the main Motivation for everything they make this gen. I want to see games push forward. I will definitely give them their props if they have..even if there are some blurry textures.

And the reason we see E3 demos like we do is simple. We have created this. We have made E3 demos and graphics so important. And often at the time, what we see is what they are aiming for.

It may be misleading to the general public. But anyone who reads N4G should know about E3 demos and their intentions. I laugh when I see articles here explaining the obvious to a community that should already know.

frezhblunts1269d ago

Ehhh the graphics are still better than if it was on last gen, not worried about the graphics but I am still trying to avoid the game because of ubisoft.

theatticusera1268d ago

Trying to avoid it? Just don't buy it. Job done.

Aloren1268d ago

So if it ends up being the game of the year, you'd still avoid it "because of ubisoft" ? Your loss I guess...

frezhblunts1268d ago

Yeah right. Best game i played so far this year was bayonetta 2 hopefully far cry 4 gets game of the year if anything

Aloren1268d ago

But that's still Ubi though... will you try to avoid it too ?

+ Show (1) more replyLast reply 1268d ago
SnotyTheRocket1269d ago (Edited 1269d ago )

Seriously? It looks literally the same as the E3 demo. Stop it.

Edit: Also, I'll leave this here.

neoandrew1269d ago

Of course it will, but on a pc, not on consoles.

kingduqc1268d ago

*unless you play on pc.

+ Show (4) more repliesLast reply 1268d ago
XtraTrstrL1269d ago

Some of the issues with the screenshots has nothing to do with slight deterioration of quality from the PS4 snapshot function. There's clearly much less detail in general to the texture quality and such. Ubisoft is screwing up this gen, if they're putting it all on the huge crowds causing the strain - just dial back the crowds a bit. I rather slightly less dumb AI NPCs running around me and better visuals. Plus, if the PS4 version ends up looking exactly the same as the X1 version and being locked at 30fps on both - you know they are lying, because there'd be tons of untapped power on the PS4 version then.

Ravenor1269d ago

You're exaggerating the power differences between the two platforms, the PS4 is undoubtedly more powerful. It's not in another league, they are largely similar.

XtraTrstrL1269d ago

50% stronger GPU is pretty huge dude. Throw in the 8GB GDDR5 vs 8GB DDR3, and that's a whole nother area of superiority - which will eventually start to show more as the SDKs improve and put that GDDR5 to better use for games/apps/OS.

sinspirit1269d ago

It's 50% more raw graphical performance. Not to mention a lighter weight OS. A 50% difference is big enough for PC video cards. It's going to be a lot different when developers properly take use of it in a gaming machine where they can design for it's specific hardware.

AndrewLB1269d ago

The PS4's GPU is 40% more powerful than Xbone yet somehow you guys think that's all you need to DOUBLE the number of pixels on-screen by raising the resolution from 900p to 1080p. lol.

XtraTrstrL- Hmm... 1.23 Tflops for Xbone. And 1.84 Tflops for PS4. And FYI... the pathetic GPU's in these consoles are not starved for bandwidth because they don't have enough throughput to fill the pipeline. I used to have a GTX 680 that was overclocked to 1.3ghz and it did a whopping 4.0 Tflops, and STILL... it didn't utilize all the memory bandwidth that's only slightly higher than PS4.

Sinspirit- You can't even put these consoles in the same category performance wise. I recently spent what it would have cost me to buy a PS4 on a GTX 780 Ti which lays down a massive 6.8 Tflops when overclocked. I think thats almost 4x as much power as these consoles.

sinspirit1268d ago


Where's your common sense? You're going to sit here and talk to me as if I don't know a graphics card that cost as much as an entire console isn't stronger? I already shut down previous ridiculous statements you've made. Don't try and pretend you have some sort of argument.

Actually, console hardware, due to optimization, is equivalent to a PC equivalent of twice the performance. This is according to John Carmack. I don't think I need to tell you this AGAIN.

MarkusMcNugen1268d ago


"For the same given paper spec, a console will deliver twice the perf of a PC, and a PC will deliver twice the perf of a mobile part."

That is what John Carmack said. Not that he is the definitive voice for all video game development. It was an observation, and not one he even backed up with any kind of evidence. I don't know about you, but I question everything I'm blatantly told is a fact without evidence.

Plus, the consoles are partially made up of mobile parts...

+ Show (2) more repliesLast reply 1268d ago
Ravenor1269d ago

Saddled with underpowered CPU's, it's not as cut and dry as saying 50% more powerful GPU. It's the same silicon, with certain things disabled. You will not see things done on the PS4 that are just IMPOSSIBLE on the XB1. I'm not saying the PS4 won't likely always provide a superior experience, I'm saying that the gulf between them isn't so massive that it's going to effect assets being generated or features within games.

You guys are a little touchy about your plastic boxes.

generic-user-name1269d ago

"You will not see things done on the PS4 that are just IMPOSSIBLE on the XB1."

Ha, perhaps not until Naughty Dog step up.

XtraTrstrL1269d ago

You definitely 'MAY' see things done on PS4 that just can't be done on X1. The Tomorrow Children using the GPGPU the way Q-Games is using it might not be possible on X1. PS4 supports hUMA, X1 doesnt, that's pretty big.

sinspirit1269d ago (Edited 1269d ago )

Underpowered CPU's? They are 8-core CPU's. 1.6GHz outputs to more performance on a console than on a PC. Not to mention.. A few more cores than a standard gaming PC(which typically never take advantage of more than 2 cores, rarely 4). Also, cores properly organized to handle certain information and tasks so it's not shared among ones dedicated for gaming(slowing them down).

Actually.. No. It's not the "same" silicon. It's the same CPU cores and GPU cores, except the PS4 GPU has 50% more GPU cores, period. The X1 does not have them at all. These being APU's.. They all exist on the same die. If they have different hardware they are not the same silicon, no matter how similar.

Wait.. how? It's literally a raw 50% graphical difference, on a lighter weight OS compared to XBox One, as well as developers being able to take advantage of specific graphical hardware, and the point is that a GPU is far more important for a gaming system than a CPU, whether or not the CPU is as ahead of the X1 CPU as it's GPU is doesn't matter. The CPU is just not the factor here. It's the GPU. The CPU will bottleneck if a developer doesn't have enough time to design it more-so around the GPU. But, that's because it's not the point. And, for the record there is a benchmark that scored the PS4 CPU slightly higher in performance, despite the X1 CPU being clocked slightly higher, even though they are the same cores. This is because PS4 has a lighter weight OS. PS4 isn't ahead in just GPU power. And, while that is in fact the most important thing, you still keep rambling on.

You don't actually know hardware and optimization benefits. You throw around misinformation, terms like "silicon" to sound "smart", and you end with generic childish insults about people and their hobbies as if you're above them, when you're also here arguing with them. Don't be a hypocrite.

Yetter1269d ago

Yes they use the same jaguar 8 core CPU chip and yes the XB1s does clock slightly higher but on top of that XB1 is heavily modified and each core performs at 6 ops/cycle per core as oppose to the PS4 which is at 4 ops/cycle. The XB1 CPU mem bus also gets 30gb/s as oppose to the PS4s 20gb/s, according to the MS engineers this was specifically done to avoid bottlenecks between the CPU and GPU. Feel free to PM me and I'll send ya all the links to back this up, but just accept it, the CPU is more powerful in the XB1.

MysticStrummer1269d ago

@Yetter - "just accept it, the CPU is more powerful in the XB1."

"Interestingly while demonstrating texture generation speed, DXT compressed, using Substance Engine, for one CPU, it was found that the PlayStation 4 is able to generate 14MB/s of textures compared to 12MB/s for the Xbox One."

AndrewLB1269d ago

SinSpirit- The 8-core laptop processor in your PS4 is roughly as powerful as an Intel Haswell i3 CPU.

Also, your claims about PC's only using 2 cores is idiotic. Here are just a few games that utilize up to at least 8 CPU threads in a PC:
Crysis 2 and 3
Bioshock Infinite
AC: Black Flag
Witcher 2
Project Cars
Mass Effect 3
Battlefield 3, 4, etc.
... ok im getting bored. Just understand that pretty much all modern games utilize at least 8 cores.

50% more cores does not mean you'll get 50% more performance since its well known that graphics compuatational power required increases at an exponential rate, not a linear one.

MysticStrummer- Everyone with half a brain can see that benchmark used to make that claim measured a singular aspect of it's performance, which just happened to be the one test that would benefit the most from high memory bandwidth. Using that one test as the basis for such a grand claim is beyond irresponsible. ALL legitiment benchmark "suites" consist of dozens of tests which hit all aspects of processing ability, not a single cherry picked texture call test.

sinspirit1268d ago (Edited 1268d ago )


Cores are EXACTLY where the performance comes from. 50% more cores is exactly 50% more performance(in this situation). Quit making things up. That's like saying 4 people running on treadmills at 10 mph wouldn't be doubled if you added another 4 people running at the same speed.

You clearly don't understand what it means to utilize a number of cores.

Anything to neglect a direct benchmark that proves PS4's CPU performance?

Oh, look he knows what Haswells are. Haswell was basically a refresh to reduce power consumption. In direct benchmarks the performance was nearly equivalent.

MarkusMcNugen1268d ago


You clearly do not know what you are talking about. More CPU cores does not directly equate to 1:1 performance gains. It's called the law of diminishing returns and it applies to increasing core counts just as much as it does ROPs on a graphics card.

If adding more cores was so perfectly efficient, we would have Intel CPUs with 8-10 cores already. Adding cores is only beneficial to highly parallel tasks. Unfortunately it always hits a wall because x86 has an inherent problem with OOoE.

I'm not a fanboy so I'll admit that the PS4 CPU may be better. That is just one benchmark from one engine, and is hardly enough data points for a true comparison. When is the last time you saw a hardware comparison that only used one benchmark? None.

sinspirit1268d ago (Edited 1268d ago )


I'm sorry.. Is this a computer argument? And.. Did I say CPU cores? No. I said GPU cores and I wrote "(in this situation)" because I know CPU cores are not quite the same, especially on computers. If it's the same GPU and it has 50% more cores on it, where all the performance comes from, in a GPU's case this is basically a raw 50% performance gain. Not to mention the importance of optimization.

As for your other comment. Consoles have more than proven they are twice as powerful as PC equivalents. The PS3 was equivalent to a 6800GT. But The Last of Us, God of War, KillZone, etcetera look far better than what games that card can handle at 720p.

And, what do mobile parts have to do with it? It doesn't matter what the heck the part is. The fact is that developers only have to design games around those few parts It doesn't matter where the parts comes from.

Idk. It looks like you have no evidence, assume your own facts, or are just confused. I don't see why you would shun the word of an experienced game developer, that is also an icon on PC as well as classic consoles and clearly knows what he is talking about. Wheres the evidence? Did you look? Did you even bother before trying to refute common knowledge on this topic? No? Then don't argue.

MarkusMcNugen1268d ago (Edited 1268d ago )

No, it's not. Doubling the number of GPU cores does not represent a 50% increase in raw power. It is effected by the law of diminishing returns. Compare a single GPU and the dual GPU equivalent and you can see the performance does not scale 1:1. That fact that you think it does, shows me you have no idea what you are talking about. You can post whatever you want but you clearly don't understand the source material.

Once again, no they havent. Consoles can be coded to the bare metal in machine languages, but that does not scale to a 50% improvement. Optimization will only get you so far, being able to double the graphics capabilities just from optimization is a pipe dream.

Sure optimization does account for a decent performance increase but not on par with what you or Carmack are stating.

lol. Google the evidence for yourself. I'm not your mother. It's plainly out in the open for you to find. Carmack hasn't been relevant or made a good game in a long time. Like I said, he is not the end-all be-all of video game development.

It's not like he posted an article explaining himself. He literally sent a 140 character or less statement that people like you took as proof. I didn't see any proof. Just an unverified statement.

Common knowledge. That is hilarious. One developer makes a statement about optimization on consoles and now it's common knowledge.

edgeofsins1267d ago


We are talking about GPU cores on the same die. It is entirely different from two seperate GPU's. Cores have everything to do with 1:1 performance. Were you wondering "Oh, but why don't we just add more cores to everything?".. It doesn't work that way. More cores means more heat. Two of the same GPU's does give nearly twice the performance. And, if you mean two seperate video cards, it does come close, but it is held back because of motherboards. Not from the GPU's themselves.

Your quote on your profile. You take care of misinformation? Really? With what sources? You put down credible sources with your own personal.. imagination clouded with technical terms? You somehow come up with all your information from some mythical place yet you don't have any of it posted and you refuse to actually look into it whilst trying to put others down with zero credibility.

Not only do cores entirely have to do with the raw information that a GPU processes. These ARE different from CPU's, and you keep talking about different situations of GPU cores on different die's, or on separate cards.

You ignore the fact that you can't explain the insane difference of capabilities from comparable hardware to a PS3, to what the PS3 actually renders. Good job.

Of course it's not a perfect performance gain of twice their PC counterparts. It's theoretical, meaning their is still a large performance boost that gets pretty close to it.

He's not the "end all be all"? Okay? Does that have anything to do with how well he knows hardware and how he has a few levels of hardware as well as software knowledge much higher than yours? He doesn't claim to be a talented developer, nor does he act like some elitist. But, he is honest and has dozens of years of hands on experience in which you can not refute no matter how many times you cry no.

Like I said. The proof, which you refuse to even talk about, is in past consoles themselves. They perform much better than PC comparable hardware. There is a lot of evidence that shows their performance gain. But, idk I might just change my mind because some nobody without any proof or links to back a single word up told me to "Google it myself" when I already have and have talked up and down about everything that completely shuts down all of your assumptions. I'll wait for something actually worthwhile that directly has to do with this. Don't speak another word about PC hardware, or different situations like two separate GPU's compared to a single GPU(all cores on one die).

MarkusMcNugen1266d ago (Edited 1266d ago )

I tried to tell you that all you had to do was a quick search, even on N4G, to find the truth. You refused. So here is my proof.

First. An 8800GTX beating a PS3. You may say "Well of course, that's a 2008 GPU against a 2006 PS3, that's not really fair." Well that 2008 GPU has roughly 50% better performance than the PS3 GPU. So optimization should have limited the GPU difference to nil right? Nope. BTW, I owned that card until 2011 and it was always better than the PS3...

Second. The Radeon R9 295x2 by AMD has two Hawaii GPUs (290x) on the same die. Take a look at the information and benchmarks below, and tell me again how it equates to a 1:1 performance gain. Which is how this whole argument got started in the first place.

I would also like to add that yes, they can just add more cores. All they have to do is lower the GPU clock frequency which exponentially increases heat the higher it is, and add a core. Lower end cards don't give off that much heat and would definitely benefit from it. It's not industry standard because it's not cost effective. The performance gains don't equal the increased cost of manufacturing.

My credibility has been established before. I don't have to prove myself to you. You are the person who refused to look up information that is blatantly accessible to the public and yourself.

Like I said before. I don't just assume everything I'm told is true, even if it is from an expert in the field. Unless it is agreed upon by the majority in the that field. Since Carmack isn't a majority, and provided no evidence, and no one else is claiming the same performance increases besides console fanboys, I'm not inclined to believe his tweets.

Bring me evidence of the 50% optimization claim and I will change my position. I'm not arguing because I have to be right, I'm arguing because I know I am right. I pick my battles carefully.

+ Show (9) more repliesLast reply 1266d ago
xGrunty1269d ago (Edited 1269d ago )

Are you fucking kidding me? You'd rather have stupid AI and 100 more pixels on a piece of grass over better AI? Please leave the gaming community kthx.

Neo_Zeed1268d ago

It's not 100 pixels. It's actually 633,000 pixels. Also the AI is going to be retarded anyways. Ubisoft isn't known for stunning AI programming skills and now they've gone full retard.

I don't care if Ubi thinks it's better to cram the screen full of dull NPC's. With's Ubi's lack of programming ability they will be no better than Dead Rising zombies in French uniforms.

sinspirit1268d ago

Seeing how their CPU's are essentially the same.. I'd rather have the same AI but with better graphics and animations.

xGrunty1268d ago

Obviously I was being sarcastic with the amount and you're right I have no doubt the Ai isn't going to be that great anyway. You're words literally stated you'd rather better graphics than better ai. Your words not mine.

OutcastMosquito1269d ago

The PS4 version is GOING to look exactly the same as the Xbox one version because they want PARITY between both consoles. It doesn't matter if the PS4 has better hardware. The whole goddamn point of parity is so that ungrateful degenerates don't whine "that version has better resolution" or "framerate" but OBVIOUSLY, no matter how hard Ubisoft tries to make everyone happy, there are sour F$CKS there ready to bitch.

SonyWarrior1269d ago

ps4 will still run smoother regardless if they want parity or not

OutcastMosquito1267d ago

And the down votes are proof there exists a TON of cry babies on N4G.

edqe1269d ago

For some reason consoles seems to be all about FPS and resolution nowadays and no more 'gameplay > graphics' as it was earlier.

die_fiend1269d ago Show
frezhblunts1269d ago

The wii u is 8 times weaker and super smash bros is hitting 1080p 60 fps so assassin creed should be like 2160p and 60 fps ;)

methegreatone1269d ago (Edited 1269d ago )

I don't understand, why are people calling downgrade for this ? The screenshots they showed were obviously from the PC version. (am I wrong though ? If so, please correct me, and ignore everything else I wrote :D )
How on earth do you expect the PS4 version to have the same quality

Yes, even the PC version will probably be downgraded. However, on consoles, this downgrade business is to be expected. They aren't that powerful - why people are comparing PC and PS4 version is beyond me.

This is a multiplat, not a PS exclusive. When Naughty Dog releases the next Uncharted, then we can all gawk at the generation defining graphics they show. Until then, don't expect the PS4 version to look anywhere as good as the PC version.

As for the PS4 and XBOX parity issue. There's a chance it won't be that big a deal. They may have avoided the resolution and fps change, but as a result, you might get improved textures, lighting, rendering and a more stable 30 fps on the PS4.

+ Show (4) more repliesLast reply 1266d ago
Dudebro901269d ago

People still are gonna crap all over the game. No matter what happens, the trolls will do anything they can to tarnish the image even further.

Pathetic really.

PaleMoonDeath1269d ago (Edited 1269d ago )

Guess you like false advertisment, and lying to customers, NOT fans mate, people who spend money on these things.

Edit: You're part of the problem.

Yetter1269d ago

Like showing off suspend/resume and driveclub at the Feb 2013 PS4 reveal?

PaleMoonDeath1269d ago

The game we're going to be buying is a bit different to the one we were promised, like.. Watchdogs.

Expect this to happen quite a bit during this generation with Ubisoft.

sevilha821269d ago

Agreed,it makes me shakey for The Division,that´s one of my most expected games for next year.

Prop´s on the Ernesto pic by the way =)

Dynasty20211268d ago

Uncharted 4, Division, Witcher 3 etc will all be downgraded ON LAUNCH.

E3 and reveals should never be trusted, ever. No game in the history of gaming has looked as good on released as it did at reveal.

Only console owners seem to refuse to believe this, especially over Uncharted 4.

Stapleface1268d ago

@Dynasty, no the Witcher 3 will not be downgraded. Have you ever played a Witcher game? The Witcher 2 was pushing GPU's pretty hard for it's time. That Ubersampling was rough for some of the older cards. The Witcher games are built for pc's. Just because the consoles won't be able to look like the pc version doesn't mean it's a downgrade. Which will be the case for The Witcher 3. Just a heads up so your not disappointed. But it would be pretty crazy to think a ps4 could output the same quality as say a 780ti.

Elit3Nick1269d ago

It really is worrying that this seems to be happening to two Ubi games now, lying like this will only get you so far before the backlash starts damaging your reputation. You'd think they would be more careful after seeing what happened to EA after BF4

user56695101268d ago (Edited 1268d ago )

im pretty sure devs have been doing this for the longest. suddenly this is some how new. maybe because it gives people the reason to bash the other console and that this show these consoles are not as powerful as they made it seem, seeing how console versions of games cant run feature that has been used in pc games in the last couple of years.

3rd party devs are use to using more powerful gpus and cpus, so a 1.23 Tflops And 1.84 Tflops difference is not seeming like a big gap to people that programmed on pc. keep on thinking the world evolve around these console and theyre revolution. these are devs they work with new technology even if they dont utilize it. keep on thinking dev think 1.3 tflop vs 1.85 tflops is a big game compared to new gen console vs pcs today or even yesterday( gtx 660 1.9 TFLOPs that 2012)

Yukicore1268d ago

But in reality almost all the advertisements have falsely shown products. For example fast-food advertisements, you never will see that perfect burger that they advertise on screen in your hand.

+ Show (1) more replyLast reply 1268d ago
sevilha821269d ago

What i belive that bothers most of us (myself included)is not the fact of a few bad lines or not so good textures,the real problem is that Ubisoft choose to keep both versions equal ,so one platform doesen´t look worse than the other.

We invested in a platform that we know that is more powerfull that another,and we know what it can do,and developers choose not to use it´s full capabilities because of contracts,or whatever.

I´m sure the game will be really good and will look beutifull in the end,and i will most likelly enjoy it (eventually)if i pick it up,but it´s a shame that this dudes dont make it 1080p 60 frames per sec just because of investors and white collars,i´m not a graphics dude not by all means i prefer a nice,fluid gameplay,but if we have the tech for both i dont they use it?


methegreatone1269d ago

They can't make it 1080p 60fps on PS4. They can probably make it 1080p, yes. Not 1080p and 60fps. They'll have to tap into a lot of some magical processing power for that.

I wouldn't worry too much. See, they wanted to avoid debate and therefore kept the PS4 version at 900p. As a result, the graphics will probably be better on PS4. Better textures, draw distances, lighting etc. These are things they don't have to explicitly state as a number, so they can do this and still avoid debate, atleast until release.

Ofcourse, this might not be the case at all. I'm just thinking out loud. Besides, it still sucks that they wanted to avoid debate and what not. The whole parity thing is so screwed up.

If it affects the PC version in a big way, that is even worse. We'll have to wait and see