910°

Ex-Naughty Dog Dev Explains Why PS4/XB1 Will Never Achieve CGI Visuals, 4K Will Take Two Generations

"Trailers and feature-length movies simply have a much higher budget per second than what the full game can afford," says Filmic Worlds boss John Hable.

Read Full Story >>
gamingbolt.com
eyeofcore3264d ago

PS4's hardware is 28nm and lowest available in next couple years is 14nm and when foundries are enough experienced with it to reduce initial failure rates to an acceptable level and even on the same as current 28nm then expect two times the performance maximum if you want a console at reasonable performance and power envelope in an acceptable form factor for a home console.

Except everyone is willing to dish out 600$ on a home console.

SourtreeDing3264d ago

im fine with 1080p just make better games that are fun to play.. visuals are at a nice spot.. idk y push for Higher details are great right now. if we want higher get ready to pay more.. but im good with where its at

johndoe112113264d ago

@SourtreeDing

Exactly. They really don't need to push 4k on consoles, all they need to do is max the graphics and get framerates to a locked 60. Imagine an open world game like fallout with the graphics of The Order. Would anyone really care if it was 4k or not?

It's still early in this gen so I expect big things in a few years from these consoles even though I know all games won't be 1080/60 at the end of this gen. The thing is, when next gen comes around we probably will be able to get The order graphics at a steady 60fps an all games. That, as far as I'm concerned, would be perfect. But who knows, it may even be better.

subtenko3264d ago

Just enough time for 12K to come out,lol. At least prices will go down, and maybe,just maybe people will be giving away free 1080p tvs because noone wants them anymore XD

freshslicepizza3264d ago

these machines (xbox one even more so) have a hard time hitting a steady 30 frames at 1080p.

the main issue is consoles need to sell at a reasonable price for it to be a mainstream product. $400 seems to be the highest most are willing to spend. that means it will take a long time to get to 4k with decent frame rates.

the pc is already there but consoles take up the bulk of marketing and therefore in a sense console gaming is holding back technical advancements because most developers use the console market as their main source to sell their games.

johndoe112113264d ago (Edited 3264d ago )

@moldybread

Consoles are NOT holding back technical advancement. The reality is that most pc gamers do not have machines that can run games at 4k. If devs were to make 4k the standard for games then they would be catering to a very small audience.

That is the exact reason that pc games are designed with various settings. They are fully aware that most people cannot run games at max settings so they leave it up to users to adjust for their specific hardware. consoles are standard hardware so adjustable settings are not necessary.

The reality is 4k gaming is expensive for EVERYONE, console and pc gamers alike, so devs cater to both console and pc gamers (the majority of gamers) who cannot run games at 4k. Blaming consoles alone is ridiculous.

Ninjatogo3264d ago

This. Last gen was good, but it was always significantly behind PC. This gen, games are mostly at native 1080p and have most if not all of the graphics effects featured on the PC versions albeit at a lower quality setting. Even though PC is still ahead, console graphics are still in a nice spot now. They're not as clean as PC but, they're not ugly anymore.

SourtreeDing3264d ago (Edited 3264d ago )

@johnDOE

well said^^^ PC gamers are blind..

Console Specs are all the same where as PC is all over the Place because everyone has a different rig

@ninjatogo

we dont care if consoles are behind PC and that will always be the Case..

ProjectVulcan3264d ago (Edited 3264d ago )

When these consoles were launched, there was talk about how they would have a shorter lifespan than the previous machines, which were 7 and 8 years old.

They exceeded the usual expected lifespan of 5 years for various reasons not least because of the worldwide recession making expensive new machines somewhat unpalatable. However I wouldn't be surprised if PS4 and Xbox One were replaced not long after they were 5 years old with new hardware.

Not least because they were somewhat more 'budget' machines in the first place compared to the more advanced technology the previous machines launched with and they will age faster relative to PC hardware IMO. You really can already match and then exceed them with pretty budget PC components.

I still wouldn't expect 4k to be the target resolution of any new machines, but it's several years away anyway. Enjoy what we have right now and don't even START thinking about what new hardware can bring for at least another 2 years.

ABizzel13264d ago

As soon as I saw that the quote was on Gamingbolt I knew that's wasn't exactly what he said, and behold I was right.

He said it'll be another 2 console generations before realtime CG graphics hit consoles and add another if you want to render that in 4k. But even then I think it's only going to take 2 gens to hit 4k gaming with CG visual (so PS6 / XB5), the only reason it wouldn't is because one of the two decides to force the other into an early generation, for money's and competition's sake.

QHD - 4k gaming is coming next console generation. 2015's big GPUs are suppose to be the R9 390 and 980 Ti. Those are 8.5 TFLOPS (+300w) and 6.2 TFLOPS (7.44 in comparison, 250w). Those single GPUs should be more than enough to run many of today's games in 4k @ 30fps, considering the standard 980 and R9 290x (weaker cards) can do just that.

In 5 years time those cards should be mid-range, with significantly lower power draws (likely 50% less), 1/3 of the original price ($200 range), superior technology and software drivers, which makes them viable options for a console in their rebranded forms R9 770x and UGTX 460 (2019/2020).

Now that being said if graphics evolve, which they will, then most graphically demanding games will be in the QHD (1440p) / 2k / 3k range, while smaller games and indies will aim for 4k.

Which means the following generation (PS6 / XB5), will be just like this generation and aim for 4k gaming for the majority of games (this one aims for 1080p). Ultimately resolution will be a non-issue next gen (PS5 / XB4), because QHD (1440p) and above pretty much produces a crystal clear image regardless since it's nearly 2x 1080p. And most importantly the gen after (PS6 / XB5) will be the end of graphics and resolution wars for all but the biggest fanboys, because 4k resolutions and higher will be the norm, and graphics will be good for every major game, meaning gameplay, fun factor, and originality will be the main selling and discussion points of games once again.

garrettbobbyferguson3264d ago

@Johndoe11211 "Imagine an open world game like fallout with graphics of the order"

So how do you propose these consoles do that? The Order is a linear shooter and it can't even achieve 1080p/60.

Also in regards to "no one has a capable PC" I'm just gonna assume you're trolling.

http://store.steampowered.c...

46.94% on windows 7 x64
29.47% on windows 8.1 x64
highest x86 is XP at 3.29%

ram
30.62% on 8gb
21.88% on 4gb
13.36% on 12+gb

gpu
32.8% on 1gb vram
23.05% on 2gb vram
12.98% on 970s

44% on quad core
48% on dual core

johndoe112113264d ago

@garrettbobbyferguson

It's pretty obvious that you're here just to disagree or poke fire at some personal issue that you probably have with me for you to completely misquote both my statements. Please read my posts over again, if you have a problem with the english language PM me and i'll try to explain myself better to you privately. I never said I expect The order graphics at 1080/60 from this gen and nowhere did I say "no one has a capable PC". Something is wrong with you.

badz1493263d ago

I am also pretty happy with what the PS4 can achieve at the moment because games like inFamous SS, Driveclub and The Order still manage to amaze me with their visuals. I think we all can agree that these games look amazing!

Sure high end PCs still have the upper hand but 4k is still not a norm today just like how 1080p wasn't in 2006. It will take some more time for it to be mainstream and I am ok with that. I honestly am hoping that next gen, 4k is in the checklist of the PS5 or it will be a disappointment! I think 4-5 years is enough for 4k to mature and for the price of a 4k setup to drop to a more affordable price point.

2 more gen for 4k? That's BS!

mikeslemonade3263d ago

As long as it's 1080p and has AA its fine for now. Because we watch blu-Ray movies on 1080p and they look phemoninal. Resolution is not the bottleneck, it's the graphics.

freshslicepizza3263d ago (Edited 3263d ago )

@johndoe
"Consoles are NOT holding back technical advancement."

yes they are. developers are catering to the larger market. games like destiny also support last generation hardware because of the huge install base which hinders advancements even further. the pc should always be the lead platform but isn't always with multiplat games.

have you seen the requirements for oculus rift? they are quite high and that's good. that is why project morpheus will have an uphill battle when it comes to performance on demanding games. instead you will likely see indie style games being supported. if oculus supported the xbox one and ps4 the games would be held back to accommodate the hardware as would oculus rift from being as advanced as it is.

as long as consoles remain popular the growth going into 4k gaming will be a long route. 4k tv's are still rather expensive, the more popular they get the more prices will come down. that same scenario works with gaming, they will always cater to the most popular userbase. console gamers don't care that much about frame rates either, it's why so few games go above 30 frames and we've even seen some pc games capped at 30 frames because they were designed for consoles. the call of duty franchise could also host more online players than consoles but that franchise has been dumbed down for consoles all due to that is where the largest market is now for it. so tell me again how consoles don't hold back pc gaming.

Revolver_X_3263d ago

@Johndoe

While it is true, average pc gamers cant run 4k effectively.

https://www.youtube.com/wat...
http://www.geforce.com/hard...

Thanks to DSR technology, an average pc gamer can run 4k downsampled to 1080p. Its still something, and messing around with settings you can get a lot more out of a $350 PC, then a PS4 or X1.

While the argument is always "most pc gamers". Facts are, "most" pc gamers can play games @ 1080p 60fps. Something consoles struggle to do. I suspect "most" pc gamers will regularly play @ 4k before consoles ever adopt it. The best we can hope for is solid 1080p 60fps next gen, then hopefully 2k(1440p) after that. 4k for consoles is more like 3-4 gens from now. I do expect console cycles to become a 5yr standard from now on, so in 15-20 yrs from now. Just my thoughts though. I would love to be wrong, trust me.

BitbyDeath3263d ago (Edited 3263d ago )

@garretbobbyferguson, The Order is 1080p same as how blu-ray movies are 1080p. Do you think they are also not 1080p? The resolutions are the same on both.

awi59513263d ago

@johndoe11211

PC players very easily have machines that do 60 fps on max setting on pc at 1080P. I cant say that for consoles.

awi59513263d ago

@SourtreeDing

We are not blind we are playing pc and we can see the downgrades clearly.

Locknuts3263d ago

Higher details and higher framerate. 30fps is a joke and some games can't even achieve that.

Dee_913263d ago (Edited 3263d ago )

@moldybread
my pc isn't "there".
What you mean is the technology is there.What's holding back advancements on a mainstream scale is price.Not many people are willing to pay over $800 for prettier graphics.. Unless you want the install base to drop dramatically, creating a domino effect resulting in a handful of games releasing each year or possibly, consoles dying all together... Well, I think that is PC master race's plan for consoles.. so yea. Pony up the dough or gtfo right?
tfoh

Ike_Broflovski3263d ago

Intels next gen of Core CPUs will be 10-12 nm, I have no clue what eyeofcore is on about and neither does he TBH.

4k will be maxed on budget GPU's by the time this stagnant gen comes to an end.

BeefCurtains3263d ago

Higher and higher resolution can only do so much for gaming right now. Total immersion, that's where it's at. And tech is finally at our fingertips for good VR and AR. I hope to see some major advancements in consumers hands.

4k? Not so much right now. 1080 or 2k VR??? I'll take it, thank you very much.

_-EDMIX-_3263d ago

Johndoe is 100% correct with that notion.

Console gaming is holding no one back. If those teams want to make demanding games ONLY for PC....they are free to do so. No one is stopping them.

@Garrett-"So how do you propose these consoles do that? The Order is a linear shooter and it can't even achieve 1080p/60" Won't, not can't. It does lessor settings because of what the game is and what they focus on...that is a choice. If you game on PC...you clearly know that going 1080p, 60fps is nothing more then turning off some effects.....I don't see how the y "can't" if its their own game...they very much can if they really wanted that.

Like I've stated before....if they wanted it SOOOO BAD...they would not make new engines, just use a last gen, dated engine and do 1080p 60fps all day. Clearly...that isn't what all developers want.

1080p and 60fps are NOT ALL the settings to actually judge a game by...I mean...I'm sure we all know that right?

Thats like saying those 2 numbers mean more then a new engine. Soo, HL1 look better then HL2? What if I told you HL1 is in 1080p 60fps and HL2 is in 720p 30fps? I mean...that setting only really means soooo much. Its not a night and day difference and it doesn't go over new engines. Not even slightly.

@Ninjia- "Last gen was good, but it was always significantly behind PC."

All gens generally speaking will always behind PC, but that is generally speaking. All gaming PC's will be behind NASAs.....so? I mean.. lol, it means very little if that hardware is not being actually used as the minimum in terms of development, ie do we see right now R9's being used as minimum specs?

PC will always have the edge in making a game look "better" by comparison, but PC at the same time won't be making exclusive games that have minimums beyond that of consoles. It has to do with what many have stated already, not enough own those beast PC's to really solely develop for that crowd.

Many on here need to really ask themselves....if this was what developer wanted....whats stopping them from making a PC exclusive that is minimum a titan card? They can crowd fund if they really wanted that too...

You didn't see it in any other gen, you won't see it now. It has to do with MOST don't own such rigs to even justify such development.

I game on both PC and console and can say its a double edged sword. You "can" have the better graphics, but that option to have "better" also means its not exact like console.....which results in less exclusive HIGH END AAA development.

PC gets the hardware price down, console gets the developers working on higher end hardware, that ultimately gets PC versions being made.

I'm sorry but Witcher 3 and AC Unity are only made on PC because a PS4 and XONE exist to justify the engine and development. Yet we didn't see both titles last gen on PC despite the hardware existing.

Dee_913263d ago (Edited 3263d ago )

@garrettbobbyferguson

I have
windows 8
8gbram
1.8 gb vram
i7 quad core

I can barely run GTA V. My project cars is at medium settings with 30fps and 720p.
You need maybe double the power consoles have on pc to get graphics to look as good as consoles on PC.Optimization>>> Brute power. Don't get me wrong. I may be able to play those games at higher settings, but at 80c temps for long periods of time will drastically shorten the lifespan of my gpu. same for my cpu.Thats why the requirements to play such games require much higher specs than what consoles have.

abstractel3263d ago (Edited 3263d ago )

It's all relative though. Look at CGI 7 years ago. Or 12 years ago. I think gaming has surpassed that :P. Of course offline rendering of CGI will always have the advantage of time, meaning it can take hours per frame and be acceptable. It doesn't need to worry about refreshing at 30 frames per second or more.

So a bit of a duh comment based on the headline alone (yes, I have not read the article).

+ Show (21) more repliesLast reply 3263d ago
umair_s513264d ago

1080p is fine for me too , but they should find a way to make next gen 100% backwards compilable
I dont want to rebuy all games again for next gen

johndoe112113264d ago

I think it will. This gen probably marks the beginning of a standard set of hardware for consoles. It won't be like in the ps2 or ps3 era where hardware design was out of the ordinary. The ps5 and xbox two will probably be designed almost identical to the ps4 and xbox one but with higher specs. if they do that then games will definitely be backwards compatible.

kneon3264d ago

I agree with johndoe11211, now that they have gone with x86 they are unlikely to move on to some other architecture. The only viable one being ARM, and the only reason to do that would be to make it easier to run the same game across mobile and console.

But I find that highly unlikely, they will stay the course with x86, and because of that backwards compatibility will be quite easy. And they may even start releasing new consoles sooner than you'd expect since the development cost is so much lower.

bumnut3263d ago

That is one of the main reasons I stick with PC, no BC troubles.

johndoe112113264d ago

@eyeofcore

I've read your comment about 15 times and I still don't have a bloody clue what you're trying to say.

NuggetsOfGod3264d ago (Edited 3264d ago )

Next couple years?

AMD "zen" cpu comes next year.
And it's 14nm FinFet.

Also Amd & nvidia gpus next year will be 16nm with HBM2.

4k monitors are in the $500 range now and in 2 years 1440p/4k will be a nominal thing.

If it's gonna take ps6 to do 4k then ps5 is already holding back pc lol

But if console gamers are already willing to wait from 2005 to end of 2013 to move from 720p 30fps to 1080 30fps then it wont be a problem to play 4k 30fps in 2029 lol

My lord two more generations of downgrades.

I hope pc keeps rising.

Paying for multiplayer plus skins and so many things wrong with consoles.

Consoles are lame as fake.

@DarkOcelet
Because it takes place in a shoe box like ps3 games did.

You won't see an open world game that looks like this on ps4.
http://www.youtube.com/watc...

And before u ask "buh how many pc's can play that??!"

$80M worth.

itisallaboutps3264d ago

Priotities my man. Some people like to spend 400 dollars orthers like to spend 2k in a pc. Nothing wrong with that, but others might want to travel. Or higher a really high end concubine.

Ippiki Okami3264d ago

Gotta love the ignorant PC Enthusiast talking about stuff they don't understand LOL. This dev is talking about rendering game assets in 4k. The only games that have done this so far are Ryse(2k textures I believe) and Dragon Age Inquisition(the shiny armor).

The games on PC that say "4k" are only upscaled 1080p native games. All modern games are rendered at 1080p to display in 1080p. Epic games already said rendering in 4k now adds thousands/millions of dollars to a games budget. The Crysis devs already went near bankrupt thanks to Ryse's expensive 2k rendering costs.

I really wish 'PC Enthusiasts' would learn about games cuz you really look like idiots when you talk about stuff your clueless about.

Revolver_X_3263d ago

@Ippiki

Maybe you should educate yourself!

https://www.youtube.com/wat...
http://4k.com/gaming/

They do exist!

iloveallgames3263d ago (Edited 3263d ago )

This is definitely a case of the pot calling the kettle black!

In context, 2k and 4k are referencing resolutions. For digital display, DCI defines them as 2048x1080 and 4096x2160 where the 2k and 4k are references to the first digit in the resolutions. That's the professional world, the most common consumer equivalents for 2k and 4k are 1920x1080(full hd) and 3840x2160(ultra hd). If a game is running at 1920x1080 then it is displaying a 2k render. If it is running at 3840x2160 then it is displaying a 4k render.

When talking assets, we are talking the resolutions of textures. A 2k asset would be a 2048x2048 resolution texture and a 4k asset would be a 4096x4096 resolution texture. Now here's the catch, these two aren't conjoined. You can run any combinations you like at any resolutions you like. You could run a 2k render with 4k assets, or even higher, if you like.

There aren't a ton of pc games that are shipping with 4k assets but there are some and more are showing up all the time and there are plenty of 4k mods if you really want them. Anyway, modern design is less about single massive texture sizes, and more about multiple smaller textures blended together in the shader.

But what do I know, I'm just a clueless, ignorant pc enthusiast that needs to learn about games so I don't look like an idiot.

wegetsignalx3263d ago

Consoles will remain a part of the industry because most console owners don't want to build and maintain a PC.

ABizzel13263d ago

@Ippiki

I see what you were trying to do by replying to NuggetsOfGold (which is a waste of time since he's a 1 bubble troll, and ignorant), but you yourself went off topic with your explanation of what the dev actually said.

He said it's going to take 2 generations (PS6/XB5) before consoles can produce CG quality graphics in real-time similar to what movies use, and possibly another generation (PS7/XB6) before those CG graphics can be rendered at 4K in real-time.

His main point was talking about rendering CG quality visuals on console, not 4k gaming in general.

@Revolver_x_

What @Ippiki was saying is that most games don't develop games with 4K Assets. Many of the tools are still 1080p textures, and the image is simply scaled to 4k, rather than everything being rendered in 4k to begin with.

For example
http://www.nexusmods.com/sk...

I know it says 2k texture mod, but that's what he's talking about. Most games are still use 1080p assets, but are rendered in 4k.

clouds53263d ago

@ippiki: dude... sure texture resolution and rendering resolution are different things. But they are not dependant on each other. On PC in almost all cases you can chose your rendering resolution and your texture resolution with different settings. Usually called "resolution" and "texture quality". You can set those to any level you want.
Now your max resolution is determined by your display. If you have a 1080p monitor/TV that is the highest _native_ resolution you can output. You may want to chose it to render at 720p and upscale it to 1080p. Or you want to render at 1440p and downscale to 1080p to increase visuals or save performance.

It's true that most games don't have 4k texture option but that has nothing to do with rendering resolution and not required to output at 4k. There are enough titles that offer 4k textures through mods though.
I personally usually play in 1440p downscaled to 1080p. btw.

+ Show (4) more repliesLast reply 3263d ago
DevilOgreFish3264d ago (Edited 3264d ago )

"well said^^^ PC gamers are blind..

Console Specs are all the same where as PC is all over the Place because everyone has a different rig"

...console specs are all over the place too, some own xbox 360s, ps3s, ps vitas, Wii-Us, Xbox ones, PS4s...and each of them developers have to support too. And PCs are less complicated, they all read the same programming language and support the same APIs. All of the consoles read different languages and support different APIs.

And once again we have people downplaying GC graphics, just 9 years ago people were buying into the FFVII's tech demo. And not to mention people 9 years ago were buying into the 1080p standard, now all of a sudden 4k's too high? ...........i thought 1080p was a bigger leap from 480p SD consoles. 4k is only 2x the leap.

johndoe112113263d ago

"..console specs are all over the place too, some own xbox 360s, ps3s, ps vitas, Wii-Us, Xbox ones, PS4s...and each of them developers have to support too. And PCs are less complicated, they all read the same programming language and support the same APIs. All of the consoles read different languages and support different APIs. "

The absurdity of this comment is mind numbing.

DevilOgreFish3263d ago (Edited 3263d ago )

That's because you're seeing from a console gamer's perspective. you're not seeing it from a developer's.

- PS, Xbox, and Nintendo are a lot different from them selves. If a developer develops on DX all PCs can read it, the only difference is the power capabilities. consoles are different from programming language to hardware.

And At least PC gamers will be able to link up competing GPUs with the next DX. I'd like to see someone linking an Xbox, PS, and WII-U together. ;)

purpleblau3263d ago

the order 1886 is very close to CGI. If we don't need 4K, we might just wait another cycle. It's close

Ike_Broflovski3263d ago

No it's not at all. Average washed out greys at best. Not to mention that the game sucked over all and it's now stuck in the bargain bin where it belongs.

If you want more games like The Order to pop up then you are what's wrong with gaming.

+ Show (3) more repliesLast reply 3263d ago
DarkOcelet3264d ago (Edited 3264d ago )

The Order 1886 already almost look CGI.

Hellsvacancy3264d ago

Some Of God Of War 3 looked CGI

eyeofcore3264d ago

GOW3 achieved CGI levels quality and maybe even exceeded it to what has been available at the time of original God Of War(early 2005).

PrinceOfAnger3264d ago (Edited 3264d ago )

Ryse

http://gearnuke.com/wp-cont...

i know some scenes are pre rendered in engine but hell that pic above look more than cgi. looks real

MasterCornholio3264d ago

Isn't that from a cutscene? As far as I know cutscenes in Ryse are prerendered. Correct me if I'm wrong.

PrinceOfAnger3264d ago (Edited 3264d ago )

@masterCornholio

here is DigitalFoundry Frame-Rate Tests these are realtime scenes
https://www.youtube.com/wat...

skip @2:54 in game shot
http://www.picisto.com/phot...

wegetsignalx3263d ago

All Ryse cutscenes are pre-rendered including that one. I've played it on PC, it's very easy to spot when the game switches from 60 fps to 30 during cutscenes.

A frame counter during a cutscene doesn't mean it's realtime, it just means the framecounter still is running during the cutscene.

Cy3264d ago

It was also a 4 hour long game. I'd much rather have games like Dragon Age: Inquisition and Witcher 3 at 1080p, 30fps than have a 4 hour, linear action game which looks like a CG movie.

WeAreLegion3264d ago (Edited 3264d ago )

You beat it in 4 hours? My first playthrough clocked in at just over 12 hours.

Transistor3264d ago

What you dislike about the Order doesn't have anything to do with the fact it almost looks like CGI.

Cy3264d ago

@Transistor

Actually it does. My point is that I doubt you can have long, incredibly intricate games that look like a CG cutscene 100% of the time. The Order focused on graphics over everything else, and in a lot of ways it's barely a complete game. I'd much rather have a deep game than a pretty one.

3263d ago
UltraNova3263d ago (Edited 3263d ago )

@Cy

Not all games need to be open world 100 hr RPGs.

Its called choice and whether some like it or not I love to be able to choose between types and not have to pick between only 100hr open world games with stories so stretched out it becomes a chore to finish them.

Plus not all of us have 15 hrs per day to spent on games anymore...

Look don't get me wrong I love RPGs, you see they're like steak, I love them so much its my favorite type of food but I don't want to eat steak in every single meal because I will be fed up with it sooner rather than later.

You see I love my steak and my chicken and my pork and my salads and everything in between with equal measure.

So you disliking the order like that its you saying no to more choice. Its bad for you and for everyone else you persuade.

+ Show (3) more repliesLast reply 3263d ago
christocolus3264d ago

Cant wait to see that game at E3. Sam Lake says the team has made remarkable progress since the the last time it was shown. also one of the guys working on QB also worked with the team that did the visual effects for the movie "Gravity"

Dirtnapstor3264d ago (Edited 3264d ago )

I've yet to get an Xbone, Quantum Break may be the game that persuades me.

wegetsignalx3263d ago (Edited 3263d ago )

The Quantum Break reveal trailer is pre-rendered CG, including the first screen shot you posted.

jukins3264d ago

Yea those games looked cgi but this article is referring to cgi like visuals at 4k in real time Not even a 40 titan can achieve that

Jalva3263d ago

Lol at all the people who agree that The Order: 1886 and God of War 3 look CGI but disagree that Ryse does, just goes to show that these people don't even know what CGI is and it's just a matter of one-upping Sony and downplaying Microsoft as usual.

wegetsignalx3263d ago (Edited 3263d ago )

Ryse cutscenes are pre-rendered CG. The Order: 1886 cutscenes are realtime. That is the difference. It has nothing to do with your personal attacks.

DevilOgreFish3263d ago (Edited 3263d ago )

@ DarkOcelet "The Order 1886 already almost look CGI."

...CGI of 14 years ago still beats it. Spirits within Used fully rendered hair and of course 400,000 polygons spent on characters.
http://gamehall.uol.com.br/...
http://static1.gamespot.com...

That being said, Assassin's creed unity is the game at the moment that's actually tried to push for rendered hair on characters and lighting too.

http://i.imgur.com/jyUimVt....
http://i.imgur.com/kX7ZwrF....

Ike_Broflovski3263d ago

At least you said almost.

To most non hype driven fanboys the order was average at best, even graphics wise. To seasoned PC gamers those "awesome" graphics are laughable. Then there's the fact those console can't render a game with "decent" graphics like The Order and have the game still be fun to play and not over in mere hours.

The Order was nothing more than a bargain bin game that had never before seen levels of hype to sell it to dummies.

One of the worst games I have played in the last 2 years.

DarkOcelet3263d ago

To each his own my friend but The Order 1886 was enjoyed by many people.

Also The Last Of Us looked incredible and had an amazing gameplay and so did Gears Of War 3.

So i am pretty sure Gears 4 and The Last Of Us 2 will reach the graphic fidelity of The Order 1886 and be awesome to play.

+ Show (5) more repliesLast reply 3263d ago
Transistor3264d ago

This just reminded me how excited I am to see Quantic Dream's PS4 game. What they achieved on PS3 was pretty crazy, I could see their PS4 game getting pretty close to CGI.

Tankbusta403264d ago

I'm fine with ps4s hardware...let's develop great games before we sorry about superficial stuff.

Guy is just butt hurt he doesn't work for naughty dog any more

Minute Man 7213264d ago

We all know this and it's 2 to 3 generations away

PhucSeeker3264d ago

4K, maybe. But with The Order 1886,inFamous 2nd Son, Driveclub, Killzone Shadow Fall and Ryse, i think we're good on that CGI visual part.

Minute Man 7213263d ago

Those games you listed does look good, I have the PS ones and later this year will be picking up Ryse & a X1. To say that these games are on CG level is just inaccurate. The next 2 gens we will all see what CG gameplay looks like

Show all comments (128)
530°

Improving Graphics Performance Using Cloud Is Going To Be Really Hard: Ex-Naughty Dog Dev

The idea is tempting but simply offloading assets to cloud won’t work, says Filmic World's boss and Ex-Naughty Dog developer John Hable.

Read Full Story >>
gamingbolt.com
3250d ago Replies(12)
Sonyslave33250d ago (Edited 3250d ago )

I can’t comment on any specific applications

In other words I can't really comment what going on at MS because I have no ideal what they are doing. -___-

Eonjay3249d ago (Edited 3249d ago )

Well, the last demo they showed was cloud based physics calculations. They never said anything about graphics processing in the cloud. But they have demonstrated other uses like cloud based software updating and cloud based physics.

But indeed, the issue is that Microsoft hasn't actually shown anything new or compelling yet. This is part of the problem. They made a lot of comments at the beginning of the gen about the potential but it hasn't come to fruition yet.

vega2753249d ago

http://www.pcgamer.com/nvid...

He must have forgot nvidia was all so messing with cloud computing with lighting effects in games. So I call B.S on his statement.

Dark_king3249d ago

@vega275 That is not graphics though its just doing the calculations then sending the data back to be rendered.

“for computing indirect lighting in the Cloud to support real-time rendering for interactive 3D applications on a user's local device.”

says so right there

Lennoxb633250d ago (Edited 3250d ago )

And he would know about everything that's going on behind the scenes at MS. I mean he does work for Naughty... Oh wait. He wouldn't know a d@mn thing would he? lol

3250d ago Replies(3)
Fez3249d ago (Edited 3249d ago )

This is not a specific comment on MS... It's to do with offloading graphical computations to remote servers.

Did you even read and think about the article or did you just see the thumbnail and go into "console war mode"? Pretty low of the submitter to do that but I would expect almost everyone to see past this media manipulation by now. RTFA.

Of course there are limitations in trying to perform low-latency graphical computations over a network - this is all that has been stated and is common sense.

It will be cool to see what does emerge from cloud computing in terms of gaming. Maybe some AI can finally make big leaps, esp. as the generation unfolds and there are more servers and power.
Tbh it is hard to think of something that isn't required almost immediately in gaming though - I don't know what the average would be but something like a >100ms round trip time seems reasonable... and is an awfully long time. But perhaps there are tricks and techniques to overcome this. Maybe some things can be calculated in advance for the next few frames.

Lennoxb633249d ago

It doesn't make sense to even talk about offloading graphical assets to the cloud since that was never the goal of cloud compute in the first place. It was meant to take care of some of the CPU tasks (most very small) in order to give the CPU more room to take on other things. The cloud is not rendering graphics, the console is still doing that. All it's doing is taking bits and pieces of data that the console needs for things like AI or physics, and sending it to the console. Not rendering it in the cloud.

rainslacker3249d ago

The paper that Nvidia provided with their demonstration has a lot of great information on what is going on, as well as the requirements to make it work.

http://graphics.cs.williams...

It's a pretty technical read, but good none the less.

Nvidia claims that the standard ping time in latency is sufficient in most cases, however, the actual real world results would be highly dependent on a lot of things. Certain applications work better with lower ping.

This of course only took into account 3 different types of lighting systems, but they are fairly common approaches in today's games, but often don't happen on the same scale as it'd be resource prohibitive.

This is of course only one implementation of this kind of tech. There are others out there, and I'm sure there are some that have yet to be revealed.

@Lennox

That's actually true. However, Wardell, in his infinite wisdom, decided to discuss offloading the lighting, which many in the media took as MS actually saying it. MS has mentioned this procedure, but never specifically mentioned it in regards to X1. Much of the expectations from some people on the tech comes from misrepresentation from a 3rd party who was only talking theoretical possibilities and not actual intented application.

Cloud compute to MS was 3X the resources, however resources does not equate to 3X the power. For example, 3X the resources in a physics engine means that you can calculate 3X the number of physics calculations per interval. 3X the power means that you could likely process 300X the number of physics calculations per interval, or process much more complex physics calculations than what would be necessary for a game.

That being said, whether or not games need to calculate 3X times the amount of physics per game is questionable. I imagine there are times when it could be nice, but not sure the overall practicality of such a feat as there does need to be an object for everything physics calculation.

AI makes a lot more sense, as it can be quite complex, but the results, and variables to derive those results are typically very small and allow for quick and easy transport through standard latency scenarios. Any lag introduced would likely be imperceptible to the end user, unless there was a huge spike or disconnect, and there would likely be backup routines should the data not arrive in time.

Fez3249d ago (Edited 3249d ago )

It does make sense to talk about offloading graphical assets to the cloud if you're asked the question in an interview though lol. And it's also an interesting topic outwith any narrow console war you may or may not be involved in.

The goal of cloud computing (in particular console gaming) is to provide a better experience to the user by working around the limitations of the hardware in whatever way is feasible.
No doubt work is being done on distributed graphics right now and if it is a feasible option, you can bet it will be tried at some point.

Asking the question to a developer (ex or otherwise) for their thoughts on this subject seems okay to me.

Spotie3249d ago

Lennoxb63 says, "It doesn't make sense to even talk about offloading graphical assets to the cloud since that was never the goal of cloud compute in the first place."

Ars Technica interviews Matt Booty.
http://arstechnica.com/gami...

I'm sorry, what was that?

+ Show (1) more replyLast reply 3249d ago
Bizzare213250d ago

I like how old developers from Sony are commenting on the new things MS is trying to do. lol

XeyedGamer3250d ago

I don't see a direct reference to, MS... You just made the assumption that the cloud tech is exclusive to them.
That being said, I think it's actually Sony who are pushing it forward at this point with game streaming. I've seen very little other than a couple demos touting one thing or another from the super bs pr team over at MS.

TheCommentator3250d ago

Game streaming is as different from cloud processing as playing games online.

MS has built the XB1 architecture around this function. Each of the 4 multipurpose DME's can do data movement at 26Gb/sec to/from any 2 locations at no cost to CPU, GPU, or memory bandwidth. My understanding is that server systems use these types of accelerators to communicate between nodes within the server bank. MS will show Crackdown at E3, so they will talk more about the cloud's usefulness in gaming then.

Bizzare213249d ago

Well the thumbnail is XB1 vs PS4...

madmonkey013249d ago

nothing new about remote servers, just a new buzzword to market it.

Fin_The_Human3250d ago (Edited 3250d ago )

Read the article and to sum it all up it all depends on internet speed and reliability.

Cloud graphic and physics improvement is still too early because of the slow and unreliable internet speed that 80% of gamers have.

MS is onto something but its way ahead of its time.

kaizokuspy3250d ago

This, exactly. Microsoft is right, but it's not practical yet, but when it becomes practical it will be amazing

Brisco3249d ago

80% of the gamers? Maybe in the states but europe is ready for this.

MysticStrummer3249d ago

The states are the biggest gaming market. I'd say 80% is conservative.

dcbronco3249d ago

Eighty is way overblown. The Washington D.C. Baltimore region and northern Virginia have over 10-13 million people. The average internet speed here is 30mbs easy. I would bet most of the major cities are similar. I would bet a third of this country has around that speed or higher. The problem is low population states like North and South Dakota and Montana bring the average down. And we have too many states like that. But more than half of us live in the major cities.

On topic I think this guy is too busy with his new business to be up on what Microsoft is doing. Some of his comments make that clear.

He mentioned server cost.

It's like he doesn't know what Azure is. So the answer to who will pay for all those servers is the companies that use Azure. Microsoft offers them to developers for free. He has a business, he knows business' pay a premium for every service they use.

Bandwidth.

Microsoft and Duke university have already cut back bandwidth need by 83%. Plus if he looks up some of the information on Project Orleans a big part of that is instantaneously hydration and dehydration of information to reduce bandwidth needs.

Wi-Fi?

Not even sure that is a real problem. If your Wi-Fi sucks buy a long cat-5.

Server goes down.

Again Azure has protocols in place to switch anything running instantly over to another server. He has to remember Azure is being sold as a business tool with Quality of Service guarantees. They want to use this to create mobile disaster infrastructure that can quickly switch over to a new host if needed. Look up agents in Project Orleans.

Latency is an issue.

Some of the other things will help address that. But the predictive technology Microsoft has been working on may play a part in that. As well as lost packets.

The reality is it may be really, really hard. But there are people thinking outside the box working on it. One of the things they were working on was the console handling everything in a certain area around the player and the cloud handling the things further away. Plus I think people should remember some of the rumors we've heard over the years. It could be a matter of the cloud just adding more details to things the console draws.

I think people need to wait a couple of more weeks.

Fez3249d ago

I think the round-trip time is the big problem though, not bandwidth or reliability. And unless servers are going to be in your country and close to your location you might not be able to benefit from cloud computing.

It's a really interesting problem to overcome. So many variables that will affect everyone's experience differently.
50ms RTT vs 250ms RTT.
Internet cutting out.
Internet traffic at peak times.

I wouldn't like to be the guys programming that... actually that's a lie it would be awesome but very very difficult!

Outthink_The_Room3249d ago

It's always funny seeing people talk about RTT, but never bring up how MMO's receive data, update state and game logic, apply data and then return said packet.

Why, if a Cloud Compute approach is held back by RTT, would any MMO be playable if it requires a similar approach for data?

Fez3249d ago (Edited 3249d ago )

That's a good point... also any online game in general must do the same.

I'm not an expert on this at all but I think there are a lot of tricks to achieve this and that's why the experience is not always optimal and varies per game.
For example: Client-Side Prediction and Server Reconciliation discussion here ( http://www.gabrielgambetta....

This kind of workaround to lag may not be possible because you would need the result (at least for graphical computations) immediately. But maybe things like AI could lag a few frames behind and be better than local AI.

It could be a great thing for some people, just like game streaming could be, but in my experience with my internet connection the lag sometimes creates noticeable problems.

rainslacker3249d ago

In terms of physics, beyond a feasibility standpoint, the developers still have to want to implement these things on such a grand scale. Implementing all these little extras takes time and resources that honestly could often be spent better elsewhere. What's the point in having a billion pieces of a destructible window, when one million will be sufficient?

Over a great period of time, obviously things will become available which make this kind of stuff more feasible on a development level, but there comes a point of diminishing returns. No matter how much a computer system may be able to do something, that something still has to be implemented at some point, and that takes time and money. One of the basic tenants of AAA game design is to make the complex out of the simple, because the simple is cheaper and more flexible across multiple implementations.

In graphics there is a term called "Level of Detail", or LOD. The premise behind this is that objects that are very close to the user's view have a higher level of detail applied to them, whereas things that are very far away have very little detail applied to them. The same is true with physics calculations. How many objects can a user reasonably have within their immediate view that requires such vast amounts of physics processing to make move? Again, diminishing returns for what amounts to lots of work on the development side.

On top of all this, for graphics rendering, GPU's are becoming more powerful faster than the internet infrastructure is becoming faster and more ubiquitous, so over time, the idea of rendering on the cloud may actually hinder the designed almost routine abilities of a graphics processor. The idea of remote rendering makes sense on certain types of devices, say mobile due to the issues with heat in a very compact device, as one day, those devices are not going to be able to go any further based on today's technology. Because of this, Moore's law is actually coming to an end at twice the speed of the average PC component. However, should a device have the ability to have a rather recent GPU, then chances are that it's abilities are going to far outpace the abilities of what the extra cloud rendering could provide.

+ Show (1) more replyLast reply 3249d ago
Show all comments (151)
720°

Ex-Naughty Dog Dev Explains Difference Between DX12 And DX11, Less Gains On Consoles Compared To PC

John Hable on how DX12 will impact consoles and PC.

Read Full Story >>
gamingbolt.com
nicksetzer13257d ago (Edited 3257d ago )

Why would a random ex-naughty dog dev who isn't working with DX12 on XB1 or PC or gaming at all currently (film graphics) be a reliable source to quote from? Seems pointless.

Not to mention both consoles are strongly CPU bound as their core speeds are terrible. Not saying he is wrong or right, but someone who is actually using the software would make much more sense to quote, these comments from him are essentially guesses.

I feel like gaming bolt decided to interview this guy for no reason other than he worked for Naughtydog and wants to start a flame war.

NuggetsOfGod3257d ago (Edited 3256d ago )

"The short answer is that newer APIs will make the CPU faster, but will probably not have much effect on the GPU,” Hable said to GamingBolt. “The improved APIs change how long it takes for you to tell the GPU what you want it to do, but they have no effect on how long it takes the GPU to actually execute those instructions. “ - ex naughty dog dev

"
They might be able to push more triangles to the GPU but they are not going to be able to shade them, which defeats the purpose. " cd projekt red

http://www.cinemablend.com/...

Same thing?

An apu is gonna suck no matter what you do to it.

Sorry.

What will dx12 do for a toaster?

What will dx12/Vulkan do for a gtx 960 and an i3/i5? Or an I7 with a r9 390x with HBM?

Now that is a worthy discussion.

jmc88883256d ago (Edited 3256d ago )

You're getting disagrees for being right...the fanboys are strong with this article.

The reason the CPU's are struggling in the new consoles is not because of the overhead...it's because the CPU's are weak.

A $229 CPU from 2008, the i7 920, at stock 2.66ghz let alone the easily achievable on air cooling 4 ghz runs circles around the consoles.

That is why the CPU's are struggling...it's because they are weak.

8 cores doesn't mean much, when each is pretty weak.

Or else some people might think you can take one of the new octacore chips in a phone and think it's powerful.

I love my PS4, but a $329 GPU added to a near 7 year old PC will run circles around it.

I don't see why people still are in denial, we've had these discussions for nearly a year that PC's will get a much bigger boost then any console, because PC's have the overhead wasting the power they contain.

Why people don't think Naughty Dog or even a layman can't know this and needs further proof is the height of ignorance.

TheCommentator3256d ago

So when developers who never worked with DX12 say DX12 will do nothing for the GPU, their speculation is magically right? Brad Wardell, on the other hand, has worked with DX12 and he is repeatedly discredited by people here. Love the double standard.

We'll see at E3 who was right when DX12 gets its XB1 debut.

XanderZane3256d ago

Well I'm pretty sure both AMD and Microsoft were well aware of the GPU bottleneck with the old DDR3 GPU prior to them building the custom chips for the XB1. I'm pretty sure M$ prepared the XB1 to handle that bottleneck that Mr. Hable discussed. The XB1 was never built to run the low level DX11 API, which is pretty obvious by the earlier games. It was build with DX12/Win10 in might. The system launched some 18 months early. So we'll see what happens after the key is inserted into the system.

KnowledgeIsPower3256d ago

LOL Quoting CinemaBlend as a source, pathetic that site is garbage and is run by fanboys

Docknoss3256d ago (Edited 3256d ago )

And I'm supposed to believe some random guy that "USE" to work for Naughty Dog? Nah, Pass! Microsoft isn't going to "cheerlead" directx12 unless it actually does help out the Xbox in a noticeable way. If it didn't it would come back to haunt them worse than E3 2013 and Phil Spencer isn't that stupid.

+ Show (2) more repliesLast reply 3256d ago
Tsubasa-Oozora3256d ago

You know whats pointless? Microsoft's cheerleaders talking about how dx12 will change the world and up the graphics and framerate... even though dx12 isnt released yet. Neither is Windows 10 or any games withdx12. If anybody can put in their 2 cents in so can he. Deal with it.

nicksetzer13256d ago (Edited 3256d ago )

I agree, those people are also stupid. Are you suggesting fanboys in a comment section somehow make an article based on an interview with someone who has no experience with the software they are talking about is validated by such?

If so that is just as stupid. In fact, you claiming it will do nothing is also equally stupid as you have no clue, and it has been proven that there will be improvements. (Just how much is the question) The reality is, the software is neither released, completed or implemented yet. How about we wait and see instead of pumping more into it. People are going to believe what they believe until shown otherwise so these articles are pointless unless given by someone using it.

Funny that you think your assumption is better than everyone elses... to the point of chastising those who have different assumptions, and excusing those that don't.

Outthink_The_Room3256d ago (Edited 3256d ago )

@Tsubasa

That would be true......if there weren't dozens of benchmarks already shown to the world...

...but there are.

And we can already see a massive improvement from DX11 to DX12.

Azzanation3256d ago

There will be less cheer leaders if there is less haters.

*logic*

MS fanboys arent just console gamers which mean there the ones getting the best out of DX12.

dantesparda3256d ago

@Nicksetzer

And yet you have no idea, what experience he has with DX12 or what he knows about it. You do know that a lot of these guys all know each other right? And that they all talk right? Im pretty sure that by now everybody who is anybody in the know, knows a whole lot about DX12. Its not like a low level API is even some new mystical thing. Its been done in consoles forever now. So let's not fall back on that default fanboy argument of "how would he know if he hasn't used it yet"? How you dont know he hasnt?

@Out
"And we can already see a massive improvement from DX11 to DX12"

On the PC! repeat, PC, not X1. You are making assumtpions. We don't know what it going to do for the X1.

+ Show (1) more replyLast reply 3256d ago
bleedsoe9mm3256d ago (Edited 3256d ago )

well of course they did ,i'm just surprise the click bait went the other way this time its normally asking every indie dev why the xb1 sucks lol he really has no idea what tools are in dx12 for xb1 .

corroios3256d ago

lol, both are strongly cpu bond, lol. The weakest part of both is the CPU, average mobile cpu...

ChrisW3256d ago (Edited 3256d ago )

Well... not "average mobile CPU". More like the latest smartphone.

The Galaxy S6 (which was released recently) has both a Quad-core 1.5 GHz and a Quad-core 2.1 GHz. The PS4 and Xbox One Octa-core CPUs are respectively 1.6 and 1.75GHz per core.

Sure does look comparable, though.

JWiLL5523256d ago

He's a high level developer who worked at one of the most difficult game companies to get hired at. He sure as hell knows what he's talking about, whether or not he's currently developing something on it or not.

People seem to be fine taking the word of MS employees who aren't even developers regarding DX12. I'd hold the opinion of a talented, unbiased developer a little bit higher.

Kiwi663256d ago

He maybe a high level developer but he still hasn't had anything to do with dx12 and why does it matter if those ms employees aren't devs as at least they actually had hands on with dx12 yet your saying that his opinion on something he has no first hand knowledge of is more valid than those who have

rainslacker3256d ago

Many take Wardell's word for everything, despite him clearly stating that he doesn't know the X1 well enough to say for certain. I can respect Wardell's comments because he does at least have the knowledge to reason out what is likely to be the case, and I also realize that many of the things he says simply get attributed to the X1 despite most of the time he's only talking about PC.

This guy could probably get picked up at any MS studio if he wanted to, and go in without more than a couple days to get up to speed on DX12 specific syntax. People really don't realize how talented game developers have to be to get jobs at studios like Naughty Dog. It's not like he was some sort of intern who worked there for 3 months working on linking the menus to different parts of the game.

I've done both console programming and DX12 programming(for PC) and I can tell you there isn't a major difference in how operations are handled between the two. Syntax is different, implementation is different, but DX12 operates pretty much the same way consoles do.

It's funny though. This is a great break down of a major difference in DX12, and it's a great thing, but some people are more concerned with discrediting the statements without the knowledge or the research to refute it with something factual.

nicksetzer13256d ago (Edited 3256d ago )

@rain c++ and visual basic are the primary syntax languages for dx3d of any kind, very few changes in that. Weird you claim to be some god-like programmer but don't know that...

Not to mention crytek, unreal and square have all had tech demos showing there is an inpactful change. So should people believe you (the random self proclaimed pro) or the people who actually presented something with the software?
http://m.windowscentral.com...
http://wccftech.com/king-wu...

So if you want to believe it does nothing, enjoy your misbelief. The only question is the effect it will have on XB1.

hamburgerhill3256d ago

I'm with Nick! It would be dumb to take serious the words of those (with even huge reputations) that have absolutely no experience dealing with dx12 over those that have some.

rainslacker3256d ago

By learning new sytax I meant the new functions that exist within DX12. For console programming, they're either going to use C, C++, or the assembly API. There are also some game engine scripts that they will likely use, and many many 3rd party tools which will get licensed to make things work. Visual basic won't be used because it works off a framework which isn't suitable for AAA games, but can be used for simpler games.

When I say there isn't a major difference, what I mean is that overall, the differences are on levels that aren't actually programmed in individual games. To the average developer, they're just going to use the engine, and then provide special functions through the low level API if necessary. It's EXACTLY the way console programming is done now. Not much will change. Sorry for being unclear.

I never claimed to be some god-like programmer. In fact, that's my point. It doesn't even take a genius programmer to go from one to the other if you know the basics of one of them. No people shouldn't listen to me, but they should at least verify or research what I say to determine for themselves if what I say has merit. I don't often dismiss other people's comments without at least trying to verify if they may have some merit.

Did crytek, Unreal, and SE show off anything for the X1? because that's really what this discussion is about. This guy is discredited for his work at ND, yet you point to all those developers who haven't made any DX12 games for X1? Seems legit.

On PC I have said many many times the differences in how it operates are substantial. I can attest to this based on my own work, and I am very impressed at what it can do, and I'm a little miffed that PC's have been gimped for so long due to this kind of stuff not being available years ago without 3rd party tools.

And that's what I'm saying. DX12 brings to PC exactly what has been on consoles for decades. It's a touch more higher level, but it has an extremely efficient low level API, just like consoles.

Let me know if you want to misread and misrepresent my comment to discredit me some more. I'll be happy to respond.

If you wish to continue on with your eyes closed, and fingers in your ears going lalalala, while ignoring anyone with a comment contrary to your own, and can't bother to provide me with any kind of response that actually does address anything I say with factual information then please just put me on ignore. Let my comments be used by those who want to take the effort to learn more, and not be like yourself where you throw out a few computing terms hoping that you seem knowledgeable enough to discuss the topic properly.

I don't care what your perceptions of what it will do are, but I do care when you make it out to be something that it's not. You set an expectation for others which MS can not possibly match. At least do me the common courtesy to respond with actual facts pertinent to my comment, and not with more exaggeration and PR nonsense which only validates what you already want to believe.

TheCommentator3256d ago

What??? MS built DX12, but some guy who worked for Sony is more credible? That's just stupid.

I'll tell you what. I've got some gold to sell you... yeah, I know the Periodic Table says it's lead but they don't know what they're talking about. Trust me.

DragonbornZ3256d ago (Edited 3256d ago )

Come on now. It depends, but for the most part those MS employees actually work at Microsoft and are communicating with and/or have had experience with Dx12 or those working on Dx12, unlike the ex Naughty Dog dev. I mean geez the difference is that simple.

Man, I wonder how you all would react if some ex Halo Dev started talking about Nintendos new API or a mario dev started talking about Vulkan. Tune would change here.

And there is a reason no one else is talking about this besides Gamingbolt lol.

+ Show (4) more repliesLast reply 3256d ago
Gamer19823256d ago

oh dear the legions are out today

rainslacker3256d ago

Why indeed? Probably because gamingbolt asked him. He's not wrong with what he says though. It's not like you need to know each specific API in and out to be able to work with them or understand what they'll be doing. Good game developers don't live in a bubble, and they all have access to forums or documentation or personal contacts where they talk about this stuff, and if they're good they keep up on everything that may be available to them.

I knew nothing about DX12 when I started working with it, there wasn't even documentation available to me until after I started work, but when I had it, I was able to see what it does different from both OpenGL and DX11, and adapt my algorithms to implement what I needed to in DX12. Since that time, MS has provided a lot of that information more publicly to developers if you know where to look. It doesn't make their PR rounds as much though.

Also, consoles are not CPU bound. They haven't been for a while now. PS3 was designed to be GPU bound with offloading onto the CPU/SPE's when needed, and the 360 had a similar setup but did have some overhead with it's memory controller in the cache, nothing major though. PS4 is most definately supposed to be GPU bound as stated by Mark Cerny himself, and I'd imagine the X1 is supposed to be as well, as it's the best way to get the most performance out of these machines.

Otherwise, I didn't notice this was a gamingbolt article before clicking on it, but it is a pretty good break down and simple explanation of a basic difference between current API's and the new ones coming in DX12 and Vulkan. His bunny example pretty much nails it, but he doesn't really explain what overdraw is to help understand why it's not optimal.

This quote is pretty much what a lot of people have been saying for a while.

. “The improved APIs change how long it takes for you to tell the GPU what you want it to do, but they have no effect on how long it takes the GPU to actually execute those instructions. “

In other words, you can't make hardware do more than it's designed to do.

Even Wardell has said something similar.

JamesBroski3256d ago

With all due respect, I'm pretty sure that guys knows a little more about programming than you do.

ngecenk3256d ago (Edited 3256d ago )

let me explain what he was trying to say. and im a computer scientist, so please consider that.

imagine you're trying to send a set of order to your employee. you have two options, you can do it via a middleman: asking someone to send the order to the employee, or to send it directly to your employee. naturally the second option is way faster because.. well it's just naturally faster.

programming is a set of order, employee is your hardware, and those middle men is any middleware, such as directx 12. So what directx 12 trying to do is to fasten the middle men, but it got nothing to do to fasten the employee. Does it make the game to be rendered better? of course! but does it make the hardware better? no!

So to sum up, direct x 12 WILL make xbox one's game better, but ps4 is already a better hardware, so it's impossible for xbone to top ps4.

and considering game dev (especially the triple A games) is much closer to the hardware (which minimize the role of middleware), the effect would be very minimum in console dev.

nicksetzer13256d ago

Hmm, I find that hard to believe. Especially considering "computer scientist" is the most outlandish job position description I have ever heard. It would be like calling the people at nasa spaceship guys.

That all said, you contradict yourself regardless. Directx 12 is a kernal and it does transmit data between the hardware and an application, but bettering that process can actually benefit the quality of a game. Will it make the physocal hardware more capable? No. It does however have the same result as it makes the hardware work more efficiently.

A better example would be a helicopters propellers. If you make the propellers directly horizontal they have little effect, but if you skew them slightly they allow for much more lift. Same wings, same engine, same weight.

Proof of this would be to use a simlar piece of hardware running old software and one running new. (Let's say dx9 vs dx12) there would be an absolutely massive difference, despite the hardware being The same.

Point is, we know DX12 is a massive improvement, we just don't know howuch it expands on the current toolset for XB1. From what we have heard the toolset for XB1 is a bit under par. So this could be a two fold upgrade. Stronger API with a more reliable set of tools. Not to mention if XB1 uses DX12 properly it could allow devs to make engines that will accomadate it more readily. Would be like last gen, PS3 better hardware, but xb360 was easier to create for. I think that is the goalS is shooting for. That said, this gen, the PS4 is streamlined enough AND powerful enough that I doubt any gains will overpower it.

rainslacker3256d ago

It would appear nick, that you want to discredit anyone who thinks differently than you.

While I can't claim ng's credentials, Computer Scientist is indeed an actual thing. It is more on the theoretical side of computing, as opposed to an engineer who works on the hardware or software. A CS would have to have enough knowledge of computer engineering, but the reverse isn't always true.

Here's a list of jobs looking for computer scientist positions.

http://www.indeed.com/q-Com...

Everything he said is true though, whereas your comment is just you throwing out some computing terms again trying to look smart. So lets look at what you say.

"Directx 12 is a kernal and it does transmit data between the hardware and an application,"

No, DirectX is an API. An API is an application program interface. it is simply a set of protocals and routines which facilitate the building of software applications. There are thousands of API's which are used every day, and their main purpose is to dictate how the software interacts with the OS(emphasis on OS).

A kernel is a program which manages I/O request from software, and translates those request into instructions for the hardware.

A kernal interacts between the hardware and the OS, whereas an API interacts between the user(software) and the kernel. The idea of low level access bypasses the kernel to a degree, but in a PC that's not ideal as it can cause serious security issues, which is why you need a low level API because opening up a system fully to low level access can cause all sorts of unwanted side effects. I don't even think consoles allow for complete bypassing of the OS kernel, unless it's through the hyper-visor. Outside of closed systems, I can't think of any API that directly interacts with the hardware. Almost everything is done through a level of abstraction, and DX is no different. The whole idea behind DX was to remove the hardware from the equation to make games more compatible and easier to program for PC.

" but bettering that process can actually benefit the quality of a game"

Yes. And DX12 does that. that's what the article says, that's what I've said, that's what ng said. It can benefit the quality of the game, and it will, but the effect is that the hardware can get more data faster, or more efficiently, or spend less time waiting for data to perform it's task.

That isn't the same effect, because better hardware can do more, whereas the current hardware can do what it's designed to do. The efficiency is that it's not waiting on data, or that the data can be output without waiting, or the data can be supplied in a more acceptable manner for processing, but the data itself is still processed based on the rendering algorithm used. I know it's a nuance, and you aren't exactly wrong, but you are making assumptions about the gains.

"From what we have heard the toolset for XB1 is a bit under par"

What we've also learned is that an implementation of DX12 is already installed into the X1 API. It's not the full Dx12 because it's still not finalized, but the core is implemented from what I understand, and the current low level API probably isn't going to change either way, as the current LL API was already built for the X1, never existed in DX11. Same as last gen, the X1 has a custom version of DX.

extermin8or3256d ago

@nick wow... Are you being serious... A computer scientist is technically anyone with the job position titled that (research based I'd assume) and/or anyone with a degree/studying for a degree which here in the UK anyway is known as "Computer Science" I know several people studying it currently as I myself study physics and therefore know many people at uni studying other sciences because... Well your just more likely to meet them. You seem to like discrediting others comments without any real evidence so I'm going to assume you have no background in any form of science because if you did you'd know. Evidence is key, without some tour credibility is zero. And both the guys above seem to have evidence for their comments and potentially experience in the field. Same goes for the guy being interviewed.

nicksetzer13256d ago

Computer scientist is not a posistion nor a degree. It is a description of such. Anyone who has a degree or position would know that. Generally all it means is a major in mathematics or a computer related field. Really simple to understand. Just like saying you work on broadway. There are lots of jobs in broadway not just one.

http://en.m.wikipedia.org/w...

+ Show (1) more replyLast reply 3256d ago
NextGen24Gamer3256d ago

The xbox one "beta tested in the future" was specifically made with DX12 in mind. The Xbox One is a windows 10 device. DX12 will not only make use of all the cores (Currently only one core being used with dx11) but it will also make much better use of its super fast ESRAM. Tiled resources, etc...The main reason for the difficulty for some 3rd party games not reaching full 1080p. Moving to DX12 is huge in this regard and it's something many conveniently leave out when discussing the true benefits of games developed using DX12 on the console.

Another huge benefit not talked about much, but expect MS to really do a deep dive into this at E3. But the sheer ease for developers when using DX12 on PC and literally with a push of a button, your game is now scaled & optimized for ALL windows 10 devices. Devs will save money, time, etc...And consumers/gamers when they buy a game on ONE windows 10 device, you own it for ALL your windows 10 devices. That to me is the game changer.

Lastly the fact that if a game is built using DX12 & windows 10, cross play will be simple & easy. For years I've always dreamed of a day when PC gamers could play with console gamers. Well after July that dream will be a reality.

Dx12 & windows 10 is definitely a game changer for xbox one. The xbox one will finally have software for developers that the xbox was built for from the beginning. It's amazing to me how with the low level dx11, the xbox one has been able to keep up with the Ps4 and in some cases do better. Fanboys say that xbox one can't do 1080p. Strange, because I have over 10 xbox one games that are native 1080p.

Dx12 will open up all kind of doors for the xbox one. Soon enough, gamers will see.

filchron3256d ago

he probably knows a thing or two about PC programming numbnuts. what do you think the PS4 and xbox1 basically are?

+ Show (9) more repliesLast reply 3256d ago
LostDjinn3257d ago (Edited 3257d ago )

Oh gamingblart. Playing both sides off against each other again. Classy stuff.

Strange that anyone with even a modicum of API knowledge has been saying what's posted here (and more) for months. The whole while you've been posting BS about massive gains to Xbone, pc and even mobile.

Serious question? Do you ever feel bad for the hit whore tactics you employ? All they do is cause division and angst.

What was that old chestnut about the ends justifying the...something?

Edit: Lol at the damage control attempt by nic (first comment).
This isn't a Naughty dog dev. It's the guy who runs filmic worlds (a graphics solution based company thats open to pc mostly). He worked for Naughty dog at some point. That (the ND reference) was used to generate hits and cause an argument. Seems you fell for it..'coz...you know. Creating fighting fanboys is how this site does business.

Edit 2: nice edit nic. Keep up the good work. *smfh*

N0TaB0T3256d ago

Just think, a pact of script kiddies are going to tear this news outlet apart along with Gamingbolt.

IamTylerDurden13257d ago

Honestly gaming bolt is a farce of a site. Ppl put way too much stock into the differences of direct x and Open gl, when a new version comes out of course it's better, but in the end it all pretty much evens out.

jhoward5853257d ago

I think the ex-naughty dog dev is mixing old technological advances with today's new technological advances of gaming & hardware/software.

Thing is, MS took a completely different approach with designing the x1's hardware.That said, They simply went above the standard engineering process to make some of the x1's parts custom built. By doing this, they gain a bit more hardware control over the x1 CPU/GPU, and how it should send n process data, back and forth.

The x1 was design with dx12 in mind. That makes a world of difference. The way I look at it we don't know enough about the x1 hardware to talk about it capabilities.

DragonbornZ3257d ago (Edited 3257d ago )

That's what has me interested. We don't know enough about the hardware, and whatever details we don't have I feel like we will around the time windows 10 & Directx 12 comes out. I'm not saying there will be some magical boost, but like you said the X1 was built with Directx 12 in mind. They planned it out from the beginning and all they talked about seems to go together like pieces in a puzzle.
It's all very interesting imo.

sinspirit3256d ago

It's hardly custom built just because it has esRAM..

They didnt take a completely different approach whatsoever. It's literally just like the 360. Just stronger, obviously with added functionality.

I don't see anywhere where they went above any standard engineering process. It's a console. Of course, it has more hardware control access, but it's no different than any other console. It sends data back and forth no differently, aside from esRAM, which requires work to get anything out of it.

Yes, it was designed with DX12 in mind.. no. Wait. DX12 was designed with consoles in mind. Not the other way around. DX12 gives PC's the lower level access that consoles already have. That is all.

Nothing is revolutionary about it's design.

jhoward5853256d ago (Edited 3256d ago )

Nothing is revolutionary about it's design. Oh really?

The x1 GPU is custom built. Its the only GPU in the world that has dual lane. This tech feature its the first of it kind. There no other GPU like it. Beside, MS wouldn't spend 2 billions bucks on X1's custom built GPU for no reason. Keep in mind, you can't test something that isn't readily available on the market. That said, I can clearly see why MS had to wait until x1's custom built GPU is born before writing DX12. I think had MS used a regular GPU for the x1's design they couldn't start writing dx12 way before the x1's hardware were completed.

http://www.reddit.com/r/xbo...

Another thing, Illogically speaking, what I find obvious about the x1 custom built gpu is that it has dual lanes which means that some of the other x1 hardware component has to also change to sustain data communication between the two lane that the x1 GPU has.

I'm guessing, that why MS had to implemented a move engine to take full advantage of those two data lanes. I could be wrong on that note.

Anyways, my point is the x1 has some hidden mystery about the x1 hardware. So, to make claims on what it can do it plain crazy thinking, especially if you don;t have(know) the x1 full specification.

LostDjinn3256d ago (Edited 3256d ago )

Jhoward...what you've just written and linked to is complete crap.
The "dual lane" you're waffling on about is for gpgpu compute and refers to ACEs. They're used for parallel computing. They each run 8 queues (lanes). It has 2 ACEs for a total of 16 queues (lanes). Pretty impressive huh?

For reference the PS4 (the R9 280 and up as well) run 8 ACEs for a total of 64 queues (lanes).

One of the reasons MS would have chosen to go for the older/fewer ACEs configuration is to accomodate the eSRAM. Said RAM takes up more than half the transistor count on the die. There is no secret sauce in the hardware.

Now that kinect has been 86'd they may attempt to repurpose the move engines. Given their ultra-limited bandwidth with no direct access to the eSRAM their usefulness will be limited.

All (yes all) the xbone's spec's can be found online. Beyond 3D and other sites have all that for you if you like.

Now did you really want to talk about hardware or were you just grasping at straws? Tell ya what. I'll wait for your reply and we can continue this if you like.

Here's a tip though. Don't link to Mr X style reddits if you want to be taken seriously. ;)

Edit: LOL all the info the link you provided proves my point if you just look at the info given. OMG! You couldn't make his stuff up (but someone did).

sinspirit3256d ago

@jhoward585

For one.. You are talking about complete speculation from a rough translation. that still hasn't been clarified.

Logically speaking.. You're still talking about "new" and "one of a kind" tech that isn't actually confirmed, yet, anyways. So, I don't see why you are trying to explain how it works.

Again, still unconfirmed. Don't harp about secret X1 sauce like everyone else is. Always have to grasp onto something..

And, just as LostDjinn said..

So. Yeah.

Love your console. Don't lie about it or find excuses though. The excuses are what games and features you like. Not what you hope it to be.

jhoward5853256d ago (Edited 3256d ago )

@sinspirit
YOU:For one.. You are talking about complete speculation from a rough translation. that still hasn't been clarified.

speculation?LOL

ME:Fact is, the x1 does have a dual lane custom built GPU. No one knows how it works but MS.

-----

YOU: Logically speaking.. You're still talking about "new" and "one of a kind" tech that isn't actually confirmed, yet, anyways. So, I don't see why you are trying to explain how it works.

ME: I brought it up because in my mind I know any piece of new technology will eventually improve as it passes through the trail n error phase. The x1 GPU is the first of it kind so there going to be lot of test run on the software side of things to refine it performance.

You:Again, still unconfirmed. Don't harp about secret X1 sauce like everyone else is. Always have to grasp onto something..

ME: I was thinking the same. Until I get more info on the x1 hardware specification I won't take in another false rumor as fact.

You: Love your console. Don't lie about it or find excuses though. The excuses are what games and features you like. Not what you hope it to be.

ME: for the record I own a PS4 not an x1. maybe in time I will.
Another thing, I do my best to make sense of everything I read or hear, especially when I gather information on the internet.
Lies is always a Lies. But one thing I do know is most major company won't take the blame for another company's false(misinformation) claim.

With that being said, MS has made some claims in the past that involve AMD as far as the engineering & hardware design of the x1 hardware.
And yes, I'm talking about the secret sauce stuff that's spurring all over the internet.

Fact is, AMD has their repetition to protect ,and MS has theirs. one thing that is for sure, AMD would've ended some of those false claim to protect their image.

So, That my friend, is all I need to draw my conclusion on what is fact, and what is not a fact.

+ Show (1) more replyLast reply 3256d ago
jhoward5853256d ago (Edited 3256d ago )

@LostDjinn

Ok, the link I provided on my previous post may not be the greatest information as far what the x1 hardware can actually do.

Honestly though, I don't think anyone can say what the x1 full specification really are because there's just too many contradiction regarding the x1 hardware specification on the internet. One sites say one thing while another site says another.

Truth is, I was more interested in MS's past business decision they made to fund & create the x1's hardware more than the physical hardware(and specification)itself.

Fact is, MS spent well over 2 billion dollars on the x1 GPU technology. That alone say a lot to me. To me, It means MS took a chance which could've either worked to their advantage or not.

Truth is, we really don't know if it was a bad or good business call. We don't yet have all the details until then I will remain completely optimistic.

LostDjinn3256d ago

"MS spent well over 2 billion dollars on the x1 GPU technology" - link?

GenuineGamer3256d ago (Edited 3256d ago )

I think what u meant before about dual lanes is that the x1 gpu is split into two sets of CU with 2 individual gfx command processors. Thats what brad wardell meant when he referred to xb1 having dual lanes and how ps4 and pc dont.

jhoward5853256d ago (Edited 3256d ago )

@LostDjinn

If my memory serves me correctly, I think brad wardell were the one who stated that the x1 custom built GPU cost close to 2 billion dollars.

They way he explained it...The deal between MS and AMD to engineer & build the x1 hardware cost MS over 3 billion. Two billion dollars went into the development of the custom GPU for the x1 while the remainder 1 billion dollars went in to the rest of the x1 hardware component & design.

http://www.gamespot.com/art...

http://www.vg247.com/2013/0...

LostDjinn3256d ago (Edited 3256d ago )

Genuine- If it's command processors I think you'll find they pertain to OS task distribution. With the discrete (system) OS and gaming OS requiring high and low priorty access. It's they only way they'd be efficient. The hypervisor would be simply access the discrete level OS priorty solution.

Whats the point?
Well, think of something like the snap function. The game would be given a high priorty while say the browser would be given a low priorty (as missing your render budget on a web page would be preferable to the game doing it).

Edit: Jhow neither of your links say anything of the sort. A deal between MS and AMD is all that's mentioned. Nothing about gpus. Please make sure you provide proof of your claims in future. Otherwise you'll paint yourself in a bad light

LostDjinn3256d ago

You just completely edited your comment to cover the fact you can't provide a link. Jhoward I'm now have a very different view of you. It's not about the truth. You were simply clutching this whole time.

Thanks for that.

rainslacker3256d ago (Edited 3256d ago )

From that thread, as much as I cared to read since it kind of devolved after a bit, all I can gather is that no one seems to know what the dual data lanes are for.

For those that don't know, a data lane is just a controller for data, and in this case, the X1 has two supplying it's CU's in the GPU. I'm not sure the move engines are really important for two data paths, as it may just add more overhead since the data controller has direct access to memory and the CPU. In the case of the X1, the different controllers appear to control a split set of CU's.

Anyhow, I'm not going to speculate for now, because I'd have to do some more research. I understand what's being talked about, but not enough in relation to the X1 to be able to fathom an assumption. I do think the article which reported the leak was a bit presumptuous.

To me, the best thing to take from that thread was a comment from iroboto,

"It's very important to not get stuck into conspiracy style thinking. It's like if MS denies the functionality of the second graphics command processor you take that as the opposite. If they agree with you, you take as truth, and if they say nothing about it that means you also take that as admission that you are correct. In all scenarios you are only agreeing with what aligns to wishful thinking and that severely hampers your abilities to make good sound decisions."

Not saying you're wrong or anything, just that it may be wise to temper your expectations and wait until more information is out before postulating a conclusion on what MS did and didn't do. I'll also readily admit that I should probably follow my own advice sometimes. There are a couple people on that thread which seem really knowledgeable about hardware, and while they're offering possibilities, none of them are saying anything definite.

Otherwise, I wouldn't really call it a revolutionary design. It's a different design, done to achieve some task which is currently unknown. One person speculated that the extra channel is for the media and overlay features of the X1, so the extra data path may simply be a workaround so as not to take away from the actual abilities of the GPU.

I can think of several reasons why a second data controller would be beneficial in gaming, but it's not a feature of DX12 that I've seen. It may be specific to X1 though, in which case I wouldn't have bothered looking too much into it yet.

However, even with two data controllers, unless there is some specific gaming purpose for them, I can't really think of any reason why you would need one on the GPU's given the rather low number of compute units. The data controller that came standard with the GPU should be able to handle it perfectly fine, since graphics are serial in nature. But when it comes to GPU compute, it can actually help tremendously.

Edit@LostDjnn

Appears you went into more detail about it possibly being a multi-tasking thing needed. While that's perfectly reasonable, I still wonder if it's actually necessary to have 2 controllers. Overlay has such low overhead, it seems rather unnecessary to split the CU's and memory controllers. I can't imagine that system features would run off GPU compute.

LostDjinn3256d ago (Edited 3256d ago )

Rain that's not where I was going with it. Overlay indeed takes FA overhead. The efficiency increase I was referring to pertains to running 2 controllers with a conventional overlay as opposed to running them hardware based priortization. It has nothing to do with compute based packet distribution to the gpu. Simply the simultaneous rendering of assets from two seperate OS's on a hardware level.

Nice to chat with someone who actually just cares about the facts though. If I run outta bubbles just pm me. :)

rainslacker3256d ago

Ah. Yeah I think I missed a bit of your comment there. Indeed it does make sense to do that as it would require hardware to be dedicated to the actual secondary rendering to prevent slow downs with the game render. Since console developers have the ability to control memory controllers, it would be reasonable to isolate a secondary controller that is managed by the OS to perform it's functions. This leaves everything still available to the developers and prevents unintended conflicts.

I think one example of why this might be beneficial is looking at the PS3. While a game can render in the back ground when you hit the home button, it can also have stuttery frame rate should it keep running. It's mostly obscured so it can be hard to notice, but it is there in some games. I'd imagine on a multi-tasking view, the stuttering would be extremely noticeable.

It's an interesting approach to handle what could be done with a rather inexpensive secondary graphics chip running synchronously with the main GPU.

+ Show (5) more repliesLast reply 3256d ago
XanderZane3256d ago

The thing these developers are missing is that GPU bottleneck he's talking about won't be in the XB1. Because the XB1 was designed specifically for the DX12/Win10 API/OS there are things in the XB1 that will eliminate most of the bottleneck that would normally happen to a next-gen console. I expect a lot of PC games will get transferred to the XB1 with little problems. I'm interested to see how The Witcher 3 gets updated to Win10/DX12. Most of these developer who don't work for M$ haven't a clue of how the XB1 will handle DX12 because they don't know everything about the hardware.

Ark_3256d ago

What these APIs do is to lessen the strain on the CPU. So if you have a game that is bottlenecked by it, which basically means the GPU cannot unfold it's full potential, you get some gains with a better APIs. But the GPU stays limited to it's specs. There is nothing you can do about it.

Example: BF4 with Mantle

Slow CPU + Fast GPU ... gave you a crappy experience in DX11, since the GPU was bottlenecked by the limited CPU. Mantle helped there and freed up some capacity on the CPU, so the GPU could better unfold.

Fast CPU + slow GPU ... no gains with Mantle.

Fast CPU + fast GPU ... if there is no CPU bottleneck, you get no gains with Mantle.

Slow CPU + slow GPU ... minimal gains or no gains, if the GPU was allready at it's limit with DX11.

DX12 will basically work the same way. On PC and X1 the GPU will limit it's effect. It really only shines on old CPUs combined with mid- to toprange graphicscards.

+ Show (1) more replyLast reply 3256d ago
Lennoxb633257d ago

Can current and ex Naughty Dog employees stop commenting on DX12? It's none of their concern. I don't see MS constantly talking about PS4's API.

Sonyslave33257d ago

I know right ps4 guyz talk more about xbox 1 then the system they make games for it.

Pandamobile3256d ago

Graphics programmers aren't bound to platforms like you somehow think they're supposed to be.

Lennoxb633256d ago (Edited 3256d ago )

Being a graphic programmer doesn't make you an expert of every graphics API. A dev that only makes iOS games doesn't necessarily know everything about developing for Android OS.

Naughty Dog devs have never worked with DX12, any Xbox game, or PC game. So they would know little to nothing about it. So asking Naughty Dog current and ex employees is useless.

@Pandamobile
While that is true, at the same time it makes no sense to ask a developer of a totally different studio, with totally different programming methods. They still don't know much about it. Even AMD; the company that worked with them on it, doesn't even know much about it's capabilities. And you suspect someone who hasn't touched DX12 at all knows something? You can't really compare previous versions to DX12 either. As it unlocks low level access in PCs that has been dormant for years. So completely different case this time.

Pandamobile3256d ago (Edited 3256d ago )

Every single game developer on the planet has made games for PC. Do you really think that graphics programmers jump right out of university or previous jobs onto a PS4 dev kit without learning DirectX and OpenGL?

Seriously?

APIs are transparent to graphics programmers. Just because they've never used DX12, doesn't mean their opinion on it is completely invalid. They know all the shortcomings and pitfalls of graphics architectures, regardless of whether their employed by Sony or not.

JasonKCK3256d ago

Thank you for logic Pandamobile.

TheGreyWarrior3256d ago (Edited 3256d ago )

@Lennoxb63

"Being a graphic programmer doesn't make you an expert of every graphics API. A dev that only makes iOS games doesn't necessarily know everything about developing for Android OS."

True that but in the case of Naughty Dog believe it or not, they have Xbox One and X360 development kits at their office. If you don't believe me, look online.

rainslacker3256d ago (Edited 3256d ago )

@Lennox

I'll agree with you if you can agree that Phil Spencer isn't qualified to talk technical specs on DX12. He's not even a game programmer, he's a technical engineer of hardware more, yet if he said something about DX12, you would take it as gospel.

Otherwise, graphics programmers that understand how an API works, regardless of which one they primarily use, are far beyond the level of the less astute programmers who simply use pre-made functions to draw a screen. If you think programmers at Naughty Dog don't attend classes and go to things like Build, then you are sadly misinformed. Naughty Dog is part of the ICE team which makes the PS API's and SDK's. I dont know if this guy was part of ICE Team, but do you truly believe that these guys are clueless on different API's which are doing exactly what console API's have been doing for almost 30 years now? Any console graphics programmer probably knows this stuff more than any PC graphics programmer, because it's what they already do.

By your own reasoning, there aren't many DX12 developers out there at all, so no one we've heard from are actually qualified to talk about it in regards to consoles except for maybe a few privileged devs who got early access to it for the DX12 X1 games. There aren't many of those out there right now, and most of it is isolated to engine makers for the time being. So who exactly should we listen to? MS? That'd be fine, but they aren't exactly unbiased. So that leaves no one to listen to to relate DX12 to X1.

+ Show (2) more repliesLast reply 3256d ago
JWiLL5523256d ago

I'm sure you didn't even bother to read the article.

He's a programmer who worked at an acclaimed studio, he knows his stuff. Gamingbolt asked him questions, he answered.

This article was a step up from their usual 'posting a series of tweets' style of journalism.

Show all comments (137)
240°

PS3 Was More Or Less Maxed Out With Uncharted 2; Naughty Dog Has a Very Talented Graphics Team

Filmic Worlds boss, John Hable also talks about the selection process at Naughty Dog.

Read Full Story >>
gamingbolt.com
Blackleg-sanji3268d ago

Naughty God's man I tell you some of the best in the industry

breakpad3268d ago (Edited 3268d ago )

they were nt just talented ..they were also not bored (like many lazy others) to work on PS3 which was a beast ... Kojima,FromSoft and Naughty Dog made their own engines specially for PS3 (also Sony helped them a bit too), which maybe was difficult at start but later were benefited alot by that as now they have their own engines perfected and giving some very well made games ...(kojima started with MGS4engine for PS3 and now evolved to Fox engine, Sony gave to FromSoft the phyre engine which evolved to BBorne engine..etc etc

MegaRay3268d ago

Bullshit. PS3 wasnt maxed out by U2. TLoU maybe but not U2

PrinceOfAnger3268d ago (Edited 3268d ago )

how is the ps3 maxed out with uncharted 2 or TLoU when there is this?.

crysis 3 on console realtime

http://www.gamersyde.com/po...
http://www.gamersyde.com/po...
http://www.gamersyde.com/po...

uncharted 2 pre rendered cutscenes

https://itani15.files.wordp...

http://media1.gameinformer....

yeah I am blind lol...

Gwiz3268d ago

Third-Party never maxed out the PS3,they had to change a lot about Crysis 3 in terms of attributes to make it even portable.
Crytek uses high end solutions and do not necessarily optimize them
for pretty much anything,Uncharted 2 was developed with the CBE in mind and took full advantage of it's capabilities,well in comparison.
I would love to see a next gen CBE with more bandwidth and a better graphics solution,it's a very underrated chip

mochachino3268d ago (Edited 3268d ago )

Crysis 3 looked so bad when running. The framerate was in the 15-20 range the vast majority of the time and the jaggies were oppressive.

For all of it's effects, Crysis 3 looked worse than Crysis 1 and Crysis 2 on PS3 - Crytek tried to do too much and the horrible performance ruined the experience and graphics.

Utalkin2me3268d ago (Edited 3268d ago )

You can max out a system, but you can also optimize your engine on a maxed out system. You can only get so much power out of a system. Your next steps are to optimize your engine to get more performance. Which i think you're getting the 2 confused.

3268d ago Replies(2)
CertifiedGamer3268d ago

It was obvious that it was maxed out with Uncharted 2 as it will be maxed out with Uncharted 4 and Star Wars Battlefront

Spotie3268d ago

Dunno how the PS3 will be maxed out with games it won't even have...

CertifiedGamer3264d ago

I messed up but meant the PS4 will be maxed out with Uncharted 4 and uncharted 3 as both games are coming out in the systems third birthday.

NeverHeavyMan3268d ago (Edited 3268d ago )

A system can be "maxed out" and still show improvement, as he is saying (I'd advise reading beyond the title). Despite what Uncharted 2 did/looked like, God of War III, Heavy Rain, Uncharted 3, and Beyond: Two Souls looked better. Why? Maxing out a machine doesn't mean you can't make improvements in other areas.

The latter portion of the console's life had developers (as Hable put it), "squeezing out" what they could, though he admits, there wasn't really much left. I don't expect Uncharted 4 to be Naughty Dog pushing the PS4 to Uncharted 2's point on the PS3, but their second game? No doubt.

firelogic3268d ago (Edited 3268d ago )

Not really. "maxed out" means the limit has been reached. If you've reached the outer limits of what it can do, you can't go beyond that. If they were able to tweak something here and optimize something there to get another 1fps out of it or improve textures or improve loading times, it means they didn't hit the max with U2.

What he wanted to say was worded incorrectly.

NeverHeavyMan3268d ago (Edited 3268d ago )

If the PS3's limit had been 100% reached, you wouldn't have seen better games than Uncharted 2. That is not what maxed out means. Its RESOURCES can't be pushed any further, but the engine code can be modified in a way that would allow for optimization (as plenty on the PS3 proved). The Naughty Dog 2.0 engine had reached its limit on the PS3, but there was still enough juice to make games look and run better after Uncharted 2 released.

The engine, to that point, was maxed. The PS3's capabilities were not. Something I see people get confused on all of the time.

kneon3268d ago

You can max out any machine due to crappy code. It's easy to do, any idiot can do it.

A computer is only truly maxed out once the software has been fully optimized, and that takes a long time as someone will keep finding clever solutions to certain problems.

firelogic3267d ago

You're not getting it. If you can max out something that's it. You keep bringing up optimization. If it can be optimized, it didn't hit max. Max is the end of the road. Like I said above, if they can optimize to get better performance, it means it was never maxed to begin with. Max is the end. Being able to optimize means it wasn't at max.

Show all comments (32)