The idea is tempting but simply offloading assets to cloud won’t work, says Filmic World's boss and Ex-Naughty Dog developer John Hable.
I can’t comment on any specific applications In other words I can't really comment what going on at MS because I have no ideal what they are doing. -___-
Well, the last demo they showed was cloud based physics calculations. They never said anything about graphics processing in the cloud. But they have demonstrated other uses like cloud based software updating and cloud based physics. But indeed, the issue is that Microsoft hasn't actually shown anything new or compelling yet. This is part of the problem. They made a lot of comments at the beginning of the gen about the potential but it hasn't come to fruition yet.
http://www.pcgamer.com/nvid... He must have forgot nvidia was all so messing with cloud computing with lighting effects in games. So I call B.S on his statement.
@vega275 That is not graphics though its just doing the calculations then sending the data back to be rendered. “for computing indirect lighting in the Cloud to support real-time rendering for interactive 3D applications on a user's local device.” says so right there
And he would know about everything that's going on behind the scenes at MS. I mean he does work for Naughty... Oh wait. He wouldn't know a [email protected] thing would he? lol
This is not a specific comment on MS... It's to do with offloading graphical computations to remote servers. Did you even read and think about the article or did you just see the thumbnail and go into "console war mode"? Pretty low of the submitter to do that but I would expect almost everyone to see past this media manipulation by now. RTFA. Of course there are limitations in trying to perform low-latency graphical computations over a network - this is all that has been stated and is common sense. It will be cool to see what does emerge from cloud computing in terms of gaming. Maybe some AI can finally make big leaps, esp. as the generation unfolds and there are more servers and power. Tbh it is hard to think of something that isn't required almost immediately in gaming though - I don't know what the average would be but something like a >100ms round trip time seems reasonable... and is an awfully long time. But perhaps there are tricks and techniques to overcome this. Maybe some things can be calculated in advance for the next few frames.
It doesn't make sense to even talk about offloading graphical assets to the cloud since that was never the goal of cloud compute in the first place. It was meant to take care of some of the CPU tasks (most very small) in order to give the CPU more room to take on other things. The cloud is not rendering graphics, the console is still doing that. All it's doing is taking bits and pieces of data that the console needs for things like AI or physics, and sending it to the console. Not rendering it in the cloud.
The paper that Nvidia provided with their demonstration has a lot of great information on what is going on, as well as the requirements to make it work. http://graphics.cs.williams... It's a pretty technical read, but good none the less. Nvidia claims that the standard ping time in latency is sufficient in most cases, however, the actual real world results would be highly dependent on a lot of things. Certain applications work better with lower ping. This of course only took into account 3 different types of lighting systems, but they are fairly common approaches in today's games, but often don't happen on the same scale as it'd be resource prohibitive. This is of course only one implementation of this kind of tech. There are others out there, and I'm sure there are some that have yet to be revealed. @Lennox That's actually true. However, Wardell, in his infinite wisdom, decided to discuss offloading the lighting, which many in the media took as MS actually saying it. MS has mentioned this procedure, but never specifically mentioned it in regards to X1. Much of the expectations from some people on the tech comes from misrepresentation from a 3rd party who was only talking theoretical possibilities and not actual intented application. Cloud compute to MS was 3X the resources, however resources does not equate to 3X the power. For example, 3X the resources in a physics engine means that you can calculate 3X the number of physics calculations per interval. 3X the power means that you could likely process 300X the number of physics calculations per interval, or process much more complex physics calculations than what would be necessary for a game. That being said, whether or not games need to calculate 3X times the amount of physics per game is questionable. I imagine there are times when it could be nice, but not sure the overall practicality of such a feat as there does need to be an object for everything physics calculation. AI makes a lot more sense, as it can be quite complex, but the results, and variables to derive those results are typically very small and allow for quick and easy transport through standard latency scenarios. Any lag introduced would likely be imperceptible to the end user, unless there was a huge spike or disconnect, and there would likely be backup routines should the data not arrive in time.
It does make sense to talk about offloading graphical assets to the cloud if you're asked the question in an interview though lol. And it's also an interesting topic outwith any narrow console war you may or may not be involved in. The goal of cloud computing (in particular console gaming) is to provide a better experience to the user by working around the limitations of the hardware in whatever way is feasible. No doubt work is being done on distributed graphics right now and if it is a feasible option, you can bet it will be tried at some point. Asking the question to a developer (ex or otherwise) for their thoughts on this subject seems okay to me.
Lennoxb63 says, "It doesn't make sense to even talk about offloading graphical assets to the cloud since that was never the goal of cloud compute in the first place." Ars Technica interviews Matt Booty. http://arstechnica.com/gami... I'm sorry, what was that?
I like how old developers from Sony are commenting on the new things MS is trying to do. lol
I don't see a direct reference to, MS... You just made the assumption that the cloud tech is exclusive to them. That being said, I think it's actually Sony who are pushing it forward at this point with game streaming. I've seen very little other than a couple demos touting one thing or another from the super bs pr team over at MS.
Game streaming is as different from cloud processing as playing games online. MS has built the XB1 architecture around this function. Each of the 4 multipurpose DME's can do data movement at 26Gb/sec to/from any 2 locations at no cost to CPU, GPU, or memory bandwidth. My understanding is that server systems use these types of accelerators to communicate between nodes within the server bank. MS will show Crackdown at E3, so they will talk more about the cloud's usefulness in gaming then.
Well the thumbnail is XB1 vs PS4...
nothing new about remote servers, just a new buzzword to market it.
Read the article and to sum it all up it all depends on internet speed and reliability. Cloud graphic and physics improvement is still too early because of the slow and unreliable internet speed that 80% of gamers have. MS is onto something but its way ahead of its time.
This, exactly. Microsoft is right, but it's not practical yet, but when it becomes practical it will be amazing
80% of the gamers? Maybe in the states but europe is ready for this.
The states are the biggest gaming market. I'd say 80% is conservative.
Eighty is way overblown. The Washington D.C. Baltimore region and northern Virginia have over 10-13 million people. The average internet speed here is 30mbs easy. I would bet most of the major cities are similar. I would bet a third of this country has around that speed or higher. The problem is low population states like North and South Dakota and Montana bring the average down. And we have too many states like that. But more than half of us live in the major cities. On topic I think this guy is too busy with his new business to be up on what Microsoft is doing. Some of his comments make that clear. He mentioned server cost. It's like he doesn't know what Azure is. So the answer to who will pay for all those servers is the companies that use Azure. Microsoft offers them to developers for free. He has a business, he knows business' pay a premium for every service they use. Bandwidth. Microsoft and Duke university have already cut back bandwidth need by 83%. Plus if he looks up some of the information on Project Orleans a big part of that is instantaneously hydration and dehydration of information to reduce bandwidth needs. Wi-Fi? Not even sure that is a real problem. If your Wi-Fi sucks buy a long cat-5. Server goes down. Again Azure has protocols in place to switch anything running instantly over to another server. He has to remember Azure is being sold as a business tool with Quality of Service guarantees. They want to use this to create mobile disaster infrastructure that can quickly switch over to a new host if needed. Look up agents in Project Orleans. Latency is an issue. Some of the other things will help address that. But the predictive technology Microsoft has been working on may play a part in that. As well as lost packets. The reality is it may be really, really hard. But there are people thinking outside the box working on it. One of the things they were working on was the console handling everything in a certain area around the player and the cloud handling the things further away. Plus I think people should remember some of the rumors we've heard over the years. It could be a matter of the cloud just adding more details to things the console draws. I think people need to wait a couple of more weeks.
I think the round-trip time is the big problem though, not bandwidth or reliability. And unless servers are going to be in your country and close to your location you might not be able to benefit from cloud computing. It's a really interesting problem to overcome. So many variables that will affect everyone's experience differently. 50ms RTT vs 250ms RTT. Internet cutting out. Internet traffic at peak times. I wouldn't like to be the guys programming that... actually that's a lie it would be awesome but very very difficult!
It's always funny seeing people talk about RTT, but never bring up how MMO's receive data, update state and game logic, apply data and then return said packet. Why, if a Cloud Compute approach is held back by RTT, would any MMO be playable if it requires a similar approach for data?
That's a good point... also any online game in general must do the same. I'm not an expert on this at all but I think there are a lot of tricks to achieve this and that's why the experience is not always optimal and varies per game. For example: Client-Side Prediction and Server Reconciliation discussion here ( http://www.gabrielgambetta.... This kind of workaround to lag may not be possible because you would need the result (at least for graphical computations) immediately. But maybe things like AI could lag a few frames behind and be better than local AI. It could be a great thing for some people, just like game streaming could be, but in my experience with my internet connection the lag sometimes creates noticeable problems.
In terms of physics, beyond a feasibility standpoint, the developers still have to want to implement these things on such a grand scale. Implementing all these little extras takes time and resources that honestly could often be spent better elsewhere. What's the point in having a billion pieces of a destructible window, when one million will be sufficient? Over a great period of time, obviously things will become available which make this kind of stuff more feasible on a development level, but there comes a point of diminishing returns. No matter how much a computer system may be able to do something, that something still has to be implemented at some point, and that takes time and money. One of the basic tenants of AAA game design is to make the complex out of the simple, because the simple is cheaper and more flexible across multiple implementations. In graphics there is a term called "Level of Detail", or LOD. The premise behind this is that objects that are very close to the user's view have a higher level of detail applied to them, whereas things that are very far away have very little detail applied to them. The same is true with physics calculations. How many objects can a user reasonably have within their immediate view that requires such vast amounts of physics processing to make move? Again, diminishing returns for what amounts to lots of work on the development side. On top of all this, for graphics rendering, GPU's are becoming more powerful faster than the internet infrastructure is becoming faster and more ubiquitous, so over time, the idea of rendering on the cloud may actually hinder the designed almost routine abilities of a graphics processor. The idea of remote rendering makes sense on certain types of devices, say mobile due to the issues with heat in a very compact device, as one day, those devices are not going to be able to go any further based on today's technology. Because of this, Moore's law is actually coming to an end at twice the speed of the average PC component. However, should a device have the ability to have a rather recent GPU, then chances are that it's abilities are going to far outpace the abilities of what the extra cloud rendering could provide.
Here we go. An actual developer, you know someone who's made games and has experience with tech necessary to do so, is saying that this is going to be really hard. And true to form, the Defense Squad will try to undermine the statement by saying "oh he doesn't work for MS" or "oh he doesn't know what's going on behind the scenes" and focus on stuff like specific applications. Because you all know better than him right? I mean, MS says it it must be true, developer says it would be hard must be false because he's never made games before right? So many heads in the clouds.
Because developers are never wrong...oh wait.
"hard" =/= "impossible".. If Microsoft has figured out a way for this to work, I'm sure it was not only hard but it was VERY hard... Point?
We just had a article on here the other day where a dev said cloud gaming could be amazing if done right and it seemed to get dismissed by a bunch of people on this site, but we should all of the sudden listen to this guy. Funny how that works, huh?
Wow. Your argument is flawed. First of all, I don't recall any MS developer ever claiming improve graphics with the cloud. They are saying things like bigger world, improve AI, offloading processing and nothing regarding graphics upgrade. If anyone have seen from the Crackdown 3 demo with the cloud that was at display last May, they show a version without cloud and a version upgraded with the help of the cloud engine, the speed and destruction are the only difference to show what off loading processing with the help of the cloud could do. The graphics are the same, the only difference is physic and AI. Thia Naughty Dog dev said offloading asset to the cloud is hard but the crackdown demo show it can be done. sony don't have the skill people or money to run a cloud server like MS and maybe that is where the "hard" part of a Sony developer came from. Here is the Eurogamer part describing the new engine for Crackdown 3. http://www.eurogamer.net/ar... "WE also know that Dave Jones, who directed the first game for Xbox 360 at his now closed studio Realtime Worlds, is on board. Jones' latest company, Cloudgine, is working with Microsoft Studios on the game. Cloudgine specialises in cloud gaming technology and using data centres to boost computationally intensive game components such as PHYSICS and AI."
But wouldn't a software team that only makes software know more than a game developer. Just curious
@Dudebro: So this dev is wrong but a dev you agree with is right? Classic. @Antwan3k: "Point?" That's what I'm trying to figure out from your post. Was it your goal to make an irrelevant post? You're taking the position that MS has already made this work. Well, they haven't really shown that in uncontrolled conditions now have they? But not only are you taking the position that they've made it work, you're also being asinine with your "Hurr durr, I'm sure it was very hard" nonsense. @marlinfan10: "Could be amazing" is not the same as "da cloud will make Xbox One have 4 times better grfx" which is the general attitude coming from you Defense Force people. "Could be amazing" is an open ended, uncommitted statement. It creates no actual position, and isn't a concrete opinion. Basically that dev was not taking a position in the debate, he was saying it "could" be great. This dev is not saying it's going to be great or not, he's saying it's going to be hard to do. And immediately the inundation has begun to dismiss what he said by all the veteran developers here on N4G. @Rookie_Monster: My argument that the Defense Force is dismissing what this dev is saying because he doesn't work at Redmond? That argument? Because I didn't say anything about the cloud being able or unable to improve graphics. Read what I said again. @castillo: When you're trying to use that software in game development, you're making it to be used by game developers.
Dragonknight "Because I didn't say anything about the cloud being able or unable to improve graphics. Read what I said again." LOL, then why are you commenting of what the article is about? The article is about a naughty dog developer talking about the difficulty of improving Graphics with Cloud. Agenda much?
"LOL, then why are you commenting of what the article is about?" I'm sorry but are you really this stupid? Why am I commenting on what the article is about? Did you seriously just ask this question? *sigh* Alright then. 1. This is a public website. Part of this public website is the ability to make comments. The reason people make comments on what the article is about is literally because they can. 2. You and the rest of the Defense Force are flat out dismissing what the dev says for no actually legitimate reason other than to kiss Microsoft's big green posterior. Ranging the gamut from "He's a former Sony dev" to "He doesn't work for Microsoft" absolutely NONE of you have actually been able to refute what he has said. All you've done is chant "In Microsoft We Trust" and left it at that. My initial comment reflects that attitude. 3. Who the hell do you think you are to dictate who can comment on what, where, when, and why? Do you honestly think that the only people who should comment on news are those with an affirmative/positive attitude toward the news being reported? Imagine what would have happened if no one criticized Microsoft for their B.S. DRM at the reveal/launch of the Xbox One. If everyone acted like the you and the gang and insisted that only people with agendas could ever say anything bad about the Lord and Savior Microsoft. Get over yourself and take off the green goggles.
Everything about programming on XB1 is really hard compared to Sony. What's your point? While Sony went the easy route, walking around AMD with a shopping cart and said, "I'll take an extra bus and some more of those aces," MS went with specialized processors up the wazoo, 15 in total, modifying the system for things like massive data movement between components. Besides, I'm sure MS wouldn't invest in 300,000 servers, or be laying their own internet infrastructure just so their lies would seem more believable. If everyone took the Sony approach to life, nothing new would ever get accomplished and we'd eventually end up living in a world resembling Idiocracy. Conversely, taking risks sometimes results in making mistakes but is also how new ideas are born. These fanboy conflicts are much like Edison vs. Tesla, or Newtonian Physics vs. Quantum Science when it was trying to gain traction some years ago. New ideas often get shunned by those content to walk the same outdated approaches in the name of tradition. Win10, DX12, and the Cloud are just getting started and MS predicted a 10 year life cycle for XB1, which if they are to be believed, means this debate will carry on for a long time.
"Everything about programming on XB1 is really hard compared to Sony." Anyone with any remote knowledge about the general makeup of both the PS4 and Xbox One would stop reading your comment right there. You don't know what you're talking about. "While Sony went the easy route, walking around AMD with a shopping cart and said, "I'll take an extra bus and some more of those aces," MS went with specialized processors up the wazoo, 15 in total, modifying the system for things like massive data movement between components." LMAO! Do you even read the stuff you post? "Besides, I'm sure MS wouldn't invest in 300,000 servers, or be laying their own internet infrastructure just so their lies would seem more believable." The majority of those servers are not used for Xbox, they are used for business applications and renting them out to other businesses. You can throw out "300,000 servers" all you want to, it means diddly when most of them aren't even for the Xbox One. "If everyone took the Sony approach to life, nothing new would ever get accomplished and we'd eventually end up living in a world resembling Idiocracy." Do you remember how earlier you said that it was "harder" to program on the Xbox One? Assuming that you're not completely wrong, which you are, guess where they would have got that idea from? The PS2 and PS3, both consoles being the most difficult to program for of their respective generations. Who pioneered game sharing, share play, and DVR in consoles? Hint: It wasn't Microsoft. "Conversely, taking risks sometimes results in making mistakes but is also how new ideas are born." Which all 3 of the companies have done. What's your point? "These fanboy conflicts are much like Edison vs. Tesla, or Newtonian Physics vs. Quantum Science when it was trying to gain traction some years ago." Hmm, Edison tried to either completely steal, or discredit, Tesla's ideas so I suppose that fits Microsoft to a tee right? I mean, I already listed what Sony pioneered this gen, and all you're doing is defending unproven cloud tech like it's been proven and is the most glorious tech ever. "Win10, DX12, and the Cloud are just getting started and MS predicted a 10 year life cycle for XB1, which if they are to be believed, means this debate will carry on for a long time." And there's the pamphlet speech. Win10 and DX12 are not going to do what you think they're going to do for the Xbox One. I'm content to let that 10 years prove it to you because I'm very patient. The long game is always more entertaining than the present boasting and cult like mentality coming from brainwashed followers.
@dragonknight Lol. "The wheel on the bus goes round and round, all the live long day."
I'm not interested in the songs you sing on your short bus.
Of course you are, that's why you had to post again.
Replying to a reply is just a courtesy, it doesn't necessarily denote having an interest. I know I just used some complex words for you to understand, but let me see if I can put in words you'll understand. "Me no care about you song."
Azure was put in place first and foremost as a customer cloud for OneDrive and other MS services. You're implying that all those servers were built exclusively for Xbox with cloud processing in mind. That's ridiculous.
@TheCommentator, you really really need to lay off mrxmedias site. And comparing fanboy arguments to Edison and Tesla, wtf ? Talk about hyperbole. Gotta be the funniest thing I've read all day lol. Its common knowledge xbone uses pretty much bog standard components because, as MS themselves admitted, they didn't aim for graphical fidelity with xbone as the market they were targetting doesn't really care about it. Time to wake up n smell the coffee.
@midget_gem A GPU with 2 GCPs and 2 CCPs, 4 DMEs, and eSRAM now qualify as bog standard? Where did I mention anything about graphics fidelity? About "secret sauce"? All of the info can be found on sites like DF, extremetech, and anandtech. It's common ignorance that you "no sauce" guys just fling poop like monkey at the zoo. You're not even smart enough to see that PS4 uses only standard PC parts while XB1 is hardly standard at all, so it's no surprise you have trouble understanding analogies either.
I have not moved on to next gen as yet becuse both the PS4 and XB1 don't offer me anything for me say that is truly next gen. Before anyone starts to throw out a list of games let me explain my self. Yes we have had some games with top notch graphics like Drive Club and the Order but I am talking about games where Physics are taken to the next level, where we can enter or distroy any building or vehicle, where NPC'are not copy and paste, where every play through is different and not scripted, where interaction and decision s actually mean something...I could go on but you get the point. Hopefully this E3 second wave of next gen games prove me worng and offer games that will make me a believer. No man's sky is something to that degree and has the potential of greatness.
Meh. Im a gamer. The best versions of all the new games are on them, and lots games have already been released and are coming that arent on the old consoles. Waiting 2 years to save myself 100 dollars at the expense of a top notch experience seems pointless. It was a no brainer for me to make the jump immediately.
He does speak on a general technical level. For PS Now to work, you have to have a consistent and solid internet connection. As long as you have that then you can offload things. It'll be interesting to see the approach MS takes with this. Do you go for a full offload of graphics to the cloud, so the console only has to communicate reference locations and input, or do you go a hybrid approach with some local and some remote...or even both. If we want games to run at 60fps which is ~16.7ms a frame, then communicating the data, processing, and receiving the frame needs to roughly take that long (in an ideal world). IMO, it's doable, but I agree it's a hard problem.
Getting to where we are at was hard. Why stop now?
He happened to opine that works with or is it just a dog dropped wanting attention ?? If he were really responsible for at least this was the ND, but not for it serves.
I don't think Cloud Computing was ever intended to improve graphics performance... It's a way to calculate things like small AI code and a way to store things that aren't entirely necessary for the game to function. Things you could use it for would be something like a dynamic accident in GTA that happens in front of you. Instead of your PC calculating that AI, the server could do it and send the response to you. It's difficult to do for major things though due to latency. So until optimization is made or they figure out some tricks. It won't be much more than Cloud saving data and Drivatars pretty much.
No tricks will every get around network issues. Without a stable, low latency network connection you're screwed. I know Microsoft had suggested some nonsense about sending multiple pre-calculated frames for the various possible user inputs but that's going to massively increase the bandwidth requirements and would likely be detrimental rather than helpful.
Let's not disregard the fact that MMO's could fully utilize cloud-computing for special aspects of their games. It may not be viable for a singleplayer game or even a general multiplayer to use it, but some things could get use.
I'm sure it will be if Naughty Dog uses it to the perfection it usually does. All for the better though, they'd do amazing things on it, I'm sure.
People who know their stuff have been saying this since day one. Worldwide internet speeds are way too bad today.
Dont tell that. People like to live on the clouds...
it's gonna be hard for naughty dog because they are a playstation exclusive company and sony doesn't have that type of technology yet for the playstation brand, so of course it will be hard for naughty dog, but not for any first party developer for xbox. love it or hate it, microsoft is light years ahead of sony in terms of serevers, networking, and even cloud processing (offloading graphical tasks from the network). so im sure xbox will see this as a possibility before sony does. they have the brains and the money to make it happen and a lot sooner rather than later.
Well said but you will probably get disagrees because of your avatar and you said something negative about Sony. I stated above why I don't believe we are ready for cloud gaming but if MS can manage to somehow make cloud gaming where it does not 100% rely on the internet then yes MS will be a pioneer of cloud gaming. Edit:@PS4our I do agree that MS still has to show us instead of telling us, the demo with the building falling apart was very impressive and its said to be part of the New Crack Down game. This E3 will be critical for MS as they have no choice but to show us what all the hopla is about and they are not just blowing smoke. Greatvtime to be a gamer.
This is not the only person to debunk graphics processing in the cloud . A simple google search will show you that. But whatever keeps xbox fans happy I guess. MS has after all shown tangible proof of actual work in progress games that cloud can increase Xbox One performance 3 fold as they so stated at a Japanese stage presentation...oh wait...
http://www.pcgamer.com/nvid... So nvidia showing off global illumination using cloud computing is a lie now. Because it was debunked yet nvidia actually showed what it can do.
This guy must not be very busy or is dying to go work for Microsoft.
like the cloud is a MS idea.lol It was an idea put forward long ago by many people, Sony were even going to use Cell processor's linked up worldwide to produce power. It's not possible and won't be for 10years
I was under the impression the cloud was to assist in cpu based functions like physics, animation, etc. I never heard anything about the cloud improving graphics. Even that demo they showed a while back was all about how many destruction objects were being calculated realtime with the cloud, split screen with a computer not using cloud. That demo was all about frame rate vs object count not graphics and in that case th frame rate was vastly superior. Seems like a straw man debate article to me.
“What graphics features are in a typical game where you could live with getting the results a few seconds after starting the calculation? Not many. I think that a compelling cloud rendering technique would have to be an amazing new feature that no one has really done before. If we think outside the box there might be some really cool things that we could do. But simply offloading existing work into the cloud is hard to justify because of the roundtrip latency and all the things that can go wrong on a network.” Read more at http://gamingbolt.com/impro... Well he isn't really criticizing Microsoft. He's just criticizing offloading computations via the cloud and he does have a point. Streaming is a heck of a lot easier to do than cloud computations that's for sure. Question that you should all be asking is that will developers make the extra effort to use the cloud to obtain a minimal gain in performance? In my opinion I don't think so. However I do believe the cloud is useful for other things like streaming for example.
That quote? Been saying it since day 1. Apparently it makes me a Sony fanboy and Xbox hater. I thought I was just using common sense.
What bothers me the most is that some people believe that the cloud is only Xbox Live compute when the reality is the cloud includes streaming services and storage as well. From what I've seen the cloud is pretty much proven for streaming and data storage. But as for offloading computations to be honest I have never seen that done before from an average joes system. Microsoft still has to prove that and I thought they would last E3 but so far they haven't done it yet.
@MasterCornholio Actually you have almost every MMO uses cloud compute.The Server calculates the damage done then sends that data to the player.
@Dark_king: Yeah, but that's almost entirely streaming. And that's sorta the point. How many flawless MMOs are there that don't run into lag issues and such? And this is for games generated almost entirely server-side. If they still suffer due to internet limitations, what makes you think games that do more locally won't have the same limitations? It's the same inadequate internet being used, after all.
MMO's actually are a perfect example of cloud compute, as MMO's do exactly what cloud compute is said to be intended for. You have a local client application, which does all the work of displaying the final picture and receiving/sending input commands, and holds the resources to display such picture, as well as the code to depict the calculations that the servers send back for the client to interpret into something for the end user. That is what asynchronous computing is to a tee. It's not some magical invention of MS that recently came about. Almost the entirety of the game logic of an MMO, outside perceptible things like animation or whatnot, are entirely calculated on the cloud. User position, user action, enemy action, damage calculation, event triggering, etc, are all handled on the server, and those calculations are processed into the calculations of every user on the server to provide the MMO experience. In the end, it doesn't really matter how much of the game is processed server side, or what those server side calculations are, the premise of cloud compute is the same. The idea of applying it to graphics has only gone mainstream since the reveal of the X1, although the theory has been out there for a few years before that. @Spotie specifically Yes, MMO's are often hindered by server or internet lag. And that has been a pretty constant criticism throughout this whole cloud debate. Only the most ardent fan boy says that latency isn't an issue, or that no one will ever experience issues, or that everyone has good enough internet to take advantage of cloud compute. I imagine these are the same people who don't have issues with the MP portion of their exclusive games on day one, or months after.
Cloud Processing helps the CPU not GPU. This has already been explained many times before.
But Physics etc can be done efficiently on GPU these days. Everyone seems to have forgotten how GPGPU will grow in usage and be way faster than any cloud application.
What annoys me Microsoft has announced a game that will offload destruction to the cloud and then you have people claiming its not real? Crackdown not real? Gamingbolt a game in development is real it not nothing. We even know the name of the studio doing it Cloudgine and some of the devs developing it names are known. PS4 fans are still waiting on Last Guardian who is making that who are the devs developing it and name the studio. Sony stuff is all fake. Microsoft plans are real. Sony plans?