Roll 7's Thomas Hegarty talks about the potential that the cloud holds for gaming.
I will believe it when i see it.
Yeah we'll see. Show me.
ask and ye shall receive... https://youtu.be/eNAr1lbqkf...
Me too, not sure why anyone would disagree with Dark and Relient's comments, so far it is all talk, there's been no visual proof of it. I know for a fact that any tasks that need to be rendered in within tiny millisecond intervals can't be handled by cloud networks, they just can't react quick enough. Any baked in elements of a scene is fine, allowing those to be rendered by the cloud would in itself be awesome and you could basically have real time pre-rendered portions of a game world, with gorgeous visuals, but any parts of that scene that need to be changed very quickly would need to be dealt with by your local hardware, like a huge building in battlefield and any areas of the world that undergo quick changing damage, that needs to be handled locally. It's basically a matter of using the right hardware, for the right jobs. The cloud can help for sure, but it has it's place. Anyone thinking the Cloud can do it all and you won't have any huge performance, latency issues if you go full cloud needs to do their research and realize it has it's limitations.
@darthv72, that video is just awful. First of all that is an 11 year old game and second of all its textures have been reduced to ugly flat polygons. If that is the future of cloud gaming, count me out.
@darthv72 Not only would that be unbearable to play. Its done by a college research. They are not in it for the money, strictly to learn and research. No mention of latency or anything.
@Reginald it's a technology demonstration applied to a now completely open source game. Somehow I'm not surprised Duke University didn't fund the development of a brand new AAA game with cutting edge graphics just to demonstrate their backend technology...
Hey Darth. Didn't Duke have a partner on that project. Oh yeah, it's Microsoft. Doubt if you like. Cloud compute is the future. Even Sony knows it. Reginald probably doesn't stream online or even download videos. I remember when people said they would never watch recorded video online. And now Mark Cuban is rich. You would think it doesn't need to be pointed out again but it seems it does. Technology moves on. Some of these doubters thought the PS3 was future proof. There is a huge lack of understanding how to utilize something intelligently. The cloud can generate the world in the distance. Or in the next room and load it for you. Games where every door actually opens to something. Plus there is the complete lack of awareness of the whole picture. The amount of information they can compress is using a pipeline that is constantly growing. Phones will provide gigabit service by 2020. The main ISPs are losing their main source of income. Cable. At the same time Google is spreading their fiber network along with others. Others will follow HBO and offer streaming services. Cable companies will become strictly ISPs. The big ISPs only need to spend a little more money to offer what Google does and HBO and Netflix are forcing them to do it. Along with the new FCC chairman, who we should all applaud. In addition to all of that, Microsoft isn't just waiting on anyone else. The are laying undersea fiber of their own in partnership with a couple of companies that do that work all over the world to guarantee they have quality service to their data centers. All of the technology needed is converging and will come together sooner than people think. And doubters will look like blind men. Here's a couple of links you might find interesting. http://www.digitaltrends.co... http://www.neowin.net/news/... Utalkin2me Colleges are only in it for the money. Hate to break it to you like that. But it's true.
^ I would think your arguments would work better for an actual game streaming future, not necessarily cloud compute. Although cloud compute is a good idea, the future looks like a game streaming future. Much like the way videos and television is going. Since Sony owns the major patents for those they are in a pretty good position for gaming's future. But so is Microsoft in many ways.
lol.... fireseed... breath of fresh air to see someone using basic reason.
Jonnydoe the thing is Microsoft is actually working all angles. I don't know if you saw their predictive streaming tech. If you're streaming that helps in that arena. But compute is the real future. If you look at the way Windows 10 works with multiple GPUs, it sees it as just GPU units with combined memory. Regardless of architecture. To me I see that, and I believe they do too, as an opportunity for local and offsite cloud computing. One of the things they have pushed lately seems to be just that. I need to look into it more but that is what it seems to be. But I believe it will go beyond data centers at companies. I believe it will include home networks. Including Xbox. If you look at their Continuum technology where what you do on one device seamlessly switches to another I believe that is an opportunity to use resources from another machine to assist Xbox One. Whether it be a PC or a tablet. Maybe even a phone. I brought up the idea sometime ago on beyond 3d forums. Ultimately it was agreed that it might be possible. But no one liked the idea of having to program for that. But that was before it was announced that Windows 10 only sees assets. So really it seems they don't have to program for it. I also believe we may move away from developers doing all of the physics work and more towards a middleware model. What I mean by that is like"Real Tree" I believe it's called. The middleware that created lifelike foliage. This is another reason I believe Microsoft wanted always on. Say you're playing Alan Wake 2. Hopefully. You need a tornado. The game is rendered locally, but is mirrored on a server. The server knows where everything is, determines the path for the tornado sends it and the resultant destruction solutions to the console for implementing. That Duke demo was one mb. I doubt anyone who games only has a 1mb connection. In this area the average has to be 30 and that might be low. I think people are focused too much on end game to consider the vast space in-between. There is so much that can be done between now and everyone having GB service.
Good to know you already believe it then since it's already been shown numerous times already. Now we just waiting on the games to make use of it all.
Pretty much. Less talk and more action. I'll give them a chance though.
Cloud is evil, cloud is bad, don't bother innovating or even trying to because your all evil, moving things forward is scary so it has to be evil big bad cloud
Cavemen gonna hate. F--- Grock and his da-- wheel.
Cloud is essentially remote compute, that is, where the processing is done on a computer/server other than the one you are using. We have all seen remote compute being used in the likes of SharePlay and PS Now. The evidence is there and it works.
Shhh!! You'll burst the marketing bubble that makes "cloud" seem magical :) Yes remote computing works just fine, the problem is that too many people have internet connections that are too slow, have high latency or low data caps. Because of that you wouldn't want to design a game that relies on remote compute for anything essential.
Kneon are you quoting old messages from when Live first launched? I swear I've heard that exact same argument before.
It works for older titles on a good and low latency ISP. If you want modern graphics at full fledged clarity on your HD TV then it's not going to be the way to go. Great idea, but it won't replace physical hardware at home for at least two decades at the earliest. Only for older titles. And, it will definitely not replace competetive gaming. GREAT idea though. PSNow available on Samsung smart TV's now, Vita, some phones, current consoles, etcetera. I hope it makes its way to PC with an app. And, I would love the same thing with Nintendo. Ahh the nostalgia anywhere I go.
https://www.youtube.com/wat... Obviously not much is disclosed on specs.
This looks good, but sort of confirms that Crackdown will be an online only title unless we want to play with 4fps offline.
then you will believe it @ E3 :)
What we see @ E3 does not equate to the quality of the cloud gaming @ home. Sadly :(
Damge control early huh? how would you know perhaps it needs as little as 10 gps ...Wait till u see it and get more info b4 saying home will be different smh...
They will never agree When you show it to them they will insist a super computer is hidden under the table. If you give them a laptop with no wires, they will insist you are using a local 802.11 a/c connection to run it. If you show them you are connected to a server 3,000 miles away they will ask "what about all the gamers in Indonesia with dial up" or "what about the 2 times a year my Internet is down?" Nvidia is doing some REALLY cool things here. Onlive was too early, but in a few years you might be playing AAA games with just a wireless controller with an ARM chip in it.
Let me defend myself a bit here: I believe it when I see it. Regardless, I'm an advocate for streaming, i.e. Cloud gaming. Vita streams a lot of my PS4 content even today, and I'm also interested in Nvidia's new streaming tech. I'd love to stream Steam and Gog galaxy to a portable screen which I would then control seamlessly with a controller on the go. However, at the same time, it's hard to believe I can get a decent AAA experience on a that's train going across any continent on the planet. 5G might be different, though. On paper, the ping and speed would be suitable to streaming titles like The Witcher 3 on a Shield 2 or a similar device dedicated for streaming games. Real-world applications and show floor stuff is just so far apart, that I want to remain cautiously optimistic. Here's hoping it all works as planned! Edit: here's a bad joke: Steam's new stream service is called "StReam!". "ha ha" you can go home now.
I remember when NVidia released that video of a proof of concept which showed cloud compute, and everyone used it as proof that it could work, and was going to work for the X1. But no one actually read the analysis of what was going on, or where it was in production. NVidia, who has been working on this tech longer than MS, said in the review that the technology, while available, was not up to the level of widespread distribution both from a server standpoint(it would require massive resources on a large scale), and from an availability standpoint(the bandwidth required on a massive scale would be prohibitive or not available). NVidia stated that the tech would be quite a ways off, and quite a ways is more than a few years, although no specific time frame was given. Cloud compute for games could see some minor implementations, but probably nothing so major that a standard dedicated server couldn't do the same. Basically, the tech is available to everyone now, regardless of platform. The big question is, is the tech exciting or wanted enough in it's present or foreseeable future form to warrant people really getting that excited for it's implementation? It makes sense for online only games, but not all games are online only, and forcing features onto the cloud, minimal as they will end up being, is probably going to do more harm than good in selling this tech to the public.
Server side physics processing isn't new. Source engine has been doing this since 2004. Albeit with less gibs.
I wouldn't rely so heavily on something that required you to stay connected to the internet at all times
so then online multiplayer is out for you is what you are saying?
I;m sure he can live just fine if he doesn't get to play online MP all the time. Point being i'm sure he's not HEAVILY RELYING ON ONLINE GAMING to be happy.
Why do people make this stupid analogy?
Online multiplayer isn't a requirement its a luxury.
depends on the game
Many games rely on an Internet connection already anyways, and if this tech can bring some new experiences im all for it.
Whether its Sony with game rental services that can be done directly onto TVs or MS with their "enhanced experience", this boils down to to making online a necessity, retaining IP ownership after what use to be direct consumer sales, and above all generating constant revenue.
Look at the comments about the new Need for Speed to see what people think about always online games, but for some reason this is defended and if you question it you're a blind fanboy who can't see the genius of it.
Both game streaming and compute can be great for the future. Both have been done but not on Sony's and Microsoft's level, respectively.
It could be awesome, but unfortunately the limitations of Internet in U.S. Make it just not plausible at this time.
Cool and future gets brighter and brighter.
Maybe in 2025.
cloud gaming is already disfunct - it simply doesn't work
Cloud gaming done correctly simply doesn't exist, and nor will it. Latency will always be an issue, no matter how fast you think your connection is. That's not to mention the iron grip companies would have on access to gameplay content you consume.
well, they wouldn't send latency sensitive computations to the server.
Someday it might be just he natural way of doing things in the future, but that future must have 10X the internet speeds we have now. The way I see it, it won't be like that for along time, at least in the US, where internet providers don't seem interested in bringing high speeds unless we fork over a lot of $$$.
Speed isn't really the issue, it's more the latency. Most stuff being computed in the cloud requires minimal bandwidth. However, since there is a limitation on what can actually be calculated in the cloud, then sent back to the system in a reasonable time frame(fractions of a fraction of a second), latency becomes the problem. Unfortunately, internet speed and latency are not two connected things. You can have super high bandwidth, but really high latency, and vice versa. Unfortunately, this is a matter of physics, which can't be overcome barring some change to the way light works, because no matter how fast data can move, it still has to move within the construct of the cables that make up the internet itself. Since cloud compute for gaming is likely going to be a fairly low concern for ISP's in terms of market share, the likelihood they'll opt to focus on lowering latency over increasing the already choked bandwidth of the internet is relatively low. In the end, it's business which makes the most money for ISP's, and for business, latency isn't their biggest concern since most programs aren't hindered by the standard latency of the internet(around 80-160ms). Gaming requires that the latency remain a fairly constant <100s ping time, if not much better, and no matter what ISP's do, lag spikes happen which will drive this time up, so it's not optimal for the time being to rely on cloud compute for anything that is time sensitive.
Always interesting to read detailed technical analysis from so many qualified professionals /s.
Everyone in this comment section keeps talking keeps talking about the latency of cloud compute. Well if you did your research, you'd know it has a lot less latency than cloud streaming.
Latency in displaying a picture and latency when computing are two very different things. A person isn't likely to notice a 1/10th of a second lag time, but a program, particularly a game, will choke and stop completely in that amount of time. Of course, any game written for cloud compute would have to account for this, but that doesn't change the fact that cloud compute, and streaming are two very different technologies. Just to put it in perspective, DDR3 ram has a latency of 9-10ms, or 1/100th of a second. The internet, assuming both directions are taken into account, at a minimum has a latency of 160ms(assuming 80ms both ways which is an extremely optimistic ping), and that doesn't count the added latency added on the server, so around another 9-10ms or so. That's about 1/5th of a second, or 15-20X slower than local RAM, on a processor which is calculating a game loop at 30-60 times per second. Theoretically, a fully streamed game could have as much potential as the servers are willing to dedicate to it, but cloud compute games will always be limited by the latency of the internet, and the hardware on which the host resides. So yeah, doing research is the best thing about your comment, and I would suggest you do some yourself. Latency is latency, it doesn't really matter what it's used for. Difference is the perception of that latency, and the way the human brain works. When everything is done on the cloud(streaming) latency isn't as much an issue since the human brain can adapt. Computers however, have to be programmed to adapt, and don't handle things not accounted for very well. This is why huge lag spike are annoying for streamed games, but for Cloud compute games it will render a game virtually unplayable as eventually the host machine has to have information to proceed.
The Xbox One was built with a very low latency architecture. That explains the choice for DDR3, the super fast, low latency ESRAM. What people are confused with is the allocation of resources between console and server. The console is rendering almost everything. Graphics, ligting, sound, AI, etc. What the cloud does is take some tasks that don't take much latency. Like AI. Lighting doesn't take much latency either since its mostly static. Terrain is mostly static as well. A ground doesn't change much at all in video games. Now deformation is what is going to take a little more latency. But not much. The console is still take taking care of the sound graphics lighting, and effects of this desctruction. The only thing the cloud is doing is processing the XYZ coordinates of the pieces of debris flying out. That's the direction, and the speed of these objects. These are nothing but numbers. Not entire objects being processed on the cloud. The reason Cloud streaming has even more latency is because it processes all of that on the cloud. Not just bits and pieces of it, like compute. Everything from graphics, lighting, physics, AI, controller input, etc. That is way more than what cloud compute is taking on.
"The Xbox One was built with a very low latency architecture." No, it wasn't. Internally, the XBox bus is a standard x86 system bus, and there is a cache in ESRAM to overcome the bottleneck of the slower RAM. The cache is extremely fast, and it'll probably do it's job, but that doesn't make the system low latency. Anyhow, we're talking INTERNET latency, and MS has absolutely no control over that. None. There are too many factors and variables and service providers and home network configurations for MS to overcome the latency of the internet. Internally, the Xbox can probably handle any game logic thrown at it, within reason, but that doesn't change that the system will stall if it has to wait for the internet. It wouldn't be constant, but it will happen, giving a highly variable game play experience. I'm curious...what is your ping on your Xbox One? Or if that isn't available, what is it from your PC when pinging the closest MS cloud server? Is it below 80ms? My guess is that it's much higher than that, and there's no reason to assume it would be lower just because MS has ESRAM or some special co-processor in the Xbox One. It defies all logic in network infrastructures. I'm not personally confused about the allocation of resources. I'm quite well versed in cloud compute and asynchronous computing. I spent months researching and implementing test samples for a job I did, and in the end, it was determined insufficient for what the client needed, and that was for something that didn't require data the same way games do, and yes, I did use Azure and yes the internet provided to me was the best you can get with low latency(<60ms in most cases) and high bandwidth. I understand the difference between what I did, and how it will be implemented for game cloud compute. The reason that streaming can work the way it does is because the actual program does not have to rely on the shortcomings of the internet. Basically the entire game is run on a remote server, with only input commands being sent to the server, usually through tunneling or VPN which can help reduce latency, and then the image is simply sent back to the client. There is latency involved, but the human mind can not often see that kind of latency. And that's the difference I am talking about, a human mind and a programming routine work in entirely different ways, so a human mind can fill in the gaps when latency occurs, where as a computer can not fill in those gaps, and instead requires a back up plan to handle processes when data is not available. Those XYZ coordinates, known as a transform if you include the rotational on axis, are the heart of the rendering process. Without them, there is no place to draw an object. The direction is another variable of the transform. I understand those are the things that would be calculated on the cloud, and I also know that follow up calculations to update positions would be handled on the client, making the whole thing rather pointless to offload unless you want to have billions of objects exploding at once. Basically the whole idea here is that you're throwing a twig onto a bonfire and expecting it to have a substantial effect, when in reality you're actually taking more resources to offload the data. AI is a more interesting proposition for justifying cloud compute, and even lighting. Lighting itself is going to be a ways off by NVidia's own demonstration brief, and it's not ready for prime time. That leaves mostly just AI, and while that could be substantial, it isn't something that would be singular to MS platform, but to anyone who cares to pay for the cloud compute services from whoever provides it.
N4G is a community of gamers posting and discussing the latest game news. It’s part of NewsBoiler, a network of social news sites covering today’s pop culture.