"Though it's been a little over a month since the Xbox One and PS4 released, there are still doubts whether memory will be sufficient to last an entire generation."
In common 3D sph fluid physics simulation, it is fairly easy to max out 8 Gigabyte of ram. A new more efficient programming solution needed to not only reduce memory usage but also speed up the simulation. Another method would be the Cloud solution, such as Gaikai. AMD's TressFX is awesome, I wonder what other physic tech AMD is working on, I hope its destructible environment & sph physics. I'm still waiting for AMD's GPGPU tech demo showing Ruby fighting in the rain, with visible rainwater splashing on her body, dripping down he tight battlesuit, for the advancement of Science! :D
As a network technician I don't see server sided calculations coming into fruition any time this generation due to the undeveloped internet infrastructure in the world. Streaming full content (GAIKAI) on the other hand is feasible.
Agreed with nypifisel. Sending computational data across a network for the unit to deconstruct just sounds risky. As in a sudden loss of packet data could totally screw up a video game if it is constantly polling this information over the internet. Cloud computing vs cloud gaming.... I'd go with the much more stable platform because it has been proven. (GAIKAI.)
Yes. Here in the US I have time warner and the freakin connection drops often due to the old cable line infrastructure.
"I don't see server sided calculations coming into fruition any time this generation due to the undeveloped internet infrastructure in the world." Exactly what I've been saying since all the cloud talk started. I'm amazed you have no disagrees at the time I'm posting this.
I think it can work, you just need to split the games world into what needs to be computed locally and what should be computed via the cloud. Things that should be done locally should be everything within the vicinity of the player that needs to be constantly updated. So think visual things like lighting as well things that affect gameplay like physics or some of the enemy AI. What I think the cloud has the potential to do really well is AI. Latency can seem big issue with AI however it doesn't have to be. For online games people's ping is usually between 100-250 on average when with a good connection you can average close to 30-50 ms. However with the human mind it takes about 300-700 ms to make a decision. Therefore we can possibly have the computing for AI done via the cloud and have it come back faster than it would take the human mind to make a decision, it really just depends on how fast the servers are. Here is an example of how to mix local and cloud computing. Here's what I would like to see in a sandbox like the next fallout. New enemies that stalk you for days, the cloud is used to track where and how you travel and these new stalkers can use that data to follow you and plan to ambush you. When they make contact with you that is when your console can use more resources for for AI. Even in combat the AI can still have path finding being done through the cloud so they can change there positions or have an escape plan for any situation. Sure the U.S. might not have the best network infrastructure, but gaming is quite popular in japan as well so I expect to see some crazy stuff coming from there soon.
It will do fine. Your guys sentiments are no different from doomsdayers when something new is on the rise and most people just don't know better. This is Microsoft, one of biggest corporations. They have the resources to weather any transition. If anything compared to Sony they are truly taking innovative risks for the gaming industry and believe they will succeed. I'll look like a fool now with that statement but I'll have a fat smile on my face when you see some amazing titles come out.
disagrees will come regardless of what you say i am a computer scientist and i agree with you on what you just said . right now algorithms and theories about that happening is just a fantasy dream you need to have a super stable connection that works exactly like the components inside the hardware ( speed , bandwidth , consistency , etc ) some people might say why . well the answer is Software interactions streaming something to you is far easier than interacting with the software environment because the devs are taking into consideration every possible situation including the worst scenario now sending the data into the network is just like gambling with the data you dont know when it will come back to you maybe 20ms before the event or 1s after the event you dont know thats why right now its very hard to make something serious with the cloud except you know background updates draw distance and player's interactions thats what gets into the top of my head at the moment
Artist have been using pencils and paints for Centuries, yet we somehow have people creating stuff that has never been accomplished otherwise. Using THE SAME TOOLS ! What we can do with 8 GB of RAM now, is different than what we can achieve with it 6 years from now. Person A has a $50,000 Music studio but their music sounds like crap. Person B has $500 in recording gear and it sounds beautiful. What you have helps, but it's how you use what you have.
Exactly! Look at what happened with the XBOX ONE fiasco and it's plan to work only online and the fact that if consoles adopt the cloud they wouldn't have sold as much and as fast as they did at launch! Companies are trying there hardest to push for a Always online one service solution and the truth is people will not accept it because it's costly and there's almost no advantage going digital because of price fixing and download size!
Those of you who say this will work just fine seriously don't know what you're talking about and have no knowledge in the subject other than "But MS said so". The idea mentioned above about a consistent and stalking enemy for instance is just a waste or resources, why waste computational power on that when for the player a spawned in enemy at random times is just the same?! You wouldn't be able to know whether this enemy stalked you or not. Sonic989 made a great point. I'll develop it further with an open question: What is the bandwidth of your RAM? Now what is your internet bandwidth? Hint. The worlds average internet speed today is around 2 mbit, or for my arguments sake around 250kb/s (0.25mb/s). The XBO system memory is at 68gb/s (69000mb/s)the PS4s system memory is at 176gb/s (180000mb/s)and only the PS4 memory is considered sufficient to move big assets like textures fast enough. Now this is a very crude and not wholly right example but it's just for you to realize the difference we're talking about here.
This is blind fanboism the cloud cannot be used to improve games graphics. It's not possible with the current Internet infrastructure . You internetwork isn't fast enough . This ridiculous even if your Internet was the fastest it's not stable .The internet speed fluctuates depending on how many people are on it and how far it is a way from the hub unless you have fiber optics or a dedicated Line. Please stop listening to m$ marketing P.R spin .There's been several articles written debunking this cloud nonsense one by Digital foundry. http://www.eurogamer.net/ar... "TRiG ep 12 - Tearing Down Xbone's Cloud" on YouTube https://www.youtube.com/wat...
I saw an article on here a few weeks ago about a programming solution where hi-def textures could be streamed as needed from th hard drive instead of ran off the system memory. I'm no programmer myself but in the article it claimed this method could save up to 75% in resources so it sounded good to me.
Streaming of textures from HDD is common already. Most games do this these days. That's not really the biggest issue any more. 4GB "cache" for textures is quite sufficient. Problem I see is, that with the memory at hand some people throw their brains away and forget what we did the last 15 years thinking "hey, awesome, we don't need to optimize any more". It's quite the opposite. Things are getting much more complex. Optimization is needed more than ever.
Pc games have done that for the last 4 5 years. Rage is a good example of using this but having the gfx card also compress textures as needed.
This feature has already been in use. Black ops 2 used this feature on last gen consoles.
As mentioned, texture streaming has been common for a while. I think we are seeing more articles on it lately because the recently released DirectX 11.2 and OpenGL 4.4 specs provide direct support for a lot of what used to used have to be implemented in game code and middleware. Won't change last gen, but should be another tool for next-gen titles.
Hogwash. Theres plenty of pc games with 2000x2000 and games with 4000x4000 res textures working fine on a 1gig gfx card. Its only the last 2 3 years pc gfx cards have had more than 1 gig on the card to hold more textures if any games ever came out to use it up. This is a poorly researched story that really is written by someone who has no idea what they are writing about.
Those PC's have graphics cards that are far more powerful, have dedicated ram just for graphics. They have CPU's that offload tasks that are 2-3 times as powerful as what's in an Xbox one and a PS4. They ALSO have 4-24GB of system dedicated ram that can be used as well. It's a totally different scenario, there is a reason PC's can do far more things then consoles.
Your cant get gaming on a 1gb of GDDR5 graphic card sorry dont know where your pulling that from. At best you need a 2gb of GDDR5 just for watching movies etc from it. If you want to play a game in like 20-30 fps at that resoultion probably the best gpu uses like 4-6gb of GDDR5 memory. Then if you want 60fps you need like 2-3 of them babies next to each other which can cost you upwards of 2400-3k dollars alone in gpu costs.
@The Hitman, LOLOLOLOLOLOL! Do some research bud, because you are waaaaaaaaaaaaaaaaaaaaaaaaaaay off. I have a single 660ti that can handle 60fps in nearly every game I throw at it. Some games go well over 200fps (think Minecraft). Movies, are nothing on my card even in 1080p pushing 3D as well. The card happens to have 3gb of GDDR5, but very few games use more than 1gb at the moment. You can literally buy a 200-300 dollar computer that has at least an i5 and 8gb of RAM, slap in a 2-300 dollar graphics card and a better power supply (usually around 50 bucks) and play everything on the market right now. Please know what you are talking about before you spout nonsense.
@ Frag Do some research? The 660ti has 2gb GDDR5 at least. http://www.tomshardware.com... At 2560 x 1600 which is still significantly lower than 4k resolution on BF3 on High not even utlra settings runs at about 38-45 FPS. Putting it in utlra would probably shave off 10 FPS then upping the resolution to 4k would take off another 15-20 fps. So instead of you spouting off your bullshit about probably your pretend 660ti there is no way you can reach 60fps on 4k resolution without spending some real cash on multiple very high end GPUs. I should know better when you mention minecraft as a means of benchmarking /facepalm. So like I said before there is no way your pushing 2-4k resolution down 1gb of GDDR5 like zag said. Any and every benchmark test you will find says otherwise unless your playing facebook games ya sure.
Rather than vRAM, GPU shader/core clocks have a FAR greater impact on FPS. Of course if you exceed you vRAM capacity that will kill FPS but mostly vRAM is marketings numbers of late. At 1080p 2GB VRAM is ample and in most cases 1GB is too. But a 4-6GB VRAM GPU with SLOW clocks is going to suck regardless. Also a 6GB card won't run 1080p any better than a 2GB card at the same clocks. Anyway, everything needs to be extreme to run 4K at 60fps right now, thats not even taking HDMI/DVI cabling into account. 4K @ 60fps , TV's can't even do it let alone anemic console hardware, or even typical high end PC hardware for that matter. An extreme PC can pull it off and so far the only person I know that has bragging rights to that is ATIElite.
This really depends what you do. One GB of really fast GDDR is great for frame rate and textures which fit into the GDDR. But for high detail open world scenarios, where e.g. (exaggerated) each rock has its own 2K texture a PC with a 1-2GB card will run into PCIe bandwidth bottlenecks because the game will have to swap textures from and to system memory. No matter how fast your GPU is (clockspeed, shaders) if you can't get the data fast enough into GDDR it will stall. Most of those games are designed that they don't need more than 1-2GB VRAM at any given time, swapping only when a new level is loaded - or "stream" from DDR. That's why you won't notice. Textures are reused, geometry is reused (instanced) all to limit memory usage. For open world games, one console has a huge advantage today being able to use almost 6GB of texture memory at any given time. Those games won't use all the memory for static textures, but you can safely assume that you can fill 4GB at any given time - this will probably be true for exclusive titles only. Look at KZ:SF, for example. Ever wondered why each particular environment (e.g. in the forest) looks unique? Even a Titan with 3GB can't compete. The other problem is, developers can't rely on that a PC has that much VRAM. Some engines store multiple resolution of textures and load the lower res accordingly. But yet again, I guess 80-90% of PC graphics cards have between 1-2GB GDDR.
Heh you almost had them admit it ;). Now they suddenly realised it and wend "WAIT A MINUTE! RAAAAAAAAH!". Streaming pixels or streaming xyz values is about the same thing. If the water movement is not as latency dependant then it will work more then fine. For instance the water movement in a distant waterfall, the waves of an ocean, outside of the effect locally of your ship, can all be calculated like a stream and the result send to the client. Furthermore to the computer scientist in this discussion: You do know that a lot of the matrices involved are "sparse" and thus most of them values need no calculation nor to be send back and forth. You only need to send the differences. Now I'm not saying the "cloud" that Microsoft paints is true either. One has to be very careful and thing very hard what you can offload to it. Simplest would just be some AI like Forza 5 does. This can be extended to simulate a whole city with the AI that is further away not being updated as often or requiring as high a latency as those closer by (Who might even need to be calculated by the Console it self). But creatively approached Cloud computing can add some nice benefits to gaming, even with a slow connection.
Im no conputer scientist however if what you say is true then i came out of reading your post with a little more understanding of what cloud computing can do for us. So technically we could potentially see a much richer, diverse and interwoven city population in games like GTA in the future thanks to offloading AI without impacting system resources. As for the networking side of cloud computing, companies would be cutting off whole sections of the world. Taking South Africa as an example, I travel out there very often and have made quite a few friends across many walks of life and one thing they all have in common is terrible connections with heavy throttling, port shaping with silly caps and for this they pay an absolute fortune in comparison to people elsewhere (Europe and NA) Until governments get it into their heads that the internet has become an essential commodity to business and home life and force ISPs to offer good services, at fair prices, the world will be too segmented for global roll-out. Even here in the UK the difference of a couple of miles can mean the difference between a 40mb line and 256kb line. (speaking from experience, moved 6 months ago and went from paying £14.99 a month for 256kb (advertised as 2mb* never tested above 280kbs) to paying £8 a month for 40mb, true speed around 38mb.
Not sure if you'd be able to load data fast enough from an online server. Online storage is one thing, online RAM is another entirely. Though guess if you do the computing online as well, it's possible.
To me, Games are about gameplay and not the fancy graphics. Not everything needs photo-realistic graphics... EDIT: Though if they can balance the graphics with the fun factor, Then I'm all for it but I don't like being bored to death with the gameplay because the graphics took priority...
To me it's about wow factor which this generation lack because they chose cheap hardware over a powerful one. No excitement = no fun to me. I envy you xP
And all you console gamers were saying how expensive the PC was to buy and now it's not when it comes to PC parts in a console? I wish you'd make up your mind :)
@Seafort exactly. They became something they hated so much when comparing the consoles specs rather than games/gameplay possibilities. It's kinda ridiculous actually, because they often contradict themselves in a single sentence because of this...
Why all the disagrees ?. I said ''to me'' which is opinion :P As for you guys below . You misunderstood. The situation i'm talking about is different. When ps3 released the games shocked us , it was a huge step. Ps4 wasn't as impressive. That's what i'm saying. We all know pc is more powerful and it's upgradable. But i prefer ps4 because we get awesome exclusives :P. If i have extra money i will buy pc for better versions of multiplatforms (which won't happen until 2 years)
Hey, go play some KZ:SF. Seriously. Sure. Everybody pretend this doesn't have a wow factor - if you say that, you weren't really interested in that in the first place. You need to want to get "wowed" which I find people don't really want to any more. And don't give me that "I wasn't impressed because I got my 5 minutes hands on at a friends house" excuse.
I finished Killzone shadow fall. It's the most beautiful game i ever seen so far but the gameplay and story wasn't as good as the previous games. That's what killed it for me. It feels like a graphics demo . But it's better then playing call of duty story :P. So yes. I'm not as impressed as killzone 2. It's the start of a new generation so i'm expecting something impressive . Which i didn't get. I know i will get it in the future though.. :P
"So yes. I'm not as impressed as killzone 2. It's the start of a new generation.." Thats exactly it though, its the start of the generation. Killzone Shadowfall was a launch title, Killzone 2 came out almost 2 1/2 years after ps3's launch. Just wait for devs to truly make these systems sing, i cant even imagine the last of us like swansong they make for ps4 :O
I want good graphics AND good gameplay.
That's the thing right? Why do people seem to think those two things are mutually exclusive?! :S I for one think FF15 if the suggested fluidity is a reality then that game will be pronominal in both areas.
Crytek do a decent enough job at demonstrating why that's incredibly difficult to achieve.
What you've seen from the UE4 demos is what we should be expecting, maybe a little more or less for both consoles and probably even the wiiu. Remember the UE3 demo, Gears of War pretty much matched that and when Gears 3 came it surpassed it, so we know from the past that epic wouldnt put up no bullshots. Yeah they ran on expensive graphics cards but thats because they rushed it to show it to the public. UE is made to be optimized, the best engine for optimizing.
watch the UE3 samaritan demo... Tell me if weve seen anything friking 100000000 miles close to that.
@Beast, well duh UE4 isnt available yet. The first major game wont be available til late 2014 I think, except for an indie horror game coming in the spring, and thats just the tip of the iceberg. Once we get skilled programmers involved it should look like the demos. Just look at the Black Tusk game running inengine.
Sorry you need both gameplay and fancy graphics which = the overall package.
Bubble up...well said!
What's your point? No one said anything about ALL games maxing out the RAM. Why is it that some people feel the need to make a "gameplay before graphics" comment EVERY TIME someone so much as MENTIONS graphics? NO ONE is saying that graphics are more important.
@Seafort and andrewer And you are generalizing and entire population as well as trying to argue with PS4 having PC parts? It's not a PC, it has PC parts, it's cheaper then a good gaming PC. What is your argument? Did you just compliment consoles while trying to insult them? I'm sorry I don't see the point in you acting like ignorant children about platforms. I see great games on consoles to play, that are only on consoles, and consoles have great graphics even if PC has better graphics. I have 161 games on steam and counting, as well as games not on steam, but I don't post negative generalizations on the internet for attention because I prefer a specific platform.
Well said.I have many consoles ps3,nintendo wii ,wii u ,xbox360 ,Ps4 and so on.. And also gaming pc with i54670k gtx780 3gb Gdr..... We tested with 2 samsung led tv:s side by side Bf4.Pc(with ultra graphics) vs Ps4 .We find the only big differense was the price.Pc 1500 euros Ps4 400 euros.I am just tired to hear bullshit from only pc users.And these are the first games to ps4 think about 2 years from now Ps4:s future looking good.
The cloud will give more power
I do hope that's sarcasm
Nope, sephiroth is stronger
You mizzsplat POWAH!!!111!11!1! Derp.
Lol... Reminds me of "POWAH OF THE CELL" from ps3.
@etownone Your comment reminded me of this meme http://global3.memecdn.com/...
Big ass "duh". Any game artist, anyone with a remote understanding of how these things work knows that we are really not at a point where we have nearly enough memory and bandwidth to do absolutely everything we want to do. Doesn't mean it's not a great leap from just having 512mb though.
It's a great leap but given that with the power of next gen consoles developers will aim for more. So it will be a repeat of what we already know: More is eventually less.
It's a great leap. But IMO artists throw polygons at the problem where before it actually required creative thinking to create the same effect. It is really strange that the very same people somehow forgot how to solve this problem. The tools are getting more powerful, but at the same time, this responsibility is delegated to the tools generating sub optimal results. That's like the first generation compilers created shitty code and still required hand optimization. Now, they have these fancy tools which create shitty asset utilization but nobody really thinks they need to do low level (polygon/texture) optimization. I guess there will be lessons to be learned from a early launch generation none the less. 2 years from now, when they (we) finally accept those new boundaries, the games will blow us away.
Better hardware making programmers less hackish, or "lazy", or "less able to solve problems" is nothing new. Same could be said about NES and SNES programmers simply being able to throw bitmaps on the screen when compared to mapping objects to scan lines on the Atari 2600 because there was not enough memory to handle a screen full of data at once.
Sorry screen jumped
well, GTA V with all its size and glory was done on 512MB, TLOU on 512MB too (actually 256 for ps3 and 310 for xbox 360) with good work and optimization, 8 GB is more than enough
Hopefully the developers are not lazy this time around.
Samething I was feeling with reading article. Lazy developer alert.
I'm expecting the developers to get even more lazy this time around as they don't have to try to fit all their textures into the RAM at their disposal. So optimisation will go out the window and brute forcing their way to better performance will be much more common. I mean look at CoD Ghosts the min required RAM on PC was 6GB till they changed that in a patch as the devs were inflating the system specs required to run the game to make it seem like it was "next gen game".
I thought it was 512mb on Xbox 360 (the reason Gears of War was on Xbox 360) I want to say MS was going with 256mb and Epic stressed more ram and they jumped it to 512mb.
Last gen consoles only had 512MB total and then you have to take away for the os and other tasks it performs making the useable memory for devs around half.
360 had Edram which Helped
Oh ok yeah I forgot os uses some aswell..
RSX can read/write 256MB XDR (with the same bandwidth as GDR). So, from a graphics perspective PS3 has full 512MB of RAM. Especially because you mentioned TLoU. ICE is one of the groups extensively using XDR/SPUs for graphical effects.
good insight, bubble up
Since when was TLOU on the 360? Secondly you console gamers have never entered the realm of RAM. What you need to learn is 1.) Ram Finishes EXTREMELY quickly with high resolutions and high resolution textures. 2.) Ram finishes very quickly when a world is large. 3.) Ram is not effected too badly by polycount instead it is effected by textures (because on 1 model devs can apply parralax diffuse, parralax specularity, parralax normals,etc) We will be expecting to see these 2 things in the future larger worlds, higher resolutions and far crisper models (again usually not from polycount but usually from higher resolution normals mapping). With 8 GB of ram, these tasks wont be able to be held up too greatly (then again these things WILL SIGNIFICANTLY improve next gen but wont be such a massive jump from ps2-ps3 or ps1-ps2). If you ever watch one of Unreal Engine's optimization videos they will tell you to use textures in mobile games EXTREMELY conservitavely. Because THIS is what uses the ram as well as what else i listed... secondly devs have already optimized for x86 the past 5 years so there will be very little optimization left. Basically what im getting at is this generation will be more of a step as to ps2-ps3 was a leap, it wont be something mindblowing or crazy. Basically what we are seeing now with witcher 3 and division will only improve slightly by the end of the generation. Also I have a PC with 16GB of Ram AND 1.25GB of VRAM. And I can EASILY reach about 13gb of ram and 1gb of vram usage. (thats when the OS is taking around 4) I know you will diagree with me but please if you disagree with me PLEASE provide me with a reason.
" Since when was TLOU on the 360?" i was meaning gta v for 360/ps3 and TLOU for ps3 "(then again these things WILL SIGNIFICANTLY improve next gen but wont be such a massive jump from ps2-ps3 or ps1-ps2)" that's not because of X86 or powerpc, that's becuase it's the nature of vidoegame graphics advancement "secondly devs have already optimized for x86 the past 5 years so there will be very little optimization left" some devs already optimize it, sony, MS and some devs said ps4/X1will be fully optimized in 4-5 years, plus it's great when u have a high selling easy to develop for fixed units (nextgen consoles), it will bring a new world, many small devs will be able to do magic like never before "Basically what we are seeing now with witcher 3