Try our new beta!Click here
Submitted by Nike 774d ago | news

Havok on PS4 & Xbox One Memory: Will Take No Time For Artists to Use All RAM, Even 8 GB

"Though it's been a little over a month since the Xbox One and PS4 released, there are still doubts whether memory will be sufficient to last an entire generation." (Havok, PS4, Xbox One)

HelpfulGamer  +   774d ago
In common 3D sph fluid physics simulation, it is fairly easy to max out 8 Gigabyte of ram. A new more efficient programming solution needed to not only reduce memory usage but also speed up the simulation.

Another method would be the Cloud solution, such as Gaikai.

AMD's TressFX is awesome, I wonder what other physic tech AMD is working on, I hope its destructible environment & sph physics.

I'm still waiting for AMD's GPGPU tech demo showing Ruby fighting in the rain, with visible rainwater splashing on her body, dripping down he tight battlesuit, for the advancement of Science! :D
#1 (Edited 774d ago ) | Agree(33) | Disagree(23) | Report | Reply
nypifisel  +   774d ago | Well said
As a network technician I don't see server sided calculations coming into fruition any time this generation due to the undeveloped internet infrastructure in the world. Streaming full content (GAIKAI) on the other hand is feasible.
Giul_Xainx  +   774d ago
Agreed with nypifisel.

Sending computational data across a network for the unit to deconstruct just sounds risky. As in a sudden loss of packet data could totally screw up a video game if it is constantly polling this information over the internet. Cloud computing vs cloud gaming.... I'd go with the much more stable platform because it has been proven. (GAIKAI.)
specialguest  +   774d ago
Yes. Here in the US I have time warner and the freakin connection drops often due to the old cable line infrastructure.
MysticStrummer  +   774d ago
"I don't see server sided calculations coming into fruition any time this generation due to the undeveloped internet infrastructure in the world."

Exactly what I've been saying since all the cloud talk started. I'm amazed you have no disagrees at the time I'm posting this.
mcgrottys  +   774d ago
I think it can work, you just need to split the games world into what needs to be computed locally and what should be computed via the cloud.

Things that should be done locally should be everything within the vicinity of the player that needs to be constantly updated. So think visual things like lighting as well things that affect gameplay like physics or some of the enemy AI.

What I think the cloud has the potential to do really well is AI. Latency can seem big issue with AI however it doesn't have to be. For online games people's ping is usually between 100-250 on average when with a good connection you can average close to 30-50 ms. However with the human mind it takes about 300-700 ms to make a decision. Therefore we can possibly have the computing for AI done via the cloud and have it come back faster than it would take the human mind to make a decision, it really just depends on how fast the servers are.

Here is an example of how to mix local and cloud computing.

Here's what I would like to see in a sandbox like the next fallout. New enemies that stalk you for days, the cloud is used to track where and how you travel and these new stalkers can use that data to follow you and plan to ambush you. When they make contact with you that is when your console can use more resources for for AI. Even in combat the AI can still have path finding being done through the cloud so they can change there positions or have an escape plan for any situation.

Sure the U.S. might not have the best network infrastructure, but gaming is quite popular in japan as well so I expect to see some crazy stuff coming from there soon.
P0werVR  +   774d ago
It will do fine. Your guys sentiments are no different from doomsdayers when something new is on the rise and most people just don't know better. This is Microsoft, one of biggest corporations. They have the resources to weather any transition. If anything compared to Sony they are truly taking innovative risks for the gaming industry and believe they will succeed.

I'll look like a fool now with that statement but I'll have a fat smile on my face when you see some amazing titles come out.
#1.1.5 (Edited 774d ago ) | Agree(6) | Disagree(21) | Report
sonic989  +   774d ago
disagrees will come regardless of what you say
i am a computer scientist and i agree with you on what you just said .
right now algorithms and theories about that happening is just a fantasy dream you need to have a super stable connection that works exactly like the components inside the hardware ( speed , bandwidth , consistency , etc )
some people might say why .
well the answer is Software interactions streaming something to you is far easier than interacting with the software environment because the devs are taking into consideration every possible situation including the worst scenario now sending the data into the network is just like gambling with the data you dont know when it will come back to you maybe 20ms before the event or 1s after the event you dont know thats why right now its very hard to make something serious with the cloud except you know background updates draw distance and player's interactions thats what gets into the top of my head at the moment
#1.1.6 (Edited 774d ago ) | Agree(7) | Disagree(7) | Report
3-4-5  +   774d ago
Artist have been using pencils and paints for Centuries, yet we somehow have people creating stuff that has never been accomplished otherwise.


What we can do with 8 GB of RAM now, is different than what we can achieve with it 6 years from now.

Person A has a $50,000 Music studio but their music sounds like crap.

Person B has $500 in recording gear and it sounds beautiful.

What you have helps, but it's how you use what you have.
johny5  +   774d ago

Look at what happened with the XBOX ONE fiasco and it's plan to work only online and the fact that if consoles adopt the cloud they wouldn't have sold as much and as fast as they did at launch!

Companies are trying there hardest to push for a Always online one service solution and the truth is people will not accept it because it's costly and there's almost no advantage going digital because of price fixing and download size!
nypifisel  +   774d ago
Those of you who say this will work just fine seriously don't know what you're talking about and have no knowledge in the subject other than "But MS said so". The idea mentioned above about a consistent and stalking enemy for instance is just a waste or resources, why waste computational power on that when for the player a spawned in enemy at random times is just the same?! You wouldn't be able to know whether this enemy stalked you or not.

Sonic989 made a great point. I'll develop it further with an open question: What is the bandwidth of your RAM? Now what is your internet bandwidth?

Hint. The worlds average internet speed today is around 2 mbit, or for my arguments sake around 250kb/s (0.25mb/s). The XBO system memory is at 68gb/s (69000mb/s)the PS4s system memory is at 176gb/s (180000mb/s)and only the PS4 memory is considered sufficient to move big assets like textures fast enough.

Now this is a very crude and not wholly right example but it's just for you to realize the difference we're talking about here.
Evilsnuggle  +   773d ago
This is blind fanboism the cloud cannot be used to improve games graphics. It's not possible with the current Internet infrastructure . You internetwork isn't fast enough . This ridiculous even if your Internet was the fastest it's not stable .The internet speed fluctuates depending on how many people are on it and how far it is a way from the hub unless you have fiber optics or a dedicated Line. Please stop listening to m$ marketing P.R spin .There's been several articles written debunking this cloud nonsense one by Digital foundry. "TRiG ep 12 - Tearing Down Xbone's Cloud" on YouTube
#1.1.10 (Edited 773d ago ) | Agree(1) | Disagree(0) | Report
zerog  +   774d ago
I saw an article on here a few weeks ago about a programming solution where hi-def textures could be streamed as needed from th hard drive instead of ran off the system memory. I'm no programmer myself but in the article it claimed this method could save up to 75% in resources so it sounded good to me.
Ju  +   774d ago
Streaming of textures from HDD is common already. Most games do this these days. That's not really the biggest issue any more. 4GB "cache" for textures is quite sufficient.

Problem I see is, that with the memory at hand some people throw their brains away and forget what we did the last 15 years thinking "hey, awesome, we don't need to optimize any more".

It's quite the opposite. Things are getting much more complex. Optimization is needed more than ever.
zag  +   774d ago
Pc games have done that for the last 4 5 years.

Rage is a good example of using this but having the gfx card also compress textures as needed.
#1.2.2 (Edited 774d ago ) | Agree(0) | Disagree(4) | Report
FITgamer  +   774d ago
This feature has already been in use. Black ops 2 used this feature on last gen consoles.
AaronMK  +   774d ago
As mentioned, texture streaming has been common for a while. I think we are seeing more articles on it lately because the recently released DirectX 11.2 and OpenGL 4.4 specs provide direct support for a lot of what used to used have to be implemented in game code and middleware.

Won't change last gen, but should be another tool for next-gen titles.
zag  +   774d ago

Theres plenty of pc games with 2000x2000 and games with 4000x4000 res textures working fine on a 1gig gfx card.

Its only the last 2 3 years pc gfx cards have had more than 1 gig on the card to hold more textures if any games ever came out to use it up.

This is a poorly researched story that really is written by someone who has no idea what they are writing about.
Ipunchbabiesforfun  +   774d ago
Those PC's have graphics cards that are far more powerful, have dedicated ram just for graphics. They have CPU's that offload tasks that are 2-3 times as powerful as what's in an Xbox one and a PS4. They ALSO have 4-24GB of system dedicated ram that can be used as well. It's a totally different scenario, there is a reason PC's can do far more things then consoles.
thehitman  +   774d ago
Your cant get gaming on a 1gb of GDDR5 graphic card sorry dont know where your pulling that from. At best you need a 2gb of GDDR5 just for watching movies etc from it. If you want to play a game in like 20-30 fps at that resoultion probably the best gpu uses like 4-6gb of GDDR5 memory. Then if you want 60fps you need like 2-3 of them babies next to each other which can cost you upwards of 2400-3k dollars alone in gpu costs.
FragMnTagM  +   774d ago
@The Hitman,


Do some research bud, because you are waaaaaaaaaaaaaaaaaaaaaaaaaaay off.

I have a single 660ti that can handle 60fps in nearly every game I throw at it. Some games go well over 200fps (think Minecraft). Movies, are nothing on my card even in 1080p pushing 3D as well. The card happens to have 3gb of GDDR5, but very few games use more than 1gb at the moment.

You can literally buy a 200-300 dollar computer that has at least an i5 and 8gb of RAM, slap in a 2-300 dollar graphics card and a better power supply (usually around 50 bucks) and play everything on the market right now.

Please know what you are talking about before you spout nonsense.
thehitman  +   774d ago
@ Frag

Do some research?

The 660ti has 2gb GDDR5 at least.

At 2560 x 1600 which is still significantly lower than 4k resolution on BF3 on High not even utlra settings runs at about 38-45 FPS. Putting it in utlra would probably shave off 10 FPS then upping the resolution to 4k would take off another 15-20 fps. So instead of you spouting off your bullshit about probably your pretend 660ti there is no way you can reach 60fps on 4k resolution without spending some real cash on multiple very high end GPUs.

I should know better when you mention minecraft as a means of benchmarking /facepalm.

So like I said before there is no way your pushing 2-4k resolution down 1gb of GDDR5 like zag said. Any and every benchmark test you will find says otherwise unless your playing facebook games ya sure.
tee_bag242  +   774d ago
Rather than vRAM, GPU shader/core clocks have a FAR greater impact on FPS. Of course if you exceed you vRAM capacity that will kill FPS but mostly vRAM is marketings numbers of late. At 1080p 2GB VRAM is ample and in most cases 1GB is too. But a 4-6GB VRAM GPU with SLOW clocks is going to suck regardless. Also a 6GB card won't run 1080p any better than a 2GB card at the same clocks.

Anyway, everything needs to be extreme to run 4K at 60fps right now, thats not even taking HDMI/DVI cabling into account.
4K @ 60fps , TV's can't even do it let alone anemic console hardware, or even typical high end PC hardware for that matter. An extreme PC can pull it off and so far the only person I know that has bragging rights to that is ATIElite.
#1.3.5 (Edited 774d ago ) | Agree(2) | Disagree(0) | Report
Ju  +   773d ago
This really depends what you do.

One GB of really fast GDDR is great for frame rate and textures which fit into the GDDR. But for high detail open world scenarios, where e.g. (exaggerated) each rock has its own 2K texture a PC with a 1-2GB card will run into PCIe bandwidth bottlenecks because the game will have to swap textures from and to system memory.

No matter how fast your GPU is (clockspeed, shaders) if you can't get the data fast enough into GDDR it will stall.

Most of those games are designed that they don't need more than 1-2GB VRAM at any given time, swapping only when a new level is loaded - or "stream" from DDR. That's why you won't notice. Textures are reused, geometry is reused (instanced) all to limit memory usage.

For open world games, one console has a huge advantage today being able to use almost 6GB of texture memory at any given time. Those games won't use all the memory for static textures, but you can safely assume that you can fill 4GB at any given time - this will probably be true for exclusive titles only. Look at KZ:SF, for example. Ever wondered why each particular environment (e.g. in the forest) looks unique?

Even a Titan with 3GB can't compete. The other problem is, developers can't rely on that a PC has that much VRAM. Some engines store multiple resolution of textures and load the lower res accordingly. But yet again, I guess 80-90% of PC graphics cards have between 1-2GB GDDR.
#1.3.6 (Edited 773d ago ) | Agree(1) | Disagree(0) | Report
Dehnus  +   774d ago
Heh you almost had them admit it ;). Now they suddenly realised it and wend "WAIT A MINUTE! RAAAAAAAAH!".

Streaming pixels or streaming xyz values is about the same thing. If the water movement is not as latency dependant then it will work more then fine. For instance the water movement in a distant waterfall, the waves of an ocean, outside of the effect locally of your ship, can all be calculated like a stream and the result send to the client.

Furthermore to the computer scientist in this discussion: You do know that a lot of the matrices involved are "sparse" and thus most of them values need no calculation nor to be send back and forth. You only need to send the differences.

Now I'm not saying the "cloud" that Microsoft paints is true either. One has to be very careful and thing very hard what you can offload to it. Simplest would just be some AI like Forza 5 does. This can be extended to simulate a whole city with the AI that is further away not being updated as often or requiring as high a latency as those closer by (Who might even need to be calculated by the Console it self). But creatively approached Cloud computing can add some nice benefits to gaming, even with a slow connection.
#1.4 (Edited 774d ago ) | Agree(1) | Disagree(0) | Report | Reply
liquidhalos  +   773d ago
Im no conputer scientist however if what you say is true then i came out of reading your post with a little more understanding of what cloud computing can do for us.

So technically we could potentially see a much richer, diverse and interwoven city population in games like GTA in the future thanks to offloading AI without impacting system resources.

As for the networking side of cloud computing, companies would be cutting off whole sections of the world. Taking South Africa as an example, I travel out there very often and have made quite a few friends across many walks of life and one thing they all have in common is terrible connections with heavy throttling, port shaping with silly caps and for this they pay an absolute fortune in comparison to people elsewhere (Europe and NA)

Until governments get it into their heads that the internet has become an essential commodity to business and home life and force ISPs to offer good services, at fair prices, the world will be too segmented for global roll-out.

Even here in the UK the difference of a couple of miles can mean the difference between a 40mb line and 256kb line. (speaking from experience, moved 6 months ago and went from paying £14.99 a month for 256kb (advertised as 2mb* never tested above 280kbs) to paying £8 a month for 40mb, true speed around 38mb.
Tommykrem  +   773d ago
Not sure if you'd be able to load data fast enough from an online server. Online storage is one thing, online RAM is another entirely. Though guess if you do the computing online as well, it's possible.
GentlemenRUs  +   774d ago
To me, Games are about gameplay and not the fancy graphics.

Not everything needs photo-realistic graphics...

EDIT: Though if they can balance the graphics with the fun factor, Then I'm all for it but I don't like being bored to death with the gameplay because the graphics took priority...
#2 (Edited 774d ago ) | Agree(26) | Disagree(15) | Report | Reply
Salooh  +   774d ago
To me it's about wow factor which this generation lack because they chose cheap hardware over a powerful one. No excitement = no fun to me. I envy you xP
Seafort  +   774d ago
And all you console gamers were saying how expensive the PC was to buy and now it's not when it comes to PC parts in a console?

I wish you'd make up your mind :)
andrewer  +   774d ago
@Seafort exactly. They became something they hated so much when comparing the consoles specs rather than games/gameplay possibilities. It's kinda ridiculous actually, because they often contradict themselves in a single sentence because of this...
Salooh  +   774d ago
Why all the disagrees ?. I said ''to me'' which is opinion :P

As for you guys below . You misunderstood. The situation i'm talking about is different. When ps3 released the games shocked us , it was a huge step. Ps4 wasn't as impressive. That's what i'm saying.

We all know pc is more powerful and it's upgradable. But i prefer ps4 because we get awesome exclusives :P. If i have extra money i will buy pc for better versions of multiplatforms (which won't happen until 2 years)
Ju  +   774d ago
Hey, go play some KZ:SF. Seriously. Sure. Everybody pretend this doesn't have a wow factor - if you say that, you weren't really interested in that in the first place. You need to want to get "wowed" which I find people don't really want to any more. And don't give me that "I wasn't impressed because I got my 5 minutes hands on at a friends house" excuse.
#2.1.4 (Edited 774d ago ) | Agree(13) | Disagree(3) | Report
Salooh  +   774d ago
I finished Killzone shadow fall. It's the most beautiful game i ever seen so far but the gameplay and story wasn't as good as the previous games. That's what killed it for me. It feels like a graphics demo . But it's better then playing call of duty story :P.

So yes. I'm not as impressed as killzone 2. It's the start of a new generation so i'm expecting something impressive . Which i didn't get. I know i will get it in the future though.. :P
bratman  +   774d ago
"So yes. I'm not as impressed as killzone 2. It's the start of a new generation.."

Thats exactly it though, its the start of the generation. Killzone Shadowfall was a launch title, Killzone 2 came out almost 2 1/2 years after ps3's launch. Just wait for devs to truly make these systems sing, i cant even imagine the last of us like swansong they make for ps4 :O
TheLostCause  +   774d ago
I want good graphics AND good gameplay.
nypifisel  +   774d ago
That's the thing right? Why do people seem to think those two things are mutually exclusive?! :S I for one think FF15 if the suggested fluidity is a reality then that game will be pronominal in both areas.
Reverent  +   774d ago
Crytek do a decent enough job at demonstrating why that's incredibly difficult to achieve.
Hercules189  +   774d ago
What you've seen from the UE4 demos is what we should be expecting, maybe a little more or less for both consoles and probably even the wiiu. Remember the UE3 demo, Gears of War pretty much matched that and when Gears 3 came it surpassed it, so we know from the past that epic wouldnt put up no bullshots. Yeah they ran on expensive graphics cards but thats because they rushed it to show it to the public. UE is made to be optimized, the best engine for optimizing.
Beastforlifenoob  +   774d ago
watch the UE3 samaritan demo...

Tell me if weve seen anything friking 100000000 miles close to that.
#2.3.1 (Edited 774d ago ) | Agree(2) | Disagree(1) | Report
Hercules189  +   774d ago
@Beast, well duh UE4 isnt available yet. The first major game wont be available til late 2014 I think, except for an indie horror game coming in the spring, and thats just the tip of the iceberg. Once we get skilled programmers involved it should look like the demos. Just look at the Black Tusk game running inengine.
#2.3.2 (Edited 774d ago ) | Agree(0) | Disagree(1) | Report
Rageanitus  +   774d ago
Sorry you need both gameplay and fancy graphics which = the overall package.
hazardman  +   774d ago
Bubble up...well said!
SilentNegotiator  +   774d ago
What's your point? No one said anything about ALL games maxing out the RAM.

Why is it that some people feel the need to make a "gameplay before graphics" comment EVERY TIME someone so much as MENTIONS graphics? NO ONE is saying that graphics are more important.
#2.6 (Edited 774d ago ) | Agree(6) | Disagree(7) | Report | Reply
edgeofsins  +   774d ago
@Seafort and andrewer

And you are generalizing and entire population as well as trying to argue with PS4 having PC parts? It's not a PC, it has PC parts, it's cheaper then a good gaming PC. What is your argument? Did you just compliment consoles while trying to insult them?

I'm sorry I don't see the point in you acting like ignorant children about platforms. I see great games on consoles to play, that are only on consoles, and consoles have great graphics even if PC has better graphics. I have 161 games on steam and counting, as well as games not on steam, but I don't post negative generalizations on the internet for attention because I prefer a specific platform.
make72  +   774d ago
Well said.I have many consoles ps3,nintendo wii ,wii u ,xbox360 ,Ps4 and so on.. And also gaming pc with i54670k gtx780 3gb Gdr..... We tested with 2 samsung led tv:s side by side Bf4.Pc(with ultra graphics) vs Ps4 .We find the only big differense was the price.Pc 1500 euros Ps4 400 euros.I am just tired to hear bullshit from only pc users.And these are the first games to ps4 think about 2 years from now Ps4:s future looking good.
redcar121  +   774d ago
The cloud will give more power
Toolster  +   774d ago
I do hope that's sarcasm
MegaRay  +   774d ago
Nope, sephiroth is stronger
frostypants  +   774d ago
You mizzsplat POWAH!!!111!11!1! Derp.
etownone  +   774d ago

Reminds me of "POWAH OF THE CELL" from ps3.
TheLostCause  +   774d ago
@etownone Your comment reminded me of this meme
NeloAnjelo  +   774d ago
thunderbear  +   774d ago
Big ass "duh". Any game artist, anyone with a remote understanding of how these things work knows that we are really not at a point where we have nearly enough memory and bandwidth to do absolutely everything we want to do. Doesn't mean it's not a great leap from just having 512mb though.
#4 (Edited 774d ago ) | Agree(11) | Disagree(2) | Report | Reply
Rashid Sayed  +   774d ago
It's a great leap but given that with the power of next gen consoles developers will aim for more. So it will be a repeat of what we already know: More is eventually less.
Ju  +   774d ago
It's a great leap. But IMO artists throw polygons at the problem where before it actually required creative thinking to create the same effect. It is really strange that the very same people somehow forgot how to solve this problem. The tools are getting more powerful, but at the same time, this responsibility is delegated to the tools generating sub optimal results. That's like the first generation compilers created shitty code and still required hand optimization. Now, they have these fancy tools which create shitty asset utilization but nobody really thinks they need to do low level (polygon/texture) optimization. I guess there will be lessons to be learned from a early launch generation none the less. 2 years from now, when they (we) finally accept those new boundaries, the games will blow us away.
#4.2 (Edited 774d ago ) | Agree(3) | Disagree(0) | Report | Reply
AaronMK  +   774d ago
Better hardware making programmers less hackish, or "lazy", or "less able to solve problems" is nothing new. Same could be said about NES and SNES programmers simply being able to throw bitmaps on the screen when compared to mapping objects to scan lines on the Atari 2600 because there was not enough memory to handle a screen full of data at once.
Goku781  +   774d ago
Sorry screen jumped
#5 (Edited 774d ago ) | Agree(0) | Disagree(0) | Report | Reply
ape007  +   774d ago
well, GTA V with all its size and glory was done on 512MB, TLOU on 512MB too (actually 256 for ps3 and 310 for xbox 360)

with good work and optimization, 8 GB is more than enough
Infamous298  +   774d ago
Hopefully the developers are not lazy this time around.
hazardman  +   774d ago
Samething I was feeling with reading article. Lazy developer alert.
Seafort  +   774d ago
I'm expecting the developers to get even more lazy this time around as they don't have to try to fit all their textures into the RAM at their disposal. So optimisation will go out the window and brute forcing their way to better performance will be much more common.

I mean look at CoD Ghosts the min required RAM on PC was 6GB till they changed that in a patch as the devs were inflating the system specs required to run the game to make it seem like it was "next gen game".
#6.1.2 (Edited 774d ago ) | Agree(6) | Disagree(0) | Report
hazardman  +   774d ago
I thought it was 512mb on Xbox 360 (the reason Gears of War was on Xbox 360) I want to say MS was going with 256mb and Epic stressed more ram and they jumped it to 512mb.
zerog  +   774d ago
Last gen consoles only had 512MB total and then you have to take away for the os and other tasks it performs making the useable memory for devs around half.
ape007  +   774d ago
360 had Edram which Helped
hazardman  +   774d ago
Oh ok yeah I forgot os uses some aswell..
Ju  +   774d ago
RSX can read/write 256MB XDR (with the same bandwidth as GDR). So, from a graphics perspective PS3 has full 512MB of RAM. Especially because you mentioned TLoU. ICE is one of the groups extensively using XDR/SPUs for graphical effects.
ape007  +   774d ago
good insight, bubble up
Beastforlifenoob  +   774d ago
Since when was TLOU on the 360?

Secondly you console gamers have never entered the realm of RAM. What you need to learn is
1.) Ram Finishes EXTREMELY quickly with high resolutions and high resolution textures.
2.) Ram finishes very quickly when a world is large.
3.) Ram is not effected too badly by polycount instead it is effected by textures (because on 1 model devs can apply parralax diffuse, parralax specularity, parralax normals,etc)
We will be expecting to see these 2 things in the future larger worlds, higher resolutions and far crisper models (again usually not from polycount but usually from higher resolution normals mapping).

With 8 GB of ram, these tasks wont be able to be held up too greatly (then again these things WILL SIGNIFICANTLY improve next gen but wont be such a massive jump from ps2-ps3 or ps1-ps2).

If you ever watch one of Unreal Engine's optimization videos they will tell you to use textures in mobile games EXTREMELY conservitavely. Because THIS is what uses the ram as well as what else i listed...

secondly devs have already optimized for x86 the past 5 years so there will be very little optimization left.

Basically what im getting at is this generation will be more of a step as to ps2-ps3 was a leap, it wont be something mindblowing or crazy. Basically what we are seeing now with witcher 3 and division will only improve slightly by the end of the generation.

Also I have a PC with 16GB of Ram AND 1.25GB of VRAM. And I can EASILY reach about 13gb of ram and 1gb of vram usage. (thats when the OS is taking around 4)

I know you will diagree with me but please if you disagree with me PLEASE provide me with a reason.
#6.4 (Edited 774d ago ) | Agree(6) | Disagree(2) | Report | Reply
ape007  +   774d ago
" Since when was TLOU on the 360?" i was meaning gta v for 360/ps3 and TLOU for ps3

"(then again these things WILL SIGNIFICANTLY improve next gen but wont be such a massive jump from ps2-ps3 or ps1-ps2)"

that's not because of X86 or powerpc, that's becuase it's the nature of vidoegame graphics advancement

"secondly devs have already optimized for x86 the past 5 years so there will be very little optimization left"

some devs already optimize it, sony, MS and some devs said ps4/X1will be fully optimized in 4-5 years, plus it's great when u have a high selling easy to develop for fixed units (nextgen consoles), it will bring a new world, many small devs will be able to do magic like never before

"Basically what we are seeing now with witcher 3 and division will only improve slightly by the end of the generation"

we still don't know, some devs said otherwise(activision turn 10 and Dice) and even if we assumed that it's only slightly better than those two games, it's more than enough, think of the variations of Art style and game design with that power, plus Witcher and Division are OPEN WORLD games, so imagine smaller games

" I know you will disagree with me but please if you disagree with me PLEASE provide me with a reason"

LoL don't worry man, anytime :)

conclusion, what we saw with TLOU, Halo 4, gta V, Rage, GEars 3, Uncharted 3 was beyond amazing and all on 512MB Ram, imagine what will happen with these amazing nextgen systems and the door that it will open for devs for years to come, they sell great and are easy to develop for, it really doesn't get that much better than that
#6.4.1 (Edited 774d ago ) | Agree(2) | Disagree(2) | Report
awi5951  +   773d ago
8gb of ram is the norm on pc actually its starting not to be enough at this point. I see my pc hit 11gb of ram usage sometimes but thats because i turn virtual ram off because its slow and my games run faster with it off but i have 16gb anyway.
neoandrew  +   774d ago
No they won't, they just can't, the max is about 6gb for each console, so no 8gb sorry.
bebojet  +   774d ago
I'll take great gameplay over great graphics any day. Take a look at what was achieved with last gen consoles. Uncharted, GTA5, TLOU, God of War, GT5, Forza, Halo 4 and Assassin's Creed. All of these amazing games barely used half a GB. Now devs have 5 GBs at they're disposal.
dcj0524  +   774d ago
Why not both?
tommygunzII  +   774d ago
Seems all games start out with a vision of great gameplay and graphics. From then on it's about priorities and when the time/money runs out.

As an AI fanatic I'll take smart AI over anything else, but I know I'm in the minority.
kingduqc  +   774d ago
take a look at the great games: Boring ass story: the movie, Online not working 5, Follow the rail, boring cool car screenshot, copy pasta: the game, pewpewpew 4 and finally press X to not die as a pirate.
Dlacy13g  +   774d ago
Of course they will use all 8GB of RAM. Will they use it as efficiently early on as later? Probably not. Extra resources will be gobbled up early by sloppy/lazy programming as developers can just put stuff in with out compression, etc... As tools advance, games strive for more and the hardware gets into it's 3rd year you will see developers start doing things to squeeze more in and need to optimize things as space won't be the luxury it was early in the 1st year.
TruthInsider  +   774d ago
How can they use 8GB when there is "only" ~5.5GB for games?
frostypants  +   774d ago
Being a little pedantic, doncha think?
BitbyDeath  +   774d ago
5.5GB was only rumoured, we still don't know how much is available for devs.
frostypants  +   774d ago
All the guy means is that developers will take advantage of everything you throw at them. It's not a knock on any sort of hardware limitation. They will use what you give them. It's really a complement to the efficiency of modern development tools that they can leverage new hardware so quickly.
bub16  +   774d ago
Uncharted 3 ran on 512mb of ram, bring on naughty dog and 8gb
Doink  +   774d ago
That game is ultra lineal, a best example is GTA V
YodaCracker  +   774d ago
I still don't know how Rockstar managed to get GTA V running on PS3 and 360. It is the greatest technical marvel of the last generation, for sure. It feels more next gen than anything I've seen on the new consoles, to be honest!
hellvaguy  +   774d ago

I thought the same exact thing about Skyrim on consoles.
nope111  +   774d ago
I hope sandbox games and destructable environments become the new trend this gen.
Dlacy13g  +   774d ago
I care less about the sandbox as I do destructible environments but would love to see both become trends.
larrysdirtydrawss  +   774d ago
As for you guys below . You misunderstood. The situation i'm talking about is different. When ps3 released the games shocked us , it was a huge step. Ps4 wasn't as impressive. That's what i'm saying

exactly wich early ps3 games shocked you?? resistance? even on release day,ppl were like,''meh'' graphically.. not until uc1/gow3 did ppl go ''woah''. ps4 launched with kzsf and that's one of the best looking games ive ever seen anywhere
kingPoS  +   774d ago
I trust that necessity will once again force innovation. After all.... who would of thought the PS3's split memory of 512mb could do as much as it already has.

Gateway MT6706 2008
snookiegamer  +   774d ago
It's my understanding that any developer can utilize X amount of ram if little or no optimization is done. So it's a moot point.
deadfrag  +   774d ago
More Ram leads to some developers been lazy because they will not do optimization has it shoud.Some devs will just stuck thinks on the code without RAW optimization and counting on the huge amount of ram to hide how bad they are.Expect to see some games looking shitY and the devs saying they used all resources.
#17 (Edited 774d ago ) | Agree(3) | Disagree(2) | Report | Reply
PersonMan  +   774d ago
That happened this gen too.
hellvaguy  +   774d ago

You paint a very glass half empty perspective. Devs do not have unlimited time to do projects. So in reality the less time spent in one area (optimizing for 512mb ram) means more time spent in other areas (a.i., storyline, effects, etc).
#17.2 (Edited 774d ago ) | Agree(0) | Disagree(0) | Report | Reply
PigBenis  +   774d ago
Well the new gen graphics cards use dedicated 4gb GDDR5, plus everyone has like 6gb system memory.. so 10gb total, ps4 uses 3.5 for os right? so only 4.5gb available for games.. Hmm.. not a whole lot to be honest, but i guess it will get by.
FragMnTagM  +   774d ago

You have no idea what you are talking about.

Both consoles have 8gb of RAM. The PS4 has GDDR5 RAM, and the XBOX One has DDR3 RAM.

The principle differences are:
•DDR3 runs at a higher voltage that GDDR5 (typically 1.25-1.65V versus ~1V)
•DDR3 uses a 64-bit memory controller per channel ( so, 128-bit bus for dual channel, 256-bit for quad channel), whereas GDDR5 is paired with controllers of a nominal 32-bit (16 bit each for input and output), but whereas the CPU's memory contoller is 64-bit per channel, a GPU can utilise any number of 32-bit I/O's (at the cost of die size) depending upon application ( 2 for 64-bit bus, 4 for 128-bit, 6 for 192-bit, 8 for 256-bit, 12 for 384-bit etc...). The GDDR5 setup also allows for doubling or asymetric memory configurations. Normally (using this generation of cards as example) GDDR5 memory uses 2Gbit memory chips for each 32-bit I/O (I.e for a 256-bit bus/2GB card: 8 x 32-bit I/O each connected by a circuit to a 2Gbit IC = 8 x 2Gbit = 16Gbit = 2GB), but GDDR5 can also operate in what is known as clamshell mode, where the 32-bit I/O instead of being connected to one IC is split between two (one on each side of the PCB) allowing for a doubling up of memory capacity. Mixing the arrangement of 32-bit memory controllers, memory IC density, and memory circuit splitting allows of asymetric configurations ( 192-bit, 2GB VRAM for example)
•Physically, a GDDR5 controller/IC doubles the I/O of DDR3 - With DDR, I/O handles an input (written to memory), or output (read from memory) but not both on the same cycle. GDDR handles input and output on the same cycle.

The memory is also fundamentally set up specifically for the application it uses:
System memory (DDR3) benefits from low latency (tight timings) at the expense of bandwidth, GDDR5's case is the opposite. Timings for GDDR5 would seems unbelieveably slow in relation to DDR3, but the speed of VRAM is blazing fast in comparison with desktop RAM- this has resulted from the relative workloads that a CPU and GPU undertake. Latency isn't much of an issue with GPU's since their parallel nature allows them to move to other calculation when latency cycles cause a stall in the current workload/thread. The performance of a graphics card for instance is greatly affected (as a percentage) by altering the internal bandwidth, yet altering the external bandwidth (the PCI-Express bus, say lowering from x16 to x8 or x4 lanes) has a minimal effect. This is because there is a great deal of I/O (textures for examples) that get swapped in and out of VRAM continuously- the nature of a GPU is many parallel computations, whereas a CPU computes in a basically linear way.
shorty74a  +   774d ago
TruthInsider  +   774d ago
But they still DO NOT have access to 8GB!
kingPoS  +   774d ago
I don't know why someone would disagree with fact. APU's are inherently bandwith thirsty. Adding fast ram to help saturate that big pipe of an I/O makes since right?

Gateway MT6706 2008
mysteryraz11  +   774d ago
Consoles dont need high end hardware look what the ps3 could produce with less 512mbs to use for games, case closed
snookiegamer  +   774d ago
Why you got a disagree I will never know. You're absolutely correct. PS3 has shown what can be achieved (visually) with only 256mb of video ram.

...People tend to forget the console has significantly less overhead than that of PC's.

Just saying;/
#19.1 (Edited 774d ago ) | Agree(1) | Disagree(3) | Report | Reply
mysteryraz11  +   774d ago
Just ppl who dont wanna accept the fact that consoles can do more with ram limitations then pcs due to all hardware being the same and devs using it for a long period of time and the fact that consoles arent memory hogs like pcs, with shit like uc3 and last of us on ps3, I cant imagine what we will see on the new consoles
theizzzeee  +   774d ago
Wow! That's a big statement considering their software makes memory about 9 times more efficient.
PersonMan  +   774d ago
The graphics can look like real life, but it's not going to make the game more fun to play. I can see myself getting just as bored playing Killzone Shadow Fall as I got playing Killzone 3 as the core gameplay is the same.
beatled  +   774d ago
isn't it obvious they are NOT powerful enough

a game like bf4 is 720p on xbox1 and 900p on ps4

both are covered in jaggies and do not have nearly the level of detail of the pc version

they are a gen behind when they launched

sure, there will be great exclusives for each platform that will be awesome and amazing experiences

but, they are both DREADFULLY underpowered, the xbox1 even worse than the ps4, though both are very weak compared to even gaming pc's from 2011 era

doesn't mean I cannot love the games, and I will, hell I love playing my ds/psp more than I play my ps3/360/ps4/wii u

something about gaming right before bed on a handheld that just makes me happy!
n4gamingm  +   773d ago
yeah but i blame late drivers and ea rushing dice to release bf4 obviously the game wasnt ready.
NotSoSilentBob  +   774d ago
I think if Developers actually take their time and write the code, and continue optimizing it, 8 gigs can go a long ways, but with the present model of start a game and have it finished in 6 weeks, optimization is put on the back plate to profits.
starscream420  +   774d ago
Azure will be successful computing can be done. How much is the real question. And as far as the PC/console argument.....I have a icore7 with a Titan. Guess what? Ps4 and the Xbox one can't compete graphically...............but who cares? I still have the ps4 for its exclusives and the Xbox one for its exclusives. Exclusives (timed and permanent ) will be a huge factor in the long run. The developers will in time optimize the available ram for these consoles. Relax.
psDrake  +   773d ago
I think some gamers care way too much about the hardware specs of consoles. Look, The Last of Us or GTA 5 was done with pretty outdated specs, and yet they are really great games in terms of graphics ( doesn't make them great games but in this case they both are )

PS4 & X-Box One are here to stay for a looong time. Their specs are good enough to handle wild imaginations for years to come and all you have to do is enjoy. Let engineers/ developers worry about maximizing specs.

Add comment

You need to be registered to add comments. Register here or login
New stories

Unravel Review – A Not So Epic Yarn - The Jimquisition

18m ago - It’s got the look of a big-deal indie game, but it’s all style and zero substance. | Unravel

Naruto Shippuden Ultimate Ninja Storm 4 review: An entertaining fighter - Examiner

18m ago - The Fourth Great Ninja War has begun as Madara Uchiha wages battle against the hidden villages of... | PS4

Track the Release Date for PlayStation VR

Now - Sony is yet to reveal the exact release date for PlayStation VR. Start tracking it now using | Promoted post

Final Fantasy Explorers (3DS) Review | VGChartz

19m ago - VGChartz's Chris Matulich: "While there's plenty to like in Final Fantasy Explorers, like the wel... | 3DS

Review: MOP - Operation Cleanup (PlayStation 4) - Defunct Games

19m ago - Defunct Games reviews MOP: Operation Cleanup, available now on PlayStation 4 and coming soon to P... | PS4

Gone Home Review: Digging around | Clip Through

20m ago - Spenser Smith from Clip Through writes: Playing Gone Home had the same effect on me as listening... | PC