Dev On PS4, Xbox One & PC Memory: Unlikely That Devs Use All of 8GB In A Frame, 16GB Is A Waste

"It is unlikely you will get close to using all of that 8GB in a single frame, so this is actually more than enough," says Dan Walters, Director of Calvino Noir.

Read Full Story >>
The story is too old to be commented.
1754d ago Replies(3)
Paytaa1754d ago

I'm not the smartest when it comes to the insides of the machines (albeit learning) but if games like The Last of Us and Gears 3 could be made with only 256MB of RAM then I can't wait to see what games look like at the end of this gen. I mean already we have Uncharted 4, Horizon, and Quantum Break that look like some of the best looking games I've ever seen and the systems aren't even maxed out yet although the devs will say they are to sell their game but I digress.

Justiceleague1754d ago (Edited 1754d ago )

Lol, it all comes down to optimization

uRaDecepticon1754d ago

Well actually those 2 games used more than 256mb of RAM, I'm sure, especially Gears, seeing as the 360 had 512mb of unified RAM. As far as what games will look like by the end of this Gen, well I personally think that'll be a long time coming but once development costs stabilize then the industry will show us that up until now, we haven't seen anything yet with what these machines are capable of. When you're spending the same amount of money on game development that you did on last Gen titles, you're not exactly gonna be pushing things to the limit or next level. Just think of it like this; these machines are a canvas, ones that are bigger than those from last Gen. Those from last Gen more or less got fully covered with paint and that being done cost a certain amount of money. Now that these new canvases are bigger, it'll take more paint than what was used for last gen to fully cover them, more paint costs more money.

Paytaa1754d ago

Woops I meant 512mb. Read justiceleague's comment with the 256mb and ended up typing that and yeah I agree with everything you said

meche3341754d ago

yup and the 10mb edram which helped output the resolution

RonsonPL1754d ago (Edited 1754d ago )

Let me explain then:
Graphics would look exactly the same as they do in terms of what you see on a single frame, if PS4 had 4GB or even less than that. It's about how much you can utilize. It's no use to have 100TB of memory if your graphics chip (GPU) cannot handle more than 100MB. PS4 games looks like they use 1,5-2GB, maybe 3GB of memory in terms of geometry, draw distance, resolution and textures. Lots of memory goes for effects put ontop of that, dynamic lighting etc, but since it's not that interesting to me, I don't know the exact numbers so you'd have to ask someone else.
But the things I mentioned are most important and it's clear that CPU and GPU are way too slow for the amount of memory. PS3 and its 256+256MB of memory was a different situation. It had more power in relation to memory amount. Games could look better if the memory amount was doubled.
The bigger memory pool is present in PS4 because:
- Microsoft planned to out-advertise their Xone as better console since PS4 is 4GB and Xone is 8GB. Sony decided to change 4 for 8GB.
- it'll come in handy when PS5 releases. Cross-gen games can be made easier. More money can be milked from old hardware and games (Sony has profits from every game sold, thanx to the license fees)
- it enables devs to create more open world games, bigger world, less loading, less stutter resulting from streaming, less costly streaming algorythms (just look at how much work it used to require in the past, only the best could do it properly (remember Jak&Daxter on PS2? It streamed from a DVD. So were many games on x360)
- some effects might be better. Textures imitating distant scenery could be fetched quicker.
- memory was cheaper than going for faster CPU/GPU. You cannot both use cheap crAPU (CPU+GPU on the same die) and have a fast console at the same time. The crAPU they used was the best they could get at the low price point they demanded from AMD.

About the maxing out. It's already done. There will be no magic progress. This console generation is easy to develop for. You won't see anything nicer. The recent games look good but look closely WHY that is. The latest shiny horror game works at sub-30fps. Often 20fps. This is the limit. It's the devs who say they'll get much more out of those low-end consoles, who are lying.
And about the best looking games you've ever seen. What did you expect?
- there are NO games designed with fast PCs in mind. No game after Crysis showed what PCs can do, so you were looking at the same target hardware for game created in 2005 and 2012. In late 2013 new generation of consoles came out, most publishers preferred to wait, big AAA games take 2-3 years to develop, and that's the ONLY reason games showed in E3 2015, scheduled for 2016-17 look better than what you saw in 2014. But look at Uncharted 4. The best team of coders in the world rised the white flag and backed out of their 60fps promise. After 2 years they didn't achieve 60fps. It proves my point. There's no hidden power to be discovered in PS4/Xone. Next games could have new tricks to cover the lack of power (blur filters, bloom, particles, fog, night, dusty air, etc. poor draw distance etc.) but won't magically squeeze more out of PS4. And some PS4 games already make first PS4 models to turn "jet engine " mode on (loud fans, since console uses all it has, which produces lots of heat, as for their cooling type/dimensions).

The guy in the article IMO talks about that. You could see much better graphics if you just replaced the PS4's GPU with a faster one, even if you still had 8GB of memory or even less. The memory amount to power ratio was never like that in any console generation before.
People need to stop thinking about Xone/PS4 in terms of what it used to be in PS2 or PS3 era.

AndrewLB1753d ago (Edited 1753d ago )

"- there are NO games designed with fast PCs in mind."


Star Citizen.

It's designed specifically to showcase the power and capabilities of the modern PC platform. The amount of power required for the full flight model simulation of 10 ships is difficult to do even on an i7-3770K, which is in a whole different league than consoles. DX12 will help some things, but those calculations are done on the CPU... plus it's not like the GPU isn't already going to be maxed out anyway.

Chris Roberts was approached by console manufacturers, and they had their own limitations that they wanted to enforce on the game like graphics parity, that CR would not allow, especially exclusivity (CR will not choose sides when it comes to five year old tech) and they refused to allow their online access (XBOX live or PS online) to sync with PC access for a persistent universe across platforms (and that is basically the entire reason for the game).

Here is a 2400p screenshot I took.

Or even better, check out this youtube clip:

RonsonPL1751d ago (Edited 1751d ago )


Release data: 2017

IMO, SC graphics are "meh..." to me.
This is a space sim, I'd like to see some scenery, during a clear day, that blows out of the water everything we saw before.
Like Far Cry 1 and HL2 did, like Crysis 1 did.
Since Crysis 1, we've been stuck. Next-gen consoles moved us from 2007 to 2009-2010. We're still 5 years behind now. Star Citizen doesn't look like 5 years of progress in comparison to PS4 which is the hardware level 100% released games are designed for. In 2017 we'll have 16nm HBM2 cards available, so it sure as hell isn't a "great AAA game that shows how far ahead the current PC is.
Far Cry 1 was released in 2004 and showed properly what 2004 hardware was capable of more or less (textures were optimized for 128MB cards, and 2004's 6800gt had 256MB and appropriate power to utilize it.).
Crysis 1 could utilize all 2007 cards, and even a little more with some settings file tweaking. A properly done PC game has no problems with running on slower machines, details can be adjusted. Besides, they can always take the easy path, and force some agressive draw distance/LOD settings when lowest settings are chosen. No problems with "oh no! We won't make money on this game, since 1% of PC gamers have a PC capable of running it".
No. It's all BS we're fed nowadays. It can be done, it was being done in the past, and it would blow our socks off, if someone REALLY created a game designed for PCs (if someone started today, he would have to targed PCs from the year the game releases, so 2017-18). A game designed for PCs in 2016 would show bigger leap in graphics thatn 8 years that took to go from x360/ps3 to Xone/PS4.
HBM2 is a mini-revolution. And foundries skipped 22nm. We'll get 16nm FF+ and HBM2 at the same time, and those cards are already being prototyped, and prototypes are operational. Within 8-12 months, PCs technology will change the performance limits drastically. And then there's things like Vulcan and dual/many-GPU solutions working properly for the first time (no lag, no stutter, just pure performance gains).

I'd like to see a game done by the dev, who targetted 2016 high-end hardware. I don't care that the game would run in 5fps on my PC on highest settings. I still could play the game at 60fps on lower settings, so I'd be happy.
But if history was to repeat itself, I would do everything I could to earn the money required for an upgrade, and then enjoy the game in full glory.
People would finally get the reason to buy better hardware. If AMD/Nvidia/Intel/AMD wouldn't be stupid and lower the prices of the hardware close to the absolute best, the PC gaming would flourish. Every company would earn good money, PC gamers could enjoy PC not "PC" games again, and other devs would follow when they realized there is lots of people with hardware much better than consoles.
All it would take is one or two AAA games.
And yet... since 2007 not a single company. Not AMD, not Intel, not Nvidia, not Valve, not Microsoft, not EA/Ubisoft. None. Not a single game.
And now they cry that PC gaming isn't as prosperous as they wanted it to be. They say it's because of the will of people who prefer consoles or mobiles. It's a trend, fashion. It's everything but their own damn stupidity in decissions they made or not made.
MS recently pumped millions into... Minecraft.
Intel has a monopolly and is going full-crazy with gaming CPU prices.
Oculus invests 50mln in some mobile Crap-contest, while that 50mln could be enough to create the mentioned AAA game (with 0 for advertising, since it would naturally go viral).

+ Show (1) more replyLast reply 1751d ago
ABizzel11754d ago

These systems are much closer to being maxed out than you think. Porting PS3 games has become standardized by Naughty Dog with TLoUs Remastered. Bluepoint took everything they learned from ND and did the Uncharted Collection and used the exact same method, with a few more optimizations learned along the way.

By holiday 2016 the PS4 and XBO will be maxed among developers, in the traditional sense (aka first parties ahve used 100%, not it's time to share what we've learned, then optimize). After that first parties will squeeze the last bit of juice out of them the next 2 years.

The technique of porting PS3 games to PS4 has already been mastered, and it's all down to optimizations which means at most we have another 2 years to go (2017 / 2018. The consoles are 99.9% maxed, and it finally boils down to engine trickery, and best technique to squeeze that last 0.01%.

donthate1754d ago

I'm sorry to dissappoint you, but isn't the fact that Gears 3 and Last of Us made with 256-512 MB RAM and you see what we get with 8 GB of RAM show you that we are getting diminishing returns!

Yup, we are hitting diminishing returns, and even a powerful PC needs to be magnitudes more powerful to even be that noticeable, let alone the wow factor.

Sad, but true!

+ Show (2) more repliesLast reply 1751d ago
Artemidorus1754d ago

Gamingbolt, using the "dev" card again

As time goes on and games improve, they will use more memory. I got this information from a "dev", honest.....

Mate17891754d ago

I think we're good with 8GB for now..

RegorL1754d ago

Will developers want to use more or less memory in future?
(Rhetorical question)

Show all comments (34)