Submitted by Rashid Sayed 374d ago | news

PS4 Is Powerful But Achieving 1080p/60fps Without Major Development Challenges Is Arguable

Housemarque explains that achieving 1080p and 60fps with superior graphical effects is a challenge. (Alienation, PS4)

Alternative Sources
Whitey2k  +   374d ago
horsemargue is a great indie team i can personally see them top class along with ND blacktusk developers
SoapShoes  +   374d ago
Uh, what... Blacktusk has yet to make anything, how are they top class? I realize they have high profile people there but it takes the right people to make a great team.
ShinMaster  +   373d ago
I think it all depends on the game.
Azzanation  +   373d ago
Its not about the company but about the people working for it. Look at Rare, great brand name with no one behind it, Black Tusk have some of the best in the industry working for it and it unproven. Doesn't mean there not up there.
calis  +   373d ago
"Doesn't mean there not up there."

Ahh yes it does.

No one claims Rare is a great brand name these days either.
sonarus  +   373d ago
They wee though. Like seriously rare should have never separated from Nintendo. I LOVED all their games from original do on snes all the way to banjo kazooie on 64.
AngelicIceDiamond  +   374d ago
@Whitey Black Tusk is unproven.

As for this. Looks like Indies are REALLY stepping it up their game. Their games are getting bigger demanding more from these systems. World size, graphics, content etc.

I said this once Indie arcade could eventually outdo AAA. In quality and creativity.

Actually, some indie arcade devs already won with creativity front.

Can't wait to see more from this.
#1.2 (Edited 374d ago ) | Agree(31) | Disagree(3) | Report | Reply
Evilsnuggle  +   374d ago
I hate this idea of "indies" vs AAA . Indies studio or independent only tell who own the studio . Independent games are only about the financial of the game. Independent game sometime have smaller budget than AAA games. AAA game that are finance by a big publishers have larger budget that independent games. Being a independent financial game is about who owns the IP and finance the game. Being a " indie" studios or independent studio has nothing to do with being a AAA game. Double fine, Ninja Theory ,CD Project Red make AAA games and all are indie studio's or independent studio
#1.2.1 (Edited 374d ago ) | Agree(10) | Disagree(1) | Report
DigitalRaptor  +   373d ago
@ Angelic

I keep re-iterating this but certain people still see indies as a threat to their console or just something to laugh at. I think indies have been shelling out more creativity for years now, not just with these consoles but last-gen too. PC, even before that.

Looking at games like Ethan Carter, SOMA, Dreamfall Chapters, No Man's Sky, WiLD, Everybody's Gone to the Rapture, The Witness, RiME - you can no longer simply look at indies and consider them small fish by default.
#1.2.2 (Edited 373d ago ) | Agree(2) | Disagree(0) | Report
IndoAssassin  +   373d ago
You guys should play counter spy. Sweet game man.
warczar  +   373d ago
@ evilsnuggle

Being an indie is more about leaving behind asshole publishers like ea and activision. Talented people like Tim Shafer and kenji inafune shouldn't be making games through crowd funding but that's what they do to stay away from the shithead publishers of the world.
DVS-Zev  +   374d ago
That's insulting to add blacktusck (who had to can their first game due to bad management) to the same page with industry leaders like ND.

Hell, it's insulting to Housemarque even,
DevilishSix  +   374d ago
Totally agree....Black Tusk has nothing to show since they opened their doors. They are nothing more than a money sink until they deliver a game....good or bad..just deliver a game.
donthate  +   374d ago
It is insulting to lie and say that Black Tuscks first game was canned due to "bad" management, when all things point to the opposite. Reports were that the prototype they were working on looked great.

In fact, BT were entrusted with Gears of War which is a huge, huge, HUGELY important franchise to MS which points to them being well managed.
DVS-Zev  +   374d ago

What reports are you talking about? And everything was apparently fine, for year after year after year of nothing to show and the game gets scrapped and that's a good sign to you?

By all accounts, from proven insiders at MS, blacktusk was a poorly managed money pit with no idea what they were doing.

The end result was a New IP, years in the making getting scrapped for more gears of war
AngelicIceDiamond  +   374d ago
@DVC Black Tusk's is also a brand new studio as well so of course they're not anything like ND, Remedy, 343, Sony Santa Monica or any of those devs.

But from what I've heard Black Tusks is comprised of "super devs" of vets that are former AAA developers from across other various studio's.

(according to an insider) These guys are the real deal because of such superior talent all in one studio.

We'll see though.
dcbronco  +   373d ago
Zev year after year after year and nothing to show sounds a lot like last guardian also. I assume that is also mismanagement.
rainslacker  +   373d ago
I didn't know any reason had been given as to why their game was cancelled. I know they got put on the next GeOW, so I guess MS trusts them enough to at least manage a flagship IP for them.

However, I don't believe working on a GeOW game will really help them be as creative as they could have been given their staff's experience in the industry.

I will agree they're unproven as of yet, but for the time being it's worth looking forward to what they will come out with.
darthv72  +   374d ago
okay, with most fixated on the 'resolutiongate' i have to ask, Why cant there be a happy medium? Devs opting to improve the fluidity of their game at the sake of clarity. these systems are capable of upscaling just as the TV's these days can do the same thing.

Is there really that much a noticeable difference between a game that outputs 960p and is then upscaled to 1080p?

pixel counting is not what these games are developed for. They are meant to be played. Personally, my order of importance are:

Fun factor
Frame Rate

Having fun playing a game is something i have been doing since the 2600. When did that change for some gamers?

@canadian...wouldnt that depend on the game?
#1.4 (Edited 374d ago ) | Agree(13) | Disagree(14) | Report | Reply
I_AM_ CANADIAN_1989  +   374d ago
Having framerate that low must mean you have the vision of a shark. Framerate/screentearing should be first on that list. Saw someone playing titanfall on the xbone the other day had the worst screen tearing I have ever seen. Stutters and screen tearing is just as worse as listening to skipping song on a scratched cd if have good eye's.
#1.4.1 (Edited 374d ago ) | Agree(13) | Disagree(8) | Report
AngelicIceDiamond  +   374d ago
I agree with everything you said.

"Fun factor
Frame Rate

My order is just a tad bit different.

Fun factor

Well said bubble up.
Boody-Bandit  +   374d ago
" When did that change"

You know as well as any N4G member or lurker that has been on this site for more than a year or 2. It changed now that Sony has the edge, and by a lot more than MS did last generation, with multiplat titles because their hardware is more capable and easier to develop for.

So if you are going to be coy at least be an honest broker while doing so. This whole gameplay / fun factor over graphics don't mean a damn thing to a PS4 consumer. Because their console is more capable of offering the best of all worlds.

Can we stop the hypocrisy now? I mean I've been on this site for years, and both sides are guilty of it, but the whole it doesn't matter until it does is why articles like this one dominate this site and always will.
#1.4.3 (Edited 374d ago ) | Agree(7) | Disagree(5) | Report
darthv72  +   374d ago
@boody, if you are going to be selective at least be respectful to the selective text. The actual comment is: "when did that change for some gamers"

Some...not all. And i am being genuine with my question. Maybe its because i have been gaming for so long that i dont follow the notion of "it doesn't matter until it does" as you put it.

i am an equal opportunity gamer. i play games that are fun and appealing. i dont play sales or graphics like many others do. I play the games and if wanting to know when the notion of playing games for fun changed...there must be a reasonable answer.

Was it the 5th, 6th, 7th gen? Or older? for many, it was no longer about just having fun but about how to be disrespectful to your fellow gamer because they didnt buy or play the same game as another.

I am just 1 person looking for a logical reason behind the great divide. Gaming is supposed to bring people together, not drive them apart.

if you dont know the answer then just say you dont know the answer. But there is an answer and the sooner we can figure it out then perhaps the sooner we can get back to enjoying this entertainment than bickering about it.
DougLord  +   374d ago
I'll go you one better.

Fun factor is #1, #2, #3 and #4
Compelling story (unless it's a racer, fighter, sports game)
Compelling characters (unless it's a racer, fighter or sports game)
Novel Mechanics / features
Competitive Multiplayer with hosted servers and good matchmaking and control over cheaters and greifers
Model complexity and texture detail
special effects
Lack of crazy visual errors
Great sound track
DefenderOfDoom2  +   373d ago
Yeah, i also agree "fun factor" is number 1 on my list ! Smooth gameplay mechanics would be number 2 on my list .
Boody-Bandit  +   373d ago

Graphics ALWAYS mattered. You say you've been gaming since Atari 2600. I was gaming before PONG. We called it pinball.

I go all the way back before conventional home gaming started. Now in the very beginning people didn't say much about graphics with Pong or the 1st Atari console but nearly every console from that point forward most gamers I would knew would discuss graphics and their depth / skill level increasing.

In the beginning, as you probably know, gaming was done with a paddle / stick and usually 1 action button. Now we have flight and mech assault setups with unlimited buttons. But graphics ALWAYS mattered. Were they the most important? No. But the biggest draw to newer consoles was nearly always the graphics first and foremost.

It was that way with the NES, with the SNES (OMG I can't believe how incredible Super Mario World LOOKS), the 3DO really grabbed the eye candy enthusiast and started to blur the lines of traditional (what was once sprite based to polygons) gaming graphics to near photo realistic age of gaming began.

I use to hang out in gaming stores a friend of mine owned (26 stores) in the Philadelphia / Bucks County area of Pa. I worked part time for him and we went from store to store checking up and hanging out in his various locations. The workers and customers were always discussing which console had the better graphics and what games looked the best all the way back to the Colecovision era.

It really got heated when the SENS and Genesis were launched.

You ask how or if 960p vs 1080p has a noticeable difference. These guys were arguing over 16 bit sprite based games. It's just now that the social media makes everything amplified x1000. Graphics ALWAYS mattered.

The only reason it's being downplayed now is MS screwed up. Point blank and period. They chose a gimmick and poor vision and choices over processing power. For years on this site the members were arguing over blades of grass, like I said previously, but now 720p to 1080p is not a discernible difference. That's flat out BS and anyone that has done actual side by side comparisons know that. Hell websites were created just to do comparisons of video games. What ever happened to Lens of Truth? It appears they lost interest as soon as this generation started. Odd really /s

So again, stop being coy. I say this because you have been a member long enough to know this. You check my history and you will see I rarely ever got caught up in comparison or sales articles. I'm just keeping it real.

Graphics always mattered. It's only now being downplayed to this level because one console clearly is out in front when it comes to performance. Last gen the differences were significantly less than they are today yet this site was a warzone when it came to discernible difference that took magnifying glasses to see. Now 720p to 900p to 1080p is hardly noticeable. SMDH
#1.4.7 (Edited 373d ago ) | Agree(5) | Disagree(2) | Report
Boody-Bandit  +   373d ago
It doesn't matter until it does is to all the hypocrites that downplay power, options and features that their console of preference doesn't have. Sooner or later parody of most of those options and features happen through updates and hardware revisions and than all of a sudden, low and behold, I never said it didn't matter or doesn't make a difference. I merely said it wasn't necessary and it didn't really add to the gaming experience.

Again all sides are guilty of this but I find it shameless that a mass majority of members get so caught up in the most miniscule of difference just a year ago want to pretend bigger differences today don't matter at all.

Gaming isn't world peace. It's a hobby. It's not meant to topple divisions and unite humanity across the world. It's just something most do to relax or hang out with friends after a long day at work or school. To escape the daily grind and get lost and immersed into something to take your mind off of the hustle and bustle of every day life.

Gaming is like any other hobby or source of entertainment. It's divisive and usually segregated. Their called fans. Some are more rabid than others. Myself I could careless about any of it. I just like to go into my man cave and kick back and relax. I leave taken sides to those that have nothing better to do.

I'm just given my perspective as someone that has been lurking or a member of gaming sites since their inception.

As far back as I can remember graphics always mattered. Where they rank is up to the individual and it's all subjective. It's the totality of it all the is the true gaming experience.
#1.4.8 (Edited 373d ago ) | Agree(1) | Disagree(1) | Report
AndrewLB  +   373d ago
BoodyBandit- Why do you say PS4 is easier to develop for? Because AAA Sony studios say so? lol.

A while back, CD Projekt Red, The Witcher 3 developer Balazs Torok claimed that it is actually the Xbox One that is the easier console to develop for.

"The Xbox One is pretty easy to understand because not just the hardware is similar to the PC, but everything like the SDK, the API is really similar to what you would find on a PC," said Torok, speaking to Eurogamer.

"On the PS4 it's very good to have the fast memory. Everyone is really happy about that," Torok said. (Faster memory is the main reason why PS4 runs higher resolution)

"I don't see a major power difference. The memory is very different but I already said that before. Pure computation power, if you just measure that, there's no major difference."
nypifisel  +   373d ago
The whole situation is wrongly understood by let's face it uneducated games journalist that doesn't know how hardware function;

Resolution and FPS is not something that if one just tries hard enough can be achieved, it's whether hardware can project so many pixels in a certain amount of time or not. Software creates shortcuts to make this task easier for hardware. But the hardware is always an upper limit.

There's no problem for any game to achieve 1080p 60fps, but it comes down to how difficult this calculation will be, I.E how much graphical flair (sized pixels) you want in a game, can you accept that games looks static in their graphical elements or do we as consumers in modern times demand that fidelity rises all the time, do we want more realistic looking games? This is a weight off, you have to balance one with the other, less powerful hardware means that you have to cut more on the fidelity side to achieve good performance.

Whole discussion is stupid and it's a self playing piano by now with stupid people feeding the machine of stupidity.

This is why Xbox One games won't ever look as good, and if you want an Xbox One game and PS4 game to have the same assets and so on you will need to lower resolution on the Xbox One. This will be a fact for the PS4 too further into the generation, problem is that Xbox One already has this issue.
DoubleM70  +   373d ago
You got disagree, but you right. 720 and above is fine with me. Your from my era these kids don't no the pain we went through....lol Atari 2600 was not exactly knocking the graphics out the park.
marcofdeath  +   373d ago
I know that most of you don't do homework on both these systems.
Let me just say, PS4 is not the powerhouse you may think. Developers are always going to say how good there game is but we must always keep that in mine.

PS4 is easy to code for do to it being a standard AMD GPU of today. It's memory system is in my opinion horrible.
PS4 is not full HSA.
PS4 is not 176GB/s effective bandwidth.
That is why developers don't tell you the PEAK!
PS4's GPU really is balanced to 14 of the 18CU's the other 4 will be for GPGPU.

XB1 still has along way to go while software catches up (SDKs), but as i have said before bandwidth is the best way to indicate which system is and will be more powerful. Go on to the Internet and look at the history of bandwidth in consols it will tell you that 99% of the time the system with the most bandwidth will be the most powerful. And as it stands that score right now is as follows.
Effective bandwidth
XB1 ESRAM + DDR3=190-200GB/s effective bandwidth
PS4 GDDR5 = 135-140GB/s effective bandwidth
Theoretical PEAKs
XB1 ESRAM Read 1024bits x 853MHz / 8 = 109GB/s PEAK
Xb1 ESRAM Write 1024bits x 853MHz / 8 = 109GB/s PEAK
XB1 DDR3 256bits x 2133Mhz / 8 = 68GB/s PEAK
218GB/s + 68GB/s = 286GB/s PEAK
PS4 GDDR5 256bits x 5500GHz / 8 = 176GB/s PEAK

Now sooner or later you must ask your self, "why is it that XB1 has that much BW for a weak console?"
#1.4.12 (Edited 373d ago ) | Agree(0) | Disagree(5) | Report
nypifisel  +   373d ago

That's just a bunch of bullshit. You can't add the two pools (DDR3-ESRAM) bandwidth and call that factual end numbers. In reality the split pools create choke points were a theoretical max will never be achievable.

Also your conclusion is seriously flawed since it ignores any software gains that will be made on the PS4. In a sense your conclusion just disproved your point, that's ignoring the fact that memory bandwidth has nothing to do with calculating pixels but only by moving them. Where also the ESRAM is way too small for any bigger BPPs that comes with higher resolution. Which demands a reduction in graphics to push more pixels with any sort of AA applied.

Since you obviously can't read numbers maybe you can look at a picture:

#1.4.13 (Edited 373d ago ) | Agree(3) | Disagree(0) | Report
ziggurcat  +   373d ago

you keep regurgitating the same nonsense. none of it is right, no matter how many times you repeat it over, and over again.

PS4 is a more powerful machine than the xbone. it's time to let it go, and accept that fact.
DanteVFenris666  +   373d ago
I agree with you except my frame rate and resolution would be switched. After player eso with a laptop that can't handle it on good settings. I would much prefer to have higher res then I am now then fps increase which is at 20-30fps
Blaze929  +   374d ago
something ain't quite making sense about these "new" consoles lately. From both parties.
DoubleM70  +   373d ago
It's called learning curve things get better as the generation goes on. Let them learn the hardware own both consoles.
C-H-E-F  +   373d ago
I find this funny because SOE can run a 2000 player MMO on the PS4 on PC ULTRA settings @1080p 60fps... Devs have 0 excuse from here on out. at all
Vystrel  +   373d ago
.... I had no idea every game ever runs on the planet side 2 engine. Different games, different engines, different optimizations need to be done.
AndrewLB  +   373d ago
Planetside 2 was released over two years ago on PC and at the time it's graphics were not spectacular. Have you even bothered to look at the screenshots?


And FYI, the PS4 version will be lacking some features found on PC like nVidia PhysX effects. Vehicle motion and handling, volumetric fog and clouds, as well as particle effects.
Killzoner99  +   373d ago
Black Tusk? Nope. You have to earn your spot at the top like Naughty Dog has. What has Black Tusk done except spend millions with nothing to show for it. It's been proven time and again that allstars dev teams can't deliver the goods because the team members can't see past their own egos to cooperate on developing a game. Too many chefs in the kitchen as they say. I'm pretty sure the company will dissolve before they release anything.
DoubleM70  +   373d ago
I think your just talking about the quality of talent that's there at BlackTusk. I just hope they are working on maybe 2 or 3 games. Not just Gears of War. We need new IP's from them I know they creative people, so they need to use that talent.
TheTwelve  +   374d ago
...but at least it's a discussion and a possibility when it comes to the PS4.

I'll be VERY curious to see how this gen goes and how many games achieve this on the PS4.

Different game than the subject, but I'll be very curious to see what Witcher 3 can do on PS4. That will tell me a lot.
joab777  +   374d ago
It not doubt at all that this gen will cap pit much sooner. Devs will find ways to optimize and push the hardware, which is great, and others will focus on gameplay and unique experiences to offer something new, but graphics, resolution and framerate will hit a wall.

I was hoping so much that they would have offered two different models or offer upgradable tech for the future. It could remain a closed system with minimum and maximum settings being more static than the PC. It would have given more breathing room. Either that or in 3 yrs at E3 we will see the PS5 shown for another $400 investment.

But we see already how games like The Witcher 3, while being beautiful require more than is even offered right now to push the limits. I would a dropped $600-700 for a PS4...and imagine what that would b capable of.
solar  +   374d ago
it shouldn't be a "discussion or a possibility". both the next gen consoles should run at that rez and fps as the bottom. not "hope to achieve" it.

so if you look at the whole situation objectively, you have been sold a bill of goods. especially when Mark Cerny came out and called the PS4 a "Supercharged PC".
ShowGun901  +   374d ago
...the question is HOW far can they (any dev, not just these guys) push the graphics while keeping 1080/60. Farther than the xbone i'd wager, but not anywhere near a $1600 gaming rig.

If the order:1886 and UC4 are examples of the ps4 not being maxxed out yet, I'll be just fine with that! they look amazing to me! (what do people want?)

personally, i'd rather see advances in physics (no more clipping please!) and AI this gen, rather than it getting anymore photorealistic than the order:1886... games always look more real in pictures than video, because interactions, whether with the environment or with the AI, always look a little... fake. Thats what i'd like impovement on in the next 5 years!
#3 (Edited 374d ago ) | Agree(14) | Disagree(1) | Report | Reply
uth11  +   374d ago
what do people want?

If you like photo realistic, immersive environments, I predict that you will be happy this gen. :)

But on the other hand, if you are pixel-counter, (or polygons, or frames.) Then you will frequently find reasons to be disappointed. Too much focus on the numbers, not enough on the game.
edqe  +   374d ago
$1600 gaming rig is a bit exaggeration. Intel Broadwell will already have very capable iGPU - even older Iris Pro 5200 [1] is surprisingly good. Intel NUC [2] with Broadwell as a gaming machine could be quite interesting already.

I wouldn't surprised to see gamers having Broadwell + 64G RAM + 12G dGPU(s) gaming rigs next year [3] (of course not just gaming but doing some work as well).

In the same time I enjoy having games that are complex and demanding other than in graphics perspective.
Dwarf Fortress - http://www.bay12games.com/d...
Unreal World RPG - http://urw.fi/
Europa Universalis - http://www.europauniversali...
X Motor Racing - http://www.xmotorracing.com...

[1] http://www.youtube.com/watc...
[2] http://www.intel.com/conten...
[3] http://www.guru3d.com/news-...
#3.2 (Edited 374d ago ) | Agree(3) | Disagree(2) | Report | Reply
Mithan  +   374d ago
You do realize that memory is useless?
So is 12GB on a GPU that wont use it?
edqe  +   373d ago
@Mithan: Maybe today in games that has ported from consoles. As a programmer and especially people who are working on graphics and media stuff more memory is only good.

Hopefully in future games would start to use PCs capabilities but that's not going to happen for AAA games from Ubisoft, EA, or similar that has to make sure games runs on much lower specs consoles and PCs as well.

PCs would have very strong CPUs for AI for example. I guess it much easier to find a high skilled graphics programmers than AI programmers.
#3.2.2 (Edited 373d ago ) | Agree(0) | Disagree(0) | Report
Neixus  +   374d ago
Go hard or go home
etownone  +   374d ago
So disappointed this gen.

These consoles are supposedly like 4-6 times more powerful then last gen and still no 1080p/60fps
angelsx  +   374d ago
Me too.I think for the new consoles need new engines to reach the gold standard 1080/60
#5.1 (Edited 374d ago ) | Agree(7) | Disagree(0) | Report | Reply
dantesparda  +   374d ago
Me too!
DefenderOfDoom2  +   373d ago
The funny thing is, i heard developers like JOHN CARMARCK say months before the 8th gen consoles came out, that the 8th gen consoles will have a hard time running games at 60fps and 1080p with a lot going on the screen at the same time. Why did a lot journalists and gamers not know this????
krouse93  +   373d ago
Not to mention 4K is right around the corner, 4K TV's are starting to flood the market, and the consoles can't keep up. I have a PS4, but I can't help but feel slightly unfulfilled knowing 4K games aren't in the pipeline.
Kayant  +   373d ago
And they can continue to 4K won't be mainstream until 5+ years probably closer to 8+ because of prices and lack of content.

They don't need to be doing 4K gaming yet but they should support 4K media playback.

Current console games can do 1080p/60fps if they wanted. The consoles are powerful enough in most cases. It's devs that choose graphics effects and other effects as a priority compared to framerate. You only need to look at Nintendo and all the 60fps games on Wii U for that.
marcofdeath  +   373d ago
it's only been one year. Xb1 and PS4 will get more powerful as time goes on. One of the console have already giving you a glimpse of its potential. Most of the engines made our old retooled from last generation. Therefore they do not take into consideration a closed box environment.
imt558  +   373d ago
Quote :

I know that most of you don't do homework on both these systems.
Let me just say, PS4 is not the powerhouse you may think. Developers are always going to say how good there game is but we must always keep that in mine.

PS4 is easy to code for do to it being a standard AMD GPU of today. It's memory system is in my opinion horrible.
PS4 is not full HSA.
PS4 is not 176GB/s effective bandwidth.
That is why developers don't tell you the PEAK!
PS4's GPU really is balanced to 14 of the 18CU's the other 4 will be for GPGPU.

XB1 still has along way to go while software catches up (SDKs), but as i have said before bandwidth is the best way to indicate which system is and will be more powerful. Go on to the Internet and look at the history of bandwidth in consols it will tell you that 99% of the time the system with the most bandwidth will be the most powerful. And as it stands that score right now is as follows.
Effective bandwidth
XB1 ESRAM + DDR3=190-200GB/s effective bandwidth
PS4 GDDR5 = 135-140GB/s effective bandwidth
Theoretical PEAKs
XB1 ESRAM Read 1024bits x 853MHz / 8 = 109GB/s PEAK
Xb1 ESRAM Write 1024bits x 853MHz / 8 = 109GB/s PEAK
XB1 DDR3 256bits x 2133Mhz / 8 = 68GB/s PEAK
218GB/s + 68GB/s = 286GB/s PEAK
PS4 GDDR5 256bits x 5500GHz / 8 = 176GB/s PEAK

Now sooner or later you must ask your self, "why is it that XB1 has that much BW for a weak console?"

Oddworld developer say hello :


So much about PS4 bandwidth and Diablo III has framerate drops on Xbone.
Oh yes, there is Metro Redux also. http://www.eurogamer.net/ar...

If Xbone's SDK will improve, so PS4's SDK will improve also. Oh yes, and about Star Citizen DX12. He was referring to the PC, dude. He doesn't even work on Xbone or PS4. He is STRICTLY PC developer. Of course, you and others on MRX blog realized it is about Xbone. Funny!

Nobody takes you seriously, dude. Especially when you wrote that Xbone has dual GPU crap :


Try harder!
#5.5.1 (Edited 373d ago ) | Agree(1) | Disagree(1) | Report
marcofdeath  +   372d ago

I was right about PS4's weak GDDR5 (terrible memory system) only being 135GB/s. You still don't "GET IT", 176GB/s is PS4's theoretical bandwidth, it's effective bandwidth is 135GB/s. Just like XB1's theoretical is 286GB/s theoretical bandwidth, but it's effective bandwidth is only 190-200GB/s, as said in the digital foundry article.

Developers are bound by agreements, and in the gaming business are always last to know. Case and point Diablo 3 1080p was given the code by Microsoft. Also as i have said it's a old engine so it should run great on PS4.
This developer at gamingbolt.
I didn't even have to click on it because I know what he can say, that's why there's no context for the bandwidth. Publicly he can only say what has been announced from Sony.
He is giving you Sony's PR talk. The proof to that is here:

"Jenner wouldn't go into "details" on the levels of "bandwidth available" for each bus owing to confidentiality agreements"

Note how DF article writer feels in the same PR line of 176GB/s. The developer knows that he cannot give the real bandwidth specifications the only thing that can be repeated publicly is the PR line of 176GB/s. That development document protects Sony from any libel that's why it's still up.

Beyond 3-D now fully agrees with my findings get over it. It's 135GB/s.

As to your Xbox one doesn't have two GPU's, i will follow the proven information that I can find and others, just like I did with the PS4 effective bandwidth. It's not about you or me, it's about the information that can be proven as fact. My history proves that. It is not only me that was saying Xbox ones has multiple GPUs, (768GPU and a 768GPGPU for ALU only so technically not a GPU) but unlike you I get some information from the horses mouth, like: Charlie Demerjian (no Microsoft fan) over at http://semiaccurate.com/?s=... you need to read them and look for the words "GPUs". Then there is this one here from the Xbox one architecture software designer.

"The "GPUs" are really complicated "beasts" this time around."


That is why i said that over time it will pass PS4.... A ALU GPGPU that I can imagine how much work it's going to take to program properly.

Next up is the actual physical evidence.
Your PS4 GPU has what is called global shared memory commonly known as GSM. It controls the amount of threads(64)for your PS4 that has 32kB for its GPU. Now Xbox one has 64kB for it's GSM 2x for 128threads.

So you see I don't make blind accusations i back my shit up, but it's still only speculation at this point until I can find confirmation from other sources. You cannot find other sources to back up your 176GB/s "effective bandwidth", and that's why I always win when talking to you. If people disregard me that's fine, but I have been right more than you. "ESRAM IS A BOTTOLENECK"

Thank you you're making me famous. I love to speculate and voice my theory, but I will always tell people that I am speculating and that's why I post there. Just like PS4's fact 176GB/s peak and effective bandwidth is 135GB/s, that's not speculation it's FACT.
#5.5.2 (Edited 372d ago ) | Agree(0) | Disagree(3) | Report
ziggurcat  +   372d ago

"I was right about PS4's weak GDDR5 (terrible memory system) only being 135GB/s."

no, you're not right about that at all.

"Case and point Diablo 3 1080p was given the code by Microsoft. Also as i have said it's a old engine so it should run great on PS4. "

hahaha no, they didn't just give them a magical code. and if diablo is running on an old engine, why were blizzard struggling to get it to 1080p on xbone? i mean, if the hardware is beyond next-gen, then it shouldn't have a problem, right??

"Beyond 3-D now fully agrees with my findings get over it. It's 135GB/s."

yeah... no. i doubt they fully agree with you since they banned you from that website for being horribly inaccurate because you were throwing out numbers generated by misterx's rear end.

"He is giving you Sony's PR talk. The proof to that is here:

"Jenner wouldn't go into "details" on the levels of "bandwidth available" for each bus owing to confidentiality agreements"

and you conveniently left out the continuation of the sentence here:

"but based on our information the GPU has full access to the 176GB/s bandwidth of the PS4's GDDR5 via Garlic"

"My history proves that. It is not only me that was saying Xbox ones has multiple GPUs, "

bahahahahahaha no it doesn't:


actually, it's' right, and it's why xbone has been struggling to get to 1080p most of the time.

"Just like PS4's fact 176GB/s peak and effective bandwidth is 135GB/s, that's not speculation it's FACT."

nope. not fact at all.
BX81  +   373d ago
im not disappointed at all. The ps4/xb1 have and will continue to have great games. Its just started.
etownone  +   373d ago
I don't doubt great games are coming.

But easily... First wave of games should be at the 1080p/60fps..

Oh well.
BX81  +   373d ago
If promised by devs and Sony/ms heads then I agree. I don't remember if the release on the consoles addressed this topic? I do know the specific features were a let down. The consoles felt rushed then finished with updates
DanteVFenris666  +   373d ago
Because that alone causes 4* amount of work plus all of the other upgraded visuals and bigger landscapes
DevilishSix  +   374d ago
I hate when people call other gamers pixel counters, like its a bad thing to want a game to be clear (resolution) and to run smooth (fps). It's not a bad thing in fact it is one of the most important things. You could have the greatest game of all time but if it runs like s*** and looks like vasoline smeared all of it, then its just not worth it.
Spotie  +   374d ago
I hate it because they act like that's the only thing that matters, period.

I mean, it could be an article about graphics, and folks complain about people debating the subject.

That makes no sense to me. The same as the ones who say "suddenly gamers only play sales" in a sales article.

Are these people broken?
DanteVFenris666  +   372d ago
In only 20 so I can't even begin to imagine how some of you older gamers lived before hd. Having 1080p is a benefit but 700-900 is great too. 30 fps is great, 60 fps is better. As long as I game is 720p and 30frames it's at a resolution and gram rate that is solid. Anymore is icing on the cake.

If rather have increase effects, number of objects and other things before frames and res is increased past those points
MilkMan  +   374d ago
This better be a joke. WTF do you need bro!
Get outta here with that nonsense. Go on and make games and use those big brains of yours. Stop with the "limitations" of the hardware already.
HanzoHattori  +   374d ago
How many GB of RAM are these dev kits using now? I know that dev kits were given out that only made use of 4GB RAM in the first few months of the PS4's release. I'm just wondering if that's still the case since it has been confirmed that the PS4 operating system only needs 1GB of RAM to run in the background while an application/game is running.
mochachino  +   374d ago
It's a $400 dollar box devs should make the games 720p with 60 FPS
Chard  +   374d ago
I'm just happy that we're past the era of devs struggling to get 720p/30fps/vsync on consoles.
user367272  +   374d ago
It is all cool...I play mostly exclusive games on consoles which I don't really care too much about graphics as long as it is fun and buy third party games on steam on my PC rig.

People expect too much from these consoles when both companies are trying save a few bucks and not take a lost on each console sold by using low powered laptop CPU and outdated graphics card. There is only so much it could do with those specs and the 10x more powerful than previous systems are just marketing talk. In real world benchmarking...both the PS4 and XB1 are like 3-4x more powerful than previous gen.
#11 (Edited 374d ago ) | Agree(6) | Disagree(6) | Report | Reply
mochachino  +   374d ago
That not true, they're probably closer to 8x more powerful. PS3 and 360 often struggled to have 720p at 30 FPS. If you go back and play your old games after playing on the new consoles, you'll see just how bad frame rates, aliasing and shadows were. I can't even look at Crysis 3 on my 360 and thats supposed to have good graphics.

Remember, devs had only 512 mb of RAM and MUCH weaker GPU's with 360 and PS3.

Pick any PS4/xone game and it would be the best graphics of the year on PS3/360. People say games like Wolfenstein and Watchdogs don't have good graphics but they'd be incredible graphics on the old consoles.

Infamous and Ryse which are only launch window titles looks much better than anything on the old consoles and definitely many times better than any of the launch games on 360/PS3.
#11.1 (Edited 374d ago ) | Agree(2) | Disagree(1) | Report | Reply
mysteryraz11  +   374d ago
yet last of us ps4 and mgs5 is 1080p 60fps, ps4 has several games running at it, its understandable if its for a game like witcher 3 which is super demanding
tee_bag242  +   373d ago
You just sound like another fangirl making concessions. So it's fine Witcher 3 doesn't run 60fps because it happens to be on the console you own?
mysteryraz11  +   373d ago
Lol you are a retard, did I hurt your feelings child you cant get over the fact that I owned you
tee_bag242  +   373d ago
.. fail ... again

Another flaccid argument from the world's most gaping ring-stung cheerleader :p

Go back to school and learn some more
#12.2 (Edited 373d ago ) | Agree(0) | Disagree(1) | Report | Reply
SPARTAN3  +   374d ago
Well do 720 60 with Shit popping off everywhere am fine with that.
ATi_Elite  +   374d ago
Both platforms are really in essence a JOKE.

Sure very powerful if you wanna continue making Last gen games but with better graphics.

But for the sake of gaming it's time to move far beyond corridor shooters and games with lame AI and only two on screen enemies at a time.

Time to move forward to huge game worlds, variable gameplay, lots and lots of Physics, Destruction, advanced AI, and massive amount of on screen enemy NPC's etc..

and if you gotta sacrifice down to 720p or 900p then so be it.
CertifiedGamer  +   373d ago
To developers anything is a challenge compared to pc because PC being the most powerful platform has more resources in comparison.
Amsterdamsters  +   373d ago
Hmmm, I can play pretty much any game at 1080p 60+fps on Ultra setting on my PC and it does so much more.....
SaturdayNightBeaver  +   373d ago
you shouldn't say that here, just a tip. we hate PC gamers that can enjoy better than we do on console. Developers will soon learn how to code to the metal with ps4, and xbox one dx12 and secret gpu will change everything.
#16.1 (Edited 373d ago ) | Agree(4) | Disagree(2) | Report | Reply
colony  +   373d ago
Try not coming off as a pretentious cunt. It helps
wheatley  +   373d ago
Very excited for Alienation :) Housemarque are fantastic
Golden30  +   373d ago
Uncharted 4 says Hello!
colony  +   373d ago
Dont mind me.
I just came here to watch the pc elitists argue and whine about the consoles, defending their purchase.
Not that they know jack shit about computers other than numbers and hertz
incendy35  +   373d ago
From a pure CPU standpoint, PS4 is pretty weak. However, the big thing consoles do that PC's cannot "currently" match is Parallel Computing which changes the development game. On PC today, the CPU and the GPU are separate entities that have separate memory. For the most part if you want them to work together, they have to update the bits of memory on their end and then move them to the memory of the other for it to do work.

On next gen consoles that isn't the case. The CPU and GPU are accessing the same bits of memory. This is the future of all computing, and increases performance by up to 2.4 times according to AMD. CPU's are awesome at computational tasks, GPU's are awesome at procedural tasks.. Put them together and it is the ultimate CPU.

Parallel Computing is the future of PC's as well and many newer GPU's are offering direct memory access to the CPU but it isn't widely available enough for game engines to make it a priority. DirectX 12 is also key to this movement as it provides ways to flag operation to run in parallel if it is an available path. With DX11 and OpenGL there is no path right now. AMD's Mantle also allows a path, but DX12 will be the key to wide adoption in the PC market.

The developers in this article aren't wrong. It isn't easy to get the most out of the PS4 and X1 and requires a new way of thinking. But the possibilities to push their limits are just getting started.
#20 (Edited 373d ago ) | Agree(1) | Disagree(0) | Report | Reply
JoshOnTech  +   373d ago
I want developers to focus on making the games as good as possible, then worry about resolution. I don't need 1080p. Hell, when I was on PS3 and Xbox 360 I thought I WAS playing in 1080p. I want great games to come before great resolution. Though both would be nice.
Broburger  +   373d ago
You can only do so much with a gimped HD 7870.

Add comment

You need to be registered to add comments. Register here or login
New stories

WCCFtech Reviews: Broken Sword 5

21m ago - WCCFtech says that Broken Sword is a great game, stating that: "Broken Sword 5 is a reminder of j... | Xbox One

Recon 3D Omega Wireless Review | The Game Scouts

21m ago - Tin Salamunic: It wasn’t too long ago that wiring was essential for those seeking high-end headph... | Tech

Check What Xbox One Games are Coming Out in August

Now - At Releases.com you can check release dates for all Xbox One games. Visit now and start tracking the games you plan to buy. | Promoted post

Assassin’s creed Syndicate preview - TGG

24m ago - (Sharn Daniels, The Gaming Ground) I had the great pleasure to play a preview build of Ubisoft... | PC

Preview: WildStar: free-to-play - closed beta - Gaming Boulevard

24m ago - The guys of Gaming Boulevard got access to the the closed beta of WildStar that's going free-to-p... | PC

Fight Through Monsters From Japanese Folklore In Beat ‘Em Up Princess Kaguya

35m ago - Siliconera: So far, Zoo Corporation has only released Japanese card games onto Steam (oh, and a s... | PC