Housemarque explains that achieving 1080p and 60fps with superior graphical effects is a challenge.
horsemargue is a great indie team i can personally see them top class along with ND blacktusk developers
Uh, what... Blacktusk has yet to make anything, how are they top class? I realize they have high profile people there but it takes the right people to make a great team.
I think it all depends on the game.
Its not about the company but about the people working for it. Look at Rare, great brand name with no one behind it, Black Tusk have some of the best in the industry working for it and it unproven. Doesn't mean there not up there.
"Doesn't mean there not up there." Ahh yes it does. No one claims Rare is a great brand name these days either.
They wee though. Like seriously rare should have never separated from Nintendo. I LOVED all their games from original do on snes all the way to banjo kazooie on 64.
@Whitey Black Tusk is unproven. As for this. Looks like Indies are REALLY stepping it up their game. Their games are getting bigger demanding more from these systems. World size, graphics, content etc. I said this once Indie arcade could eventually outdo AAA. In quality and creativity. Actually, some indie arcade devs already won with creativity front. Can't wait to see more from this.
I hate this idea of "indies" vs AAA . Indies studio or independent only tell who own the studio . Independent games are only about the financial of the game. Independent game sometime have smaller budget than AAA games. AAA game that are finance by a big publishers have larger budget that independent games. Being a independent financial game is about who owns the IP and finance the game. Being a " indie" studios or independent studio has nothing to do with being a AAA game. Double fine, Ninja Theory ,CD Project Red make AAA games and all are indie studio's or independent studio
@ Angelic I keep re-iterating this but certain people still see indies as a threat to their console or just something to laugh at. I think indies have been shelling out more creativity for years now, not just with these consoles but last-gen too. PC, even before that. Looking at games like Ethan Carter, SOMA, Dreamfall Chapters, No Man's Sky, WiLD, Everybody's Gone to the Rapture, The Witness, RiME - you can no longer simply look at indies and consider them small fish by default.
You guys should play counter spy. Sweet game man.
@ evilsnuggle Being an indie is more about leaving behind asshole publishers like ea and activision. Talented people like Tim Shafer and kenji inafune shouldn't be making games through crowd funding but that's what they do to stay away from the shithead publishers of the world.
That's insulting to add blacktusck (who had to can their first game due to bad management) to the same page with industry leaders like ND. Hell, it's insulting to Housemarque even,
Totally agree....Black Tusk has nothing to show since they opened their doors. They are nothing more than a money sink until they deliver a game....good or bad..just deliver a game.
It is insulting to lie and say that Black Tuscks first game was canned due to "bad" management, when all things point to the opposite. Reports were that the prototype they were working on looked great. In fact, BT were entrusted with Gears of War which is a huge, huge, HUGELY important franchise to MS which points to them being well managed.
^ What reports are you talking about? And everything was apparently fine, for year after year after year of nothing to show and the game gets scrapped and that's a good sign to you? By all accounts, from proven insiders at MS, blacktusk was a poorly managed money pit with no idea what they were doing. The end result was a New IP, years in the making getting scrapped for more gears of war
@DVC Black Tusk's is also a brand new studio as well so of course they're not anything like ND, Remedy, 343, Sony Santa Monica or any of those devs. But from what I've heard Black Tusks is comprised of "super devs" of vets that are former AAA developers from across other various studio's. (according to an insider) These guys are the real deal because of such superior talent all in one studio. We'll see though.
Zev year after year after year and nothing to show sounds a lot like last guardian also. I assume that is also mismanagement.
I didn't know any reason had been given as to why their game was cancelled. I know they got put on the next GeOW, so I guess MS trusts them enough to at least manage a flagship IP for them. However, I don't believe working on a GeOW game will really help them be as creative as they could have been given their staff's experience in the industry. I will agree they're unproven as of yet, but for the time being it's worth looking forward to what they will come out with.
okay, with most fixated on the 'resolutiongate' i have to ask, Why cant there be a happy medium? Devs opting to improve the fluidity of their game at the sake of clarity. these systems are capable of upscaling just as the TV's these days can do the same thing. Is there really that much a noticeable difference between a game that outputs 960p and is then upscaled to 1080p? pixel counting is not what these games are developed for. They are meant to be played. Personally, my order of importance are: Fun factor Gameplay Frame Rate Resolution Having fun playing a game is something i have been doing since the 2600. When did that change for some gamers? @canadian...wouldnt that depend on the game?
Having framerate that low must mean you have the vision of a shark. Framerate/screentearing should be first on that list. Saw someone playing titanfall on the xbone the other day had the worst screen tearing I have ever seen. Stutters and screen tearing is just as worse as listening to skipping song on a scratched cd if have good eye's.
I agree with everything you said. "Fun factor Gameplay Frame Rate Resolution" My order is just a tad bit different. gameplay Fun factor Framerate Resolution. Well said bubble up.
" When did that change" You know as well as any N4G member or lurker that has been on this site for more than a year or 2. It changed now that Sony has the edge, and by a lot more than MS did last generation, with multiplat titles because their hardware is more capable and easier to develop for. So if you are going to be coy at least be an honest broker while doing so. This whole gameplay / fun factor over graphics don't mean a damn thing to a PS4 consumer. Because their console is more capable of offering the best of all worlds. Can we stop the hypocrisy now? I mean I've been on this site for years, and both sides are guilty of it, but the whole it doesn't matter until it does is why articles like this one dominate this site and always will.
@boody, if you are going to be selective at least be respectful to the selective text. The actual comment is: "when did that change for some gamers" Some...not all. And i am being genuine with my question. Maybe its because i have been gaming for so long that i dont follow the notion of "it doesn't matter until it does" as you put it. i am an equal opportunity gamer. i play games that are fun and appealing. i dont play sales or graphics like many others do. I play the games and if wanting to know when the notion of playing games for fun changed...there must be a reasonable answer. Was it the 5th, 6th, 7th gen? Or older? for many, it was no longer about just having fun but about how to be disrespectful to your fellow gamer because they didnt buy or play the same game as another. I am just 1 person looking for a logical reason behind the great divide. Gaming is supposed to bring people together, not drive them apart. if you dont know the answer then just say you dont know the answer. But there is an answer and the sooner we can figure it out then perhaps the sooner we can get back to enjoying this entertainment than bickering about it.
I'll go you one better. Fun factor is #1, #2, #3 and #4 Compelling story (unless it's a racer, fighter, sports game) Compelling characters (unless it's a racer, fighter or sports game) Novel Mechanics / features Competitive Multiplayer with hosted servers and good matchmaking and control over cheaters and greifers Coop Model complexity and texture detail FPS special effects Lack of crazy visual errors Great sound track Resolution.
Yeah, i also agree "fun factor" is number 1 on my list ! Smooth gameplay mechanics would be number 2 on my list .
darth Graphics ALWAYS mattered. You say you've been gaming since Atari 2600. I was gaming before PONG. We called it pinball. I go all the way back before conventional home gaming started. Now in the very beginning people didn't say much about graphics with Pong or the 1st Atari console but nearly every console from that point forward most gamers I would knew would discuss graphics and their depth / skill level increasing. In the beginning, as you probably know, gaming was done with a paddle / stick and usually 1 action button. Now we have flight and mech assault setups with unlimited buttons. But graphics ALWAYS mattered. Were they the most important? No. But the biggest draw to newer consoles was nearly always the graphics first and foremost. It was that way with the NES, with the SNES (OMG I can't believe how incredible Super Mario World LOOKS), the 3DO really grabbed the eye candy enthusiast and started to blur the lines of traditional (what was once sprite based to polygons) gaming graphics to near photo realistic age of gaming began. I use to hang out in gaming stores a friend of mine owned (26 stores) in the Philadelphia / Bucks County area of Pa. I worked part time for him and we went from store to store checking up and hanging out in his various locations. The workers and customers were always discussing which console had the better graphics and what games looked the best all the way back to the Colecovision era. It really got heated when the SENS and Genesis were launched. You ask how or if 960p vs 1080p has a noticeable difference. These guys were arguing over 16 bit sprite based games. It's just now that the social media makes everything amplified x1000. Graphics ALWAYS mattered. The only reason it's being downplayed now is MS screwed up. Point blank and period. They chose a gimmick and poor vision and choices over processing power. For years on this site the members were arguing over blades of grass, like I said previously, but now 720p to 1080p is not a discernible difference. That's flat out BS and anyone that has done actual side by side comparisons know that. Hell websites were created just to do comparisons of video games. What ever happened to Lens of Truth? It appears they lost interest as soon as this generation started. Odd really /s So again, stop being coy. I say this because you have been a member long enough to know this. You check my history and you will see I rarely ever got caught up in comparison or sales articles. I'm just keeping it real. Graphics always mattered. It's only now being downplayed to this level because one console clearly is out in front when it comes to performance. Last gen the differences were significantly less than they are today yet this site was a warzone when it came to discernible difference that took magnifying glasses to see. Now 720p to 900p to 1080p is hardly noticeable. SMDH
It doesn't matter until it does is to all the hypocrites that downplay power, options and features that their console of preference doesn't have. Sooner or later parody of most of those options and features happen through updates and hardware revisions and than all of a sudden, low and behold, I never said it didn't matter or doesn't make a difference. I merely said it wasn't necessary and it didn't really add to the gaming experience. Again all sides are guilty of this but I find it shameless that a mass majority of members get so caught up in the most miniscule of difference just a year ago want to pretend bigger differences today don't matter at all. Gaming isn't world peace. It's a hobby. It's not meant to topple divisions and unite humanity across the world. It's just something most do to relax or hang out with friends after a long day at work or school. To escape the daily grind and get lost and immersed into something to take your mind off of the hustle and bustle of every day life. Gaming is like any other hobby or source of entertainment. It's divisive and usually segregated. Their called fans. Some are more rabid than others. Myself I could careless about any of it. I just like to go into my man cave and kick back and relax. I leave taken sides to those that have nothing better to do. I'm just given my perspective as someone that has been lurking or a member of gaming sites since their inception. As far back as I can remember graphics always mattered. Where they rank is up to the individual and it's all subjective. It's the totality of it all the is the true gaming experience.
BoodyBandit- Why do you say PS4 is easier to develop for? Because AAA Sony studios say so? lol. A while back, CD Projekt Red, The Witcher 3 developer Balazs Torok claimed that it is actually the Xbox One that is the easier console to develop for. "The Xbox One is pretty easy to understand because not just the hardware is similar to the PC, but everything like the SDK, the API is really similar to what you would find on a PC," said Torok, speaking to Eurogamer. "On the PS4 it's very good to have the fast memory. Everyone is really happy about that," Torok said. (Faster memory is the main reason why PS4 runs higher resolution) "I don't see a major power difference. The memory is very different but I already said that before. Pure computation power, if you just measure that, there's no major difference."
The whole situation is wrongly understood by let's face it uneducated games journalist that doesn't know how hardware function; Resolution and FPS is not something that if one just tries hard enough can be achieved, it's whether hardware can project so many pixels in a certain amount of time or not. Software creates shortcuts to make this task easier for hardware. But the hardware is always an upper limit. There's no problem for any game to achieve 1080p 60fps, but it comes down to how difficult this calculation will be, I.E how much graphical flair (sized pixels) you want in a game, can you accept that games looks static in their graphical elements or do we as consumers in modern times demand that fidelity rises all the time, do we want more realistic looking games? This is a weight off, you have to balance one with the other, less powerful hardware means that you have to cut more on the fidelity side to achieve good performance. Whole discussion is stupid and it's a self playing piano by now with stupid people feeding the machine of stupidity. This is why Xbox One games won't ever look as good, and if you want an Xbox One game and PS4 game to have the same assets and so on you will need to lower resolution on the Xbox One. This will be a fact for the PS4 too further into the generation, problem is that Xbox One already has this issue.
You got disagree, but you right. 720 and above is fine with me. Your from my era these kids don't no the pain we went through....lol Atari 2600 was not exactly knocking the graphics out the park.
I know that most of you don't do homework on both these systems. Let me just say, PS4 is not the powerhouse you may think. Developers are always going to say how good there game is but we must always keep that in mine. PS4 is easy to code for do to it being a standard AMD GPU of today. It's memory system is in my opinion horrible. PS4 is not full HSA. PS4 is not 176GB/s effective bandwidth. That is why developers don't tell you the PEAK! PS4's GPU really is balanced to 14 of the 18CU's the other 4 will be for GPGPU. XB1 still has along way to go while software catches up (SDKs), but as i have said before bandwidth is the best way to indicate which system is and will be more powerful. Go on to the Internet and look at the history of bandwidth in consols it will tell you that 99% of the time the system with the most bandwidth will be the most powerful. And as it stands that score right now is as follows. Effective bandwidth XB1 ESRAM + DDR3=190-200GB/s effective bandwidth PS4 GDDR5 = 135-140GB/s effective bandwidth Theoretical PEAKs XB1 ESRAM Read 1024bits x 853MHz / 8 = 109GB/s PEAK Xb1 ESRAM Write 1024bits x 853MHz / 8 = 109GB/s PEAK XB1 DDR3 256bits x 2133Mhz / 8 = 68GB/s PEAK 218GB/s + 68GB/s = 286GB/s PEAK PS4 GDDR5 256bits x 5500GHz / 8 = 176GB/s PEAK Now sooner or later you must ask your self, "why is it that XB1 has that much BW for a weak console?"
@marcofdeath That's just a bunch of bullshit. You can't add the two pools (DDR3-ESRAM) bandwidth and call that factual end numbers. In reality the split pools create choke points were a theoretical max will never be achievable. Also your conclusion is seriously flawed since it ignores any software gains that will be made on the PS4. In a sense your conclusion just disproved your point, that's ignoring the fact that memory bandwidth has nothing to do with calculating pixels but only by moving them. Where also the ESRAM is way too small for any bigger BPPs that comes with higher resolution. Which demands a reduction in graphics to push more pixels with any sort of AA applied. Since you obviously can't read numbers maybe you can look at a picture: http://www.spawnfirst.com/w...
@marcofdeath: you keep regurgitating the same nonsense. none of it is right, no matter how many times you repeat it over, and over again. PS4 is a more powerful machine than the xbone. it's time to let it go, and accept that fact.
I agree with you except my frame rate and resolution would be switched. After player eso with a laptop that can't handle it on good settings. I would much prefer to have higher res then I am now then fps increase which is at 20-30fps
something ain't quite making sense about these "new" consoles lately. From both parties.
It's called learning curve things get better as the generation goes on. Let them learn the hardware own both consoles.
I find this funny because SOE can run a 2000 player MMO on the PS4 on PC ULTRA settings @1080p 60fps... Devs have 0 excuse from here on out. at all
.... I had no idea every game ever runs on the planet side 2 engine. Different games, different engines, different optimizations need to be done.
Planetside 2 was released over two years ago on PC and at the time it's graphics were not spectacular. Have you even bothered to look at the screenshots? http://i.ytimg.com/vi/7HCh5... And FYI, the PS4 version will be lacking some features found on PC like nVidia PhysX effects. Vehicle motion and handling, volumetric fog and clouds, as well as particle effects.
Black Tusk? Nope. You have to earn your spot at the top like Naughty Dog has. What has Black Tusk done except spend millions with nothing to show for it. It's been proven time and again that allstars dev teams can't deliver the goods because the team members can't see past their own egos to cooperate on developing a game. Too many chefs in the kitchen as they say. I'm pretty sure the company will dissolve before they release anything.
I think your just talking about the quality of talent that's there at BlackTusk. I just hope they are working on maybe 2 or 3 games. Not just Gears of War. We need new IP's from them I know they creative people, so they need to use that talent.
...but at least it's a discussion and a possibility when it comes to the PS4. I'll be VERY curious to see how this gen goes and how many games achieve this on the PS4. Different game than the subject, but I'll be very curious to see what Witcher 3 can do on PS4. That will tell me a lot.
It not doubt at all that this gen will cap pit much sooner. Devs will find ways to optimize and push the hardware, which is great, and others will focus on gameplay and unique experiences to offer something new, but graphics, resolution and framerate will hit a wall. I was hoping so much that they would have offered two different models or offer upgradable tech for the future. It could remain a closed system with minimum and maximum settings being more static than the PC. It would have given more breathing room. Either that or in 3 yrs at E3 we will see the PS5 shown for another $400 investment. But we see already how games like The Witcher 3, while being beautiful require more than is even offered right now to push the limits. I would a dropped $600-700 for a PS4...and imagine what that would b capable of.
it shouldn't be a "discussion or a possibility". both the next gen consoles should run at that rez and fps as the bottom. not "hope to achieve" it. so if you look at the whole situation objectively, you have been sold a bill of goods. especially when Mark Cerny came out and called the PS4 a "Supercharged PC".
...the question is HOW far can they (any dev, not just these guys) push the graphics while keeping 1080/60. Farther than the xbone i'd wager, but not anywhere near a $1600 gaming rig. If the order:1886 and UC4 are examples of the ps4 not being maxxed out yet, I'll be just fine with that! they look amazing to me! (what do people want?) personally, i'd rather see advances in physics (no more clipping please!) and AI this gen, rather than it getting anymore photorealistic than the order:1886... games always look more real in pictures than video, because interactions, whether with the environment or with the AI, always look a little... fake. Thats what i'd like impovement on in the next 5 years!
what do people want? If you like photo realistic, immersive environments, I predict that you will be happy this gen. :) But on the other hand, if you are pixel-counter, (or polygons, or frames.) Then you will frequently find reasons to be disappointed. Too much focus on the numbers, not enough on the game.
$1600 gaming rig is a bit exaggeration. Intel Broadwell will already have very capable iGPU - even older Iris Pro 5200  is surprisingly good. Intel NUC  with Broadwell as a gaming machine could be quite interesting already. I wouldn't surprised to see gamers having Broadwell + 64G RAM + 12G dGPU(s) gaming rigs next year  (of course not just gaming but doing some work as well). In the same time I enjoy having games that are complex and demanding other than in graphics perspective. Dwarf Fortress - http://www.bay12games.com/d... Unreal World RPG - http://urw.fi/ Europa Universalis - http://www.europauniversali... X Motor Racing - http://www.xmotorracing.com... ...  http://www.youtube.com/watc...  http://www.intel.com/conten...  http://www.guru3d.com/news-...
You do realize that memory is useless? So is 12GB on a GPU that wont use it?
@Mithan: Maybe today in games that has ported from consoles. As a programmer and especially people who are working on graphics and media stuff more memory is only good. Hopefully in future games would start to use PCs capabilities but that's not going to happen for AAA games from Ubisoft, EA, or similar that has to make sure games runs on much lower specs consoles and PCs as well. PCs would have very strong CPUs for AI for example. I guess it much easier to find a high skilled graphics programmers than AI programmers.
Go hard or go home
So disappointed this gen. These consoles are supposedly like 4-6 times more powerful then last gen and still no 1080p/60fps
Me too.I think for the new consoles need new engines to reach the gold standard 1080/60
The funny thing is, i heard developers like JOHN CARMARCK say months before the 8th gen consoles came out, that the 8th gen consoles will have a hard time running games at 60fps and 1080p with a lot going on the screen at the same time. Why did a lot journalists and gamers not know this????
Not to mention 4K is right around the corner, 4K TV's are starting to flood the market, and the consoles can't keep up. I have a PS4, but I can't help but feel slightly unfulfilled knowing 4K games aren't in the pipeline.
And they can continue to 4K won't be mainstream until 5+ years probably closer to 8+ because of prices and lack of content. They don't need to be doing 4K gaming yet but they should support 4K media playback. Current console games can do 1080p/60fps if they wanted. The consoles are powerful enough in most cases. It's devs that choose graphics effects and other effects as a priority compared to framerate. You only need to look at Nintendo and all the 60fps games on Wii U for that.
it's only been one year. Xb1 and PS4 will get more powerful as time goes on. One of the console have already giving you a glimpse of its potential. Most of the engines made our old retooled from last generation. Therefore they do not take into consideration a closed box environment.
Quote : I know that most of you don't do homework on both these systems. Let me just say, PS4 is not the powerhouse you may think. Developers are always going to say how good there game is but we must always keep that in mine. PS4 is easy to code for do to it being a standard AMD GPU of today. It's memory system is in my opinion horrible. PS4 is not full HSA. PS4 is not 176GB/s effective bandwidth. That is why developers don't tell you the PEAK! PS4's GPU really is balanced to 14 of the 18CU's the other 4 will be for GPGPU. XB1 still has along way to go while software catches up (SDKs), but as i have said before bandwidth is the best way to indicate which system is and will be more powerful. Go on to the Internet and look at the history of bandwidth in consols it will tell you that 99% of the time the system with the most bandwidth will be the most powerful. And as it stands that score right now is as follows. Effective bandwidth XB1 ESRAM + DDR3=190-200GB/s effective bandwidth PS4 GDDR5 = 135-140GB/s effective bandwidth Theoretical PEAKs XB1 ESRAM Read 1024bits x 853MHz / 8 = 109GB/s PEAK Xb1 ESRAM Write 1024bits x 853MHz / 8 = 109GB/s PEAK XB1 DDR3 256bits x 2133Mhz / 8 = 68GB/s PEAK 218GB/s + 68GB/s = 286GB/s PEAK PS4 GDDR5 256bits x 5500GHz / 8 = 176GB/s PEAK Now sooner or later you must ask your self, "why is it that XB1 has that much BW for a weak console?" Oddworld developer say hello : http://gamingbolt.com/oddwo... So much about PS4 bandwidth and Diablo III has framerate drops on Xbone. Oh yes, there is Metro Redux also. http://www.eurogamer.net/ar... If Xbone's SDK will improve, so PS4's SDK will improve also. Oh yes, and about Star Citizen DX12. He was referring to the PC, dude. He doesn't even work on Xbone or PS4. He is STRICTLY PC developer. Of course, you and others on MRX blog realized it is about Xbone. Funny! Nobody takes you seriously, dude. Especially when you wrote that Xbone has dual GPU crap : http://i.imgur.com/MJhH0uK.... Try harder!
@imt558 I was right about PS4's weak GDDR5 (terrible memory system) only being 135GB/s. You still don't "GET IT", 176GB/s is PS4's theoretical bandwidth, it's effective bandwidth is 135GB/s. Just like XB1's theoretical is 286GB/s theoretical bandwidth, but it's effective bandwidth is only 190-200GB/s, as said in the digital foundry article. Developers are bound by agreements, and in the gaming business are always last to know. Case and point Diablo 3 1080p was given the code by Microsoft. Also as i have said it's a old engine so it should run great on PS4. This developer at gamingbolt. I didn't even have to click on it because I know what he can say, that's why there's no context for the bandwidth. Publicly he can only say what has been announced from Sony. He is giving you Sony's PR talk. The proof to that is here: http://www.eurogamer.net/ar... "Jenner wouldn't go into "details" on the levels of "bandwidth available" for each bus owing to confidentiality agreements" Note how DF article writer feels in the same PR line of 176GB/s. The developer knows that he cannot give the real bandwidth specifications the only thing that can be repeated publicly is the PR line of 176GB/s. That development document protects Sony from any libel that's why it's still up. Beyond 3-D now fully agrees with my findings get over it. It's 135GB/s. As to your Xbox one doesn't have two GPU's, i will follow the proven information that I can find and others, just like I did with the PS4 effective bandwidth. It's not about you or me, it's about the information that can be proven as fact. My history proves that. It is not only me that was saying Xbox ones has multiple GPUs, (768GPU and a 768GPGPU for ALU only so technically not a GPU) but unlike you I get some information from the horses mouth, like: Charlie Demerjian (no Microsoft fan) over at http://semiaccurate.com/?s=... you need to read them and look for the words "GPUs". Then there is this one here from the Xbox one architecture software designer. "The "GPUs" are really complicated "beasts" this time around." http://www.totalxbox.com/74... That is why i said that over time it will pass PS4.... A ALU GPGPU that I can imagine how much work it's going to take to program properly. Next up is the actual physical evidence. Your PS4 GPU has what is called global shared memory commonly known as GSM. It controls the amount of threads(64)for your PS4 that has 32kB for its GPU. Now Xbox one has 64kB for it's GSM 2x for 128threads. So you see I don't make blind accusations i back my shit up, but it's still only speculation at this point until I can find confirmation from other sources. You cannot find other sources to back up your 176GB/s "effective bandwidth", and that's why I always win when talking to you. If people disregard me that's fine, but I have been right more than you. "ESRAM IS A BOTTOLENECK" wrong. Thank you you're making me famous. I love to speculate and voice my theory, but I will always tell people that I am speculating and that's why I post there. Just like PS4's fact 176GB/s peak and effective bandwidth is 135GB/s, that's not speculation it's FACT.
@marcofdeath: "I was right about PS4's weak GDDR5 (terrible memory system) only being 135GB/s." no, you're not right about that at all. "Case and point Diablo 3 1080p was given the code by Microsoft. Also as i have said it's a old engine so it should run great on PS4. " hahaha no, they didn't just give them a magical code. and if diablo is running on an old engine, why were blizzard struggling to get it to 1080p on xbone? i mean, if the hardware is beyond next-gen, then it shouldn't have a problem, right?? "Beyond 3-D now fully agrees with my findings get over it. It's 135GB/s." yeah... no. i doubt they fully agree with you since they banned you from that website for being horribly inaccurate because you were throwing out numbers generated by misterx's rear end. "He is giving you Sony's PR talk. The proof to that is here: http://www.eurogamer.net/ar... "Jenner wouldn't go into "details" on the levels of "bandwidth available" for each bus owing to confidentiality agreements" and you conveniently left out the continuation of the sentence here: "but based on our information the GPU has full access to the 176GB/s bandwidth of the PS4's GDDR5 via Garlic" "My history proves that. It is not only me that was saying Xbox ones has multiple GPUs, " bahahahahahaha no it doesn't: https://twitter.com/albertp... " "ESRAM IS A BOTTOLENECK" wrong." actually, it's' right, and it's why xbone has been struggling to get to 1080p most of the time. "Just like PS4's fact 176GB/s peak and effective bandwidth is 135GB/s, that's not speculation it's FACT." nope. not fact at all.
@etownone. im not disappointed at all. The ps4/xb1 have and will continue to have great games. Its just started.
I don't doubt great games are coming. But easily... First wave of games should be at the 1080p/60fps.. Oh well.
If promised by devs and Sony/ms heads then I agree. I don't remember if the release on the consoles addressed this topic? I do know the specific features were a let down. The consoles felt rushed then finished with updates
Because that alone causes 4* amount of work plus all of the other upgraded visuals and bigger landscapes
I hate when people call other gamers pixel counters, like its a bad thing to want a game to be clear (resolution) and to run smooth (fps). It's not a bad thing in fact it is one of the most important things. You could have the greatest game of all time but if it runs like s*** and looks like vasoline smeared all of it, then its just not worth it.
I hate it because they act like that's the only thing that matters, period. I mean, it could be an article about graphics, and folks complain about people debating the subject. That makes no sense to me. The same as the ones who say "suddenly gamers only play sales" in a sales article. Are these people broken?
In only 20 so I can't even begin to imagine how some of you older gamers lived before hd. Having 1080p is a benefit but 700-900 is great too. 30 fps is great, 60 fps is better. As long as I game is 720p and 30frames it's at a resolution and gram rate that is solid. Anymore is icing on the cake. If rather have increase effects, number of objects and other things before frames and res is increased past those points
This better be a joke. WTF do you need bro! Get outta here with that nonsense. Go on and make games and use those big brains of yours. Stop with the "limitations" of the hardware already.
How many GB of RAM are these dev kits using now? I know that dev kits were given out that only made use of 4GB RAM in the first few months of the PS4's release. I'm just wondering if that's still the case since it has been confirmed that the PS4 operating system only needs 1GB of RAM to run in the background while an application/game is running.