Killing pixels now and then...


CRank: 5Score: 0

Thoughts on the Next-Gen and the Industry (Technical article from a Soft Developer)

The "next-gen" is coming, in fact it is "technically" already here with the WiiU. After going through countless of posts in N4G and other sites, reading so many comments I decided to write quick blog post as objective as possible.

First my general background: I am currently working as a Software Developer with specialisation into data encryption algorithms, mainly Java and pseudo-parallel programming using Java Threads. I am doing an Android game for the last 2 months with some colleagues of mine for fun (a Tron-like, just to explore Android API). I won a game development competition by MS in the past and thus I am not stranger with game developers logic. I did a BSc in Computing and applied for a PHD with Web Development as my research topic -can't be more specific in a public post. ;-) As you can see I got a technical background and my post will try to reflect it. I will try to avoid quoting numbers as everyone here memorised them and I do not want another "numbers talk" post. In fact I found most "gaming journalism" posts lately written by people who in my opinion are just copying numbers and trying to compare them using primary school mathematics.

My (recent) gaming background: I own a PS3/XBox 360 and a maxed Macbook Pro. I used to own a gaming rig, but due to work I had to switch to a Macbook Pro. I still game on my laptop, in fact I 'm playing games such as BF3, Dishonoured on bootcamp on ultra (full HD res) with 30-60FPS. Moreover I game with DOTA2 on a virtual machine again on ultra. So yes, I can play ALL games expect Nintendo exclusives. In interest of ful disclosure, due to limited time I do not game as much as I used to. In the past I owned a ps2 and an original xbox.

Now that you got an idea who I am, I will start commenting ;-)

PS4 announcement and specs:

PS4 is interesting from a developer's point of view, mainly due to the discrete ARM CPU. Having another CPU in a commercial product is rare and I wonder whenever developers will be allowed to use it or if the OS will run over there, leaving the main CPU purely for the developers.

The 8GBs GDDR5 RAM is the hottest topic however. PC fanboys argued that DDR5 is not needed or that PCs used them for some time now in the graphics card. In fact actually AMD ATI 4870 was the first card to use 1GB GDDR5 back in 2008. Now does this count? No.
Let's see how a PC works in a nutshell:
Game asks the engine, which asks the OS to load data. The OS loads "blocks" of it into the main memory (RAM). If the data need CPU processing they are going to Northbridge and from there to the CPU -in case of Intel- or they are going to Memory Controller stored inside the CPU -in case of AMD. If the data need GPU processing they are going to the southbridge (after passing from northbridge) and then to the graphics card and are stored in the graphics card RAM. Textures are usually loaded from the Hard Drive to the graphics card straight away -if the developer is sensible enough not to leave everything to the OS. This is the case of Windows, a polylithic OS. As you can see the OS layer is always on top of the hardware. This and the time to get data to the graphics card are the two main bottlenecks.
Does the graphics card benefit from GDDR5 over GDDR3 or 4? Yes. The GPU is a Parallels processor, not a linear one like the CPU. Multiple (hundreds) calculations can take place at the same time. Thus a fast bandwidth is needed to feed the GPU with data.
Will a CPU benefit from DDR5 over DDR3? Yes and no. The performance difference will be minor and will not benefit the cost. A CPU due to Intel's HT and the multiple cores both Intel and AMD got, are able to do "pseudo-parallel" processing.
This is why nVidia is at "war" with Intel. nVidia tries to promote CUDA as a parallel processing standard, that makes CPU a secondary component compare to the GPU! Already CUDA is used for physics calculations in games, but also financial and academic calculations. JP Morgan the financial giant is using GPU-Computing as it is called for everyday financial calculations.
PS3 for better or worse used Cell, a prime example of parallel processing. This was PS3 main advantage and disadvantage as developers are LAZY. One of the first things we learn at Uni is: DRY. Don't Repeat Yourself. Having multiple platforms with different architectures? Write code for one and export/ edit the syntax to the rest -and hope it will work!

Now what all these have to do with PS4 GDDR5?
Sony learned from PS3 that they can't ignore standards and that developers are really lazy. Thus creating a new architecture or upgrading Cell isn't going to work. However parallel computing is the future, with PCs getting more and more powerful graphics cards and with new APIs it is becoming easier and easier to program. Thus they showed the physics demo, a prime example of Parallel processing. In order to "utilise" parallel processing, as explained above, fast memory for the graphics card is a must! GDDR5 was the obvious choice. This explain the GPU using it, how about the CPU?
This is where I take my hat off to Sony -if I am correct and Sony gave the freedom to developers to program in low-level languages. They knew that GPU needed GDDR5, but also they knew the bottleneck of loading things from the CPU to the GPU, as up to this day, x86 NEEDS a CPU for better or the worse. What they are doing is using the same memory pool for the CPU and GPU, thus no data have to be transferred from CPU to GPU! The developer will only have to switch a pointer/reference (in C/C++ terms) from the CPU to the GPU.
Due to this they couldn't go with half DDR3 and half GDDR5. Will the CPU benefit from GDDR5? Not that much. Will the system benefit from GDDR5 insane speeds? Not that much in a direct way, but a lot indirectly.

The PS4 -as all consoles- will be as good as a mid PC in terms of specs on a paper upon released day, perform as good as a high PC (with the assumption the developer is good enough) at it's peak and then perform as good as mid-PC towards the end, declining to a low-end PC quality.

Just note: Specs on a paper aren't everything. You can see this everywhere. Even in Smartphones, Windows 8 Mobile performs fast with a dual core CPU, while an Android needs a quad core for a similar performance. The iOS got some amazing good-looking games with iPhones having less specs on paper than Android flagships. I won't go into details why is that, nor I want to start a smartphone flamewar.

The NextBox:
Yep we got rumoured specs, people are already comparing it to PS4. I will keep it simple in this post:
IF NextBox has the rumoured specs, it may on paper be inferior of the PS4, but in reality I doubt it in 3rd party games.
In 3rd party games we will see ports from one console to the other, with similar performance and results. The only visible differences I expect are colour output and sharpness to be different due to the different rendering. I just hope Microsoft doesn't try to ENFORCE their own standards again -as they are doing now in RTC for the Web- and make DirectX mandatory. This will limit developers...

Is it a next-gen console? technically yes. Is it a current-gen console? again yes. In terms of raw power, WiiU is falling behind... but is it as weak as people say it is? In my opinion: No. WiiU is a console built for GP-GPU computing in mind. However the whole architecture is a giant bottleneck for developers and unless developers spend extra time optimising games, they won't look as good as PS4/nextbox games. Sure exclusives will look nice ;-) In fact gamers who want the Nintendo experience will have a fantastic journey ahead of them! I am tempted to get one, but I won't at the moment. Wii was a success due to the innovation and price. WiiU price points is to much. Parents and casuals who are getting their first consoles may even go after a PS3/360 now as they are cheaper. Upgraders from Wii are either Nintendo gamers who will get a WiiU -if they haven't- or casual gamers who may even go a PS3/ 360 and still call it an upgrade. In any case I hope WiiU will sell well and EVERY gamer should do so. It is to the best interest of everyone to have multiple players in the console market... Competition helps innovation and price drops ;-)

Oh boy, this is the hard one. I expect people to start a defending it already from my PS4 bit above. PC is a "different" league from the rest. You get as good graphics as you are willing to spend. It is unfair in my opinion to do graphics comparisons between PC and consoles. If a console performs better in a multi-plat game than a high-end PC, then the developers should stop working.

The main advantage and in the long run, disadvantage, is that it is an "open platform". ANYONE can program and distribute a game/ application on it. This lead to piracy and the industry nowadays either are using DRM to counter that or are putting PCs into second fate *looks at Infinity Yard*. PC-Only developers are slowly going multi-plat, not just for platforms but also for Macs/Linux. It is proven Mac owners are most likely to buy a game, even if they are the minority, it is an increased in the revenue. Blizzard, ArenaNet and so on are doing this and are successful enough. Steambox will be using a Linux distribution, thus more developers will try to include UNIX OSes....

We had digital distribution services in the past(i.e.: Direct2Drive)... but now games REQUIRED you to register it with Steam/ Origin. Sure with some file editing you can pass that, you can even pass the online-only in Diablo 3. However this makes the average gamer think if it is worth the trouble or not and ends up buying a game.

Steam is a complete environment, with achievements, cloud storage and so on. This leads to the question: How different is a console from a PC now? It isn't much, is it? With Windows8 marketplace and Microsoft pushing DirectX and XNA API for games over OpenGl.
At the end of the day, Steam and Origin aren't much different than PSN/Live -especially the later. Long-gone is the idea of having an "open" system. Especially if we follow MS way of using DirectX and their marketplace...

Hey, are you claiming the PC gaming is dying? No. Far from it. I am saying it is evolving. Steam while a close system, with Greenlight gave the opportunity for so many small developers publish their games... thus they don't need to go "console" exclusive for arcade titles. Small developers now got the choices:
-Publish in MS/Apple App store using their own standards.
-Publish a game for consoles as PSN/Live is there, with almost no fear of piracy.
-Go multi plat for increase user base and thus revenue, using Steam for PC/Mac and maybe try MS/Apple/Amazon/.. app store as well.

At the beginning of the generation arcade developers were publishing only on PSN/Live or were trying to get people buy their page from their webpages. Steam helped them... In return developers will publish to the Steambox games. PC gaming is getting not-so-Windows... sure AAA games still work on DirectX, but it will be a mater of time. Afterall if PS4 uses OpenGL/CL with a X86, it will make sense for developers to use it to the PC ports/lead platform.

Final thoughts:
The industry as a whole is declining. We lost publishers such as Sierra and THQ. It is stupid of us to argue which platform is better at the moment EVERYONE just wants our money. Having annual re-releases, milking of IPs and DLCs are prime examples what Publishers do nowadays.

One thing that always amazed me in gaming journalist, is when they say the performance is done due to "new algorithms". This is complete CRAP. Anyone with a descent Computing degree knows the Big-Oh Notation and how algorithmic efficiency is rated. In fact if you manage to develop a faster algorithm for the shortest-path calculation (a commonly used algorithm in vectors) you get 1 Million dollars from Clay Institute and most likely a Nobel... Don't listen to them! The improvements you see are due to:
-Marketing. It will sell more and keep the console sales up more with "graphics" boosts from the software side.
-Developers are actually starting to learn the APIs and optimising games, instead of porting...

This Generation isn't as much about hardware as it is for prices and futures, such the sharing on PS4 or whatever nextbox will include. I hope thanks to Steam and consoles digital stores we will see smaller developers making awesome games, available to all! ;-) See Ouya for example. The hardware is awful compare to other consoles, but lots of games are coming! Why? It is affordable, a distribution store is there and it is easy to program/ port on.

I hope I made sense and will freely answer all questions. :-) I may add to this article in the future. Sorry for my English, but it is my secondary language (3rd if you count Java ;-)).

The story is too old to be commented.
Derekvinyard131936d ago

great read, cant belive sierra is gone :-( the industry is decling. your final thoughts are dead accurate im glad some people understand whats going on. if you dont mind me asking, how old are you? because veteran gamers seem to see the industry diffrent then kid growing up call of dutyified, 20's 30's?

Athonline1936d ago

Thanks! :-)

I am in my 20s. I gradated high school, went to the army for two years (it was mandatory) and I left the gaming world for a while. After the army I went to Uni and found the industry like this... declining at an accelerating rate. Now (as said above) started recently working as a software developer.

I was saying back then we should stop thinking with brands in our head and try to discourage developers milking franchises. Now these effects are present with all these season passes, annual "re-releases" and so on...

The industry used to be about "interactive art", a medium for storytelling or pure entertainment... last (this) generation I found myself stop enjoying games and completing them for that extra achievement point or trophy... or having to replay a game 3-4 times to justify the price.

Previously it used to be about Single-Player with OPTIONAL Multiplayer or JUST Multiplayer with a few bots. This generation it is like a rule, now that networking programming is done easy, all games should have both Multiplayer and Single-player. Even if most of the times it just feels out of place...

I don't blame the developers at IW for Call of Duty, afterall Fifa was doing the same thing for years. Just now it is mainstream, widely acceptable and all this "gaming journalism" care more and more about such games. It is us the gamers to blame for, as we are "dutyified" as you said. We accept an annual re-release of the same game and on top of that, Season Passes, etc.

RIP Sierra. I missed them. Same with Westwood and Bulfrog Studios...

Derekvinyard131936d ago

Can't agree with your opinion more, I'm in my 20's to, thing is we came from such a great exploit free generation it sad to see this decline because of what games use to be, like you said a form of art, now it's all about money. There is no innovation only sequals because developers can not afford one flop they have to stick with what will sell, we have seen developers close down because of this. What dev do u have a lot of respect for that changes it up a lot ? I think I know what your going to say being from the Same time