I read an interesting article today from G4 regarding graphics in video games. It got me thinking about the games I grew up with and what we have today. Video games have a long and well documented history going back several decades. From the games played on the big mainframe computers to today, where a game can be played on a cellphone. Advances in graphics have improved substantially over the years with each successive generation.
When the first video games came out the graphics were, to put it mildly, less than mind blowing. The screens usually had a white object (ala space war), or whatever you controlled, on a black background. The advancing vector graphics in early games became something to behold. Eventually games evolved to 2D sprites and emerging to 3D polygons. The jump to 3D was definitely a huge advance for gaming and for showing the realism possible. With advances in texture, lighting, and processing power, we have some amazing looking games today, but is that really what we need?
We all like to see the eye candy in todays games. Who doesn't like to see the way the foliage moves in Crysis? Uncharted includes realistic looking and moving characters in addition to a good looking environment, but is all this realism something we really need? Could realistic games bad for gaming? Let's take a look at what the effects are of getting all that pretty scenery. As games become more realistic we see a huge increase in the amount of software that is required to render said environments and characters. It takes a lot of information to get Crysis to look as drop dead gorgeous as it does. What cost do we have to pay for that? Obviously, we need a system with a lot of processing power that can read all that information and display it on the screen as it was meant to be seen. If our computer or console is unable to read and send out the information fast enough, we see drastic problems with the rendering. Slowdown is to be expected in games with high detail environments, although the developers work hard to make sure their game is compatible with as many systems as possible. Another issue we see, more and more these days, is regarding the length of the game. The developers have a hard choice in choosing the correct balance between quality and quantity. We all want more of our favorite games. To see more and different environments in Uncharted, or to have another area to go to in Halo 3 are all natural reactions from the gamers. For the game makers though, that one more level may mean another $500,000 in developing costs or adding a second disc to include the extra code. Both factors that are important in order to lower development costs.
For some games though, realism is the only way it will work. The Normandy Beach level in Medal of Honor would not have made the impression it did if you didn't see the bodies, hear the bullets and mortars, or see the Nazis in the distance with the machine guns. You needed to feel a connection with the world, the real world that was, in order to get the full impact and emotional response to the moment. When you charge the beach, you can imagine being with the soldiers that were actually there, some 50 years or so before you were even born. It brings us a connection to what was, and maybe even, what is to come.
Realism may be a challenge for developers, but in the right world, under the right circumstances, it is a requirement.
As always, I'd like to know what you think. Yes you, the one that kept muttering to himself while reading this article. You just need to speak up a little more so I can hear.