HD and Next-Gen. Why are our games still sub-HD?
This is going to be my first post on here, so bear with me and I'll try not to ramble too much.
I've seen many discussions in the past that are full of arguments about resolution this and fps that, but I decided to put this together after reading through a recent post about The Order: 1886. With the next-gen upon us, many people are wondering why we are still seeing resolution and fps numbers reminiscent of last-gen. Considering the rhetoric used and promises made by both Sony and Microsoft, I don't blame anyone for asking. Well, lets disregard whatever was said and break this down a bit.
The main problem with assuming next-gen would be strictly 1080p60 is that we are ignoring the other issues that affect performance. With all else equal, new hardware could easily run last-gen games at 1080p60. But what about wanting higher res textures? Higher polygon count character models? More realistic facial animations? The added horsepower from the new consoles is needed for all of this and more, as well as keeping the game running at a smooth framerate and at HD resolutions.
I recently saw someone make a sarcastic comment about The Order and it's aspect ratio, complaining about it being an "artistic choice." I would say that it actually is. The artistic choice comes from the balance each developer gives it's game. Graphical "prettiness," framerate, and resolution are all competing for a consoles horsepower. Sure, we could take the step up for all new games to run at 1080p60, but we won't see near as much of an increase in the "graphics" of the game. There are some that think this is how it should be, but developers realize that no one category is most important. They won't be able to make a graphically stunning game if all this newly available power is used solely to run their game at at a steady 1080p60.
For those that disagree with this style of game development, I am sorry but I don't think it will change anytime soon. When the generation comes that console gaming reaches 4k video, we will probably have many titles running at less than 4k. We will probably have many titles running at less than 120 or 60 fps (crossing my fingers for a 120fps console "standard!") I personally think this mentality is our fault to begin with. We as consumers are so critical of how our games look, that developers have to give up something from the other two categories to keep the visuals at a level we are happy with. Ultimately, rendering a game at a slightly lower resolution and then upscaling afterwards still gives great results. Many of the so-called 900p games still look great. I feel that developers have really found a sweet spot between resolution, and after-effects such as anti-aliasing to produce a beautiful image.
I hope this post has given all of you a little better sense of this issue. We shouldn't look at it as being an issue of underpowered consoles, but rather as a conscious decision made by the developers of each game as to where they think it best to allocate system resources to make the game they've imagined. I plan on making a post sometime soon going into some more detail on the process of upscaling, and how it isn't really as bad as everyone makes it out to be. Thanks for reading!