John: "Most veteran gamers are wise to it now, but in the not-too-distant past few years we have been seeing some games get noticeable downgrades in their graphical presentation after they’re revealed."
Does anyone really need to ask this question? Because target renders are meant to get people’s attention, get them interested in the game. Developers no doubt want to hit those targets, but most times their ambitions shoot above a consoles, or even a PCs, pay grade. Making them have to scale back to realistic, current, hardware. Which then means downgrades in a games graphics and efffects. It isn’t as if developers want to downgrade anything, but sometimes their hopes get the best of them and we end up with something like Watchdogs, which truly looked next gen in its unveiling video... but ended up losing a lot of what made the atmosphere in the game. Wreckless was the same. People thought it was going to be the next GTA and it ended up being a police chase kind of thing.
You basically said it all. People also have to understand that at the time the initial trailers are shown, the game isn't fully compete. So especially if a game is open world, it will be missing npcs and environmental elements such as trees or buildings and some areas of the map might not even be in the game yet. So at the time the game would look and can fully run on the hardware at it's current state, but then add those missing factors I mentioned previously and then suddenly you'll have to start scaling back things like graphics or resolution to get the game at a playable state with core features in tact on current hardware. A lot of gamers don't understand this so they see it as being deceived.
You're correct. Most Devs are extremely ambitious and want to shoot above a console, or even PCs pay grade.Well, I have worked with 3d application as such as Mya/3d max and one thing I always notice I do(as well as some of my classmate) we tend to add lots of effects to a character model or an environment because it looks better. But the reality is when you start to render a scene and it doesn't work you have to cut out lots things like lighting, fog effects, reflections, refractions, etc.
Two people downvoted you, this makes you wrong! Clearly, it is because they want to suck us in then go "HA HA" and render the graphics worse, because "why not" right?
It's more that things come up during testing which have to be worked around, and to get things running smoothly all around, graphics are usually the thing that frees up memory enough to make that happen. That, and as you say, most games are shown early on hardware that is beyond what you'll see on consoles, or even the average PC.
Agreed, an underpowered is always the reason for the downgrade. Anyways, I really hope the next-gen console has a faster CPU.
Double agree. Although there are many more factors involved. Storage capacity is one of the largest setbacks, especially on consoles. As resolution increases, so does file sizes. Consumers don't want multiple discs and current bluray discs can only hold so much now. Crap people moan about downloading 30 GB games, imagine when games are 1 TB. The problem is people focus too much on beefy GPUs, the famous TeraFlop buzz word. GPUs are important for caching and post-rendering effects such lighting, particle, water effects, etc. They don't run games, which is why higher clock speeds on CPUs are so important. Making a game look pretty is easy, although time consuming, but making a game look pretty AND run smoothly is a whole new ballgame. Hence why consoles are a decade behind just hitting the basic 60 fps. Devs have to compromise performance for graphics because gamers buy games that look pretty first then complain later. Things needed to progress gaming: Much higher cpu clock speeds, higher storage capacity and advanced compression.
I don't get why people are disagreeing with you, it's a hard truth. Consoles are underpowered, running on about 5 yr old tech now, sure we're getting mid gen upgrades, but the games still have to be made with the weaker models in mind.
Because that's a value judgment based on unrealistic expectations. Consoles have to hit a mass-market price point, and most console users don't care about graphics as much as purists do. That's why a system like the Switch is not only possible, but highly successful. Other considerations take priority, not graphical power.
Everything's underpowered compared to a prerendered artistic vision.
What you fail to realize is that today's gaming console are not balanced.That said, the GPU is much more powerful than the CPU which sucks. The CPU/GPU need to be equal in processing power to simulate things properly.
Because most games are designed and shown on a high end PC first then imported to a console dev kit.
Change "some" to "every AAA game ever".
To increase framerate. What a dumb question.
N4G is a community of gamers posting and discussing the latest game news. It’s part of NewsBoiler, a network of social news sites covering today’s pop culture.