Some food for thought in this video which goes back to the idea that games consoles could be upgraded for better performance.
Plenty of unforgettable games have completely messed up their players throughout the years, all the way back from the PS1 days to the dark recesses of the modern internet.
Games such as Mad Max, Red Dead Redemption 2, and Batman: Arkham Knight desperately deserve a modern-day revisit.
"Hammerwatch II's journey to Xbox has been long and perilous. After first launching on Steam in the fall of 2023, the game finally turned up on PlayStation 5 last December. Since then, Xbox gamers who enjoyed the original Hammerwatch and the sublime Heroes of Hammerwatch have been anxiously awaiting their turn at the long-promised sequel. At last, the wait is nearly over because Hammerwatch II will hit Xbox and PlayStation 4 on April 23," says Co-Optimus.
I honestly hope not. The whole point of a gaming console is you buy it once, and that's it. The developers have one set of hardware to work with, and the consumer doesn't have to worry about upgrading. If they do, then we all might as well buy PCs.
That's exactly why the MOVE, while selling well enough, isn't huge, simply because it was a post-release accessory that developers didn't have to develop for. If Sony wants their new MOVE 2.0/Eyetoy 2.0 to be sued, it NEEDS to be bundled with ALL PS4's.
So yeah, keep that "upgrade" crap away from my consoles. I have a PC already, I don't want to have to worry about my console as well.
There is no need for upgradeable consoles when you have cell architecture when the games become better, better & better. Just look at GOW: A. It's about making a console that lasts.
I want an upgradable console, atleast for two console cycles, with just the gpu being replaced.
Imagine an xbox/ps4 with a pool of 8gb ram and a great cpu. A custom gpu solution on a pcie bus that you just pop off the side and can replace for peanuts the next generation (5-6 years down the line). I would want that.
It would save the customer money and cut down on R&D every other console cycle. They will not need to create a new gpu every year, just with the console cycle.
Sony already makes an all-in-one PC with Windows 8.
All they have to do is ditch Windows and make a Linux-based OS with full accessory and game support.
That probably won't happen though.
I agree with SynGamer on this one.
Console gamimg puts everyone on the same "level", whereas gamers playing on super-customizable PCs often have an advantage.
With consoles, game developers have a "least common denominator" that they know everyone has because of the built-in features / specs of a particular system to start out with so they can maximize the gaming experience from there.
If you want more / better / "bleeding-edge" technology, then PC gaming is probably best bet for you.
If you want simplicity /"plug-n-play gaming", then a console, (of your choice), may be more appropriate for your situation.
Either way, it's great that there are so many options out there for gamers.
Althogh making systems "slimmer" as time goes by is great, it'd be even better to see the makers of these systems integrate a few "ugrades" along the way with those changes, as systems evolve.
Of course, this may lengthen the lifespan of the console-- which may or may not align with the plans of the video game console manufacturers in the long run.
The video game console manufacturers have to constantly ask themselves, "What if other console manufacturers don't include these "upgrades' in their systems"-- "What if they do?". "How will this affect their strategic decision when it comes to developing the new systems for the future?" and "How does it ultimately alter their timeline when doing so?".
Even the smallest "upgrade" may have unintended consequences that haven't been forseen.
Let's face it; everyone loves new technology-- we just need to be careful when/ how we integrate it-- that' all.