GamesAreEvil writes "..Blu-ray is simply not worth the money."
At a time where GPUs are more available than ever, it appears as though PC gamers aren't upgrading as often as they used to.
For me, the primary concern with new software is how it's often exclusive to a new series. This not only frustrates me but also raises questions about the lifespan of the hardware. With GPUs no longer offering significant performance boosts, they rely heavily on software enhancements.
However, this reliance is contingent on developer support. When the new 5000 series hits shelves, it's likely that the 4000 series won't be compatible with Nvidia's new software. This would negate any advantage it had over the 3000 series, leaving one to wonder why they upgraded in the first place. And the same will keep happening as we move through the generations.
AMD is a bit better in that regard as they often use open standards, which offer wider compatibility. However, they have even less developer support, and their software solutions tend to lag behind Nvidia by at least one whole generation. So if you have a 3000 series from Nvidia right now, it doesn't really make that much sense to upgrade to the 7000 series from AMD because feature-wise they are pretty similar level.
oh my god, these "Here's why" articles are always about the most obvious shit ever, like do people actually read these?
because they last for generations. You don't need to upgrade every 1, 2 or even 3 years. I went from a 1080ti which served me so well to a 3080 with years in between. I won't even consider upgrading until the 5000 series at the earliest, but will most likely wait for the 6000 series.
If you're in the market for an RTX 4090 graphics card, this one is currently the cheapest one you'll find on Amazon after a hefty discount.
Nvidia DLSS 3.7 is the latest update to the long-running AI upscaling technology, and it further shows native performance doesn't matter.
I think hardware development is at a point where they need to figure out how to draw less power, These beefy high end cards eat wattage, and I'm curious if using DLSS & AI in general will lower the power draw. It would seem like the days of just adding more VRAM & horsepower is over. Law of diminishing returns. Pretty soon DLSS/FSR will be incorporated into everything, and eventually the tech will be good enough to hardly notice a difference if at all. AI is the future and it would be foolish to turn around and not incorporate it at all. Reliance on AI is only going to pick up more & more.
PS4 Pro had dedicated hardware in it for supporting checkerboard rendering that was used significantly in PS4 first party titles, so you don't need to look to PC or even modern PC gaming. The first RTX cards released nearly 6 years ago, so how many nails does this coffin need?
Almost deaf person:
- lightweight portable 5$, speakers of 0,5cm diameter are the final nail in coffin of Hi-Fi audio!
Some people in 2010:
- smartphones are the final nain in the console gaming's coffin!
This is just the same.
AI upscalling is complete dogshit in terms of motion quality. The fact that someone is not aware of it (look at the deaf guy example) doesn't mean the flaws are not there. They are. And all it takes to see them, is to use a display that handles motion well, so either gets true 500fps at 500Hz LCD TN or OLED (or faster tech) or uses low persistence mode (check blurbusters.com if you don't know what it means) also known as Black Frame Insertion or backlight strobing.
Also, image ruined by any type of TAA is just as "native image" as chineese 0,5$ screwdriver is "high quality, heavy duty, for professional use". It's nowhere near it. But if you're an ignorant "journalist", you will publish crap like this article, just to flow with the current.
There's no coffin to native res quality and there never will be. Eventually, we'll have enough performance in rasterization to drive 500fps, which will be a game changer for motion quality while also adding other benefit - lower latency.
And at 500fps, the amount of time required for upscalling makes it completely useless.
This crap is only usable for cinematic stuff, like cutscenes and such. Not for gaming. Beware of ignorants on the internet. The TAA is not "native" and the shitty look of the modern games when you disable any TAA, is not "native" either as it's ruined by the developer's design choice - you can cheat by rendering every 4th pixel when you plan to put a smeary TAA pass on it later on. When you disable it, you will see a ruined image, horrible pixellation and other visual "glitches" but it is NOT what native would've looked like if you'd like to honestly compare the two.
Stay informed.
How much VRAM is standard today? My laptop has a 1080p QLED display but only an Intel Iris Xe with 128MB of VRAM. I currently do all my gaming on it but certain titles do suffer because of it. I plan on getting a Steam Deck OLED soon to play the newer and more demanding titles.
bluray is like an added bonus for ps3 users
"I’m not expert when it comes to technical aspects of a Blu-ray disk, nor will I pretend to know exactly what makes a Blu-ray so much better than a regular DVD."
So what's the point of this article?
Then STFU then.
Must be one these selectively blind people. You know, the one's who can notice when a PS3 game is missing a pixel but can't tell the difference between DVD and Blu-ray.
Forget the Technical aspests
If you Cant tell the Difference when watching either !!
You Need you eyes testing
And you have no right going any futher with your opinion
So stop moaning and look at it as a Bonus
A State of the Art in HD Bonus !!