"For nearly 20 years Sony in Japan has been plagued by the myth of the "Sony Timer" – but is there really a kill-switch that destroys your device just after its warranty runs out? Many Japanese genuinely believe that there is."
Save a bucket load on this massive curved gaming monitor- with the potential of even more savings in LG's Gaming Week
KnowTechie writes: "The GameScent is a unique way to add immersion to your games. Thanks to its HDMI and 3.5mm jack, the device is versatile, so you can use it across your consoles and PC. However, it would be great to see a new range of scents since there are only six, and many smell similar to each other."
At a time where GPUs are more available than ever, it appears as though PC gamers aren't upgrading as often as they used to.
For me, the primary concern with new software is how it's often exclusive to a new series. This not only frustrates me but also raises questions about the lifespan of the hardware. With GPUs no longer offering significant performance boosts, they rely heavily on software enhancements.
However, this reliance is contingent on developer support. When the new 5000 series hits shelves, it's likely that the 4000 series won't be compatible with Nvidia's new software. This would negate any advantage it had over the 3000 series, leaving one to wonder why they upgraded in the first place. And the same will keep happening as we move through the generations.
AMD is a bit better in that regard as they often use open standards, which offer wider compatibility. However, they have even less developer support, and their software solutions tend to lag behind Nvidia by at least one whole generation. So if you have a 3000 series from Nvidia right now, it doesn't really make that much sense to upgrade to the 7000 series from AMD because feature-wise they are pretty similar level.
oh my god, these "Here's why" articles are always about the most obvious shit ever, like do people actually read these?
because they last for generations. You don't need to upgrade every 1, 2 or even 3 years. I went from a 1080ti which served me so well to a 3080 with years in between. I won't even consider upgrading until the 5000 series at the earliest, but will most likely wait for the 6000 series.
Kind of like the 360's kill switch... The "ON" button.
Looks like noone gonna approve this one :(
Apparently I offended alot PS3 users (got alot of reports on it)
Just saw it at Engadget and thought it could be interesting, since alot us own sony devices (I own 4 sony gaming devices: Ps2,PS, PSP Go, PSX and a Sony TV) :I
The reports claim I took a PS3 picture to offend, but I didnt!
I just took the picture from the article and Engadget.com...
EDIT: Anyway changed categories and picture to please fanboys... Hope I did'nt offend anyone and that it can be approved now...
And to all the reports: I mean come on? Reporting "fake" on this one? Its actually a good and interesting article.
I used to think that about the PS2 phat back when mine broke. It stopped reading discs exactly one year to the day after purchase, and at least 20 other people I knew with original PS2's had theirs end up the same way, all around the one year mark.
That was one of the reasons why I waited 2 years to get a PS3, which also broke 3 days shy of the one year mark due to YLOD.
/tin foil hat
I bought my 60 gig (now 250 gig) PS3 used nearly 3 years ago, use it quite heavily and haven't had any problems at all. I think it all comes down to common sense and taking care of your electronics.
My 60 gig broke, but that is the exception rather then the rule.
(Oddly enough my PS2 and PS1 still work like the day I bought them, and I was a early adopter.)