160°

Latest Nvidia & AMD high-end GPUs can handle pretty well 4K resolution games

DSOGaming writes: "Toshiba has demonstrated the first games, running at a 4K ( 3840×2160 ) resolution on its 4K Quad-FHD screen."

Read Full Story >>
dsogaming.com
Hatsune-Miku4245d ago (Edited 4245d ago )

Awesome, now I can't wait for the ps4. Let's see If people will continue to believe that the Sony ps4 wont be able to play some AAA games in 4k resolution especially when Sony is selling 4k telis now

john24245d ago

Sony said that PS4 will support 4K resolutions, meaning support for these screens and upcoming movies(provided we get this kind of movies). Don't expect PS4 games to run natively at 4K resolutions (obviously they will be upscaled)

DigitalAnalog4245d ago (Edited 4245d ago )

Didn't you hear what the promoter just said? He says the 4K that is powering Grid (a current gen game) is running at the MOST powerful graphic cards at the moment 680 and 7970 at 30fps. That is how demanding this resolution can take. These cards costs at least $499 ALONE. If SONY were to render games like this it would NOT be a current-gen game. Most likely only the PS2 games could run at this resolution.

Even if the PS4 does sport a 680, it will NOT run next-gen games at 4K. More like current gen games like Uncharted or Resistance.

hellvaguy4245d ago

"the MOST powerful graphic cards at the moment 680 and 7970 at 30fps"

690 is the most powerful of them all, says the Hulk.

http://www.newegg.com/Produ...

KMCROC544245d ago

Not many people at first are going to say anything cause not everyone is well off or stupid enough to drop 20-25k on a tv.

fr0sty4245d ago

Are we talking current gen 4k, or next gen 4k? PS3 can run ps2 games at native 1080p (maybe even higher) also, but you won't find many ps3 games running at native 1080p.

shutUpAndTakeMyMoney4245d ago (Edited 4245d ago )

Yeah consoles won't have 4k gaming. For one console devs want to squeeze out all the power which means 1080p.As the hardware gets older and games look better they might lower fps and resolution like they do now.
Also gtx 680 won't fit in a console box.

Tech is moving to fast for consoles. Wait until nvidia Maxwell (delayed till 2014) & consoles will be surpassed early this time.
http://www.youtube.com/watc...
by 2015 4k will be available on all high end/ mid ranged pc cards.

+ Show (2) more repliesLast reply 4245d ago
Fishy Fingers4245d ago (Edited 4245d ago )

Even new Ivy Bridge CPUs support the 4K resolution with their onboard GPU, as does Nvidia and AMDs flagship cards.

There's nothing special about it, many current GPUs could support 4K (other than their output), there just hasn't been the panels to warrant it.

Raf1k14245d ago

There's a big difference between hardware supporting 4K and it being able to run core games at 4K. Running an operating system at 4K is nothing like running a game at that resolution.

We certainly won't be seeing next-gen console games running at 4k unless they're not very demanding like at 2D fighter e.g. BlazBlue. I think the 4K support is mainly for 4K bluray which we're likely to see at some point in future as I've stated before.

This would certainly help in future-proofing the PS4 somewhat not that you can fully future proof something like that.

Pain_Killer4244d ago

Running and Supporting are two different things.

Low end and mainstream hardware can support higher resolutions but can the same hardware run them?

TheoreticalParticle4245d ago

"Latest Nvidia & AMD high-end GPUs can handle pretty well 4K resolution games"

Are you guys even trying any more? If your headline sounds like it was written by someone who has the writing skills of a 2nd grader, why would I ever want to visit your site?

ZoyosJD4245d ago

That's what the guy in the video said, and I'm not blaming him because English clearly isn't his first language.

TheoreticalParticle4245d ago

If he's quoting the guy in the video, then he needs to use quotation marks.

LAWSON724245d ago

Well considering these TVs are expensive as hell, I wont care about it until they are at least a $1000.

TABSF4245d ago (Edited 4245d ago )

4k on next Xbox and PS4, don't think so. Maybe movies and arcade titles.

He was talking about HD 7970 and GTX 680 these GPUs cost £350-£450 and they can only run at 30Hz, it looked like it lagging in that presentation on Dirt Showdown.

Sorry but 30fps feels like its lagging to me, 30fps is welfare.

If your expecting PS4 and Next Xbox to run 4k resolution then be prepared to see £600-£800 consoles.

Those 2 GPUs are the most power single chips on the market and they are struggling, Maxwell and Sea Islands are not out till 2013/2014 next gen would of already started.

LAWSON724245d ago

yeah some actually aint ignorant

Pain_Killer4244d ago

The GTX 680 and HD 7970 both card run 4K resolution at 30-35 AVG FPS on the most demanding titles such as Battlefield 3.

http://www.brightsideofnews...

Dirt Showdown ain't that much of a resource hog as BF3 so the only limiting factor is the 30Hz on the monitor, not the GPU's. Provide them a 60Hz and they would easily churn out better frame rates.

And about that so called Next Gen, Even if they do bring out these console by 2013 do you really expect them to outperform current series GPU's?

NVIDIA would out the GK110 on consumer level by 1H 2013 and so would AMD out its Sea Island, Those are enough to take the performance crown back from these so called next gen consoles if they atleast manage to run faster than current gen cards.

TABSF4244d ago

Of course

If anyone thinks that Consoles will have GPUs that can even compete with GTX 580 or HD 6970 then they are completely mistaken. Same with the of course newer series HD 7970 and GTX 680.

Hell I'll be surprised if PS4/NXB can complete with GTX 480 and HD 5870

GK110 will have 2880 cores with a die size way over 500mm2, its gonna be a beast. expect very high temps, noise and price :)

Maxwell and Sea Islands are gonna be out either just before or just after next gen console get release making it impossible for them to feature them.

Show all comments (23)
100°

Make your next GPU upgrade AMD as these latest-gen Radeon cards receive a special promotion

AMD has long been the best value option if you're looking for a new GPU. Now even their latest Radeon RX 7000 series is getting cheaper.

Father__Merrin15h ago

Best for the money is the Arc cards

just_looken13h ago

In the past yes but last gen amd has gotten cheaper and there new cards are on the horizon making 6k even cheaper.

The arc cards are no longer made by intel but asus/asrock has some the next line battlemage is coming out prices tbd.

Do to the longer software development its always best to go amd over intel if its not to much more money even though intel is a strong gpu i own 2/4 card versions.

330°

Nvidia DLSS 3.7 drives a further nail in the coffin of native performance

Nvidia DLSS 3.7 is the latest update to the long-running AI upscaling technology, and it further shows native performance doesn't matter.

DustMan10d ago

I think hardware development is at a point where they need to figure out how to draw less power, These beefy high end cards eat wattage, and I'm curious if using DLSS & AI in general will lower the power draw. It would seem like the days of just adding more VRAM & horsepower is over. Law of diminishing returns. Pretty soon DLSS/FSR will be incorporated into everything, and eventually the tech will be good enough to hardly notice a difference if at all. AI is the future and it would be foolish to turn around and not incorporate it at all. Reliance on AI is only going to pick up more & more.

Tapani10d ago (Edited 10d ago )

DLSS certainly lowers power consumption. Also, the numbers such as the 4090 at 450W does not tell you everything, most of the time the GPU stays between 200-350W in gameplay, which is not too different from the highest end GPU of 10 years ago. Plus, today you can undervolt + OC GPUs by a good margin to keep stock performance while utilizing 80% of the power limit.

You can make the 4090 extremely power efficient and keep 90% of its performance at 320W.

However, in today's world the chip manufacturing is limited by physics and we will have power increases in the next 5-10 years at the very least to keep the technology moving forward at a pace that satisfies both businesses and consumers.

Maybe in 10 years we have new tech coming to the markets which we are yet to invent or perhaps we can solve existing technologies problems with manufacturing or cost of production.

On the other hand, if we were to solve the energy problem on earth by utilizing fusion and solar etc. it would not matter how much these chips require. That being said, in the next 30-40 years that is a pipedream.

MrBaskerville9d ago

I don't think fusion is the way forward. It will mosy likely be too late when it's finally ready, meaning it will probably never be ready. Something else might arrive before though and then it becomes viable.

Firebird3609d ago

We need to stop the smear campaign on nuclear energy.
We could power everything forever if we wanted too.

Tacoboto10d ago

PS4 Pro had dedicated hardware in it for supporting checkerboard rendering that was used significantly in PS4 first party titles, so you don't need to look to PC or even modern PC gaming. The first RTX cards released nearly 6 years ago, so how many nails does this coffin need?

InUrFoxHole10d ago

Well... its a coffin man. So atleast 4?

Tacoboto10d ago

PSSR in the fall can assume that role.

anast10d ago

and those nails need to be replaced annually

Einhander197210d ago

I'm not sure what the point you're trying to make is, but PS4 Pro was before DLSS and FSR, and it still provides one of the highest performance uplifts while maintaining good image quality.

DLSS is it's own thing but checkerboarding om PS5 still is a rival to the likes of FSR2.

Tacoboto10d ago

Um. That is my point. That there have been so many nails in this "native performance" coffin and they've been getting hammered in for years, even on PS4 Pro before DLSS was even a thing.

RaidenBlack9d ago

Don't know what's OP's point is either but ... checkerboard rendering was good enough for its time but in terms of image quality its wayy behind what's DLSS 3 or FSR 3 is currently offering.
The main point of the article and what OP missed here is that DLSS 3.7 is soo good that its nearly undisguisable from native rendering and basically throws the "its still blurry and inferior to native rendering" debacle, (that's been going around in PC community since 2019), right out of the window.

Einhander19729d ago

RaidenBlack

DLSS is as i said a different thing from FSR and checkerboard.

But you're talking about FSR 3 which probably is better than checkerboard, but FSR 3 has only started to get games this year, so checkerboard which was the first hardware upscaling solution was and is still one of the best upscaling solutions.

Give credit where credit is due, PlayStation was first and they got it right from the get go, and PSSR will almost certainly be better than it will be given credit for, heck digital foundry is already spreading misinformation about the Pro.

Rhythmattic10d ago

Tacoboto
Yes... Its amazing how many talekd about KZ2 deferred rendering, pointing out the explosions were lower res than the frame itself..
And of course, Then the idea of checkerboard rendering, not being native....
For sure, maybe this tech makes it minimal while pixel counting, but alas, seems performance and close enough , and not native now matters.....
I want to see it run native without DLSS.. why not?

RonsonPL10d ago

Almost deaf person:
- lightweight portable 5$, speakers of 0,5cm diameter are the final nail in coffin of Hi-Fi audio!

Some people in 2010:
- smartphones are the final nain in the console gaming's coffin!

This is just the same.
AI upscalling is complete dogshit in terms of motion quality. The fact that someone is not aware of it (look at the deaf guy example) doesn't mean the flaws are not there. They are. And all it takes to see them, is to use a display that handles motion well, so either gets true 500fps at 500Hz LCD TN or OLED (or faster tech) or uses low persistence mode (check blurbusters.com if you don't know what it means) also known as Black Frame Insertion or backlight strobing.

Also, image ruined by any type of TAA is just as "native image" as chineese 0,5$ screwdriver is "high quality, heavy duty, for professional use". It's nowhere near it. But if you're an ignorant "journalist", you will publish crap like this article, just to flow with the current.

There's no coffin to native res quality and there never will be. Eventually, we'll have enough performance in rasterization to drive 500fps, which will be a game changer for motion quality while also adding other benefit - lower latency.
And at 500fps, the amount of time required for upscalling makes it completely useless.
This crap is only usable for cinematic stuff, like cutscenes and such. Not for gaming. Beware of ignorants on the internet. The TAA is not "native" and the shitty look of the modern games when you disable any TAA, is not "native" either as it's ruined by the developer's design choice - you can cheat by rendering every 4th pixel when you plan to put a smeary TAA pass on it later on. When you disable it, you will see a ruined image, horrible pixellation and other visual "glitches" but it is NOT what native would've looked like if you'd like to honestly compare the two.

Stay informed.

RaidenBlack9d ago

Main point of the article is how far DLSS has come with v3.7 since 2018. If this is what we're getting already, then imagine what we'll get within next ~3 years. Yes parity will obviously be there compared to the then native rendering tech but it'll slowly narrow down to the point it'll be indistinguishable.
Something similar is like the genAI Sora ... AI generative videos were turd back when they were introduced (the infamous Will Smith eating video) ... but now look at Sora, generating videos that just looks like real life.

9d ago
Yui_Suzumiya9d ago

How much VRAM is standard today? My laptop has a 1080p QLED display but only an Intel Iris Xe with 128MB of VRAM. I currently do all my gaming on it but certain titles do suffer because of it. I plan on getting a Steam Deck OLED soon to play the newer and more demanding titles.

purple1019d ago

Maybe better to get a budget gaming laptop and link a dualsense to it

= Portable console with far better graphics than a steam deck! + bigger screen and able to use it for work / etc

170°

Why I'm worried about the Nvidia RTX 50 series

Aleksha writes: "Nvidia has established itself as a dominant force in the world of AI, but I can't shake the worry of what this means for the RTX 50 series."

Tal16913d ago

Echo sentiment here - I think the way GPUs are going, gaming could be secondary to deep learning. Wonder if the 40 series was the last true generation of GPUs?

Number1TailzFan13d ago

No.. Jensen believes GPUs should stay expensive. Those wanting a top end GPU will have to splash out for it, or play at just 1080p and 60fps or something if you can only afford a low end option.

On the other hand if you don't care about RT or AI performance then there's always AMD that are doing ok at the mid range.

Christopher13d ago

***or play at just 1080p and 60fps or something***

My over 2-year-old laptop GPU still runs fine. I think this is more a reason why GPUs are going to other things in priority, because the market reach for new users is shrinking as more PC gamers focus less on replacing older and still working parts that run RT/AI fine enough as it is. Not to say there aren't people who still do it, but I think the market is shrinking for having the latest and greatest like it has been the past two decades. Problem is we aren't growing things at a rate as we were, we're reaching the the flattening of that exponential curve in regards to advancement. We need another major technological advancement to restart that curve.

D0nkeyBoi13d ago

The irremoval ad makes it impossible to read article

Tzuno13d ago (Edited 13d ago )

I hope Intel takes some lead and do a big dent to nvidia sales

Jingsing12d ago

You also need to consider that NVIDIA are heavily invested in cloud gaming. So they are likely going to make moves to push you into yet another life subscription service.

Kayser8112d ago

NVIDIA will never change their price point until AMD or intel makes a GPU that is comparable and cheaper than them .
it happend before in the days of gtx280 which they changed the price from 650$ to 450$ in a matter of 2 weeks because of rx4870 which is being sold at 380$.

Show all comments (8)