190°

It's All About The Grass - New NVIDIA Video Shows Off Impressive Turf Effects

NVIDIA has released a new video, showing off Turf Effects; a new NVIDIA GameWorks technology which empowers users to simulate and render massive grass simulation with physical interaction.

Read Full Story >>
dsogaming.com
ratrace3469d ago

Your workout is my warmup.

frostypants3469d ago (Edited 3469d ago )

Yeah...you go running out your door to buy a $600 video card upgrade because "ZOMG GRASS!". Different mentality, I suppose...

Kleptic3469d ago (Edited 3469d ago )

since when are 680 equivalents or higher $600?

Maxor3469d ago

This coming from console fanboys who flame on for days about 1080p vs 900p. Specs that the master race consider to be table scraps.

thehobbyist3469d ago

Actually, Nvidia's newest card is only about $350

:I

starchild3468d ago

@Maxor

Didn't you know? Graphics only matter if they are being used to bash the Xbox One or Wii U and to put the PS4 on a pedestal. If PC has better graphics..."it's just those damn elitist PC fanboys being graphic whores". I love the hypocrisy on this site.

tee_bag2423468d ago (Edited 3468d ago )

It runs on a 680 which is ancient. You can buy the far faster 780 for $249 on special which is at least 2 times faster than all the weak sauce consoles combined.

https://www.ozbargain.com.a...

Don't know why you're here bashing every PC article. I can only conclude you're sour others are enjoying the fruits of cutting edge tech while your stuck twiddling your thumbs staring at the walls.

Let's not fool ourselves. I have a PS4 and X1 too and with all the extra's and online they are far from cheap either.

+ Show (2) more repliesLast reply 3468d ago
Sketchy_Galore3469d ago

True but I have kissed a real girl who wasn't even my mother or anything so I win.

awi59513468d ago

Come back when you do her. Kissing is nothing lol.

TheStrokes3469d ago (Edited 3469d ago )

Nice.
It's little things like this that can add so much more realism to a game. Not sure when this can be utilized to that extent though... I'm not very knowledgeable in that area.

LightofDarkness3469d ago (Edited 3469d ago )

Well, it's a gameworks tech so technically you can just plug it in now if you want. But it's like the fur simulation on a massive scale, so you may see small patches of it in some games. Within about 5 years you might see this becoming standard.

TheStrokes3469d ago

Ah. Thanks for the info. +

cannon88003469d ago

In the nvidia gameworks site it says that it will be available in early 2015 so we have to wait :(

lemoncake3469d ago

Wish one of these new consoles had gone with a decent nvidia card instead of both going amd this gen.

XB1_PS43469d ago

Eh, It would have been more effective if they went Intel for the processors. The only reason AMD GPUs aren't as good is because of drivers. That's a PC unique problem. Because all of the consoles have the same GPUs. There's no coding for hundreds of cards.

Fake edit: I mean all PS4's have the same GPU, and all the XB1's have the same GPU. Not that both platforms share the same GPU.

lemoncake3469d ago

Nvidia cards are very different from amd cards, with different architecture and built in features. To suggest amd and nvidia cards are the same and are only different because amd suck at drivers is very wrong.

awi59513468d ago

That driver crap only comes from someone who hasnt had a amd card for the last 10 years. There isnt any amd driver problems the only one i can think of that has happened in the last 10 years is the recent 280x driver screw up. For some reason it was acting crazy because the 280x is really a 7970 overclocked and the drivers were getting confused. That has been fixed in recent drivers.

bmf73643469d ago

They went with AMD APU's this gen. However they could've optimized the system to run both the AMD APU's GPU and an Nvidia card to create a more unique hybrid-PC capable of mantle and Nvidia's PhysX and Turf Effects.

3469d ago
chaldo3469d ago

@LogicalReason

I wouldn't do it if I was Nvidia obviously. Business reasons son. Not because they are "selfish" ???

Kleptic3469d ago

Its very likely that Nvidia was never even approached by Sony or MS before this gen for the PS4 and XB1...

AMD wrapped that up effortlessly with SoC's that have desktop cpu's (albeit what ended up were extremely low power, low IPC desktop cpu's) AND powerful enough gpu's on a single plate...

While Nvidia is attempting to make headway in the mobile market with integrated SoC's...they have nothing in the foreseeable future in the PC/Gaming console market cpu wise...maybe they'll never need to as market shift continues towards low power mobile devices...But AMD having a solution entirely from their own production infrastructure (as in Sony nor MS would need to source a separate cpu from another company)...they were pretty much given this generation without even having to impress...

bmf73643469d ago

These are tech companies with the capability of full utilizing these systems. If the PS4/Xbox One are operating a custom-built and optimized AMD A10, they could customarily optimize both GPU's to run a game and able to fully utilize both company's technologies in the GPU's.

However what I left out is the possibility of complicating a system's architect if they are able to crossfire/SLI an AMD and Nvidia GPU like this. Such a thing is farfetched in PC gaming. Let alone the fact that SLI/Crossfire is not something generally supported for PC games.

awi59513468d ago

No Nividia is just greedy they wanted to lock consoles into unfair deals that killed the first xbox. They xbox died because nividia wanted to charge xbox full price for their GPU for the life span of the console this is why the xbox couldnt drop in price. Thats why xbox told them to go to h(&(& for the 360 and thats why Sony and nintendo told them the same this gen.

+ Show (2) more repliesLast reply 3468d ago
3469d ago Replies(1)
MK24ever3469d ago

Flower did this many years ago in PS3, I'd even say the grass looked better...

https://www.youtube.com/wat...

ONESHOTV23469d ago

really ? i don't think so man look at the video and compare it with your link now look at which one is better......

decrypt3469d ago

Hard to see with Sony googles hes wearing.

ATi_Elite3469d ago

I laughed so hard. Flower is nice but not Nvidia turf nice.

I think Crysis was better than flowers grass

Play2Win3469d ago

lol, that console gamers. so funny.

Saryk3469d ago

Whatcha talkin bout Willis?

aLiEnViSiToR3469d ago

Dude you should really go and check with your eye doctor...

starchild3468d ago

Wow, I see no self shadows or shading there at all. I don't see any kind of dynamic crushing of the grass, only a generic animation of the grass parting a little bit. How can people be so delusional?

tee_bag2423468d ago

I know.. those 60fps really kill the filmic effect too lol

What are you on? Try watching the video and at 1080p. Every strand of grass is casting a shadow. Not to mention has simulated physics.

+ Show (4) more repliesLast reply 3468d ago
MK24ever3469d ago

Oh "PC gamers", do you realize 99% of console gamers also own a PC and game on it? Only the few stupid ones who chose to play ONLY on PCs come here and think they are better than every other gamer there is. The grass on flower was done on almost a decade old hardware, it also had millions of independent grass leafs that were dynamic and still look amazing. So why should I be impressed with a tech-demo showing a little better looking grass done with high-end 2014 hardware?...

Saryk3469d ago

Oh "PC gamers", do you realize 99% of console gamers also own a PC and game on it?

Really?

99%?

MK24ever3469d ago

Of adult gamers most likely yes, who doesn't own a PC nowadays? Or is it forbidden for a "peasent console gamer" to own a PC? I own a PC, I game on my PC, yet since I also own a console, I'm already a console fanboy and I can't be underwhelmed by something a "PC" can do... -_-"

aLiEnViSiToR3469d ago

LOL you must be on some fancy drug O_o your comments are hilarious xD

starchild3468d ago

"do you realize 99% of console gamers also own a PC and game on it?"

Sure, that's why Sony fanboys are always saying that consoles and PCs are "totally different markets and shouldn't be compared".

If what you are saying is true then they definitely aren't separate markets at all.

Anyway, most PC gamers I've seen comment here on N4G also own consoles. I own a PS4, PS3 and Xbox 360 in addition to my PC. But I would bet my life that most of the console fanboys on here do not own a real gaming PC.

You would be impressed by the demo if you weren't so blindly devoted to Sony and if you had a clue what was really being accomplished in that demo.

tee_bag2423468d ago (Edited 3468d ago )

And that's just it. I have a gaming PC, PS4, X1, PS3, 360 and most consoles before.

I value someone's opinion who has experienced both sides and can appreciate the evolution of cutting edge gaming and boundary pushing.
Not a child who picks a side then blindly follows the herd.
Sympathizing that either Sony's or X1 console is powerful enough, or PC's optimizations are good enough does nothing to help them raise the bar, or us benefit as consumers. Those excuses are simply fanboy rhetoric and do nothing but tell the industry it's OK not to raise the bar.

A classic example of fanboyism is paying for online.
Last gen 360 users were the joke. Now it's suddenly not a big deal for Sony fans to eat their hat and pay for online.
I can guarantee if we all stopped paying for online it would very soon be free for all. Yet again, sympathizing gets us no where.

Credit when credit is due, no matter what platform.

MK24ever3466d ago

WOW, damn some of you people got mad because I wasn't impressed by this tech demo, and since I mentioned PS3... BAM! I'm a Sony Fanboy, a console fanboy, an ignorant, blind, etc etc etc... I'm 33 years old and I played on PCs all my life as well as consoles, I saw and played games since the Atari 2600 days so I saw many game evolution since then. I wasn't impressed with this demo and said very old hardware had already made very good looking grass, can't I have my opinion without getting insulted? I didn't said the technology was worse, I said the grass on Flower did looked better. I didn't offended no-one either... I love technology as probably most of you do, no need for all this shit.

+ Show (1) more replyLast reply 3466d ago
Show all comments (51)
330°

Nvidia DLSS 3.7 drives a further nail in the coffin of native performance

Nvidia DLSS 3.7 is the latest update to the long-running AI upscaling technology, and it further shows native performance doesn't matter.

DustMan11d ago

I think hardware development is at a point where they need to figure out how to draw less power, These beefy high end cards eat wattage, and I'm curious if using DLSS & AI in general will lower the power draw. It would seem like the days of just adding more VRAM & horsepower is over. Law of diminishing returns. Pretty soon DLSS/FSR will be incorporated into everything, and eventually the tech will be good enough to hardly notice a difference if at all. AI is the future and it would be foolish to turn around and not incorporate it at all. Reliance on AI is only going to pick up more & more.

Tapani10d ago (Edited 10d ago )

DLSS certainly lowers power consumption. Also, the numbers such as the 4090 at 450W does not tell you everything, most of the time the GPU stays between 200-350W in gameplay, which is not too different from the highest end GPU of 10 years ago. Plus, today you can undervolt + OC GPUs by a good margin to keep stock performance while utilizing 80% of the power limit.

You can make the 4090 extremely power efficient and keep 90% of its performance at 320W.

However, in today's world the chip manufacturing is limited by physics and we will have power increases in the next 5-10 years at the very least to keep the technology moving forward at a pace that satisfies both businesses and consumers.

Maybe in 10 years we have new tech coming to the markets which we are yet to invent or perhaps we can solve existing technologies problems with manufacturing or cost of production.

On the other hand, if we were to solve the energy problem on earth by utilizing fusion and solar etc. it would not matter how much these chips require. That being said, in the next 30-40 years that is a pipedream.

MrBaskerville10d ago

I don't think fusion is the way forward. It will mosy likely be too late when it's finally ready, meaning it will probably never be ready. Something else might arrive before though and then it becomes viable.

Firebird36010d ago

We need to stop the smear campaign on nuclear energy.
We could power everything forever if we wanted too.

Tacoboto11d ago

PS4 Pro had dedicated hardware in it for supporting checkerboard rendering that was used significantly in PS4 first party titles, so you don't need to look to PC or even modern PC gaming. The first RTX cards released nearly 6 years ago, so how many nails does this coffin need?

InUrFoxHole11d ago

Well... its a coffin man. So atleast 4?

Tacoboto11d ago

PSSR in the fall can assume that role.

anast10d ago

and those nails need to be replaced annually

Einhander197211d ago

I'm not sure what the point you're trying to make is, but PS4 Pro was before DLSS and FSR, and it still provides one of the highest performance uplifts while maintaining good image quality.

DLSS is it's own thing but checkerboarding om PS5 still is a rival to the likes of FSR2.

Tacoboto11d ago

Um. That is my point. That there have been so many nails in this "native performance" coffin and they've been getting hammered in for years, even on PS4 Pro before DLSS was even a thing.

RaidenBlack10d ago

Don't know what's OP's point is either but ... checkerboard rendering was good enough for its time but in terms of image quality its wayy behind what's DLSS 3 or FSR 3 is currently offering.
The main point of the article and what OP missed here is that DLSS 3.7 is soo good that its nearly undisguisable from native rendering and basically throws the "its still blurry and inferior to native rendering" debacle, (that's been going around in PC community since 2019), right out of the window.

Einhander197210d ago

RaidenBlack

DLSS is as i said a different thing from FSR and checkerboard.

But you're talking about FSR 3 which probably is better than checkerboard, but FSR 3 has only started to get games this year, so checkerboard which was the first hardware upscaling solution was and is still one of the best upscaling solutions.

Give credit where credit is due, PlayStation was first and they got it right from the get go, and PSSR will almost certainly be better than it will be given credit for, heck digital foundry is already spreading misinformation about the Pro.

Rhythmattic10d ago

Tacoboto
Yes... Its amazing how many talekd about KZ2 deferred rendering, pointing out the explosions were lower res than the frame itself..
And of course, Then the idea of checkerboard rendering, not being native....
For sure, maybe this tech makes it minimal while pixel counting, but alas, seems performance and close enough , and not native now matters.....
I want to see it run native without DLSS.. why not?

RonsonPL11d ago

Almost deaf person:
- lightweight portable 5$, speakers of 0,5cm diameter are the final nail in coffin of Hi-Fi audio!

Some people in 2010:
- smartphones are the final nain in the console gaming's coffin!

This is just the same.
AI upscalling is complete dogshit in terms of motion quality. The fact that someone is not aware of it (look at the deaf guy example) doesn't mean the flaws are not there. They are. And all it takes to see them, is to use a display that handles motion well, so either gets true 500fps at 500Hz LCD TN or OLED (or faster tech) or uses low persistence mode (check blurbusters.com if you don't know what it means) also known as Black Frame Insertion or backlight strobing.

Also, image ruined by any type of TAA is just as "native image" as chineese 0,5$ screwdriver is "high quality, heavy duty, for professional use". It's nowhere near it. But if you're an ignorant "journalist", you will publish crap like this article, just to flow with the current.

There's no coffin to native res quality and there never will be. Eventually, we'll have enough performance in rasterization to drive 500fps, which will be a game changer for motion quality while also adding other benefit - lower latency.
And at 500fps, the amount of time required for upscalling makes it completely useless.
This crap is only usable for cinematic stuff, like cutscenes and such. Not for gaming. Beware of ignorants on the internet. The TAA is not "native" and the shitty look of the modern games when you disable any TAA, is not "native" either as it's ruined by the developer's design choice - you can cheat by rendering every 4th pixel when you plan to put a smeary TAA pass on it later on. When you disable it, you will see a ruined image, horrible pixellation and other visual "glitches" but it is NOT what native would've looked like if you'd like to honestly compare the two.

Stay informed.

RaidenBlack10d ago

Main point of the article is how far DLSS has come with v3.7 since 2018. If this is what we're getting already, then imagine what we'll get within next ~3 years. Yes parity will obviously be there compared to the then native rendering tech but it'll slowly narrow down to the point it'll be indistinguishable.
Something similar is like the genAI Sora ... AI generative videos were turd back when they were introduced (the infamous Will Smith eating video) ... but now look at Sora, generating videos that just looks like real life.

10d ago
Yui_Suzumiya10d ago

How much VRAM is standard today? My laptop has a 1080p QLED display but only an Intel Iris Xe with 128MB of VRAM. I currently do all my gaming on it but certain titles do suffer because of it. I plan on getting a Steam Deck OLED soon to play the newer and more demanding titles.

purple10110d ago

Maybe better to get a budget gaming laptop and link a dualsense to it

= Portable console with far better graphics than a steam deck! + bigger screen and able to use it for work / etc

170°

Why I'm worried about the Nvidia RTX 50 series

Aleksha writes: "Nvidia has established itself as a dominant force in the world of AI, but I can't shake the worry of what this means for the RTX 50 series."

Tal16914d ago

Echo sentiment here - I think the way GPUs are going, gaming could be secondary to deep learning. Wonder if the 40 series was the last true generation of GPUs?

Number1TailzFan14d ago

No.. Jensen believes GPUs should stay expensive. Those wanting a top end GPU will have to splash out for it, or play at just 1080p and 60fps or something if you can only afford a low end option.

On the other hand if you don't care about RT or AI performance then there's always AMD that are doing ok at the mid range.

Christopher14d ago

***or play at just 1080p and 60fps or something***

My over 2-year-old laptop GPU still runs fine. I think this is more a reason why GPUs are going to other things in priority, because the market reach for new users is shrinking as more PC gamers focus less on replacing older and still working parts that run RT/AI fine enough as it is. Not to say there aren't people who still do it, but I think the market is shrinking for having the latest and greatest like it has been the past two decades. Problem is we aren't growing things at a rate as we were, we're reaching the the flattening of that exponential curve in regards to advancement. We need another major technological advancement to restart that curve.

D0nkeyBoi14d ago

The irremoval ad makes it impossible to read article

Tzuno13d ago (Edited 13d ago )

I hope Intel takes some lead and do a big dent to nvidia sales

Jingsing13d ago

You also need to consider that NVIDIA are heavily invested in cloud gaming. So they are likely going to make moves to push you into yet another life subscription service.

Kayser8113d ago

NVIDIA will never change their price point until AMD or intel makes a GPU that is comparable and cheaper than them .
it happend before in the days of gtx280 which they changed the price from 650$ to 450$ in a matter of 2 weeks because of rx4870 which is being sold at 380$.

Show all comments (8)
230°

Nvidia AI Demo Unwittingly Proves that Human Voice Actors, Artists, and Writers are Irreplaceable

Nvidia presented Covert Protocol, a tech demo aiming to showcase the "power" of the Nvidia Ace technology applied to video game characters.

Read Full Story >>
techraptor.net
Eonjay34d ago (Edited 34d ago )

They look like they are in pain. Almost begging to be put down. It was uncomfortable to watch.

PRIMORDUS34d ago

The tech. is too early. Come back in 10+yrs and see what it can do then.

N3mzor34d ago

That presentation sounds like it was written by an AI using corporate buzzwords.

CS734d ago

I don’t know why people keep thinking of it as AI vs no AI.

A much more likely scenario is the use of AI alongside human work.

Eg. AI voices used during side quests or banter to boost the lines of dialog.

AI generating additional pre determined branches in dialog tree options for more freedom in conversations with NPCs

Smellsforfree34d ago

"AI generating additional pre determined branches in dialog tree options for more freedom in conversations with NPCs"

I'm wondering about that last one. Will that make a game more fun or more immersive? In the end, how can it possibly be more than filler content and then if it is filler content how much do I really want to engage with conversing with it if I know it will lead no where?

MrBaskerville34d ago

It's one of those things that sounds cool on paper. But will probably get old fast.

DivineHand12534d ago

The tech is now available, and it is up to creators to create something unique with it.

Profchaos34d ago (Edited 34d ago )

The biggest thing to talk about here is that every interaction requires communication to inworld servers so there's three big impacts here
1) games are always online no question about it
2) delays in processing on inworld servers, outages or unexpected load as a result of some astronomically popular game will cause real time game delays ever wait for a chat got response this will be similar as the context must be pulled via the llm.

Now as for the other impact the artistic one no I don't think writers can be replaced I've mentioned before often AI generated writing is word soup I still standby that it's also evident in the video to.
AI can not convery accurately human emotions and I don't think ever will.

I know publishers are looking to cut down on development costs but what happens when inworld decide to charge per interaction or update their pricing a year after your game goes live you have no choice but pay it or shutter it.

Ive felt for a while that we are heading towards this place of games being disposable entertainment and now it's feeling more and more accurate

Show all comments (23)