40°

Nvidia seeks peace with Linux

Few companies have been the target of as much criticism in the Linux community as Nvidia. Linus Torvalds himself last year called Nvidia the "single worst company" Linux developers have ever worked with, giving the company his middle finger in a public talk.

Nvidia is now trying to get on Linux developers' good side. Yesterday, Nvidia's Andy Ritger e-mailed developers of Nouveau, an open source driver for Nvidia cards that is built by reverse engineering Nvidia's proprietary drivers. Ritger wrote that "NVIDIA is releasing public documentation on certain aspects of our GPUs, with the intent to address areas that impact the out-of-the-box usability of NVIDIA GPUs with Nouveau. We intend to provide more documentation over time, and guidance in additional areas as we are able."

Read Full Story >>
pcgamesn.com
Dark_Overlord3868d ago

Wonder if this is in reaction to the Valve OS announcement?

Feralkitsune3867d ago

If more devs actually begin to make Linux ports, this will practically kill some of Nvidia's sales since they have not done much to help on the Linux side of things much in the past. The Drivers I use on Linux are ones that the fans themselves had to create since Nvidia refuses to release up to date open source drivers.

330°

Nvidia DLSS 3.7 drives a further nail in the coffin of native performance

Nvidia DLSS 3.7 is the latest update to the long-running AI upscaling technology, and it further shows native performance doesn't matter.

DustMan15d ago

I think hardware development is at a point where they need to figure out how to draw less power, These beefy high end cards eat wattage, and I'm curious if using DLSS & AI in general will lower the power draw. It would seem like the days of just adding more VRAM & horsepower is over. Law of diminishing returns. Pretty soon DLSS/FSR will be incorporated into everything, and eventually the tech will be good enough to hardly notice a difference if at all. AI is the future and it would be foolish to turn around and not incorporate it at all. Reliance on AI is only going to pick up more & more.

Tapani15d ago (Edited 15d ago )

DLSS certainly lowers power consumption. Also, the numbers such as the 4090 at 450W does not tell you everything, most of the time the GPU stays between 200-350W in gameplay, which is not too different from the highest end GPU of 10 years ago. Plus, today you can undervolt + OC GPUs by a good margin to keep stock performance while utilizing 80% of the power limit.

You can make the 4090 extremely power efficient and keep 90% of its performance at 320W.

However, in today's world the chip manufacturing is limited by physics and we will have power increases in the next 5-10 years at the very least to keep the technology moving forward at a pace that satisfies both businesses and consumers.

Maybe in 10 years we have new tech coming to the markets which we are yet to invent or perhaps we can solve existing technologies problems with manufacturing or cost of production.

On the other hand, if we were to solve the energy problem on earth by utilizing fusion and solar etc. it would not matter how much these chips require. That being said, in the next 30-40 years that is a pipedream.

MrBaskerville15d ago

I don't think fusion is the way forward. It will mosy likely be too late when it's finally ready, meaning it will probably never be ready. Something else might arrive before though and then it becomes viable.

Firebird36015d ago

We need to stop the smear campaign on nuclear energy.
We could power everything forever if we wanted too.

Tacoboto15d ago

PS4 Pro had dedicated hardware in it for supporting checkerboard rendering that was used significantly in PS4 first party titles, so you don't need to look to PC or even modern PC gaming. The first RTX cards released nearly 6 years ago, so how many nails does this coffin need?

InUrFoxHole15d ago

Well... its a coffin man. So atleast 4?

Tacoboto15d ago

PSSR in the fall can assume that role.

anast15d ago

and those nails need to be replaced annually

Einhander197215d ago

I'm not sure what the point you're trying to make is, but PS4 Pro was before DLSS and FSR, and it still provides one of the highest performance uplifts while maintaining good image quality.

DLSS is it's own thing but checkerboarding om PS5 still is a rival to the likes of FSR2.

Tacoboto15d ago

Um. That is my point. That there have been so many nails in this "native performance" coffin and they've been getting hammered in for years, even on PS4 Pro before DLSS was even a thing.

RaidenBlack15d ago

Don't know what's OP's point is either but ... checkerboard rendering was good enough for its time but in terms of image quality its wayy behind what's DLSS 3 or FSR 3 is currently offering.
The main point of the article and what OP missed here is that DLSS 3.7 is soo good that its nearly undisguisable from native rendering and basically throws the "its still blurry and inferior to native rendering" debacle, (that's been going around in PC community since 2019), right out of the window.

Einhander197214d ago

RaidenBlack

DLSS is as i said a different thing from FSR and checkerboard.

But you're talking about FSR 3 which probably is better than checkerboard, but FSR 3 has only started to get games this year, so checkerboard which was the first hardware upscaling solution was and is still one of the best upscaling solutions.

Give credit where credit is due, PlayStation was first and they got it right from the get go, and PSSR will almost certainly be better than it will be given credit for, heck digital foundry is already spreading misinformation about the Pro.

Rhythmattic15d ago

Tacoboto
Yes... Its amazing how many talekd about KZ2 deferred rendering, pointing out the explosions were lower res than the frame itself..
And of course, Then the idea of checkerboard rendering, not being native....
For sure, maybe this tech makes it minimal while pixel counting, but alas, seems performance and close enough , and not native now matters.....
I want to see it run native without DLSS.. why not?

RonsonPL15d ago

Almost deaf person:
- lightweight portable 5$, speakers of 0,5cm diameter are the final nail in coffin of Hi-Fi audio!

Some people in 2010:
- smartphones are the final nain in the console gaming's coffin!

This is just the same.
AI upscalling is complete dogshit in terms of motion quality. The fact that someone is not aware of it (look at the deaf guy example) doesn't mean the flaws are not there. They are. And all it takes to see them, is to use a display that handles motion well, so either gets true 500fps at 500Hz LCD TN or OLED (or faster tech) or uses low persistence mode (check blurbusters.com if you don't know what it means) also known as Black Frame Insertion or backlight strobing.

Also, image ruined by any type of TAA is just as "native image" as chineese 0,5$ screwdriver is "high quality, heavy duty, for professional use". It's nowhere near it. But if you're an ignorant "journalist", you will publish crap like this article, just to flow with the current.

There's no coffin to native res quality and there never will be. Eventually, we'll have enough performance in rasterization to drive 500fps, which will be a game changer for motion quality while also adding other benefit - lower latency.
And at 500fps, the amount of time required for upscalling makes it completely useless.
This crap is only usable for cinematic stuff, like cutscenes and such. Not for gaming. Beware of ignorants on the internet. The TAA is not "native" and the shitty look of the modern games when you disable any TAA, is not "native" either as it's ruined by the developer's design choice - you can cheat by rendering every 4th pixel when you plan to put a smeary TAA pass on it later on. When you disable it, you will see a ruined image, horrible pixellation and other visual "glitches" but it is NOT what native would've looked like if you'd like to honestly compare the two.

Stay informed.

RaidenBlack15d ago

Main point of the article is how far DLSS has come with v3.7 since 2018. If this is what we're getting already, then imagine what we'll get within next ~3 years. Yes parity will obviously be there compared to the then native rendering tech but it'll slowly narrow down to the point it'll be indistinguishable.
Something similar is like the genAI Sora ... AI generative videos were turd back when they were introduced (the infamous Will Smith eating video) ... but now look at Sora, generating videos that just looks like real life.

14d ago
Yui_Suzumiya15d ago

How much VRAM is standard today? My laptop has a 1080p QLED display but only an Intel Iris Xe with 128MB of VRAM. I currently do all my gaming on it but certain titles do suffer because of it. I plan on getting a Steam Deck OLED soon to play the newer and more demanding titles.

purple10115d ago

Maybe better to get a budget gaming laptop and link a dualsense to it

= Portable console with far better graphics than a steam deck! + bigger screen and able to use it for work / etc

170°

Why I'm worried about the Nvidia RTX 50 series

Aleksha writes: "Nvidia has established itself as a dominant force in the world of AI, but I can't shake the worry of what this means for the RTX 50 series."

Tal16918d ago

Echo sentiment here - I think the way GPUs are going, gaming could be secondary to deep learning. Wonder if the 40 series was the last true generation of GPUs?

Number1TailzFan18d ago

No.. Jensen believes GPUs should stay expensive. Those wanting a top end GPU will have to splash out for it, or play at just 1080p and 60fps or something if you can only afford a low end option.

On the other hand if you don't care about RT or AI performance then there's always AMD that are doing ok at the mid range.

Christopher18d ago

***or play at just 1080p and 60fps or something***

My over 2-year-old laptop GPU still runs fine. I think this is more a reason why GPUs are going to other things in priority, because the market reach for new users is shrinking as more PC gamers focus less on replacing older and still working parts that run RT/AI fine enough as it is. Not to say there aren't people who still do it, but I think the market is shrinking for having the latest and greatest like it has been the past two decades. Problem is we aren't growing things at a rate as we were, we're reaching the the flattening of that exponential curve in regards to advancement. We need another major technological advancement to restart that curve.

D0nkeyBoi18d ago

The irremoval ad makes it impossible to read article

Tzuno18d ago (Edited 18d ago )

I hope Intel takes some lead and do a big dent to nvidia sales

Jingsing18d ago

You also need to consider that NVIDIA are heavily invested in cloud gaming. So they are likely going to make moves to push you into yet another life subscription service.

Kayser8118d ago

NVIDIA will never change their price point until AMD or intel makes a GPU that is comparable and cheaper than them .
it happend before in the days of gtx280 which they changed the price from 650$ to 450$ in a matter of 2 weeks because of rx4870 which is being sold at 380$.

Show all comments (8)
110°

Battlefield V now broken on Steam Deck / Linux with EA anticheat live

That's all folks. EA anticheat has now been added into Battlefield V, so it's the end of being able to play it on Steam Deck and other Linux systems.

This joins the likes of Plants vs. Zombies Garden Warfare 2: Deluxe Edition, EA SPORTS FC 24, EA SPORTS FIFA 23, Battlefield 2042 and Madden NFL 24 that all have EA's own homegrown anti-cheat that make them simply unplayable on systems running Linux.

Now if you try to run it, you'll be greeted with an error. A shame to see a game that's multiple years old get broken like this and no doubt EA will continue to use their own EA anticheat in future online games. Battlefield 1 is still okay, and Apex Legends is also still running but perhaps it's only a matter of time before EA force it onto those too?

Read Full Story >>
gamingonlinux.com
just_looken23d ago

more comments here:
https://www.reddit.com/r/St...
https://steamcommunity.com/...

47 comments in hours bf5 won't launch anymore ea is saying meh not my prob bitch tosses a faq
https://answers.ea.com/t5/B...

Yes counter cheating that is good but when the ea app on pc is still totally broken and these cheaters are on 2042 same anti cheat this is just putting scotch tape on trumps file cabinet to prevent leaks its nothing more than a stop gap.

I mention ea app as for 4years it was in beta then tossed out with 3 year old issues and i still cant not get the products that i purchased on origin yeah i purchased digital but guess what there is no dvd games on pc anymore.

Philaroni22d ago

Likewise, I'm about to do some games with some friends here but if you wish to PM me I could share my Discord if you like to talk further. Oddly you bring up THQ that I did some work for in the past.

Also for Starfield I have just a Stock GTX 1080. Not even the 1080i version on an ultra wide and with some Ini, changes and other weird things I know I got it working at Series X frame rate. Sure not the the fancy graphics options. I know Bethesda games have always been more CPU then GPU limiting. A good application I don't see many using is Process lasso. It helps with may CPU heavy games on PC. (Battletech is the main one I use it for) Can find it here if you like. https://bitsum.com/

""Xbox has made custom engines like forza/unreal they used to support epic back in the day like slipstream like you said but they make so much cash they do not need to work as much as sony.

But the issue i find is the lack of passion behind games aswell as the mass amount of users that love the broken/microtransantion filled games we get every day. Back in the day when a game was crap or had something wrong it was a oh crap are we going to get shutdown or like thq just poof gone. But today its na who cares fix it later or write it off then oh wait never mind they like it such as the new cod.""

Going to combine me response here. You are right, MS and I do feel like what they do with not just Froza and also the tech they use in Flight Sim. Is not given enough credit. If I am correct the next Fable game is going to be using a mix of things from the Turn 10 Team, maybe Unreal? Hard to say at this point, But MS along with Sony both own a full License of the tech (Days Gone for Sony) I agree with you on the talent and passion part, games have become too much, at least AAA games the 'Milk, butter, eggs and toast" to support places now days. Innovations in gameplay and such has fell lower and lower. I still don't see mechanics and ways of playing games like I did so many years back, its a bummer.

For MS maybe the end as even though they might have a massive advantage in software and services (I use to work in Azure) They can kind of do some things at cost but seem to fail at it a lot. The cloud gaming push I don't think is going to have the effect so many hope for (Not sure why anyone wants that) I still want MS to have a chance but, if I sit here with my magic dice and roll it on MS games over the last forever, it could land on Crackdown 3 (Remember that?) The cloud powered game... and now they are trying to do it again with Kojima some what. As an engineer, its not happening now, tomorrow or anything in the next 24 years. I remember back on this interview with John,

https://www.reddit.com/r/nv...

We are not even at that point yet. So hope we get there and I hope MS is still a player but I do feel you are right in many ways, they are already trying to slowly retreat just by the rhetoric alone.

Ether way again PM me and we should talk more, would love to do so.

Philaroni22d ago

The rough part of all of this is that for those of us on PC mostly be it Windows or Linux based, the anti-cheat sucks up a lot of resources. It kind of reminds me of the conversation between, security and freedom. Too much of ether is both good if managed well or very bad. We just as gamers keep seeing this pop up over and over again. Wish I had some magic wand to stop cheaters... but I don't... hell we just seen what happened with Apex at a high level event.. getting that kind of access and such is not good... but at the same time. Can anyone, one person provide me an example where Anti Cheat has even like a hit rate above 50? I know I'm pulling numbers out of my ass, but it seems the cheaters beat the systems in place time and time again. Where there is a lock, and a smith to make it. We always will have a thief that breaks it.

just_looken22d ago (Edited 22d ago )

I can

Sense everyone seems to have brain rot and forgot the ps3 that is the solution.

The ps3 for its first half had systems that would only launch the game if it saw you had a ps3 controller along with a inhouse operating system and a real gpu/cpu it was way harder to run cheats off of along with the ability to make custom rooms.

Yes there were p2p lag switching then later on thanks to that dev kit leak cheat menus but for years games like killzone/mag had dedicated servers that were hard to crack a operating system/cpu those in public had 0 knowledge how to make stuff on them.

Right now if they put all mp files on the server so we just stream the game that would be a huge step or make the multiplayer run in a sanboxed mode separate on the system.

Philaroni22d ago

I'm glad you brought that up that Killzone had Dedicated servers. I remember the Yellow Dog Linux days of PS3, as well. EA use to do more Dedicated Servers for games but as time went on they would 'rent them out' and slowly the official servers would die down.

P2P Lag always had that magical thing (Gears 2 (Aak the shot gun) and Halo 2,3) fans know as host advantage. That alone allowed for lags switches, booting of others from games. ECT. (Though I admit, was the most fun I had when our team could beat cheaters. Man was that a good time)

I fully agree with you, the fact so much is working how it is has in a hard way compromised network/client security issue. I'll pick on COD for example where some assets are local and other are on server, between games and between even evolutions/iterations of the engines used.

just_looken22d ago

@phil

Thank you for the reply i am glad others on here remember

Now back to the series x having games with the same frame rate as a 2005 xbox 360 but the masses thinking that is fine and next gen.

Starfield on series x native 780ishp fallout 3 xbox 360 720p both 30fps gaming has "grown so much"

Sadly we are not in large numbers and see mp/sp tech wise/cheater wise has gotten worse not better.

Philaroni22d ago

I hate to say it about Starfield I'd have too look at reports from way back when. I swear to god they said it was going to launch with Creation Kit 2 (I worked with 1 alot in Skyrim mods in the block based aka cell based structure) Back then it was great, but for the life of me I don't know how a Studio like Obsidian can make a game same as Bethesda but with less bugs and bull shit. (Fallout Veg and Outer Worlds) Not that it was with out bugs but when a 3rd party does better with your own tech... I find an issue... with it all.

Frame rate stuff I only understand from two points, design and marketing. On the Design part yes back in the day 30FPS was I swear almost more common then now days. It was not a design compromise to keep it at 30 as the hardware and the way the game was being 'displayed' (Key thing there) was as impactful. Now days I feel the Marketing side wants a 4K trailer running likely rendered on a Xbox or PS system with little text saying (Oh wit was on a PC that no one is able to afford... of course teasing there a bit.)

I do understand that Starfield is a 'simulated world' where like every item you drop is there, for like forever... cool, that is not too new now days and design wise, is kind of a dumb thing.... who cares how many carrots you can collect... is it cool sure, makes for good PR in some ways. Lack of tangible game play is the issue. I can't take credit for this but a buddy of mine worked in advertising and I shit you not at Fast food. They would make wax and even 3D print now and then post render foods you would get, just for you as the client at a Burger King to not be given as you where sold on. (((( He did not work for them FYI.. but others)))

Issue with gaming as you are saying is too much is on the 'presentation.' It does not sell like it use too, that type of advertising. Most of it is word of mouth, who my friends tell me is a good game or not, what my friends are playing. Then comes the reviews.

Its dumb that our next gen systems feel like we gone all but up a step that we already went down two to three on. Take Uncharted 4 for example 30FPS SP and 60 MP. That I can deal with, and that was a PS4 game.

I do at times dis Xbox a lot, but come on, they still have yet to make there own damn game engine. Slipstream failed heavily, where Sony and most of their own studios have tech they made for the games they are trying to build. I am unsure how Xbox is again now saying the 'next' system will be the best ever. Sure it will be duh? Tech changes and grows, but I never seen a system use it so poorly. (I blame alot on the bloated Xbox OS FYI)

just_looken22d ago

@phil

You can have both presentation and framerate pc's have been doing both for decades now 4k ray tracing bla bla yeah that is different but starfield is no maxed out minecraft with its seed tech with ray tracing minecraft uses seeds also with huge buildings but for years pc's can do 60fps on that.

I mentioned 30fps because back in the day we had hd consoles alot of users had sd tv's just getting into hd tv's so i get the graphics difference but we are talking about 20years of hardware difference.

A real hardware console 6700 3700x 16gb of ram can run starfield at 60fps not maxed out but its possible this was pre performance patches:
https://youtu.be/hNM488QIKO...

Remember the apu/igpu tablet crap the consoles are using are based off of the 6700

The xbox operating system has always been windows based from windows 2000/xp xbox-xbox 360 the tail end of the 360 using vista/7 then to the series x using windows 10 that is why backwards compatibility works great its all direct x based with the same bc as a window's pc. What we see today is just the change from needing games to survive to making games as a product m$ makes more money in a week than what sony can make in months.

Xbox has made custom engines like forza/unreal they used to support epic back in the day like slipstream like you said but they make so much cash they do not need to work as much as sony.

But the issue i find is the lack of passion behind games aswell as the mass amount of users that love the broken/microtransantion filled games we get every day. Back in the day when a game was crap or had something wrong it was a oh crap are we going to get shutdown or like thq just poof gone. But today its na who cares fix it later or write it off then oh wait never mind they like it such as the new cod.

The new cod is making bank yes users hated it but the sales show the masses that do not post love it sadly

Heck did you see that wow has limited time store items now with hundreds online defending it because they think the store its what's need to keep the game online despite the $15 a month payment and m4 ownership.

I find in the end of this generation Microsoft will go the way of sega they might also just buy sony as everything sony except music/games is not making money they are a cheap buy for microsoft right now so we would just have 2 companies fighting in the gaming ring with papa os watching from the sidelines.

Great chatting 2 you may i recommend looking at rpgm games? they are out there even though i have a i9 4090 custom rig right now along with a ps5 i have been playing 4yr old games or rpgm stuff this year. Its funny i remember being a 2360/ps3 owner stacks of adventures now i am like well time to see what is in the past.

+ Show (1) more replyLast reply 22d ago