- simple ( @simple@lemm.ee ) 95•6 months ago
An article about Nvidia in the Linux community? Surely all the comments will be productive and discuss the topic at hand.
- RandomLegend [He/Him] ( @RandomLegend@lemmy.dbzer0.com ) English59•6 months ago
Too little too late.
Already sold my 3070 and went for an 7900 XT bcs i got fed up with NVidia being lazy
- morrowind ( @morrowind@lemmy.ml ) 16•6 months ago
Good. This is the better overall solution
- RandomLegend [He/Him] ( @RandomLegend@lemmy.dbzer0.com ) English11•6 months ago
well i dearly miss CUDA as i don’t get ZLUDA to work properly with Stable Diffusion and FSR is sill leagues behind DLSS… but yeah overall i am very happy
- Possibly linux ( @possiblylinux127@lemmy.zip ) English2•6 months ago
You can run Ollama with AMD acceleration
- RandomLegend [He/Him] ( @RandomLegend@lemmy.dbzer0.com ) 11•6 months ago
- yes i know, but Cuda is faster
- Ollama is for LLM, Stable Diffusion is for images
- Possibly linux ( @possiblylinux127@lemmy.zip ) English3•6 months ago
I’m aware I wanted to point out that AMD isn’t totally useless in AI.
- RandomLegend [He/Him] ( @RandomLegend@lemmy.dbzer0.com ) 2•6 months ago
Oh it definetly isn’t
Everything I need does run and I finally don’t run out of vram so easily 😅
- 30p87 ( @30p87@feddit.de ) 9•6 months ago
Even now, choosing between a free 4090 or a free 7900 XTX would be easy.
- RandomLegend [He/Him] ( @RandomLegend@lemmy.dbzer0.com ) English12•6 months ago
It totally depends on your usecase.
NVidia runs 100% rocksolid on X11.
If you’re someone who reallse uses CUDA and all their stuff and don’t care about Wayland. NVidia is the choice you have to make. Simple as that.
If you don’t care about those things or are willing to sacrifice time and tinker around with AMDs subpar alternatives, AMD is the way to go.
Because let’s face it. AMD didn’t care about machine learning stuff and they only now begin to dabble in it. They lost a huge amount of people who work with those things as their day job. They can’t tell their bosses and/or clients that they can’t work for a week or two until they figured out how to get this alternative running that is just starting to care about that field of work.
- 30p87 ( @30p87@feddit.de ) 6•6 months ago
Luckily the only way I’m gonna use ML is on my workstation server, which will have it’s Quadro M2000 replaced/complemented by my GTX 1070 once I have an AMD GPU in my main PC, because on that I mainly care about running games in 4k, with high settings but without much Raytracing, on Wayland.
- CrabAndBroom ( @CrabAndBroom@lemmy.ml ) English9•6 months ago
I had to update my laptop about two years ago and decided to go full AMD and it’s been awesome. I’ve been running Wayland as a daily driver the whole time and and I don’t even really notice it anymore.
- Andrenikous ( @Andrenikous@lemm.ee ) English6•6 months ago
If the day comes I want to upgrade my 3080 I’ll switch to an AMD solution but until then I’ll take any improvement I can get from Nvidia.
- Bulletdust ( @Bulletdust@lemmy.ml ) 3•6 months ago
I don’t believe Nvidia were the one’s being lazy in this regard, they submitted the merge request for explicit sync quite some time ago now. Wayland devs essentially took their sweet time merging the code.
- Catsrules ( @Catsrules@lemmy.ml ) 28•6 months ago
Explicit Sync sounds like some kind of porn syncing program.
- eveninghere ( @eveninghere@beehaw.org ) 4•6 months ago
It’s like Apple syncs videos with explicit lyrics in Apple Music when you play a song.
- pmk ( @pmk@lemmy.sdf.org ) 24•6 months ago
I will never buy anything with Nvidia again.
- EccTM ( @EccTM@lemmy.ml ) English12•6 months ago
Thats great.
I’d still like my Nvidia card to work so I’m happy about this, and when AMD on Linux eventually starts swapping over to explicit sync, I’ll be happy for those users then too.
- Possibly linux ( @possiblylinux127@lemmy.zip ) English2•6 months ago
AMD on Linux doesn’t need explicit sync
- umbrella ( @umbrella@lemmy.ml ) 20•6 months ago
hey look, the yearly “nvidia is finally fixing wayland support” post!
- lemmyvore ( @lemmyvore@feddit.nl ) English15•6 months ago
It will not though. Explicit sync is not a magic solution, it’s just another way of syncing GPU work. Unlike implicit sync it needs to be implemented by every part of the graphical stack. Just because Nvidia is implementing it will not solve issues with compositors not having it, and graphical libraries not having it, and apps not supporting it, and so on and so forth. It’s a step in the right direction but it won’t fix everything overnight like some people think.
Also it’s silly that this piece mentions Wayland and Nvidia because (1) Wayland doesn’t implement sync of any kind, they probably meant to say “the Wayland stack” and (2) Nvidia is not the only driver that needs to implement explicit sync.
- Bulletdust ( @Bulletdust@lemmy.ml ) 7•6 months ago
Now all they need is a complete nvidia-settings application under Wayland that allows for coolbits to be set, and I may be able to use Wayland. For some reason, my RTX 2070S boosts far higher than the already overclocked from factory boost clocks, resulting in random crashing - I have to use GWE to limit boost clocks to OEM specs to prevent crashing.
Strangely enough, this was never a problem under Windows.
- onlinepersona ( @onlinepersona@programming.dev ) English5•6 months ago
Doesn’t this mean application developers will have to explicitly sync the graphical state? If that’s the case, then devs will have to write custom code for it to work on NVIDIA, correct? If so, I doubt this will “finally solve” any issues, only finally provide the ability to solve them… explicitly and with a lot of dev work + required awareness.
How come AMD doesn’t need this?
P.S Obligatory Fuck NVIDIA
Anti Commercial AI thingy
CC BY-NC-SA 4.0 :::___