- henfredemars ( @henfredemars@infosec.pub ) 32•3 days ago
That’s rather depressing to hear. AI is often used as a crutch used to pave over crappy code that would cost money to properly optimize. Maybe Nvidia is also using AI as a crutch instead of developing better GPUs that can actually render more pixels?
- catloaf ( @catloaf@lemm.ee ) 8•3 days ago
Usually people are against just throwing more hardware at a problem.
They’re going to keep making more powerful hardware either way, since parallel processing capability supports graphics and AI just fine. But if they can use a novel software solution to drastically increase performance, why not?
- tunetardis ( @tunetardis@lemmy.ca ) 9•3 days ago
They’re going to keep making more powerful hardware either way, since parallel processing capability supports graphics and AI just fine.
It’s not quite as simple as that. AI needs less precision than regular graphics, so chips developed with AI in mind do not necessarily translate into higher performance for other things.
In science/engineering, people want more—not less—precision. So we look for GPUs with capable 64-bit processing, while AI is driving the industry in the other direction, from 32 down to 16.
- catloaf ( @catloaf@lemm.ee ) 3•3 days ago
For science and engineering, workstation cards like the A6000 aren’t going anywhere.
- henfredemars ( @henfredemars@infosec.pub ) 1•3 days ago
That’s true, but I would like to see improvements driven along the consumer segment also. AI rendering is a nice software addition but I could easily see it becoming a distraction from hardware improvements.
Consumers generally can’t just throw more money at a problem in the way that professional and business can.
- averyminya ( @averyminya@beehaw.org ) 3•3 days ago
It’s funny because we don’t even need GPU’s. There’s tech that offloads the model’s “search” to an analog computer which is ~98% accurate for a fraction of the energy.
I imagine NVIDIA isn’t too excited about that side of AI, though.
- warm ( @warm@kbin.earth ) 15•3 days ago
Fuck off Nvidia.
- jlow (he/him) ( @jlow@beehaw.org ) 14•3 days ago
You better look for a new job then and let someone tale over who can, then.
- Artyom ( @Artyom@lemm.ee ) 1•2 days ago
Lazy CEOs just don’t want to work anymore!
- szczuroarturo ( @szczuroarturo@programming.dev ) 2•2 days ago
Yes they cant. Dlss is something they developed and every single one of their GPU has cuda cores ( not only for ai , they are just generaly usefull ). Pepole are expecting them to work with dlss. Its kinda stating the obvius