AMD Quietly Funded A Drop-In CUDA Implementation Built On ROCm: It's Now Open-Sourcewww.phoronix.comexternal-linkcross-posted to: technology@lemmy.zipstable_diffusion@lemmy.dbzer0.comlinuxfurs@pawb.sociallinux@lemmy.mlhackernews@lemmy.smeargle.fans rvlobato ( @rvlobato@lemmy.ml ) Open Source@lemmy.ml • 4 months ago message-square13fedilinkarrow-up1110
arrow-up1110external-linkAMD Quietly Funded A Drop-In CUDA Implementation Built On ROCm: It's Now Open-Sourcewww.phoronix.com rvlobato ( @rvlobato@lemmy.ml ) Open Source@lemmy.ml • 4 months ago message-square13fedilinkcross-posted to: technology@lemmy.zipstable_diffusion@lemmy.dbzer0.comlinuxfurs@pawb.sociallinux@lemmy.mlhackernews@lemmy.smeargle.fans
minus-square mayooooo ( @MayonnaiseArch@beehaw.org ) linkfedilink11•4 months agoA serious question - when will nvidia stop selling their products and start asking for rent? Like 50 bucks a month is a 4070, your hardware can be a 4090 but thats a 100 a month. I give it a year
minus-square poVoq ( @poVoq@slrpnk.net ) linkfedilink11•4 months agoIt’s more efficient to rent the same GPU to multiple people the same time, and Nvidia is already doing that with GeforceNow.
minus-square umbrella ( @umbrella@lemmy.ml ) linkfedilink3•4 months agowhenever the infrastructure is good enough they can keep the hardware and stream your workload to you.
minus-square Ludrol ( @Ludrol@szmer.info ) linkfedilink3•4 months agoWhen the AI and data center hardware will stop being profitable.
A serious question - when will nvidia stop selling their products and start asking for rent? Like 50 bucks a month is a 4070, your hardware can be a 4090 but thats a 100 a month. I give it a year
It’s more efficient to rent the same GPU to multiple people the same time, and Nvidia is already doing that with GeforceNow.
whenever the infrastructure is good enough they can keep the hardware and stream your workload to you.
When the AI and data center hardware will stop being profitable.