- Norgur ( @Norgur@kbin.social ) 74•7 months ago
Thing is: there is always the “next better thing” around the corner. That’s what progress is about. The only thing you can do is choose the best available option for you when you need new hardware and be done with it until you need another upgrade.
- Sigmatics ( @Sigmatics@lemmy.ca ) 62•7 months ago
Exactly. The best time to buy a graphics card is never
- wrath_of_grunge ( @wrath_of_grunge@kbin.social ) 15•7 months ago
really my rule of thumb has always been when it’s a significant upgrade.
for a long time i didn’t really upgrade until it was a 4x increase over my old. certain exceptions were occasionally made. nowadays i’m a bit more opportunistic in my upgrades. but i still seek out ‘meaningful’ upgrades. upgrades that are a decent jump over the old. typically 50% improvement in performance, or upgrades i can get for really cheap.
- jmcs ( @jmcs@discuss.tchncs.de ) 7•7 months ago
It depends on what you need. I think usually you can get the best bang for buck by buying the now previous generation when the new one is released.
- massive_bereavement ( @massive_bereavement@kbin.social ) 6•7 months ago
Graphics card. Not even once.
- AeroLemming ( @AeroLemming@lemm.ee ) 19•7 months ago
You have a magical button. If you press it now, you will get $100 and it will disappear. Every year you don’t press it, the amount of money you will get if you do press it goes up by 20%. When should you press the button? At any given point in time, waiting just one more year adds an entire 20% to your eventual prize, so it never makes sense to press it, but you have to eventually or you get nothing.
Same thing with graphics cards.
- Bizarroland ( @Bizarroland@kbin.social ) 8•7 months ago
Is it compound or straight percentage?
Cuz if it’s just straight percentage then it’s $20 a year, whereas if it is compound then it’s a 2X multiplier every three and a half years roughly.
- AeroLemming ( @AeroLemming@lemm.ee ) 4•7 months ago
Compound, which more closely models the actual rate at which computing power has grown over the years.
- Bizarroland ( @Bizarroland@kbin.social ) 2•7 months ago
So if I waited roughly 35 years then I would get $1 million…
- AeroLemming ( @AeroLemming@lemm.ee ) 2•7 months ago
Or you could wait 70 years and leave 34 million to people in your will… The point is that there is no mathematically correct choice.
- Bizarroland ( @Bizarroland@kbin.social ) 2•7 months ago
I think I got about 77 years left in me, unless somebody comes along and kills me that is.
That at least would be $125 million which isn’t too shabby. I find it hard to believe that anybody would say that $125 million 77 years from now would not be a considerable amount of money.
- SkyeStarfall ( @SkyeStarfall@lemmy.blahaj.zone ) 4•7 months ago
Once you need it, or, alternatively, once you have enough to live comfortably for the rest of your life. It’s exponential growth, you only get one chance, just gotta decide what your goal with the money actually is.
- AeroLemming ( @AeroLemming@lemm.ee ) 1•7 months ago
Yep. My point is that there’s no easily calculable, mathematically “correct” moment to push the button. Same goes for buying a graphics card.
- Sigmatics ( @Sigmatics@lemmy.ca ) 3•7 months ago
Press it before you retire
Same with graphics cards
- Nik282000 ( @nik282000@lemmy.ca ) 4•7 months ago
I bought a 1080 for my last PC build, downloaded the driver installer and ran the setup. There were ads in the setup for the 2k series that had launched the day before. FML
- Norgur ( @Norgur@kbin.social ) 6•7 months ago
Yep. I bought a 4080 just a few weeks ago. Now there is ads for the refresh all over… Thing is: you card didn’t get any worse. You thought the card was a good value proposition for you when you bought it and it hasn’t lost any of that.
choose the best available option
“The” point. Which is the best available option?
The simplest answer would be “price per fps”.
- Norgur ( @Norgur@kbin.social ) 5•7 months ago
Not always. I’m doing a lot of rendering and such. So FPS aren’t my primary concern.
- gnuplusmatt ( @gnuplusmatt@reddthat.com ) 10•7 months ago
As a Linux gamer, this really wasn’t on the cards anyway
- BCsven ( @BCsven@lemmy.ca ) 2•7 months ago
AMD is a better decision, but my nVidia works great with Linux, but I’m on OpenSUSE and nVidia hosts their own OpenSUSE drivers so it works out of the get go once you add the nVidia repo
- gnuplusmatt ( @gnuplusmatt@reddthat.com ) 2•7 months ago
I had an nvidia 660 GT back in 2013, it was a pain in the arse being on a leading edge distro, used to break xorg for a couple of months every time there was an xorg release (which admittedly are really rare these days since its in sunset mode). Buying an amd was the best hardware decision, no hassles and I’ve been on Wayland since Fedora 35.
- sederx ( @sederx@programming.dev ) 9•7 months ago
i saw a 4080 on amazon for 1200, shits crazy
- joneskind ( @joneskind@beehaw.org ) 8•7 months ago
It really is a risky bet to make.
I doubt full price RTX 4080 SUPER upgrade will worth it over a discounted regular RTX 4080.
SUPER upgrades never crossed the +10%
I’d rather wait for the Ti version
- wrath_of_grunge ( @wrath_of_grunge@kbin.social ) 3•7 months ago
really the RTX 4080 is going to be a sweet spot in terms of performance envelope. that’s a card you’ll see with some decent longevity, even if it’s not being recognized as such currently.
- joneskind ( @joneskind@beehaw.org ) 1•7 months ago
It will depend on the power upgrade offered by the 50XX and the game development studios appetite for more power.
But TBH I don’t see Nvidia able to massively produce a 2 times faster chip without increasing its price again
Meaning, nobody will get the next gen most powerful chip, game devs will have to take that into account and the RTX 4080 will stay relevant for longer time.
Besides, according to SteamDB, most of gamers still have an RTX 2080 or less powerful GPU. They won’t sell their games if you can play it decently on those cards.
The power gap between high-ends GPUs is growing exponentially. It won’t stay sustainable very long
- Anony Moose ( @anonymoose@lemmy.ca ) 1•7 months ago
I’m looking to get a 4090 this black Friday, and even with these refreshes, doesn’t seem like my purchasing decision would really be affected, unless they’re also refreshing the 4090.
- GarytheSnail ( @GarytheSnail@programming.dev ) 7•7 months ago
All three cards are rumored to come with the same memory configuration as their base models…
Sigh.
- Excel ( @excel@lemmy.megumin.org ) 4•7 months ago
It would help if they had any competitors. AMD and Intel aren’t cutting it.
- body_by_make ( @Cqrd@lemmy.dbzer0.com ) 7•7 months ago
AMD is definitely pulling their load, but more competitors are always better.
- UnspecificGravity ( @UnspecificGravity@discuss.tchncs.de ) 6•7 months ago
For the vast majority of customers that aren’t looking to spend close to a grand for a card that is infinitesimally better than a card for half the price, AMD has plenty to offer.
- Kit ( @Kit@lemmy.blahaj.zone ) 4•7 months ago
Meh I’m still gonna buy a 4070 Ti on Black Friday. Wish I could wait but my other half wants a PC for Christmas.
- state_electrician ( @state_electrician@discuss.tchncs.de ) 1•7 months ago
Only slightly related question: is there such a thing as an external nVidia GPU for AI models? I know I can rent cloud GPUs but I am wondering if long-term something like an external GPU might be worth it.
- baconisaveg ( @baconisaveg@lemmy.ca ) 2•7 months ago
A 3090 (used) is the best bang for your buck for any LLM / StableDiffusion work right now. I’ve seen external GPU enclosures, though they probably cost as much as slapping a used 3090 into a barebones rig and running it headless in a closet.
- AnotherDirtyAnglo ( @AnotherDirtyAnglo@lemmy.ca ) 1•7 months ago
Generally speaking, buying outright is always cheaper than renting, because you can always continue to run the device potentially for years, or sell it to reclaim some capital.