Not quite there yet … from left on surface, 5G internet, WireGuard router, pihole on a Zero W and 4x4 N95 HTPC, plus 1080p projector. When a computer that size (actually smaller, since I don’t need a SATA bay) can outperform my tower, though …
This photo of Meteor Lake shows 16GB of LPDDR5X on the package. AMD’s looking to kill the low-midrange GPU in the next couple of generations of APUs, with Intel attempting to reach parity. And all of this in a fraction of the power envelope of a midrange gaming rig.
Maybe it’s next-quarter-itis dominating the tech press, but these developments feel like they deserve a bit more attention given that all signs point to gaming 4x4 PCs with a wall wart in the next two years. This actually makes Intel’s exit from the NUC space somewhat surprising, but they’ve been shedding products pretty consistently and this may just be a part of that.
I’m in the situation of having a 5-year-old gaming rig that’s still going strong (caveat: I’m a factory/city-builder gamer so an RX 6600 works fine for me at 4K60), and moving into a stepvan in the next couple of weeks and therefore suddenly very aware of power draw, so all of this may be more exciting to me than the average bear, as I could see finally upgrading on account of a dead component in the next couple of years.
Yet there’s still that part of me from college that wants to keep abreast of the latest developments, and as I’ve watched now six desktop Intel generations hit benchmarks since I was the lucky winner of an 8086K, there’s been nothing that really draws a line in the sand and says “this will be the clear new minimum target.”
Intel starting over at 1st gen for Meteor Lake shows they see this finally changing. It honestly could have happened anywhere from introduction of E-cores to the seeming destination of Rentable Units, which have finally popped up outside of MLID. I’ve seen nothing about what AMD’s disaggregated endpoint looks like, even though I’m definitely looking to Strix Halo as where I may be able to ditch the ITX sandwich tower completely. Couple this with swapping out my TV for a native 1080p mini projector (a “maybe” suggestion that turned into having to try one at $40, and wow!), and I could be gaming in a van in fucking style with essentially zero dedicated hardware space in just a couple years!
Anyway, in situations like this, I’ve found that I may have inadequate sources, so I thought I’d see if anyone had suggestions.
- cooopsspace ( @cooopsspace@infosec.pub ) English46•1 year ago
Pry my gaming PC from my cold dead hands.
Upgradability is king.
My desktop will become my server, and the hand me down process doesn’t stop there.
- loops ( @loops@beehaw.org ) English28•1 year ago
My desktop will become my server.
*5 years later
My desktop will be integrated into my personal server farm.
- cooopsspace ( @cooopsspace@infosec.pub ) English4•1 year ago
I’ll get 6-8 out of mine, especially since none of my games are AAA.
I had a slew of desktop-cum-servers (you know what, I’m leaving it … y’all know what I meant) back in the day, and what broke the chain was realizing that my server files fit on an SSD. I’m not doing anything on the server end that would necessitate an i7, so that little $99 computer works fine outside of Jellyfin attempting to transcode 4K, at which point switching to VLC gets the job done with far less than 99% dropped frames.
I’m dicey on being locked into a GPU for years, but my 580 was fine at 4K until Satisfactory, and I don’t see myself suddenly discovering an interest in AAA gaming in my 40s, so a beefy iGPU could be sufficient.
- CalcProgrammer1 ( @CalcProgrammer1@lemmy.ml ) 35•1 year ago
The death if the tower/server/workstation/supercomputer/etc. is a pretty bad take. Computers have been getting better for over half a century and these big machines still exist. As computing power grows, so do software demands. If we make a phone with the power of today’s gaming PC we could make a gaming PC with the same technology many times more powerful, and games will take advantage of that. A modern smartphone of today can run PC games of the 2000’s and maybe early 2010’s with proper emulation. The Steam Deck can run most games released today. That doesn’t mean demand for high end systems disappears.
Can you point out where I extrapolated about all use cases for all towers? A lot of people seem to have read that, and I’d really like to understand where my post went so wildly off the rails that this was the predominant takeaway. For someone with a heavy background in communications, I’m apparently terrible with words.
- fox_the_apprentice ( @fox_the_apprentice@lemmynsfw.com ) English2•1 year ago
Can you point out where I extrapolated about all use cases for all towers?
Very well, I’ll bite. From your OP:
Not quite there yet … […] When a computer that size (actually smaller, since I don’t need a SATA bay) can outperform my tower, though …
all signs point to gaming 4x4 PCs with a wall wart in the next two years
From one of your other posts:
at no point did I say towers are going to die, just that for a system for gaming without the need for a high-end GPU
These comments imply that compact computers will start to outperform full-sized computers, and that GPUs will become useless due to the performance of APUs.
If you’re just talking about your personal preferences, then go build yourself an ITX form factor PC and have fun with it! Oh wait, you already did! Good for you - you have a PC that fits your needs. You aren’t alone in that. The Steam Deck is generally well-received, for example.
But it really sounds like you aren’t just talking about your personal preferences. It sounds like you really believe that APUs are going to replace high-end GPUs. It sounds like you think gaming laptops are going to take over the world.
I’ll repeat your quote from earlier: “a system for gaming without the need for a high-end GPU”. APUs are going to replace low-end GPUs - in fact, they already have! The $200 new GPU market no longer exists! But they don’t provide enough performance to max out graphics of new games and, in your own words, remove the need for high-end GPUs.
It seems like your entire post is about “man I can’t wait for the full/mid-tower PC market to die” and then you’re acting surprised when people say things like “my PC sits on my desk and never moves do I’d rather have a full-sized cooling solution.”
And your other comments just reinforce it.
What peripherals are people using that necessitate so many add-in cards for non-HPC needs that ATX is a must, and why is it assumed that anything less than an i9 will freeze opening Notepad and thus the only power envelope worth validly addressing is that of an i9?
You’re creating a strawman. Nobody except you is saying that an i9 will freeze opening Notepad. It’s also very hostile in implying that “I’m fine with therefore most of the world must also be fine with it.” Some folks play CPU-intensive games and prefer having a powerful PC. That’s their money to spend how they want.
Full-sized PCs permit adaptability.
- Onboard Wi-Fi died? Get an add-in card replacement. This applies to onboard audio, network, USB, etc.
- Phone no longer has a 3.5mm jack and I now have to use a Bluetooth headset? Time to add Bluetooth to my PC via add-in card.
- Got a cool new monitor from a friend that’s higher resolution? I can upgrade my GPU independently. APUs limit me to what my motherboard socket supports.
Full-sized PCs permit better cooling. Even mid-range GPUs benefit from that.
Full-sized PCs easier dust cleaning and maintenance.
Full-sized PCs are easier to work in for those of us that like to tinker.
There’s more to it than “What peripherals are people using that necessitate so many add-in cards for non-HPC needs that ATX is a must,” that’s a deliberately dumb take.
To summarize:
People who want a compact or power-efficient PCs already have one.
People who prefer more power/cooling have a mid-tower or full-tower PC.
Games and other software will keep pace with the power/cooling available common PCs, so don’t expect the full-sized ones to go away just because the compact ones get more powerful.
I guess I’m a bit confused about a lot of reactions here, because at no point did I say towers are going to die
Yes you did. You’ve been saying it repeatedly in different forms throughout your various replies. I think you’re wrong, and it seems so do several other commenters.
I’m honestly not trying to be combative here … I’m just surprised by the sorts of responses, so I appreciate the explanation, even if it’s to a certain extent more confusing.
Very well, I’ll bite. From your OP:
Not quite there yet … […] When a computer that size (actually smaller, since I don’t need a SATA bay) can outperform my tower, though … all signs point to gaming 4x4 PCs with a wall wart in the next two years
From one of your other posts:
at no point did I say towers are going to die, just that for a system for gaming without the need for a high-end GPU
These comments imply that compact computers will start to outperform full-sized computers, and that GPUs will become useless due to the performance of APUs.
There’s the disconnect. How are you getting from A to B? “Can outperform my tower” from five years ago is not “compact computers will start to outperform full-sized computers” and certainly not “GPUs will become useless due to the performance of APUs.” This is the extrapolation that’s confusing me.
What peripherals are people using that necessitate so many add-in cards for non-HPC needs that ATX is a must, and why is it assumed that anything less than an i9 will freeze opening Notepad and thus the only power envelope worth validly addressing is that of an i9?
You’re creating a strawman. Nobody except you is saying that an i9 will freeze opening Notepad. It’s also very hostile in implying that “I’m fine with therefore most of the world must also be fine with it.” Some folks play CPU-intensive games and prefer having a powerful PC. That’s their money to spend how they want.
In context, I was referring to the sorts of things that transpire on Reddit when it comes to CPU recommendations if gaming is mentioned at all, where it’s often i9 or nothing, and if it came out two weeks ago, it’s already too slow by orders of magnitude. The middle ground is all but ignored, which is what I’m referring to.
I guess I’m a bit confused about a lot of reactions here, because at no point did I say towers are going to die Yes you did. You’ve been saying it repeatedly in different forms throughout your various replies. I think you’re wrong, and it seems so do several other commenters.
Please provide examples of this since I’m doing it all over the place. I can’t find one where I talk about how towers are an endangered species.
- fox_the_apprentice ( @fox_the_apprentice@lemmynsfw.com ) English1•1 year ago
There’s the disconnect. How are you getting from A to B? “Can outperform my tower” from five years ago is not “compact computers will start to outperform full-sized computers” and certainly not “GPUs will become useless due to the performance of APUs.” This is the extrapolation that’s confusing me.
You literally made the statement “at no point did I say towers are going to die, just that for a system for gaming without the need for a high-end GPU” which says that you think GPUs (except those at the very high end) will be made obsolete by APUs.
In context, I was referring to the sorts of things that transpire on Reddit when it comes to CPU recommendations if gaming is mentioned at all, where it’s often i9 or nothing, and if it came out two weeks ago, it’s already too slow by orders of magnitude. The middle ground is all but ignored, which is what I’m referring to.
And yet you used it as an actual argument against people here - not on Reddit - who disagree with you. You don’t get to use that argument here and then try to say “no no, I only meant it in reference to people way over there who aren’t even on this social media.”
Please provide examples of this since I’m doing it all over the place. I can’t find one where I talk about how towers are an endangered species.
I provided several exact quotes in my previous comment. I’ll add one more, since it wasn’t enough. In the title of this thread you state “It feels like we’re on the cusp of […] and a real shot at relegating towers to the extreme high end.” Emphasis mine. That very clearly says that you think towers are going to become endangered. That they will preserved only by, and I quote, people at “the extreme high end.”
I thank you for your time reading my comment and replying to me. I think this will be my final word on the subject - but I’ll be sure to read any replies in case you think there’s still a misunderstanding.
- BananaTrifleViolin ( @BananaTrifleViolin@kbin.social ) 22•1 year ago
This doesn’t make sense. It’s more likely we’ll pack more into a high end device then say goodbye to them in tasks like gaming.
Computing power has been constantly improving for decades and miniaturisation is part of that. I have desktop PCs at work in small form factors that are more powerful than the gaming PC I used to have 10 years ago. It’s impressive how far things have come.
However at the top end bleeding edge in CPUs,.GPUs and APUs high powered kit needs more space for very good reasons. One is cooling - if you want to push any chip to its limits then you’ll get heat, so you need space to cool it. The vast majority of the space in my desktop is for fans and airflow. Even the vast majority of the bulk of my graphics card is actually space for cooling.
The second is scale - in a small form factor device you cram as much as you can get in, and these days you can get a lot in a small space. But in my desktop gaming tower I’m not constrained such limits. So I have space for a high quality power supply unit, a spacious motherboard with a wealth of options for expansions, a large graphics card so I can have a cutting edge chip and keep it cool, space for multiple storage devices, and also lots and lots of fans, a cooling system for the CPU.
Yes, in 5 years a smaller device will be more capable for today’s games. But the cutting edge will also have moved on and you’ll still need a cutting edge large form factor device for the really bleeding edge stuff. Just as now - a gaming laptop or a games console is powerful but they have hard upper limits. A large form factor device is where you go for high end experiences such as the highest end graphics and now increasingly high fidelity VR.
The exceptions to that are certain computing tasks don’t need anything like high end any more (like office software, web browsing, 4k movies), other tasks largely don’t (like video editing) so big desktops are becoming more niche in the sense that high end gaming is their main use for many homes users. That’s been a long running trend, and not related to APUs.
The other exception is cloud streaming of gaming and offloading processing into the cloud. In my opinion that is what will really bring an end to needing large form factor devices. We’re not quite there but I suspec that will that really pushes form factors down, rather than APUs etc.
- tburkhol ( @tburkhol@beehaw.org ) 9•1 year ago
I thought Apple was on to something with their cylindrical form factor. It’s inconvenient from a mobo perspective, but a central core motherboard, with CPU, GPU, even memory and M2 sticking out like fins. Get the mobo down to segments like 25mm x 250, and stack the whole thing over a massive -like 250-350 mm- low rpm fan would be great for heat management and noise. Heck, put it in a venturi tunnel driven by a 500mm fan.
I guess I’m a bit confused about a lot of reactions here, because at no point did I say towers are going to die, just that for a system for gaming without the need for a high-end GPU, things are looking like miniaturization is finally coming for the desktop.
“I play a lot of games, including new ones, but none of them needs high-end hardware” seems to be a weird place few people will admit to being in on online fora. I don’t care about performance past 60Hz, and my current hardware is more than sufficient to do so. I do not understand the appeal of combat in games and did not buy Satisfactory until the update that had a passive-enemy mode.
But that’s where I’m at. I’m not attempting to extrapolate anything beyond my use case, though I’m aware my needs already far exceed content consumption, which has been covered by iGPUs for a decade at this point.
Heat being brought up as though I’m unfamiliar with thermodynamics is also baffling. I have an AIO for my current rig and wouldn’t dream of trying to do everything on air at current dissipation levels.
But I also have an HTPC that handles everything I need it to and draws 11W during content consumption. Yes, the top end will always need beefy cooling solutions; I feel like what’s being missed is top end has been overkill for me when aspirational future use cases run into the boots-on-the-ground experience as the system ages. Nothing else on the market could rival the performance of an 8086K at $0, so that’s what I have, but I’m willing to admit it’s more than what I needed.
Intel has done a very good job of convincing people that if they needed an i7 years ago, nothing less will suffice in the future (and until AMD finally forced i7s above four cores, that was pretty true). I’ve built a completely new system to gain 34MHz (in the '90s, obviously), so I get the drive to always go for faster and better and bigger … and if people want to keep doing that for bragging rights, more power to them, but I look at people asking if an i9-13900K will suffice with RPL-R on the horizon and can’t help but wonder what the use case is where a 13th gen would crawl while a 14th flies.
The other thing about physical space is we’ve come a long way since the days of the I/O plate being a couple of PS/2 ports, a parallel port and an RS-232 port. I’ve needed six add-in cards before, including a PCI-SCSI card, but I can’t imagine the consumer use cases for full ATX in 2023. Sound and networking are onboard, and insufficient USB ports is to me more a failing on mobo selection than needing a full echelon of x1-x4 slots to rectify that failing. I’ve also had media servers that made a mid-tower feel cramped, but those data now fit on a 2280 SSD.
I went ITX with this build since the HDDs were already offloaded to a server in 2018, bypassing mATX completely.
So my questions are: What peripherals are people using that necessitate so many add-in cards for non-HPC needs that ATX is a must, and why is it assumed that anything less than an i9 will freeze opening Notepad and thus the only power envelope worth validly addressing is that of an i9? Yes, heat needs to be mitigated, but using vastly more power than needed in the first place causes that mitigation need.
- Takatakatakatakatak ( @bandario@lemmy.dbzer0.com ) English3•1 year ago
I think most of what you have said is true, and I’m glad for it. I will continue to build enthusiast level computers and sit close enough to the bleeding edge because I enjoy it.
What I believe is likely to happen is that serious performance oriented gaming PCs will once again become fairly niche. I sort of bucked against the idea of micro form factor mini PC’s being a valid choice for gamers for a while.
It wasn’t until I saw the youtube video I’ve linked below that I realised something like the HX99G would MORE than fill the desire of most of my friends group in terms of gaming performance, thermals and user experience.
It’s not as small as the APU powered boxes that OP was talking about, and it has a dedicated wedge of silicon for the GPU but it is extremely cheap, extremely capable and seems to run fairly cool whilst being smaller than 99.9% of normal PCs.
My wife recently asked that I build her a gaming PC. She’s pretty casual and doesn’t mind 1080p gaming. All of my spare parts and previous gen hardware has already been put to work in a gaming PC for my daughter so I began the task of speccing up a reasonably decent 1080p gaming build from new parts. I can’t beat the price:performance ratio of the HX99G. Watch the video and see for yourself.
Keeping in mind that this is now previous generation components and a next-gen replacement is almost certainly due any month now…it’s nuts. Not only is he playing current release games at 1080p, in some titles they were happily over 100fps at 1440p, with fighting games even running at 4K without issue.
You and I will happily keep our server-sized monsters, but I know a LOT of people that will happily sit in this lane. The price is right and so is the performance. It’s like a console without so many limitations, as well as being a powerful PC in its own right.
- SenorBolsa ( @SenorBolsa@beehaw.org ) 13•1 year ago
Putting it in a bigger box with more cooling capacity will always make a much faster computer, so that’s not going away anytime soon and someone will always find a way to use 20% more power than is available every time a faster computer is made. A lot of things just come down to how well you can cool something, engines, brakes, lights, computers, batteries… how hard do you want to go and how long do you want to do it often determines the form of things.
My computer fits on my desk as it is so making it smaller gains me nothing and just makes it less useful.
Maybe tower PCs will become slightly more niche again in the future, but they’ll always be around for enthusiasts like me.
- wintrparkgrl ( @wintrparkgrl@beehaw.org ) English2•1 year ago
Just like I’ll never have something smaller than a full tower. With just cheap air cooling I can get stable 68c with heavy overclocking
- SenorBolsa ( @SenorBolsa@beehaw.org ) 1•1 year ago
Unless you have bleeding edge hardware yes, the highest end stuff usually requires that you dissipate 600+w of heat continuously at full tilt. I’m fine with running the hardware just below it’s stock throttling limits (which are well below safety cutoffs) which these days is in the 90s. It’s just kind of the reality of it if you don’t want to experience what it’s like to game on the deck of an aircraft carrier or go through the trouble of water-cooling everything.
FWIW I’ve put a lot of cards through this kind of “abuse” and then handed them down, they all worked for many years after.
My GPU is even hotrodded with the fans and shrouds removed and two side panel fans close to it, and the gaps sealed with gaffers tape to improve static pressure. Works really well but still, it’s a lot of heat to move out of a relatively small device.
- stealth_cookies ( @stealth_cookies@lemmy.ca ) 5•1 year ago
On one hand we are already at a point where most people don’t need any more computing power than their phone has. At the same time, for the people that game or need the most performance you are limited by how much size you need for the cooling solution and i don’t see that changing significantly anytime soon.
- d3Xt3r ( @d3Xt3r@beehaw.org ) 5•1 year ago
I did the TV -> projector swap last year, got myself a 4K projector that sits above my bed and projects a massive 100" image on the wall opposite my bed, and it’s awesome. I’ve got my PS5 and Switch hooked to it, and I’m currently living the dream of being able to game and watch movies on a giant screen, all from the comfort of my bed. Some games really shine on such a screen and you see them in a new light, like TotK, Horizon series, Spiderman etc and it’s 100% worth the switch, IMO.
Now I also have a regular monitor - a nice low latency QHD 16:10 monitor with HDR, hooked up to my PC, which also uses a 6600 XT btw. Main reason I use this setup is for productivity, running some PC games that don’t have console equivalents, plus the colors look much nicer compared to my projector. Maybe if I bought a laser projector and had one of those special ALR screens I could get nicer colors, but all that is way beyond my budget. Although these days I’m not on my desktop as much as I used to be (I also have a Ryzen 6000 series laptop that I game on btw), I still like my desktop because of the flexibility and upgradability. I also explored the option of switching to a cloud-first setup and ditching my rig, back when I wanted to upgrade my PC and we had all those supply chain issues during Covid, but in the end, cloud gaming didn’t really work out for me. In fact after exploring all the cloud options, I’ve been kind of put off by cloud computing in general - at least, the public clouds being offered by the likes of Amazon and Microsoft - they’re just in it to squeeze you dry, and take control away from you, and I don’t like that one bit. If I were to lean towards cloud anything, it be rolling my own, maybe using something like a Linode VM with a GPU, but the pricing doesn’t justify it if you’re looking any anything beyond casual usage. And that’s one of the things I like about PC, I could have it running 24x7 if I wanted to and not worry about getting a $200 bill at the end of the month, like I got with Azure, because scummy Microsoft didn’t explain anywhere that you’d be paying for bastion even if the VM was fully powered off…
Anyways, back to the topic of CPUs, I don’t really think we’re at the cusp of any re-imagining, what we’ve been seeing is just gradual and natural improvements, maybe the PC taking inspiration from the mobile world. I haven’t seen anything revolutionary yet, it’s all been evolutionary. At the most, I think we’d see more ARM-like models, like the integrated RAM you mentioned, more SoC/integrated solutions, maybe AI/ML cores bring the new thing to look for an a CPU, maybe ARM itself making more inroads towards the desktop and laptop space, since Apple have shown that you can use ARM for mainstream computing.
On the revolutionary side, the things I’ve been reading about are stuff like quantum CPUs or DNA computers, but these are still very expiremental, with very niche use-cases. In the future I imagine we might have something like a hybrid semi-organic computer, with a literal brain that forms organic neural networks and evolves as per requirements, I think that would be truly revolutionary, but we’re not there yet, not even at the cusp of it. Everything else that I’ve seen from the likes of Intel and AMD, have just been evolutionary.
Short-throw 4K is certainly the stretch goal; my two 32" 4K60 HDR LGs are coming along in the meanwhile. Sometimes, you just can’t beat a shitton of pixels a couple of feet from your face.
I’ve been going toward more control this year, having switched to KDE after one too many Win11 nags about OneDrive. It floors me that they really could have upped the telemetry without me jumping ship, but instead they toasted themselves off my desktop. I will not be participating in any sort of thin-client/VDI dystopia anytime soon for the reasons you enumerated, which is what makes the idea of that photo being my entire internet connection/VPN/pihole/gaming PC/display sometime soon so appealing even without the van situation.
Yeah, there’s been a lot of progress on a lot of fronts, but SoCs coming to replace gaming towers that are essentially unchanged since the adoption of ATX as a form factor is to me bigger than the cores on a chip themselves. And the power envelope for a single-package CPU/GPU (RAM notwithstanding) with that level of performance would to me, as an enthusiast, obliterate everything up to the 80/800 GPU level. I’m sure people would still build towers because they like building towers, but I’m happy to let PSUs and power connectors sail quietly into the night.
- SenorBolsa ( @SenorBolsa@beehaw.org ) 1•1 year ago
Just a heads up with short throw you have to be really sure to have a perfectly flat screen surface to project on, even just .5 throw means any kind of pull down screen will be a nightmare to use, even tab tensioned isn’t great (but acceptable)
- crow ( @crow@beehaw.org ) English3•1 year ago
I’ve always believed the next step of computing will be using photonics to connect the PC better into a more integrated machine with shared removable ram, swappable chache, and other things that needed to be close to other parts on the board. Intel has finally just started testing this a little bit.