Not quite there yet … from left on surface, 5G internet, WireGuard router, pihole on a Zero W and 4x4 N95 HTPC, plus 1080p projector. When a computer that size (actually smaller, since I don’t need a SATA bay) can outperform my tower, though …

This photo of Meteor Lake shows 16GB of LPDDR5X on the package. AMD’s looking to kill the low-midrange GPU in the next couple of generations of APUs, with Intel attempting to reach parity. And all of this in a fraction of the power envelope of a midrange gaming rig.

Maybe it’s next-quarter-itis dominating the tech press, but these developments feel like they deserve a bit more attention given that all signs point to gaming 4x4 PCs with a wall wart in the next two years. This actually makes Intel’s exit from the NUC space somewhat surprising, but they’ve been shedding products pretty consistently and this may just be a part of that.

I’m in the situation of having a 5-year-old gaming rig that’s still going strong (caveat: I’m a factory/city-builder gamer so an RX 6600 works fine for me at 4K60), and moving into a stepvan in the next couple of weeks and therefore suddenly very aware of power draw, so all of this may be more exciting to me than the average bear, as I could see finally upgrading on account of a dead component in the next couple of years.

Yet there’s still that part of me from college that wants to keep abreast of the latest developments, and as I’ve watched now six desktop Intel generations hit benchmarks since I was the lucky winner of an 8086K, there’s been nothing that really draws a line in the sand and says “this will be the clear new minimum target.”

Intel starting over at 1st gen for Meteor Lake shows they see this finally changing. It honestly could have happened anywhere from introduction of E-cores to the seeming destination of Rentable Units, which have finally popped up outside of MLID. I’ve seen nothing about what AMD’s disaggregated endpoint looks like, even though I’m definitely looking to Strix Halo as where I may be able to ditch the ITX sandwich tower completely. Couple this with swapping out my TV for a native 1080p mini projector (a “maybe” suggestion that turned into having to try one at $40, and wow!), and I could be gaming in a van in fucking style with essentially zero dedicated hardware space in just a couple years!

Anyway, in situations like this, I’ve found that I may have inadequate sources, so I thought I’d see if anyone had suggestions.

  • I guess I’m a bit confused about a lot of reactions here, because at no point did I say towers are going to die, just that for a system for gaming without the need for a high-end GPU, things are looking like miniaturization is finally coming for the desktop.

    “I play a lot of games, including new ones, but none of them needs high-end hardware” seems to be a weird place few people will admit to being in on online fora. I don’t care about performance past 60Hz, and my current hardware is more than sufficient to do so. I do not understand the appeal of combat in games and did not buy Satisfactory until the update that had a passive-enemy mode.

    But that’s where I’m at. I’m not attempting to extrapolate anything beyond my use case, though I’m aware my needs already far exceed content consumption, which has been covered by iGPUs for a decade at this point.

    Heat being brought up as though I’m unfamiliar with thermodynamics is also baffling. I have an AIO for my current rig and wouldn’t dream of trying to do everything on air at current dissipation levels.

    But I also have an HTPC that handles everything I need it to and draws 11W during content consumption. Yes, the top end will always need beefy cooling solutions; I feel like what’s being missed is top end has been overkill for me when aspirational future use cases run into the boots-on-the-ground experience as the system ages. Nothing else on the market could rival the performance of an 8086K at $0, so that’s what I have, but I’m willing to admit it’s more than what I needed.

    Intel has done a very good job of convincing people that if they needed an i7 years ago, nothing less will suffice in the future (and until AMD finally forced i7s above four cores, that was pretty true). I’ve built a completely new system to gain 34MHz (in the '90s, obviously), so I get the drive to always go for faster and better and bigger … and if people want to keep doing that for bragging rights, more power to them, but I look at people asking if an i9-13900K will suffice with RPL-R on the horizon and can’t help but wonder what the use case is where a 13th gen would crawl while a 14th flies.

    The other thing about physical space is we’ve come a long way since the days of the I/O plate being a couple of PS/2 ports, a parallel port and an RS-232 port. I’ve needed six add-in cards before, including a PCI-SCSI card, but I can’t imagine the consumer use cases for full ATX in 2023. Sound and networking are onboard, and insufficient USB ports is to me more a failing on mobo selection than needing a full echelon of x1-x4 slots to rectify that failing. I’ve also had media servers that made a mid-tower feel cramped, but those data now fit on a 2280 SSD.

    I went ITX with this build since the HDDs were already offloaded to a server in 2018, bypassing mATX completely.

    So my questions are: What peripherals are people using that necessitate so many add-in cards for non-HPC needs that ATX is a must, and why is it assumed that anything less than an i9 will freeze opening Notepad and thus the only power envelope worth validly addressing is that of an i9? Yes, heat needs to be mitigated, but using vastly more power than needed in the first place causes that mitigation need.