i got sick again so the financial update and also this thread are late. i’ll get the financial update up at a later point, or i might just combine it with january since there’s not that much to report as far as i can tell

  • I have the same problem; my flat is only about 50sqm. Judging by the way things are going, I think there’s a chance Nvidia will release some consumer-grade hardware meant for LLMs in the near-ish future. Until they reveal their next lineup, although it may seem like a poor financial decision, I’m just sticking to using the cloud for running llms.

    I’m also hoping to get my hands on some raspberry pis too. I would like to build a toy k3s cluster at some point and maybe run my own mastodon instance. :)

    • Well at least I’m not the only one whose homelab ambitions are being crushed by their apartment layout. I think that I’m going to end up with a 2U compute rack, which means I’ll probably limp along on one or two consumer low-profile GPUs. Now if only I could work out the details of the actual rack server hardware…

      A Raspberry Pi cluster is interesting! My only real exposure to using Pis in a homelab was an old 1B I was using for PiHole. It was great right up until it stopped working.