I am using duplicati and thinking of switching to Borg. What do you use and why?
Fryboyter ( @Fryboyter@discuss.tchncs.de ) 18•2 years agoThere is no such thing as the objectively best solution. Each tool has advantages and disadvantages. And every user has different preferences and requirements.
Personally, I am using Borg for years. And I have had to restore data several times, which has worked every time.
In addition to Borg, you can also look at Borgmatic. This wrapper extends the functionality and makes some things easier.
And if you want to use a graphical user interface, you can have a look at Vorta or Pika.
TedvdB ( @tedvdb@feddit.nl ) 2•2 years agoOh I love borg backups and the ability to script it.
I’m making encrypted backups of a lot of servers this way, including a Lemmy instance.
Agree. Should say ‘best for you’. Cool thanks. I know of Vorta which I intended of using. Gonna read up on the other ones.
lawliot ( @lawliot@beehaw.org ) 14•2 years agoI use restic. For local backups, Timeshift.
*ira ( @ira@beehaw.org ) 2•2 years agoSeconded, I use restic with a remote blob storage and works nicely
CjkOvPDwQW ( @CjkOvPDwQW@lemmy.pt ) 13•2 years agoUsing borg backup, just because there are some nice frontends for the gnome ecosystem (when I am using gnome, I love to use gnome apps), and it has a nice cmd for scripting when using something else (using it on servers)
Hrafn Blóðbók ( @sudoreboot@beehaw.org ) 6•2 years agoAnd there is a nice graphical frontend for it too: Vorta
CjkOvPDwQW ( @CjkOvPDwQW@lemmy.pt ) 3•2 years agoPersonally more of Pika Backup user ;)
mrmanager ( @mrmanager@lemmy.today ) 10•2 years agoI don’t have backups. :/
And I will regret it some day.
I use github for code so that’s backed up though.
exu ( @exu@feditown.com ) 11•2 years agoThere are two kinds of people.
Those who make backups and those who will. Yote.zip ( @yote_zip@pawb.social ) 4•2 years agoYou very much will. It’s easier than you’d think.
IncidentalIncidence ( @IncidentalIncidence@feddit.de ) 1•2 years agoautomated/networked backups like people are talking about here are great, but even just an external SSD and the nautilus copy function will give you at least some insurance.
flux ( @flux@beehaw.org ) 8•2 years agoKopia has served me great. I back up to my local Ceph S3 storage and then keep a second clone of that on a raid.
Kopiahas good performance and miltiple hosts can back up tp it concurrently while preserving deduplication – unlike borgbackup.
aliens ( @aliens@infosec.pub ) English3•2 years agoKopia has been working great for me as well. It’s simple, versatile and reliable. I previously used Duplicati but kept running into jobs failing for no reason, backup configurations missing randomly and simple restores taking hours. It was a hot mess and I’m happy I switched.
Parsnip8904 ( @Parsnip8904@beehaw.org ) English3•2 years agoI want to love kopia but the command line syntax feels unnatural to me. I don’t know why either. For the whole month I test drove it, I had to look up every single time how to do something. Contrast this with restic which is less featureful in some ways but a few days in it felt like I was just using git.
aliens ( @aliens@infosec.pub ) English3•2 years agoI never used the command line with Kopia besides starting it up in server mode and used the web based GUI to configure, it was pretty simple to get everything setup that way. You may want to give it another try using Kopia in that mode.
Parsnip8904 ( @Parsnip8904@beehaw.org ) English2•2 years agoMy use case is for headless machines which makes it a no go in that regard unfortunately.
flux ( @flux@beehaw.org ) English3•2 years agoYou can use the web ui remotely.
Personally I use it from command line, though, and my only complaint is that it’s too easy to start a backup you didn’t intend to… Buut if you’re careful about usong the
kopia snapshot
command then it’s fine. Parsnip8904 ( @Parsnip8904@beehaw.org ) English2•2 years agoOh I thought the webui was only for server mode.
I just quickly glanced through the manuals of both restic and kopia. I think my trouble with kopia is that its style feels kind of weird. I’m just not able to wrap my head around it well.
kopia snapshot create /dir
is shorter but more confusing thanrestic -r repo backup /dir
derek ( @derek@lemmy.one ) 7•2 years ago- Btrfs for local system backups based on snapshots
- Photoprism for photos
- Syncthing for other media
flux ( @flux@beehaw.org ) 2•2 years agoYou will reconsider calling strategy a backup should the filesystem get corrupted for whatever reason.
I’ve tested my full system backup restore once with btrfs. Worked out fine.
derek ( @derek@lemmy.one ) 1•2 years agoMaybe Photoprism isn’t a backup strategy, but Syncthing for sure is, because you can have multiple backup units in it.
I’m additionally use software RAID on one of devices, that receives Syncthing backups.
Karce ( @karce@wizanons.dev ) 7•2 years agoI use btrfs snapshots and btrbk
btrfs is a great filesystem and btrbk complements it easily. Switching between snapshots is also really easy if something goes wrong and you need to restore.
Archwiki docs for btrfs: https://wiki.archlinux.org/title/Btrfs#Incremental_backup_to_external_drive
Of course you’d still want a remote location to backup to. You can use an encrypted volume with cloud storage. So google drive, etc all work.
CoffeeBot ( @CoffeeBot@lemmy.ca ) 4•2 years agoOh interesting! I might take a look at btrbk
brandhout ( @brandhout@feddit.nl ) 4•2 years agoThis is the way !
Thanks. Heard a lot about it. Will check it.
kamin ( @kamin@lemmy.kghorvath.com ) English3•2 years agoThis is what I do. Btrfs snapshots and use send/receive with my NAS.
Klaymore ( @Klaymore@sh.itjust.works ) 6•2 years agoI use NixOS so all my system configuration is already saved in my NixOS configs, which I save on GitHub. For dotfiles that aren’t managed by NixOS I use syncthing to sync them between my devices, but no real backup cause I can just remake them if I need to, and things like my Neovim and VSCode configs are managed by my NixOS configs so they’re backed up as well.
Demonstrable_Legume ( @Demonstrable_Legume@beehaw.org ) 2•2 years agoYou can take this to the extreme too by erasing your root partition each boot: https://grahamc.com/blog/erase-your-darlings/
Using that method you isolate all important state on the system for backup with zfs send.
Klaymore ( @Klaymore@sh.itjust.works ) 3•2 years agoYeah I have a full impermanence setup using tmpfs, which is really nice. I did it like on the NixOS wiki and it’s been helpful for organizing my dotfiles and keeping track of all the random stuff that programs put everywhere.
I actually have all my stuff in a separate /stuff folder kinda by accident so my /home only has dotfiles and things like that.
professed ( @professed@beehaw.org ) 6•2 years agoI started using Timeshift when it was included with a distro I was using and haven’t had reason to shift away from it. Have already used it once to do a full restore.
I_Am_Jacks_____ ( @I_Am_Jacks_____@beehaw.org ) 6•2 years agoI’ve been using restic. It has built-in dedup & encryption and supports both local and remote storage. I’m using it to back up to a local restic-server (pointing to a USB drive) and Backblaze B2.
Restores for single or small sets of files is easy: restic -r $REPO mount /mnt Then browse through the filesystem view of your snapshots and copy just like any other filesystem.
TDCN ( @TDCN@feddit.dk ) 5•2 years agoRsync is great but if you want snapshots and file history rsnapshot works pretty well. It’s based on rsync but for every sync it creates shortcuts for existing files and only copies changes and new files. It saves space and remains transparent for the user. FreeFileSync is also amazing
ipkpjersi ( @ipkpjersi@lemmy.one ) 5•2 years agoI use my own scripts with
rsync
etc, I don’t back up my OS itself since I have installing it automated with scripts as well. I just back up specific things I need with my scripts.automated with scripts
would you like to share those or do you have references for creating such scripts? this is on my to do list since years but I always struggle where to begin with.
ipkpjersi ( @ipkpjersi@lemmy.one ) 2•2 years agoThey’re very personalized to my setup, so they’re not particularly useful in a general sense - I’d recommend something more like using this guide which seems to be pretty good: https://jumpcloud.com/blog/how-to-use-rsync-remote-backup-linux-system
Learning bash has been great for me, it’s helped a ton being able to automate so many different things even just like installing and configuring specific applications to work the way I want, etc
I think a script to manually run for manual backups plus a different script to run for automatic backups scheduled via cronjob is a great way to go.
There’s of course more advanced things like zfs snapshots which I won’t get into, but I think my explanation as a general concept should be fairly useful.
flatbield ( @furrowsofar@beehaw.org ) 4•2 years agoI am old school. I just use GNU Tar with the Pax format and multiple external detachable encypted hard drives. Reason is it is simple and a well known tool that is very common with a standard archive format.
GnomeComedy ( @GnomeComedy@beehaw.org ) 2•2 years agoI’m curious - how much data are you backing up with that method and how frequently are you doing your backups? Doesn’t sound like it would scale well, but I’m also wondering if maybe this is perfect and I’ve just been over thinking it.
flatbield ( @furrowsofar@beehaw.org ) 3•2 years agoThere is not a size limit. Lot of these other methods actually use GNU Tar behind the scenes anyway. More then that GNU tar has been used for decades for this purpose. Pull out any Unix book from 2 decades ago and you will see “tar”, “cpio”, and “dump/restore” as the way. The new tool out there is Pax and in fact GNU Tar supports the new “pax” format. Moreover GNU Tar with Pax format can backup almost full disk structure including hard links, ACLs, and extended attributes which a lot of tools do not do. It is still useful to archive some things at a lower level like your partition table, and boot blocks of course. You also have to decide what run-level (such as rescue) you want to archive in, and/or what services you should stop, or provide separate to file system dumps for depending on your system. Databases, and things like ecryptfs take some special thought (thought it does for any tool). It is also good to do test restores to verify your disaster plan.
I use tar on many systems. My workstation is about 1TB of data. Backup is about 11 hours though I think it could be faster if I disabled compression (I currently use the standard gzip compression which is not optimal). I think the process is CPU bound by the compression at the moment. Going to uncompressed or using parallel gzip at level 2 is probably the fastest you can do and should really speed things up by 4X or more. I have played with this some for my wife and her raw backup is a lot faster now. My wife uses USB 3 external drives specifically plugged into USB 3 ports (the one with the SS symbol and the blue interior), and with a USB 3 related cable. I use 6TB naked SATA drives I insert into a hot mount enclosure and store in storage boxes. My backup system can theoretically do incrementals too, but it has some issues since I have moved to BTRFS so I do not use that at the moment. Did always use before. I have an idea how to fix, but need to debug and test incrementals now.
How often: I backup monthly. When my incrementals were working I use to do it weekly or whenever I got nervous. Other option for the BTRFS file systems would be to use their native backup tools. Not sure though, I like to use generic stuff. Lot to be said for generic.
Big downside of tar is the mind numbing man page. Getting the options correct takes some real thought. You also have to be comfortable with the shell and Bash scripting. Big upside you can customize exactly what you want.
davefischer ( @davefischer@beehaw.org ) 2•2 years agotar dates all the way back to the 70s.
flatbield ( @furrowsofar@beehaw.org ) 3•2 years agoYes, I actually did not know how far back, thanks. Wikipedia seems to say 1979. I know my system admin book dated 1992 talks about it and it was common then. I think my brother use to use it in the early 1980s for his job and maybe I did too a few times. Wikipedia says GNU Tar is newer and traces back to 1987. The formats have changed some and there are several. The PAX format is much newer which I think was standardized in 2001 but GNU Tar would have taken time to implement it. I do not know that date.
People seem to forget that tar worked well back then and still does.
davefischer ( @davefischer@beehaw.org ) 2•2 years agoI had the chance to play with late 70s Unix for a bit a few years ago. (Hardware on loan from a museum.) VERY minimal, but still recognizable. (Well, my Unix reflexes are old - I started in the mid 80s.)
flatbield ( @furrowsofar@beehaw.org ) 3•2 years agoInteresting. About then I was using a VAX. Somehow I spend most of my time on other stuff until I switched to Linux around 2000.
davefischer ( @davefischer@beehaw.org ) 2•2 years agoMy first Unix was 4.3BSD on a VAX-11/750. (There was another 11/750 running VMS, but I didn’t like that nearly as much.)
winety ( @winety@dataterm.digital ) English1•2 years agoI am using similarly “dumb” back-up system. I’ve two external USB HDDs, to which I copy my home folder every 4 to 6 months. The back-up folder currently has circa 250 GiB, but I don’t use any compression and I also probably do not have to back up my Steam library multiple times.
Yes, it doesn’t scale very well, but at the same time, I do not need to hoard 5 year old data. Yes, I should have an off-site back-up, but if my house burns down, I have bigger problems than losing my old photos.
Yote.zip ( @yote_zip@pawb.social ) 4•2 years agoI’ve used borg for a while and like it a lot. I would say your best option for pure linux is borg+borgmatic/vorta just because borg is battle-tested.
If you run any other OSs and don’t mind a relative newcomer, I’ve found kopia to be easy to recommend to my windows friends. At this point kopia has been around long enough (~4 years of actual beta) that I think it’s safe to trust its integrity with personal data. It has all the important features from borg in a cross-platform solution, so it’s also a viable alternative for borg on linux if you don’t like borg’s frontends for whatever reason.
Kovu ( @Kovu@beehaw.org ) 4•2 years agoI like pikabackup it’s based on borg
heartlessevil ( @heartlessevil@lemmy.one ) 4•2 years agoYeah, this is what I’ve found to be the best option. The encryption and deduplication is great.