Can you please share your backup strategies for linux? I’m curious to know what tools you use and why?How do you automate/schedule backups? Which files/folders you back up? What is your prefered hardware/cloud storage and how do you manage storage space?

  • Earth Walker@lemmy.world
    link
    fedilink
    arrow-up
    35
    ·
    edit-2
    1 month ago

    I use Borg Backup, automated with a bash script that Borg provides. A cron job runs the script at the desired frequency. I keep backups on different computers, ideally I would recommend one copy in the cloud and one copy on a local machine. Borg compresses and encrypts its backups.

    Edit: I migrated a server once using the backups from this system and it worked great.

  • astrsk@fedia.io
    link
    fedilink
    arrow-up
    28
    ·
    1 month ago

    Borg backup is gold standard, with Vorta as a very nice GUI on machines that need it. Otherwise, all my other Linux machines are running in proxmox hypervisors and have container/snapshot/vm backups regularly through proxmox backup server to another machine. All the backup data is then replicated regularly, remotely via truenas scale replication tasks.

    • GenderNeutralBro@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 month ago

      Borg via Vorta handles the hard parts: encryption, compression, deduplication, and archiving. You can mount backup snapshots like drives, without needing to expand them. It splits archives into small chunks so you can easily upload them to your cloud service of choice.

    • NotAnArdvark@lemmy.ca
      link
      fedilink
      arrow-up
      4
      ·
      1 month ago

      Adding my “Me too” to Vorta/Borg. I use it with Borgbase, which I like because it’s legitimately cheap and they support Borg development. As well, you can set Borg backups with Borgbase to “append only,” which prevents ransomware or other unexpected “whoopsies” from wiping out your backup history.

      I backup most of my computer every hour, but have pruning rules that make sure things don’t get too out of hand. I have a second backup that backs everything up to my NAS (using Vorta, again). This is helpful for things like my downloads folder, virtual machines, or STEAM library - things I wouldn’t want to backup over the network, but on occasion I do find myself going “whoops, I wanted that.”

      I also have Vorta working on my Mom’s Macbook, then have Borgbase send me an email when there isn’t any activity for longer than a couple of days. Once I got automatic pruning working right I never had to touch this again.

  • shapis@lemmy.ml
    link
    fedilink
    arrow-up
    15
    ·
    1 month ago

    All my code and projects are on GitHub/codeberg.

    All my personal info and photos are on proton drive.

    If Linux shits itself (and it does often) who cares. I can have it up and running again in a fresh install in ten minutes.

    • krash@lemmy.ml
      link
      fedilink
      arrow-up
      1
      ·
      29 days ago

      But proton drive soaent have a linux client yet, I suppose you just upload your files there once through the web interface and don’t sync?

      • shapis@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        29 days ago

        Personal stuff is mostly on my phone. And I’ll just sync to the computer what’s needed.

  • Lucy :3@feddit.org
    link
    fedilink
    arrow-up
    14
    ·
    1 month ago

    I use rsync to incrementally back up / to a separate drive, as well as a drive on another device (my server), which then packs, compresses and encrypts the latest backup of all devices daily, and uploads them to Hetzner as well as GDrive.

  • TimeSquirrel@kbin.melroy.org
    link
    fedilink
    arrow-up
    15
    arrow-down
    1
    ·
    1 month ago

    I plug in an external drive every so often and drag and drop parts of my home dir into it like it’s 1997. I’m not running a data center here. The boomer method is good enough and I don’t do anything important enough to warrant going all out with professional snapshot based backup solutions and stuff. And I only save personal documents, media, and custom config files. Everything else is replaceable.

    • Papamousse@beehaw.org
      link
      fedilink
      arrow-up
      4
      ·
      1 month ago

      yeah about the same, old coot here, I plug a USB3-SSD (encrypted with LUKS) and rsync from internal HD to this external HD. That’s it.

    • Fonzie!@ttrpg.network
      link
      fedilink
      arrow-up
      2
      ·
      30 days ago

      I do exactly this but with a little shell script that just has some rsync -av and mv -f calls instead of dragging and dropping.

  • tetris11@lemmy.ml
    link
    fedilink
    arrow-up
    11
    arrow-down
    1
    ·
    edit-2
    29 days ago

    I was talking with a techhead from the 80s about what he did when his tape drives failed and the folly that is keeping data alive on a system that doesn’t need to be. His foolproof backup storage is as follows.

    1. At Christmas buy a new hard drive. If Moore’s law allows, it should be double what you currently have
    2. Put your current backup hardrive into a SATA drive slot. Copy over backup into new hard drive.
    3. Write with a sharpie the date at which this was done on the harddrive. The new hard drive is your current backup.
    4. Place the now old backup into your drawer and forget about it.
    5. On New Years Day, load each of the drives into a SATA drive slot and fix any filesystem issues.
    6. Put them back into the drawer. Go to step 1.
  • mvirts@lemmy.world
    link
    fedilink
    arrow-up
    10
    ·
    29 days ago

    Shout out to all the homies with nothing, I’m still waiting to buy a larger disk in hopes of rescuing as much data from a failing 3TB disk as I can. I got some read errors and unplugged it about 3 months ago.

  • _spiffy@lemmy.ca
    link
    fedilink
    English
    arrow-up
    9
    ·
    30 days ago

    Dump configs to backup drive. Pray to the machine spirit that things don’t blow up. Only update when I remember. I’m a terrible admin for my own stuff.

  • Kongar@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    7
    ·
    1 month ago

    Synology NAS. I really love that thing. I use their synology drive software to backup the Linux home folder, as well as windows PCs, iPads, iPhones etc. I use their photos mobile software to automatically backup phone photos and videos. I also synchronize a few select folders between PCs so certain in-use files are always up to date. I set the NAS to keep 30 old versions of every file. This works great for my college kids - dad has a copy of everything in case they nuke a paper or something (which has happened).

    I stopped cloning drives long ago. Now I just reinstall the os and packages. With Linux, this is honestly faster than deploying a backup - a single pacman command installs everything I want. Then I just log into things as I open them. Ya I might have to futz around with some settings or redownload some big games on steam - but the eye candy and games can wait - I can be productive pretty quickly after an install.

    I DO use btrfs with automatic snapshots (snapper and btrfs assistant). This saves me from myself when I bork an update (which I’ve done more than once). If I make a mistake, I just rollback a snapshot, and try again without my stupid mistakes. This has saved my install 3 or 4 times now.

    Lastly, I sneaker net an external hard drive to my office. On it is a manual backup of the NAS. I do this once per month. This protects from catastrophic failures like my house burning down. I might lose a month or so of pictures in the worst case scenario, but I still have my 25+ years of pictures of my kids, wedding videos, etc.

    In the end, the only thing that really matters is not losing my lifetime of family pictures and the good memories they provoke.

    • ddh@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      2
      ·
      29 days ago

      Synology NAS here also, divided into private (family stuff, docker volumes etc) and public (Linux ISOs and anything that can be redownloaded). Both get backed up weekly to an older NAS with Hyper Backup. Private additionally goes onto a LUKS encrypted drive monthly which is spot-checked, taken offsite, and the previous offsite drive brought back. I don’t back up any PC (don’t care, just reinstall) or phones (they are backed up on iCloud).

  • aquafunkalisticbootywhap@lemmy.sdf.org
    link
    fedilink
    arrow-up
    8
    arrow-down
    1
    ·
    edit-2
    1 month ago

    etckeeper, and borg/vorta for /home

    I try to be good about everything being installed in packages, even if Im the one that made the package. that means I only have to worry about backing up my local package archive. but Ive never actualy recreated a personal system from a backup, and usually end up starting from a fresh install, slowly adding back things from the backup if I missed them. this tends to cut down on cruft and no longer needed hacks and fixes. also makes for a good way to be exposed to new paradigms (desktop environments, shells, etc)

    something that helps is daily notes. one file for any day Im working on my system and want to remember what a custom file, confg edit, or downloaded/created package does and why. these get saved separately and I try to remember to grep them before asking the internet

    i see the benefit to snapshots, but disk space is expensive, and Im (usually) careful (enough) not to lock myself out or prevent boots. anything catastophic I have to fix is usually seen as a fun, stressful learning experience! that rarely happens anymore, for better or for worse

  • sntx@lemm.ee
    link
    fedilink
    arrow-up
    7
    ·
    30 days ago

    I’m using rustic, a lock-free rust-written drop-in-replacement of restic, which (I’m referring to restic and therefore in extension to rustic) supports always-encrypted, deduplicating, compressed and easy backups without you needing to worry about whether to do a full- or incremental-backup.

    All my machines run hourly backups of all mounted partitions to an append-only repo at borgbase. I have a file with ignore pattern globs to skip unwanted files and dirs (i.e.: **/.cache).

    While I think borgbase is ok, ther’re just using hetzner storage boxes in the background, which are cheaper if you use them directly. I’m thinking of migrating my backups to a handfull of homelabs from trusted friends and family instead.

    The backups have a randomized delay of 5m and typically take about 8-9s each (unless big new files need to be uploaded). They are triggered by persistent systemd-timers.

    The backups have been running across my laptop, pc and server for about 6 months now and I’m at ~380 GiB storage usage total.

    I’ve mounted backup snapshots on multiple occasions already to either get an old version of a file, or restore it entirely.

    There is a tool called redu which is like ncdu but works on restic/rustic repos. This makes it easy to identify which files blow up your backup size.

  • fireshell@lemmy.ml
    link
    fedilink
    English
    arrow-up
    6
    ·
    30 days ago

    Example of a Bash script that performs the following tasks

    1. Checks the availability of an important web server.
    2. Checks disk space usage.
    3. Makes a backup of the specified directories.
    4. Sends a report to the administrator’s email.

    Example script:

    #!/bin/bash
    
    # Settings
    WEB_SERVER="https://example.com"
    BACKUP_DIR="/backup"
    TARGET_DIRS="/var/www /etc"
    DISK_USAGE_THRESHOLD=90
    ADMIN_EMAIL="admin@example.com"
    DATE=$(date +"%Y-%m-%d")
    BACKUP_FILE="$BACKUP_DIR/backup-$DATE.tar.gz"
    
    # Checking web server availability
    echo "Checking web server availability..."
    if curl -s --head $WEB_SERVER | grep "200 OK" > /dev/null; then
    echo "Web server is available."
    else
    echo "Warning: Web server is unavailable!" | mail -s "Problem with web server" $ADMIN_EMAIL
    fi
    
    # Checking disk space
    echo "Checking disk space..."
    DISK_USAGE=$(df / | grep / | awk '{ print $5 }' | sed 's/%//g')
    if [ $DISK_USAGE -gt $DISK_USAGE_THRESHOLD ]; then
    echo "Warning: Disk space usage exceeded $DISK_USAGE_THRESHOLD%!" | mail -s "Problem with disk space" $ADMIN_EMAIL
    else
    echo "There is enough disk space."
    fi
    
    # Creating backup
    echo "Creating backup..."
    tar -czf $BACKUP_FILE $TARGET_DIRS
    
    if [ $? -eq 0 ]; then
    echo "Backup created successfully: $BACKUP_FILE"
    else
    echo "Error creating backup!" | mail -s "Error creating backup" $ADMIN_EMAIL
    fi
    
    # Sending report
    echo "Sending report to $ADMIN_EMAIL..."
    REPORT="Report for $DATE\n\n"
    REPORT+="Web server status: $(curl -s --head $WEB_SERVER | head -n 1)\n"
    REPORT+="Disk space usage: $DISK_USAGE%\n"
    REPORT+="Backup location: $BACKUP_FILE\n"
    
    echo -e $REPORT | mail -s "Daily system report" $ADMIN_EMAIL
    
    echo "Done."
    

    Description:

    1. Check web server: Uses curl command to check if the site is available.
    2. Check disk space: Use df and awk to check disk usage. If the threshold (90%) is exceeded, a notification is sent.
    3. Create a backup: The tar command archives and compresses the directories specified in the TARGET_DIRS variable.
    4. Send a report: A report on all operations is sent to the administrator’s email using mail.

    How to use:

    1. Set the desired parameters, such as the web server address, directories for backup, disk usage threshold and email.
    2. Make the script executable:
    chmod +x /path/to/your/script.sh
    
    1. Add the script to cron to run on a regular basis:
    crontab -e
    

    Example to run every day at 00:00:

    0 0 * * * /path/to/your/script.sh