I’m particularly interested in low bandwidth solutions. My connection to the internet is pretty rough 20mbps down and 1mbps up with no option to upgrade.

That said, this isn’t limited to low bandwidth solutions.

I’m planning on redoing my entire setup soon to run on Kubernetes followed by expanding the scope of what my server does (Currently plex, a sftp server and local client backups). Before i do that i need a proper offsite backup solution.

  • sudneo@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    For Kubernetes you can use Velero. I tried it, but I didn’t like it (overly complex for my use case), so I wrote my own tool.

    Essentially the strategy for me is fairly straightforward, but it depends on the data you have.

    I have mostly 2 types:

    • manifests and configuration. This I have all in git (as I am using flux).
    • persistent volumes. I use openEBS, but for a low resources cluster I use host volumes only. For these I have written my tool that simply runs as a daemonset with the whole root of the host mounted in RO and the DAC_read_search capability, queries the API for volumes and backs up using restic the whole PV to Backblaze. Incidentally, this is also the same way I do all my other backups, outside K8s (I.e.borg or restic to b2).

    I chose b2 mostly for the price, but any s3 will do. Since all I am uploading there is encrypted anyway, I don’t need to worry about the privacy implication of having a third party potentially having access to my data.

  • krolden@lemmy.ml
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    Multiple ~2tb seedboxes. They can just get shut randomly but I haven’t had that happen to me.

    Cheap storage

  • hydralisk@lemmy.ml
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    I am a fan of using Restic. More specifically using Autorestic, which is a wrapper that allows you to easily configure restic using yaml files. Since all of my services are in docker containers, I just have a hook to shutdown all my containers, do the backup, and then run all my containers again. Downtime is not an issue since it just runs when I would be sleeping. Just have it backup to Backblaze B2, which I think you get 10GB free, which is plenty for me right now.

  • ipkpjersi@lemmy.one
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    I have a cottage, so I rsync to my computer there, and I also have a computer and Synology NAS there for further backups. If I end up selling the cottage though… I’m not sure lol. I don’t really have anything too too irreplaceable honestly outside of stuff I already backup to multiple cloud backups too.

  • 486@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    Any backup software that supports incremental backup should work similarly bandwitdth-wise. I like Restic. You can even do incremental backups with plain rsync, if you want. If your data does not change much, than you should be okay. For the initial backup run it would be helpful if you have physical access to the remote location so you can bring a full backup there without having to upload it through your slow uplink.

    • PrettyFlyForAFatGuy@lemmy.mlOP
      link
      fedilink
      arrow-up
      0
      arrow-down
      2
      ·
      1 year ago

      Definitely an option if I’m a bit more selective with what i back up. At the moment for the client backups i’m zipping and encrypting the entire home folder for each client once a week. I could probably write something that looks for file changes and uploads just those