I have a 56 TB local Unraid NAS that is parity protected against single drive failure, and while I think a single drive failing and being parity recovered covers data loss 95% of the time, I’m always concerned about two drives failing or a site-/system-wide disaster that takes out the whole NAS.

For other larger local hosters who are smarter and more prepared, what do you do? Do you sync it off site? How do you deal with cost and bandwidth needs if so? What other backup strategies do you use?

(Sorry if this standard scenario has been discussed - searching didn’t turn up anything.)

  • Shadow@lemmy.ca
    link
    fedilink
    English
    arrow-up
    41
    ·
    2 hours ago

    I don’t. Of my 120tb, I only care about the 4tb of personal data and I push that to a cloud backup. The rest can just be downloaded again.

    • NekoKoneko@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      8
      ·
      2 hours ago

      Do you have logs or software that keeps track of what you need to redownload? A big stress for me with that method is remembering or keeping track of what is lost when I and software can’t even see the filesystem anymore.

      • BakedCatboy@lemmy.ml
        link
        fedilink
        English
        arrow-up
        5
        ·
        58 minutes ago

        My *arrstack DBs are part of my backed up portion, so they’ll remember what I have downloaded in my non-backed up portion.

      • Sibbo@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        16
        arrow-down
        2
        ·
        2 hours ago

        If you can’t remember what you lost, did you really need it to begin with?

        Unless it’s personal memories of course.

        • Onomatopoeia@lemmy.cafe
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          1 hour ago

          I can’t remember the name of an excel spreadsheet I created years ago, which has continually matured with lots of changes. I often have to search for it of the many I have for different purposes.

          Trusting your memory is a naive, amateur approach.

          • frongt@lemmy.zip
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            1 hour ago

            So you do remember that you have several frequently-used spreadsheets.

        • NekoKoneko@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 hours ago

          For me, I have a bad memory. I might remember a childhood movie (a nickname I give to special Linux ISOs) that I hadn’t even thought of for 10 years and track down a copy, sometimes excavating obscure sources, and that may be hours of one-time inspiration and work repeated many times over. Having a complete list is a good helper, but a full backup of course is best.

      • Kurotora@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        ·
        2 hours ago

        In my case, for Linux ISOs, is only needed to login in my usual private trackers and re-download my leeched torrents. For more niche content, like old school TV shows in local language, I would rely in the community. For even more niche content, like tankoubons only available at the time on DD services, I have a specific job but also relying in the same back up provider that I’m using for personal data.

        Also, as it’s important to remind to everyone, you must encrypt your backup no matter where you store it.

      • i_stole_ur_taco@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 hours ago

        Set up a job to write the file names of everything in your file system to a text file and make sure that text file gets backed up. I did that on my Unraid server for years in lieu of fully backing up the whole array.

      • ShortN0te@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 hours ago

        That should be part of the backup configuration. You select in the backup tool of choice what you backup. When you poose your array then you download that stuff again?

    • BakedCatboy@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      59 minutes ago

      Same here, ~30TB currently but my personal artifacts portion is only like 2TB, which is very affordable with rsync.net, which conveniently has an alerts setting if more than X kb hasn’t changed in Y days. (I have my Synology set up to spit out daily security reports to meet that amount, so even if I don’t change anything myself I won’t get bugged)

    • givesomefucks@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 hours ago

      I only care about the 4tb of personal data and I push that to a cloud backup

      I have doubles of the data. Some of 'em. That way I know I have a pristine one in backup. Then I can use it, it gets corrupted, I don’t care.

      Actually, I have triples of the W2s. I have triples, right? If I don’t, the other stuff’s not true.

      See, the W2s the one I have triples of. Oh, no, actually, I also have triples of the kids photos, too. But just those two. And your dad and I are the same age, and I’m rich and I have triples of the W2s and the kids photos.

      Triples makes it safe.

      Triples is best.

      https://www.youtube.com/watch?v=8Inf1Yz_fgk

    • hendrik@palaver.p3x.de
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      2 hours ago

      I follow a similar strategy. I back up my important stuff. And I’m gonna have to re-rip my DVD collection and redownload the Linux ISOs in the unlikely case the RAID falls apart. That massively cuts down on the amount of storage needed.