Hi all, I have my home lab set up as a single git repo. I’ve got all infrastructure as opentofu / ansible configs, and using git crypt to protect secret files (tofu state, ansible secret values, etc).

How would you back up such a system? Keeping it on my self hosted git creates a circular dependency. I’m hesitant to use a private codeberg repo in case I leak secrets. Just wondering what the rest of you are doing.

  • r0ertel@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    8 days ago

    As others have said, a traditional off site backup will work. How do you plan to perform a restore, though? If you need the self hosted source repo, it won’t be available until the infrastructure is stood to creating another circular dependency.

    I’m still in the early stages of exploring this, too. My solution is to run a local filesystem git clone of the “main” repo and execute it with a Taskfile that builds a docker image from which it can execute the ansible infrastructure build. It is somewhat manual but I have performed a full rebuild a few times after some Big Mistakes.

    • ch8zer@lemmy.caOP
      link
      fedilink
      English
      arrow-up
      4
      ·
      8 days ago

      You pretty much got it. I need a quick way to restore the repo and ideally have git do a self backup. Seems like a cheap VPS may be the way to go

      • r0ertel@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        8 days ago

        For my own curiosity, how do you perform a build? Is it all done in pipelines, kicked off on change? Do you execute the whole infra build each time you release an update?

        • ch8zer@lemmy.caOP
          link
          fedilink
          English
          arrow-up
          3
          ·
          8 days ago

          Honestly, I just run it from the CLI myself.

          I’ve wasted too much time fighting with CI and automation that when I migrated to forjego I didn’t bother to put it in again.

          • r0ertel@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            7 days ago

            Same. I have spent way more time troubleshooting a pipeline than it saves. I like the idea of automation but laziness prevails.

  • oranki@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    4
    ·
    8 days ago

    Borgbackup in addition to git. Since there’s probably not much data, any cheap VPS could act as storage.

  • Possibly linux@lemmy.zip
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    edit-2
    7 days ago

    I would set aside a dedicated device that acts as a sort of “provisioner” and admin node. It can be something like a raspberry pi or desktop computer.

    From a backup perspective I would evacuate risk vs cost/effort. If you lost your home would it really matter that you lost some config files?

  • Grimm665@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    ·
    7 days ago

    I would configure a Backblaze B2 bucket and copy your repos and configs there, should be dirt cheap compared to a VPS and very durable.

  • HelloRoot@lemy.lol
    link
    fedilink
    English
    arrow-up
    2
    ·
    8 days ago

    What I did is set up a NAS at my parents house, which I can log into as well for near zero cost offsite backups.

    And at home I have a couple of local drives with borgbackups.

    • ch8zer@lemmy.caOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      8 days ago

      My parents have a NAS! Maybe I set up Tailscale and send it over there…

      Although they live 3 streets away from me so I worry it’s not remote enough in case of flood etc