My goal is to to fully ditch Google Photos for Immich. I have about ~3TB of photos and videos. Looking for a super simple way of backing up the library to cloud storage in case of a drive failure without spending a ton.
Ideally, this will require nothing on my part besides copying files into a given folder. And ideally the storage will be encrypted and have basic privacy assurances.
Also if it matters my home server is running Debian. But I’d prefer something that runs in docker so I can more easily check on it remotely.


If you’re already running ZFS, sanoid would be an option.
Okay, how do you get sanoid & syncoid to run, because I’ve tried, and I’m just too dummy. When it makes a backup, is it literally making a zfs data record/pool/whatever on the other machine? Or is it more like a file? I have a Proxmox running cockpit (SMB & NFS) and the machine is connected to a USB drive bay that has ZFS. My immich is saving pictures to my ZFS drive bay via SMB.
I’ve tried to do
syncoid pool_name/data/immich root@cockpit.service.IP.addr:mnt/samba/backupsbut I get hit with:
Long ass error message
WARNING: ZFS resume feature not available on target machine - sync will continue without resume support. INFO: Sending oldest full snapshot Orico2tera4/data/immich@syncoid_nova_2026-01-27:13:38:44-GMT-05:00 to new target filesystem root@192.168.0.246:/mnt/samba/backups (~ 42 KB): /dev/zfs and /proc/self/mounts are required. Try running 'udevadm trigger' and 'mount -t proc proc /proc' as root. 44.2KiB 0:00:00 [ 694KiB/s] [===========================================] 103% CRITICAL ERROR: zfs send 'Orico2tera4/data/immich'@'syncoid_nova_2026-01-27:13:38:44-GMT-05:00' | pv -p -t -e -r -b -s 43632 | lzop | mbuffer -q -s 128k -m 16M | ssh -S /tmp/syncoid-root1921680246-1772385641-845218-1784 root@192.168.0.246 ' mbuffer -q -s 128k -m 16M | lzop -dfc | zfs receive -F '"'"'/mnt/samba/backups'"'"' 2>&1' failed: 256I’ve tried reading the github docs and some forums but I’m dummy. I just want to have backups that I can encrypt and keep in a cloud for cheap somewhere. Does it literally have to be two different machines (god I’m dumb)? Can I just auto run ZFS snapshots and encrypt then save those to Drive/OneDrive/Whoever?
You can do a sanoid sync to another zpool or dataset on the same machine or a remote host, they behave the same. It’s replicating that dataset on the other machine, then sending the snapshots after that point over via
zfs send. You can instruct sanoid to prune those snapshots after the send and start new ones for the next send, or just accumulate them so you have points in time to revert to.IIRC, you can send a zfs snapshot to a file, but I can’t recall how to do that, so AFAIK, you can’t just send it to a file based service like Onedrive. You can use a service like zfs.rent and send them a harddrive with your base sync on it (encrypt it) and then once they’ve brought it online, you can sync to that. Best to test out your methods with the drive hooked up locally.
I know it’s anathema to Lemmy, but the best help you’ll get is Claude where you can paste the errors in and have it sort it out for you as you troubleshoot. It’s pretty good at shit like that.