Linux simple backup just using rclone

28 October 2024 Off By nonsavant

In my post Linux backup using rclone and restic running in Docker containers I described my method for backing up files on a Linux host using rclone and restic.

However, I decided that for some use cases, the full functionality of restic is overkill, as I’m really looking for a cloud-based archive for my files.

Also, with restic, it’s possible to define a maximum backup size and retention and pruning rules, but my use case is that I want to retain a file even after deleting it from the source. Restic rules would eventually lead to such files being removed.

Therefore, I investigated just using rclone on its own, and this is the result:

rclone container in docker-compose

The main difference to my previous setup is moving away from having rclone serve-ing a restic-compatible API.

That means moving from rclone serve ... to rclone sync ....

However, I also wanted to have the backup performed on a regular basis, so I need some sort of cron job.

Thanks mostly to this post, I discovered that it’s possible to

  • change the container entrypoint to bin/sh
  • provide a shell command as the container command

This allowed me to create a cron job inside the container which executes the desired rclone command according to the cron schedule.

docker-compose changes:

entrypoint: '/bin/sh'
command: ["-c", "(echo -e \"SHELL=/bin/sh\n0 0 4 * * * if ! pidof rclone; then rclone sync -v --retries 1 --max-transfer 5G --cutoff-mode=soft --exclude-from exclude.txt --backup-dir remote:/archive-folder /data remote:/backup-folder > /dev/stdout; fi\" | crontab -) && crond -f"]

There’s a lot going on in there!

  • entrypoint is set to rclone by default, but it can be overridden in the docker-compose
  • The command has to be provided as an array [ '...', '...', ] as the quotes were causing problems otherwise
  • The shell command does the following:
    • Pipes a line into crontab (echo -e ...) and starts crontab (crond -f)
    • Actually in this case, it also adds the shell definition to the crontab (SHELL= ...) as it didn’t seem to work otherwise
  • The crontab entry
    • Checks to see if rclone is currently running (pidof rclone) and if not:
    • executes the rclone sync command, piping the output to stdout so that the output is available via docker log
  • The rclone sync flags
    • -v so that rclone outputs some interesting info about the transfer
    • –retries 1 to stop retries
    • –max-transfer 5G to limit any one session to 5 GB
      • –cutoff-mode=soft so when max-transfer is reached, it completes any started transfers
    • –exclude-from a text file containing some standard Mac hidden/temporary files that shouldn’t be transferred
    • –backup-dir defines a remote folder where remote files are moved to if they are no longer found in the source (thus fulfilling one of my main requirements!)

Encryption

I still wanted the files to be encrypted on the remote storage. This is really easy to set up with rclone and this page helped a little with the process.

Step 1: Make sure the remote storage is configured (in my case OneDrive personal)

Step 2: run rclone config again, and create a new storage of the type crypt

Step 3: Define the remote to be encrypted (remote:folder)

Step 4: Choose to encrypt files and folders and provide encryption and salt passwords… that’s it!