Rsync Permission denied backing up a remote directory to the local machine

file-permissionsrsync

I'm getting the error mentioned in the title.

I found this similar question: Run rsync with root permission on remote machine.
That doesn't answer my question.

I'm the admin on the remote server and I want to use rsync to back up files to my local box. Here's my rsync command:

$ rsync -avz me@myserver.com:/var/www/ /backups/Sites/MySite/

It mostly works. Login is via a keypair. I don't and can't use a password (EDIT: to login via SSH). Just a few files won't transfer due to permissions. I don't want to change those permissions.

Here's the error:

receiving file list ... done
rsync: send_files failed to open "/var/www/webapp/securestuff/install.php": Permission denied (13)

I do not want to change the permissions on that file. It (and others like it) should not be readable (except by root).

This has to run in a cron job and I prefer a simple one-line solution using only the rsync command. The next choice would be a shell script I can call from the cron job. In no case can I manually log into the remote machine and become root (because I'll be sleeping when this runs.

How can I use rsync to back it up to my local box?

Best Answer

You cannot back up a file which you cannot read otherwise, so the permissions will have to be either changed or overriden by root.

Your options in more detail:

  • Override the permissions by rsync'ing as root@myserver.com directly. (

  • ...or by configuring sudo on the server to allow password-less running of the rsync server-side component.

    me    ALL=(root) NOPASSWD: /usr/bin/rsync --server --sender -vlogDtprze.iLsf . /var/www/
    

    and

    rsync --rsh="ssh me@myserver.com sudo" -avz /var/www/ /backups/...
    
  • Create a dedicated "website-backup" account on the server. Change the files' permissions to make them readable to the "website-backup" account; you may use ACLs and setfacl for that. Do not use this account for anything else.

    rsync -avz website-backup@myserver.com:/var/www/ /backups/sites/mysite/
    
  • Write a script on the server which would dump /var/www/ into an encrypted tarball. Again, this can be done as root (via crontab) or by configuring sudo to not require a password for that script. For example:

    #!/bin/sh
    tar c /var/www/ | gpg -e -r mountainx@example.com
    

    Backup would be done by pulling the entire tarball every time, which might be inefficient with large sites:

    ssh me@myserver.com "sudo /usr/sbin/dump-website" > /backups/sites/mysite.tar.gpg
    

    The password requirement would be removed by editing sudoers:

    me     ALL=(root) NOPASSWD: /usr/sbin/dump-website
    
Related Question