What’s the best way to backup DB files via network on Linux and Solaris machines

backuporaclerestore

I wonder what the best way to backup files via the network.

I have Solaris machines with Oracle 10i on a Veritas cluster. Machines are connected to EMC storage.

The /data/oracle directory is mounted on the EMC storage.

What I want is to backup the /data/oracle directory (70G) on some backup machine via the network (include soft-linked files). The reliability of copying files is very important.

I checked and found some ideas to do that.

For example the first option is to use rsync:

rsync   -WavH –progress   /data/oracle $backup_server_ip:/Backup_dir

The second option is transfering the files with tar and ssh, dealing with compression on both sides of the pipe:

cd /directory_that_we_want_to_backup
tar cpf - . | bzip2 -c | \
  ssh  $backup_server_ip  "cd /Backup_dir && bzip2 -d | tar xpf -"

and so on.

I need advice on what is more reliability from the options above, and maybe on what other good options there are for this.

Best Answer

Well theoretically you can issue the command "alter tablespace begin/end backup". Before and after you execute rsync. You will also need a backup copy of the control file. This approach was used many years ago before RMAN was introduced. The restore really requires Oracle experience. This approach is for example described in this book: "Unix Backup and Recovery".

A really recommend you to get familiar with RMAN. Even if you do not use backup agent like TSM/Netbackup you can still create local disk backups using RMAN and then you can synchronize this storage using rsync.