Jump to content

Duplicating a data set offsite

Recommended Posts

I have an office full of clients and server clients backing up to a series of data sets on a 1TB hard drive. I want to mirror those data sets to an FTP volume at another office. What is the best way to do that so as to avoid having to recopy the several gigabyte data sets every time? Ideally, I'd like to copy the data set to the other site the first time via sneakernet and then just have Retrospect sync the two copies of the data set (local and offsite)


Thanks in advance for any help that you can provide!

Link to comment
Share on other sites

There is no way of which I am aware unless you run two backup scripts each day, one to the local set and one to the remote set. And be aware that the two sets won't necessarily be the same if files change during the interim, if a different set of clients are up during the second backup script, etc. And it's difficult (read, impossible) to get clients to shut down using the Dantz/EMC/Insignia client shutdown script unless you modify it heavily to cause it to shut down on the second backup, etc. Difficult problem to solve, and not one that Retrospect was designed to handle.


You might be able to cobble something up using the "Transfer" operation, to transfer from one backup set to the other (See the Retrospect Mac 6.x Users Guide, page 60), but that's problematic because Transfer is not incremental, and you have to craft a selector to do what you want.


You might try to turn in a feature request for some future version of Retrospect. Good luck.



Link to comment
Share on other sites


I have an office full of clients and server clients backing up to a series of data sets on a 1TB hard drive. I want to mirror those data sets to an FTP volume at another office.



Since "data set" is not a term used in Retrospect, it's not precisely clear what you are doing, although the most probable scenario is that you have multiple File Backup Sets being stored on a hard drive, and you want to mirror that data to another place for safety.


Since File Backup Sets are large singe files, the only way to make copies of them is to, well, make copies of them. Which would mean a new large file copied over for any small change made during backups.


One way you _might_ be able to get the offsite redundancy you want is if you used an Internet Backup Set on your local physical network as your primary backup, and then use some unix tools to make redundant copies of the individual data files that such Backup Sets create (after tweaking Retrospect's resources to make the size of those sets appropriate for your needs). Since Internet Backup Sets create new, sequentially numbered files as they grow, you would only have to copy over the newest files for each "incremental" offsite backup.


There might be issues with the final file in each Session, however. You might have the last file (say foo.data100) only be half the size it's destined to grow when that backup ends. The next execution would see Retrospect use that same file again before creating foo.data101, so any scripts you created would have to deal with that somehow (shell scripts can do anything; shame that I have no idea how they do!). Something like always copy over the last numbered old file in addition to all the new files.


Any such configuration attempts would have to include some robust testing to be sure you could actually restore from your offsite mirror.



Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Create New...