Jump to content
Sign in to follow this  
dheiert

Duplicate function and volumes

Recommended Posts

I need to be able to run concurrent duplicate jobs to a large RAID array I have moving SQL backups from servers to the storage server. I cannot figure out how to configure the destination volumes so that the duplicate jobs are not waiting on the Windows drive.

 

Any help on this? Is it possible to define a folder as a volume and not a subvolume?

Share this post


Link to post
Share on other sites

I recommend strongly running "backups" instead of "Duplicates". With duplicates you lose all history. A file deleted from the original will also be deleted from the duplicate.

 

Create two "Disk Backup Sets" on the raid volume. Set up "Grooming" to groom older than (say) 10.

Create two backup scripts, backing up to each backup set.

Create two grooming scripts, that each weekend grooms out the oldest backups, keeping the last 10 backups.

 

That way, you have at least a few generations of your files.

Share this post


Link to post
Share on other sites

I can't use duplicate any way. No way to access the volume concurrently that way. Will have to use archive.

 

The point of the exercise is to get files off the servers daily and onto the storage machine. Then they will be sent to tape weekly with recycle.

 

Grooming won't help because the file name is always different.

Edited by Guest

Share this post


Link to post
Share on other sites
I can't use duplicate any way. No way to access the volume concurrently that way. Will have to use archive.
Right: you should create an "Archive" script.
The point of the exercise is to get files off the servers daily and onto the storage machine. Then they will be sent to tape weekly with recycle.
OK, understood.
Grooming won't help because the file name is always different.
Well, the backup set will eventually fill the disk, so you have to Groom or Recycle. With grooming, you can keep the latest backup. With Recycle you wipe them all out, so if your tapes are lost you have nothing at all. So grooming is safer.

 

Share this post


Link to post
Share on other sites

I presume he uses the backup feature of SQL itself and wants to use Retrospect to backup those resulting files?

 

In that case it's probably also convenient to make a filter for lets say files created in the last 24 hours, so to only backup the latest SQL backup. If you do that every day, you can easily use the grooming function setup as Lennart mentioned.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

×