Jump to content

Duplicate function and volumes


dheiert

Recommended Posts

I need to be able to run concurrent duplicate jobs to a large RAID array I have moving SQL backups from servers to the storage server. I cannot figure out how to configure the destination volumes so that the duplicate jobs are not waiting on the Windows drive.

 

Any help on this? Is it possible to define a folder as a volume and not a subvolume?

Link to comment
Share on other sites

I recommend strongly running "backups" instead of "Duplicates". With duplicates you lose all history. A file deleted from the original will also be deleted from the duplicate.

 

Create two "Disk Backup Sets" on the raid volume. Set up "Grooming" to groom older than (say) 10.

Create two backup scripts, backing up to each backup set.

Create two grooming scripts, that each weekend grooms out the oldest backups, keeping the last 10 backups.

 

That way, you have at least a few generations of your files.

Link to comment
Share on other sites

I can't use duplicate any way. No way to access the volume concurrently that way. Will have to use archive.

 

The point of the exercise is to get files off the servers daily and onto the storage machine. Then they will be sent to tape weekly with recycle.

 

Grooming won't help because the file name is always different.

Edited by Guest
Link to comment
Share on other sites

I can't use duplicate any way. No way to access the volume concurrently that way. Will have to use archive.
Right: you should create an "Archive" script.
The point of the exercise is to get files off the servers daily and onto the storage machine. Then they will be sent to tape weekly with recycle.
OK, understood.
Grooming won't help because the file name is always different.
Well, the backup set will eventually fill the disk, so you have to Groom or Recycle. With grooming, you can keep the latest backup. With Recycle you wipe them all out, so if your tapes are lost you have nothing at all. So grooming is safer.

 

Link to comment
Share on other sites

I presume he uses the backup feature of SQL itself and wants to use Retrospect to backup those resulting files?

 

In that case it's probably also convenient to make a filter for lets say files created in the last 24 hours, so to only backup the latest SQL backup. If you do that every day, you can easily use the grooming function setup as Lennart mentioned.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...