dheiert Posted February 10, 2011 Report Share Posted February 10, 2011 I need to be able to run concurrent duplicate jobs to a large RAID array I have moving SQL backups from servers to the storage server. I cannot figure out how to configure the destination volumes so that the duplicate jobs are not waiting on the Windows drive. Any help on this? Is it possible to define a folder as a volume and not a subvolume? Quote Link to comment Share on other sites More sharing options...
Lennart_T Posted February 10, 2011 Report Share Posted February 10, 2011 I recommend strongly running "backups" instead of "Duplicates". With duplicates you lose all history. A file deleted from the original will also be deleted from the duplicate. Create two "Disk Backup Sets" on the raid volume. Set up "Grooming" to groom older than (say) 10. Create two backup scripts, backing up to each backup set. Create two grooming scripts, that each weekend grooms out the oldest backups, keeping the last 10 backups. That way, you have at least a few generations of your files. Quote Link to comment Share on other sites More sharing options...
dheiert Posted February 10, 2011 Author Report Share Posted February 10, 2011 (edited) I can't use duplicate any way. No way to access the volume concurrently that way. Will have to use archive. The point of the exercise is to get files off the servers daily and onto the storage machine. Then they will be sent to tape weekly with recycle. Grooming won't help because the file name is always different. Edited February 10, 2011 by Guest Quote Link to comment Share on other sites More sharing options...
Lennart_T Posted February 11, 2011 Report Share Posted February 11, 2011 I can't use duplicate any way. No way to access the volume concurrently that way. Will have to use archive.Right: you should create an "Archive" script.The point of the exercise is to get files off the servers daily and onto the storage machine. Then they will be sent to tape weekly with recycle.OK, understood.Grooming won't help because the file name is always different.Well, the backup set will eventually fill the disk, so you have to Groom or Recycle. With grooming, you can keep the latest backup. With Recycle you wipe them all out, so if your tapes are lost you have nothing at all. So grooming is safer. Quote Link to comment Share on other sites More sharing options...
Ramon88 Posted February 11, 2011 Report Share Posted February 11, 2011 I presume he uses the backup feature of SQL itself and wants to use Retrospect to backup those resulting files? In that case it's probably also convenient to make a filter for lets say files created in the last 24 hours, so to only backup the latest SQL backup. If you do that every day, you can easily use the grooming function setup as Lennart mentioned. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.