Jump to content

Grooming Help


Recommended Posts

As I understand it the new grooming feature can be used to free up disk space used by a Media Set.

 

I have 15 scripts, backing up 11 servers to 17 Media Sets that are currently on the same hard drive.

 

I would like to know what the easiest/recommended way possible would be to, on a schedule, move the oldest backup data onto another hard drive (attached via FireWire) to make more room for current and future backups.

Edited by Guest
Link to comment
Share on other sites

Grooming won't move the data, so you will want to try a Copy Media Set type of script or copy Backup script which will allow you to copy data from within a specific media set to another media set on the other hard disk.

 

You can then do a recycle backup on the original media set, or perform a grooming operation.

Link to comment
Share on other sites

OK. So I created a Copy Media Set script, selected all 17 of my media sets, told it to copy to the external drive, and then scheduled it to run each Saturday.

 

Here is the (potential?) problem I see. It needs to run after the normal backup scripts are run. How can you ensure that?

 

And then after the copy is complete, I would need to automate the erasing of the previous Media Sets, how do you automate that, and again, ensure that it happens AFTER the copy is done?

Edited by Guest
Link to comment
Share on other sites

This script can actually read data from the source media set while a backup is still being written to that media set, at least that is the way it should work.

 

Basically the backup script should not need to be stopped before this copy script starts.

 

This is true with Disk Media Sets and Not file sets.

 

The copy script does have an option to recycle the source media set after the copy is done, but if you do this, then I would not recommend allowing a backup to write into the set at the same time as the copy.

Link to comment
Share on other sites

The copy script does have an option to recycle the source media set after the copy is done, but if you do this, then I would not recommend allowing a backup to write into the set at the same time as the copy.

So how do you do that?

 

And how do you run one script after another?

 

Another thing. I manually run my "archive" script which copies my 17 media sets.

 

When I went to see if it had finished yet, I was confused:

If I clicked on Activities > Running, nothing was shown

If I clicked on Activities > Past, it did not appear in the list

If I clicked on Past Backups, I see a number of items going to the "Archive" destination I created, but oddly enough, they all show "Yesterday at 10:00 PM" as a time, even though I manually triggered it this afternoon after I created it at 3:00 PM.

Additionally, not all of the sources appeared in the list, and checking the size of the Archive, I don;t think it is big enough to contain all of my previous backups.

Even more confusing, when I click on the Media Set, select the Archive, and click on the Backups tab, only two items are shown - even though the Past backups window shows eight, which is nowhere near the 17 that should have done?!?!?

 

I hope this isn't another glaring bug!

 

Isn't there a non-filtered view of the log, that would give me a clue as to what happened?

 

Do you have any ideas?

Link to comment
Share on other sites

OK.

 

This gets pretty frustrating.

 

I have to have an individual script for each physical hard server, plus a script and a media set for each data type on each physical server in order to take advantage of increased performance through threading.

 

Now, in order to execute one script after another, I have to assign a thread to each "set".

 

The management problem becomes in determining how to optimize the use of the eight threads. Obviously some backups are going to take longer than others, so now I have to run, test, calculate, and tweak all of these sets to optimize the use of matched threads....

 

I mean its doable, but don't you think the software should be taking care of these management features? Either by figuring out the most optimal combination of scripts, or probably way easier, a way to have multiple steps in a a script.

 

For example, you should be able to have one script do the following:

1. Backup a group of servers, making use of multiple threads, to attached hard drive

2. When all of the backups are complete, copy the backups to a tape or different drive

3. When the copy is complete, delete the oldest session of backup data from the attached drive

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...