Jump to content
Sign in to follow this  
jlg

Retrospect and large disk backup sets

Recommended Posts

I have a large disk-based backup set, comprised of ~4700 files and ~1.2Tb in size. The Retrospect server accesses the backup set across a GigE network on a Windows file share hosted by a Windows 2000 server in another location (it's an online offsite backup). Retrospect chokes when it does anything with this backup set...even opening the properties window of the backup set takes 10-15 minutes, and the same amount of time to switch between tabs or make any changes in the properties window. When Retrospect scans the set for grooming, the main Retrospect application window goes blank white, and *might* refresh every half-hour or so. It's unusable.

 

The wrong response is what I got from Retro support in Europe, which was essentially, "Why would anyone have a 1.2Tb backup set?" They're in France, which I suppose could be an excuse for giving a stupid answer, but that's still the wrong answer. I want to know if Retrospect is supposed to be able to handle large disk-based backup sets, and if not, why not.

Share this post


Link to post
Share on other sites

Hi

 

That is indeed the wrong answer.

 

A one TB backup set should be no problem at all. I suspect the root cause of the problem is the link speed. When the catalog file is big (I suspect it is close to 2 GB in your case) you can expect this kind of sluggishness when accessing it on a slow link.

 

I would keep the catalog file on the local disk and just keep the data on the remote server. This will improve the speed of all operations especially grooming. Keep in mind though that grooming still has to move lots of data around. It could still be a slow process.

 

Thanks

Nate

Share this post


Link to post
Share on other sites

The catalog file is local, it's just the disk set itself that's across the GigE connection. The set was created in the last rev of Retrospect, which Dantz admits had several problems with disk backup sets. I created a new disk backup set on the same volume, local catalog file, same deal, and that set now has 650+Gb and the catalog is 800Mb, and it's acting normal. So, I suspect that the problem was that both the old catalog and the backup set itself (since a recatalog didn't help) are in bad shape. Thankfully it's an offsite backup that can be replaced easily; if it had been a critical archive, I don't know what we would have done. I have set the new catalog file to groom to the last 14 backups rather than using the Retrospect default; I'm not sure that I trust the defaults at this point. That's what I get for using rev 1 of a new feature, I suppose. smile.gif

Share this post


Link to post
Share on other sites

Quote:

The set was created in the last rev of Retrospect, which Dantz admits had several problems with disk backup sets.

 


 

What do you mean by the last rev? A release of 7.0 or a release of 6.5? My backup sets were all created with the first release of 7.0 and I was lucky if I could get R to update the screen; it would either stay white or just crash outright. I was, however, trying to running four backup sets each about the same size as yours simultaneously! With three, it got much further and on the next go around, it almost acted as if it "got used" to do grooming. After two weeks of this, I was able to get Retrospect to complete the grooming on seven backup sets. smile.gif

Share this post


Link to post
Share on other sites

Once the size of the backup catalog reaches a certain magic number, Retrospect has problems accessing it. Backup jobs going to that backup set simply sit in the "waiting" queue waiting on the backup set to be available, even though the catalog and backup set files are online. Calling up the properties window of the backup catalog takes several minutes, as does changing to any of the various tabs within the catalog properties dialog. Once I change to another tab of the properties dialog, though, it is as if I have forced Retrospect to admit that the backup set is really available, and it will start the waiting backups.

 

I don't know what the "magic number" is, other than to say that my catalog file is almost 5Gb in size, and the backup set has 3227 files on disk taking up 1.46Tb. The catalog file is local on the Retrospect server, and the backup set files are on a Windows shared network drive across GigE.

Share this post


Link to post
Share on other sites

Quote:

Backup jobs going to that backup set simply sit in the "waiting" queue waiting on the backup set to be available,

 


 

I call this the "Waiting for backup set" bug.

 

I don't think the size of the backup set is what triggers it. It seems there is a flag that tells R whether a backup set is available and the grooming code sets it on but never turns it off. Once R starts having to groom, it seems to always run the grooming code, so this bug constantly prevents backup sets from being accessed. As you discovered opening the backup set allows it to be accessed, so I'm assuming that action clears the flag.

Share this post


Link to post
Share on other sites

Well, this problem needs to be addressed soon. My daily b2d backups just aren't happening unless I manually open the properties of the b2d set every day.

Share this post


Link to post
Share on other sites

Hopefully Dantz fixed the grooming bug in the latest build; we'll see pretty quickly. Regardless of that, though, the fact remains that if a disk backup set gets too large, or at least if the catalog file gets too large, then R can't handle it. I had a disk backup set that was almost 2Tb, with a (locally stored) 5Gb catalog file, and R just wouldn't open the thing. The server has 2Gb RAM. I hope Dantz doesn't seriously expect us to buy a cop-out like, "You need at least as much RAM as the size of the catalog file." That's crap.

 

I don't even think R was grooming the backup set, since it hadn't completely filled the disk yet, and I think this is part of the problem--the size of the backup set isn't the only limitation; R needs to groom the catalog file as well, because if that gets too large, it won't matter how much space is left on the disk because R won't be able to access the catalog!

 

I'm trying a different method, creating multiple smaller disk sets, limited to 250 or 500Gb each, and splitting the backup jobs between the sets. This should reduce the size of the catalog files to something R can manage. At least, I hope so.

 

It shouldn't have to be this hard. We're looking at a data lifecycle management solution so we can get something reliable for our data, and leave just the simple stuff to Retrospect. It's apparently just not built for heavy lifting.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

×