Jump to content
mferraro

Bad backup set header found ... which RDB files ?

Recommended Posts

I have a very large backupset; getting close to 1TB.

I recently did a verify, and it report 310 Bad Backup set header errors.

So I decided to recreate the catalog, hoping that would fix it.

Which it didn't

 

They appear to be all in the range of 

Bad Backup Set header found (0x0a408c1e at 1,381,031,224)
to
Bad Backup Set header found (0x679f7d8e at 1,382,719,480)
 
Is there anyway to identify which RDB files are corrupted are responsible for that sequence?
 
It seems like it would be an easy thing just to delete the offending RDB files, rebuild the catalog, then let the regular backup script fill in the missing files.
 
 
Thanks,
 
Matt Ferraro

 

Share this post


Link to post
Share on other sites

Which version of Retrospect are you using?

 

This looks to be an ongoing problem with Retrospect where .RDB files work fine for a catalog rebuild but fail when grooming or verifying. The only solution I have found is to manually delete the offending.RDB file and rebuild the catalog. Luckily when grooming the specific file is named.

Share this post


Link to post
Share on other sites

I'm using 9.5.3.103

I'm running a Groom job now to see if it identifies any RDB files.

Last groom showed no errors

 

Grooming Backup Set New Win2003...
Groomed 24.1 GB from Backup Set New Win2003.
4/11/2015 3:45:15 AM: Groomed 24.1 GB from Backup Set New Win2003.
 
Here is the results of last verify.  There is one MD5 digest error.
+ Executing Verify at 4/11/2015 12:51 PM (Execution unit 2)
To Backup Set New Win2003...
Bad Backup Set header found (0x0a408c1e at 1,381,031,224)
...
Bad Backup Set header found (0xcf9f7fb1 at 1,382,671,480)
Bad Backup Set header found (0xe3ffe8c3 at 1,382,671,544)
Generated MD5 digest for file "\\xxxxxxxxxx\2132 rear.png" does not match stored MD5 digest.
Bad Backup Set header found (0x436f6e74 at 1,382,671,611)
.....
Bad Backup Set header found (0x679f7d8e at 1,382,719,480)
4/11/2015 5:15:04 PM: 310 execution errors
Remaining: 88 files, zero KB
Completed: 573752 files, 842.3 GB
Performance: 3419.2 MB/minute
Duration: 04:23:10 (00:10:55 idle/loading/preparing)
4/11/2015 5:15:04 PM: Script "Verify" completed with 310 errors
 
 
Any other suggestions?
I was thinking of Creating a new backup set; then transferring all snapshots over; hoping that retrospect will skip the corrupt segments.
 

Share this post


Link to post
Share on other sites

If you have the storage space available you could create a new backup set and populate it from the last snapshots for each source in the backup set. Once the the new backup set is ready you can then retire the original backup set and retain it for however long your backup maintenance policy is. This is the approach I use when backup sets start to give problems.

Share this post


Link to post
Share on other sites

If you have the storage space available you could create a new backup set and populate it from the last snapshots for each source in the backup set. Once the the new backup set is ready you can then retire the original backup set and retain it for however long your backup maintenance policy is. This is the approach I use when backup sets start to give problems.

Ditto to Scillonian's post.  Often times, you will save time (and patience) in the long run, by creating a brand new backup set and starting from scratch.  Then just hold onto the old one for length of time for your retention policy.

 

Unfortunately, this is not ideal in many cases, especially when you have large backup sets that take days to rebuild from scratch.  However, I've tried to groom/repair on several occassions, only to have to turn around and recreate the backup set anyway.  The real downside is the cost of drive space to maintain 2 backup sets of the data, but if you have the room, it shouldn't be much of an issue.

 

This has probably been my biggest complaint of Retrospect over the 5 years we've been using it.  Unfortunately, it isn't just Retrospect that suffers from this problem.. I've also used CrashPlan and Acronis True Image Home and find that they also suffer from this behavior (given enough time, the automated backups eventually have a problem of some kind).  It's not a matter of "if", but "when" your backup data, or the associated catalog file, will become corrupted.  The constant reading, writing, grooming, unplanned reboots (patching while a backup is running), etc, will eventually cause an issue somewhere.  As a result, starting fresh every now and then with a backup set is a good idea if you have the space to hold onto the old one for awhile, while you create the new one and let it build up some history.  

 

Probably not what you wanted to hear, but that has been my experience anyway.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×