Jump to content

Bad backup set header found ... which RDB files ?


mferraro

Recommended Posts

I have a very large backupset; getting close to 1TB.

I recently did a verify, and it report 310 Bad Backup set header errors.

So I decided to recreate the catalog, hoping that would fix it.

Which it didn't

 

They appear to be all in the range of 

Bad Backup Set header found (0x0a408c1e at 1,381,031,224)
to
Bad Backup Set header found (0x679f7d8e at 1,382,719,480)
 
Is there anyway to identify which RDB files are corrupted are responsible for that sequence?
 
It seems like it would be an easy thing just to delete the offending RDB files, rebuild the catalog, then let the regular backup script fill in the missing files.
 
 
Thanks,
 
Matt Ferraro

 

Link to comment
Share on other sites

Which version of Retrospect are you using?

 

This looks to be an ongoing problem with Retrospect where .RDB files work fine for a catalog rebuild but fail when grooming or verifying. The only solution I have found is to manually delete the offending.RDB file and rebuild the catalog. Luckily when grooming the specific file is named.

Link to comment
Share on other sites

I'm using 9.5.3.103

I'm running a Groom job now to see if it identifies any RDB files.

Last groom showed no errors

 

Grooming Backup Set New Win2003...
Groomed 24.1 GB from Backup Set New Win2003.
4/11/2015 3:45:15 AM: Groomed 24.1 GB from Backup Set New Win2003.
 
Here is the results of last verify.  There is one MD5 digest error.
+ Executing Verify at 4/11/2015 12:51 PM (Execution unit 2)
To Backup Set New Win2003...
Bad Backup Set header found (0x0a408c1e at 1,381,031,224)
...
Bad Backup Set header found (0xcf9f7fb1 at 1,382,671,480)
Bad Backup Set header found (0xe3ffe8c3 at 1,382,671,544)
Generated MD5 digest for file "\\xxxxxxxxxx\2132 rear.png" does not match stored MD5 digest.
Bad Backup Set header found (0x436f6e74 at 1,382,671,611)
.....
Bad Backup Set header found (0x679f7d8e at 1,382,719,480)
4/11/2015 5:15:04 PM: 310 execution errors
Remaining: 88 files, zero KB
Completed: 573752 files, 842.3 GB
Performance: 3419.2 MB/minute
Duration: 04:23:10 (00:10:55 idle/loading/preparing)
4/11/2015 5:15:04 PM: Script "Verify" completed with 310 errors
 
 
Any other suggestions?
I was thinking of Creating a new backup set; then transferring all snapshots over; hoping that retrospect will skip the corrupt segments.
 
Link to comment
Share on other sites

If you have the storage space available you could create a new backup set and populate it from the last snapshots for each source in the backup set. Once the the new backup set is ready you can then retire the original backup set and retain it for however long your backup maintenance policy is. This is the approach I use when backup sets start to give problems.

Link to comment
Share on other sites

  • 2 weeks later...

If you have the storage space available you could create a new backup set and populate it from the last snapshots for each source in the backup set. Once the the new backup set is ready you can then retire the original backup set and retain it for however long your backup maintenance policy is. This is the approach I use when backup sets start to give problems.

Ditto to Scillonian's post.  Often times, you will save time (and patience) in the long run, by creating a brand new backup set and starting from scratch.  Then just hold onto the old one for length of time for your retention policy.

 

Unfortunately, this is not ideal in many cases, especially when you have large backup sets that take days to rebuild from scratch.  However, I've tried to groom/repair on several occassions, only to have to turn around and recreate the backup set anyway.  The real downside is the cost of drive space to maintain 2 backup sets of the data, but if you have the room, it shouldn't be much of an issue.

 

This has probably been my biggest complaint of Retrospect over the 5 years we've been using it.  Unfortunately, it isn't just Retrospect that suffers from this problem.. I've also used CrashPlan and Acronis True Image Home and find that they also suffer from this behavior (given enough time, the automated backups eventually have a problem of some kind).  It's not a matter of "if", but "when" your backup data, or the associated catalog file, will become corrupted.  The constant reading, writing, grooming, unplanned reboots (patching while a backup is running), etc, will eventually cause an issue somewhere.  As a result, starting fresh every now and then with a backup set is a good idea if you have the space to hold onto the old one for awhile, while you create the new one and let it build up some history.  

 

Probably not what you wanted to hear, but that has been my experience anyway.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...