Jump to content

Backing Up Too Many Files


Recommended Posts

I have Retrospect Backup 6.1.230. I am backing up several computers on my network with the same type of script. I am doing a duplicate where I replace the entire drive. I also have it omit all system and library files. This works fine on most of the computers. They will back up a couple hundred files each day. However, my wife's computer backs up 128,000 files a day. It's using the same script, so why is it backing up so many files? How can I find out what it's doing and stop it?

Link to comment
Share on other sites

Retrospect is seeing the files as having changed in some way and is therefore copying them again.

 

You don't say what OS each of the computers is running; if her machine is running 10.6, that might be an issue. If ACLs are enabled on your wife's computer and you haven't configured your script to ignore extended attribute modifications, that could also cause files to be backed up more frequently than expected.

 

If your wife's computer is somehow losing track of the correct time or time zone (such as if the backup battery is failing), that would also cause files to appear to have changed.

 

Since you are performing duplicates rather than backups, you won't have a nice catalog history to dig through; you'll have to manually compare the files on the source and destination to see how they differ over time.

Link to comment
Share on other sites

  • 2 months later...

I understand _why_ Retrospect would want to back up all the files, but those conditions aren't true. I have checked her computer and it has the correct date. I have also checked numerous files that it backed up and they haven't been used or changed for months. She is running Mac OS X 10.5.8. I have several other computers using the same version of Mac OS X and the client and they work fine. I need to know what needs to be changed on this client/computer to fix this problem.

Link to comment
Share on other sites

Something is different about those files, causing Retrospect to want to copy them again. I suspect file metadata such as ACLs.

 

Try this: In your script, go to Options> Matching and uncheck the option to use the attribute modification date when matching; see if this solves your issue.

Link to comment
Share on other sites

  • 3 months later...

Hi all,

 

I have just started encountering the same issue that the OP has started having.

 

We do a duplicate every 2 hours of our main file server to a backup file server. As of 10am last Thursday (oct 14 2010) the server started duplicating the same 6600 files (4.2 GB) to the backup server.

 

I have tried changing the location for where the backup is being performed, as well as creating a new script.

 

We are running on OS 10.5.8 and Retrospect 6.1.230

 

Nothing I have done seems to change the resulting backup.

 

I have run the backup script repeatedly and the same set gets backed-up.

 

Because we using a "duplicate" procedure I can't get a list of the files that are being backed up. I have tried to get a read random files as they fly through the backup window and the files seem to be all over the file server (292 GB and 300,000 files).

 

I have tried turning off the matching criteria option as suggested and that has yielded no change.

 

Any help here would be greatly appreciated.

 

Thank you.

Link to comment
Share on other sites

  • 3 weeks later...

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...