Qlite Posted February 16, 2006 Report Share Posted February 16, 2006 I'm seeking advice on how to deal with large files that get changed and require back-up. Two examples come to mind. Virtual PC creates a large virtual disc file, this may be multi-gig in size. Everytime a user uses that virtual disc/file the entire file gets appended to the back-up set when backing up. Obviously, this can really eat up HD back up space quickly. Is there a way to tell Retrospect to not back up that file incrementally, but to erase previous iterations of the file? I have the same problem with some clients that use Microsoft's Entourage for e-mail. There is a large 'database' file created that everytime a new e-mail arrives, etc., the entire file gets re-written as a part of the back-up. Any advice would be most appreciated. Thanks in advance. BW Link to comment Share on other sites More sharing options...
twickland Posted February 16, 2006 Report Share Posted February 16, 2006 Whatever you do, you first need to decide how often these large files need to be backed up, and how many previous versions of the files you need to retain. One strategy may be to back these files up less frequently to the existing backup set. You can do this by creating a selector like the following: Always exclude files matching Enclosing Folder name contains Virtual PC and backup date is greater than or equal to Today minus 6 days Another strategy would be to exclude these files from your regular incremental backups, create one or more new backup sets (one for each previous version you wish to retain), and use a Duplicate script to write only these particular files to the new backup sets. Link to comment Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.