Search the Community
Showing results for tags 'duplicate'.
The search index is currently processing. Current results may not be complete.
To All Retrospect Users, can you please read the following feature suggestion and offer a +1 vote response if you would like to see this feature added in a future version of Retrospect? If you have time to send a +1 vote to the Retrospect support team, that would be even better. Thank you! I contacted Retrospect support and proposed a new feature which would avoid redundant backups of renamed files which are otherwise the same in content, date, size, attributes. Currently, Retrospect performs progressive backups, avoiding duplicates, if a file's name remains the same, even if the folder portion of the name has changed. However, if a file remains in the same folder location and is merely renamed, Retrospect will backup the file as if it's a new file, duplicating the data within the backup set. This costs time and disk space if a massive number of files are renamed but otherwise left unchanged, or if the same file (in content, date, size, attributes) appears in various places throughout a backup source under a different name. If this proposed feature is implemented, it would allow a Retrospect user to rename a file in a backup source which would not subsequently be redundantly backed up if the file's contents, date, size, attributes did not change (i.e., just a file name change doesn't cause a duplicate backup). I made this suggestion in light of renaming a bunch of large files that caused Retrospect to want to re-backup tons of stuff it had already backed up, merely because I changed the files' name. I actually mistakenly thought Retrospect's progressive backup avoided such duplication because I had observed Retrospect avoiding such duplication when changing a file's folder. For a folder name change, Retrospect is progressive and avoids duplicates, but if a file is renamed, Retrospect is not progressive and backs up a duplicate as if it's a completely new file. If you +1 vote this suggestion, you will be supporting the possible implementation of a feature that will let you rename files without incurring a duplicate backup of each renamed file. This can allow you to reorganize a large library of files with new names to your liking without having to re-backup the entire library. Thanks for you time in reading this feature suggestion.
(Retro 10.2.0 (201) on Mac OS X 10.8.4, Mac Mini 16 GB mem) I have had several incidents where I have duplicated a script while it was running, presuming that the script would be copied, and i could tweak the duplicate without disturbing the original, running script. I have stopped doing this because it seems to screw up the original script in strange ways. Thinking about what I have seen, though, I think I know what's happening. The "duplicate" button does indeed duplicate the script, but names the original as "copy", which leads me to modify the original, and leave the copy unchanged. I have seen errors in the logs, backups written to the wrong media set, and other strange happenings. This most recent incident would be fully explained by this scenario. I encourage the folks at Retro to try this on a long-running script. I bet I'm right. BTW - changing a running script should either be disallowed, or fixed so the running script cannot be screwed up, or at least a warning would be nice.... Thanks,
I use a laptop for for when I'm out of the (home) office, so it has most of the same apps installed that are installed on the desktop I use in the office. Same versions, e.g. Office 2010 Professional Plus, Quicken, FireFox, same utilities, etc. Both systems are running Win 7 Pro 64. I have always kept data in a separate D partition, separate from the C partitions used for the OS and applications. I just started to back up the Programs partition on my laptop. I thought that there would be very few actual files backed up, since the various OS or app files are the same on both systems. However, Retrospect 7.7 appears to have done a complete fresh backup of all those files on my laptop, looking at the size of the backup data set relative to the space used by the installed files on both laptop and desktop, and by the time required for each backup session in the Operations log. Do I need to change a parameter setting so that this approach doesn't continue? I don't know much about disk grooming. Can I use grooming to eliminate the duplicate files? Thanks.