Jump to content

Search the Community

Showing results for tags 'duplicate'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • Announcements, News and Resources
    • Latest News
  • Windows Products-Retrospect
    • Professional
    • Server, SBS and Multi Server
    • Device and Hardware Compatibility-Windows
    • Exchange Server Add-On Support
    • SQL Server Agent
    • Linux, Unix and Netware Clients
    • Express for Windows
    • Product Suggestions-Windows
  • Mac OS X Products-Retrospect
    • Retrospect 9 or higher for Macintosh
    • Retrospect 8 For Macintosh
    • Retrospect 6: Desktop, Workgroup and Server for Mac OS X
    • Device and Hardware Compatibility-Mac OS X
    • Linux Clients
    • Product Suggestions-Mac OS X
  • Macintosh OS 9 and Earlier-Retrospect
    • Express, Desktop, Workgroup and Server for Pre-OS X
    • Device and Hardware Compatibility Pre OS X
  • General Discussion-Retrospect
    • Networking and Clients
    • Strategy, Scripts and General Use
    • Retrospect iPhone App
  • Retrospect 8.x for Mac
  • Retrospect 6.1 for Mac
  • Retrospect 7.7 for Windows
  • Retrospect 7.6 for Windows
  • Retrospect Express
  • General Discussion

Categories

There are no results to display.


Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


AIM


MSN


Website URL


ICQ


Yahoo


Jabber


Skype


Location


Interests

Found 6 results

  1. To All Retrospect Users, can you please read the following feature suggestion and offer a +1 vote response if you would like to see this feature added in a future version of Retrospect? If you have time to send a +1 vote to the Retrospect support team, that would be even better. Thank you! I contacted Retrospect support and proposed a new feature which would avoid redundant backups of renamed files which are otherwise the same in content, date, size, attributes. Currently, Retrospect performs progressive backups, avoiding duplicates, if a file's name remains the same, even if the folder portion of the name has changed. However, if a file remains in the same folder location and is merely renamed, Retrospect will backup the file as if it's a new file, duplicating the data within the backup set. This costs time and disk space if a massive number of files are renamed but otherwise left unchanged, or if the same file (in content, date, size, attributes) appears in various places throughout a backup source under a different name. If this proposed feature is implemented, it would allow a Retrospect user to rename a file in a backup source which would not subsequently be redundantly backed up if the file's contents, date, size, attributes did not change (i.e., just a file name change doesn't cause a duplicate backup). I made this suggestion in light of renaming a bunch of large files that caused Retrospect to want to re-backup tons of stuff it had already backed up, merely because I changed the files' name. I actually mistakenly thought Retrospect's progressive backup avoided such duplication because I had observed Retrospect avoiding such duplication when changing a file's folder. For a folder name change, Retrospect is progressive and avoids duplicates, but if a file is renamed, Retrospect is not progressive and backs up a duplicate as if it's a completely new file. If you +1 vote this suggestion, you will be supporting the possible implementation of a feature that will let you rename files without incurring a duplicate backup of each renamed file. This can allow you to reorganize a large library of files with new names to your liking without having to re-backup the entire library. Thanks for you time in reading this feature suggestion.
  2. Hi, I'm using retrospect 7.7 on Windows 7 64 bit. I would like to run duplication jobs every second day without the "save state" option. This is because the duplication task itself takes only 15 to 20 mins but the save state is taking an additional 3 hours. Although I have set the duplication script option to NOT save state, it still does it every time. Is this a known issue or bug? Any way to force the script to ignore saving state? Many thanks. (May have placed this in the wrong forum, so I am posting it in the Pro forum too) - If the moderator wants to delete this one from here, please do so. Thanks.
  3. (Retro 10.2.0 (201) on Mac OS X 10.8.4, Mac Mini 16 GB mem) I have had several incidents where I have duplicated a script while it was running, presuming that the script would be copied, and i could tweak the duplicate without disturbing the original, running script. I have stopped doing this because it seems to screw up the original script in strange ways. Thinking about what I have seen, though, I think I know what's happening. The "duplicate" button does indeed duplicate the script, but names the original as "copy", which leads me to modify the original, and leave the copy unchanged. I have seen errors in the logs, backups written to the wrong media set, and other strange happenings. This most recent incident would be fully explained by this scenario. I encourage the folks at Retro to try this on a long-running script. I bet I'm right. BTW - changing a running script should either be disallowed, or fixed so the running script cannot be screwed up, or at least a warning would be nice.... Thanks,
  4. I use a laptop for for when I'm out of the (home) office, so it has most of the same apps installed that are installed on the desktop I use in the office. Same versions, e.g. Office 2010 Professional Plus, Quicken, FireFox, same utilities, etc. Both systems are running Win 7 Pro 64. I have always kept data in a separate D partition, separate from the C partitions used for the OS and applications. I just started to back up the Programs partition on my laptop. I thought that there would be very few actual files backed up, since the various OS or app files are the same on both systems. However, Retrospect 7.7 appears to have done a complete fresh backup of all those files on my laptop, looking at the size of the backup data set relative to the space used by the installed files on both laptop and desktop, and by the time required for each backup session in the Operations log. Do I need to change a parameter setting so that this approach doesn't continue? I don't know much about disk grooming. Can I use grooming to eliminate the duplicate files? Thanks.
  5. We have Retrospect version 7.7.562. Our regular backup jobs go to our onsite backup drive and from there we have a retrospect job that duplicates our onsite Retrospect folder to our offsite backup folder. We have the onsite Retrospect folder selected as the source in the duplicate job and the USB drive as the destination. When we looked at our offsite today we realized the files underneath the selected folder were not being copied or could be seen through the source selection box dialog box. It turns out that there was a "Dantz" file in the directory that when removed allows us and the duplicate script to view the directory and copy it off to the offsite drive. What is this Dantz file, I understand it is a internal processing file of some sort already, and why is it causing this to happen? Potentially we could have loss all of our data due to this issue. Is this process of duplicating the RDB retrospect files to the USB drive correct or are we doing this incorrectly? Thank you, Maxtex
  6. Hi, I'm using retrospect 7.7 on Windows 7 64 bit. I would like to run duplication jobs every second day without the "save state" option. This is because the duplication task itself takes only 15 to 20 mins but the save state is taking an additional 3 hours. Although I have set the duplication script option to NOT save state, it still does it every time. Is this a known issue or bug? Any way to force the script to ignore saving state? Many thanks.
×