Jump to content


  • Content count

  • Joined

  • Last visited

Community Reputation

1 Neutral

About Humptydank

  • Rank
    Occasional Forum Poster
  1. Hi folks -- Sorry, as I mentioned in a previous post I'm a newbie at this stuff, but still have the responsibility for evaluating a backup solution for our small company. So I apologize for my learning curve here. I'd just like to make sure I understand the general theory at work here: -- Am I correct in describing the file portion of a Retrospect backup set as an unduplicated collection of files on all the machines backed up to that set? So files that are on every machine but are exactly the same (like system files would be) are only added once to the set? -- But if I backup the same machines to a new backup set (the next night, for example), all those files are copied again to the new set. So if I'm creating a new backup set every night for a week, then I have to wait a week to come back around to the first set for a "progressive" backup to take place? -- Am I correct in describing a Snapshot as just an index, indicating which of the many files in a Backup Set's file collection existed on a particular machine at a particular point in time? -- So if a file that is already in the backup set is revised, it sounds like it is simply added to the backup set again as a unique instance, not trying to be smart about whether it's a new file or not, Retrospect simply says "this doesn't match anything in the Backup Set's collection" and adds it. Conversely, if it finds a file anywhere, on any machine, that matches exactly a file in its collection it doesn't add it. It then lets the Snapshots sort out what goes where if it needs to. -- Are Retrospect Catalog Files then essentially the Table of Contents to a Backup Set, listing all the unique files and Snapshots contained in a set? That seems to be the case, since it can be re-created from the Backup Set itself. -- If the above is correct, then it sounds like a Progressive Backup looks at a machine's files, adds any new/changed files to the Backup Set in their entirety as new instances, and then stores a Snapshot and moves on. Is that accurate? Just quick answers are all I'm asking for here, thanks so much for any and all help. -- Dave
  2. Hello! I'm the tech guy for a small company, and I'm not a backup pro, but I'm the best we have so forgive the newbieish questions. :-) We're evaluating Retrospect, and one of the major attractions for us was the Proactive backup feature. In particular we have a remote workstation that, although it's on a broadband connection, isn't always up and the connection isn't always reliable. The initial backup is big, 35 gig, and if I'm reading things right should take about 4 days of transfer time to complete. It's nearly guaranteed to be disconnected many times before it's done, but if I understand Proactive correctly it should then re-schedule itself and try again. My questions are: -- On some previous tries, I saw in the logs a whole series of attempts, each getting more files as Proactive re-connected and went about its business. How do I then know when I have a completed backup set? Among the multiple tries it's very hard to determine which errors were corrected on later attempts, etc. Is there an easy way to tell when the initial backup has completed and we've achieved at least the first level of safety? -- I assume that once the full backup is completed, then the progressive backups will be considereably less time under normal usage of the machine? -- Did I read the help file correctly (yes, I read the help file. I'm so ashamed...), that within a single backup set, Retrospect will not duplicate the same files even if they are on different machines? So if I have common.dll in the c:\windows directory on machines A and B, Retrospect will detect that they are the same file and only store it once, using snapshots to determine who gets what on a restore? If that's true, then it sounds like it would benefit me to put the two remote machines, both running XP and both needing Proactive, into the same backup set. That way Restrospect would only try to fetch all the system files over the slow connection once. Is that true? Sorry to make this so long, but thanks in advance for any advice! -- Dave
  3. Hi folks - I'm not particularly Mac savvy, but we have one legacy Mac running a still-important database in 4th Dimension and an older version of MacOS (not OSX). An evaluation version of Retrospect 6.5 is running on a Windows server, and we'll be putting the Mac client on the Mac tomorrow for testing. Does anyone have any insight on how well the 4D database is backed up under these circumstances? I assume Retrospect isn't 4D-aware, so I'm not sure if we'll need to shut down the 4D server application completely during the backup periods, or whether it will make a clean backup in place. Anyone have any experience here? Thanks! -- Dave