Jump to content

jgowing

Members
  • Content count

    46
  • Joined

  • Last visited

Everything posted by jgowing

  1. Perhaps this Technote may help
  2. jgowing

    Error -1101 after Windows October Update

    Have you seen this - Microsoft suspends Windows 10 update, citing data loss reports
  3. Firstly I suggest using the Tick Button to check your Selector against a particular source, it will show what gets included and excluded. Secondly remember that when Retrospect takes a backup, it takes a Snapshot of the ENTIRE Target, to record it's state at that time. Thus the snapshot will show ALL the files on the target. The Session will show ONLY the files that were backed up during that particular session. This difference between a snapshot and a session catches many people out, the snapshot is a necessary part of Retrospect's Progressive Backup technique in order to be able to restore a target to the state it was in at the time of any backup. Having a comprehensive understanding of the Progressive backup process is fundamental to using Retrospect effectively. Progressive Backup & Snapshots
  4. jgowing

    Does Restore require Write access?

    Probably not. whilst the files system might be BTRFS, Synology sets up its disks using linux volume manager. which would not be recognised on a windows machine. Your disk will have 2 partitions, 1 Synology DSM System partition and 1 Data partition usually called Volume1. However, if you were to boot a machine into a linux derivative and mount the drive it may well be visible. For this to work your disk must be mounted on the motherboard and be supported & visible in the bios ( USB wont work) I would recommend creating a PartedMagic boot disk ( partedmagic.com ) boot your windows machine with it (It wont touch your Windows installation ) Use the disk tools in PartedMagic to identify and mount the disk, With luck your data may be recoverable
  5. My script has only the Retrospect R:\ drive as the source. ( I have a separate Script and Backupset for the C:\ System drive )
  6. Retrospect 9.5 Server on Win7 Pro. Retrospect is installed on it's own drive, (best practice) ( Not the C:\ System Drive) in the folder "Program Files" There are also folders for Retrospect Catalogs, Reports, and Resources, plus a bunch of other assorted folders I have a script to backup the Retrospect Server Volume with a Selector which includes everything except the other assorted folders. I have just discovered that the "Program Folders" folder is never backed up. I tried variations of the Selector Explicitly including "Program Files" and Including Windows Special Folders "Program Files" but no matter what I try it will not backup that folder. I suspect that Retro recognises "Program Files" as being a special Windows Folder, and the Windows Special Folders Selector expects those folder to be on the System Drive. This could be a major issue as many Server applications are installed on non system Drives in a "Program Files" folder. which could prevent retrospect from properly backing up in these situations. Any thoughts or ideas?
  7. jgowing

    Rebuilding Retrospect Server

    Recovering the Catalogs is amazingly simple. Once you have your new Retro up just drag and drop the catalog files onto the Retrospect window. It will automatically register them and create the backup sets for you. For the future it is good practice to make a duplication job to copy the \all users\retrospect directory and the catalogs to some safe place at least on another machine. I normally just overwrite them every day, so you have one good copy of that data handy for a DR scenario. See the user guide Chapter Management>Moving Retrospect, for tips an what needs to be copied. Then once you have a basic server, rebuilding Retro is 10 minute job. Copy the conifig.dat, Restart, Drop the catalog files. Start restoring clients data Cheers
  8. jgowing

    V7.7.341 released, whats changed?

    I found the "Proper" Release notes here on the Dantz KB http://kb.dantz.com/display/2n/kb/article.asp?aid=9729.
  9. Thought I'd share this little problem I had today. Error occurs in two scenarios, one during a compare, possibly involving large files over 1 GB (retrospect catalog files specifically), the other as a VSS snapshot is being taken of a volume prior to open file backup. Logs look like this Normal backup using at 8/26/2010 6:00 PM (Execution unit 2) To Backup Set ... T-8: VssWSnapVolume: DoSnapshotSet failed, winerr -2147024882, error -1018 Can't use Open File Backup option for on server01, error -1018 (not enough memory) Normal backup using at 8/30/2010 6:00 PM (Execution unit 2) To Backup Set ... T-8: VssWSnapVolume: DoSnapshotSet failed, winerr -2147024882, error -1018 Can't use Open File Backup option for on server, error -1018 (not enough memory) Further investigation revealed that the first job was still doing it's verification when the Snapshot for the second one fired off. This overwhelmed the memory in the server, resulting in the errors. Solution, For a given server, make sure there are no other Retrospect tasks running at the same time a snapshot for Open file needs to be taken.
  10. No Prob, you're welcome Of course if you were to run a thorough verify at a lter stage you do run the risk of some source data changing and thus giving an error. However since the MD5 hashes are taken at time of backup that problem won't occur with a later MD5 verify. The other cool thing with scripted verify which I fogot to mention is that Retro is clever and knows what it has already verified, so you can just run a verify whenever, and retro will only verify new stuff. Modern tapes esp LTO are pretty robust, but the "dailies' do tend to get a bit more wear & tear, so it is a good idea to cycle new ones into your high use Backup sets. ("New Member" function in Retro )
  11. jgowing

    Backup Sets & Removable Drives

    For my final solution, which has now been running beautifully for several months see http://forums.dantz.com/showpost.php?post/142042/ Thanks for the help
  12. I suspect I may have an issue around using Removable USB Drives, and I cannot find a clear explanation in the docs regarding how Retrospect treats removable drives. There is mention of it making provision for them, but exactly how is not elaborated. I have 2 disk backups sets, Romeo & Sierra, each with a single member on a removable drive with Volumes and Physical Drives, also labelled Romeo & Sierra. A single Pro-Active script points to both Romeo & Sierra backup Sets. The theory is that RS will backup to whichever Disk Drive is available. The user alternately takes 1 drive off site each day. This is based on the white paper describing using Pro_Active backup with Removable drives. All was well until Romeo kept filling and Sierra did not, and backups failed entirely if the Romeo Drive was offsite. On investigation, by inspecting the Directory structures on the Disks, and the Porperties of the Members, I discovered that BOTH Romeo & Sierra Backup Sets were pointing to the Romeo Disk Drive. Quite how this happened I don't know. I checked that all Sierra BackupSet's container files were accounted for, and copied them all back to Sierra Disk Drive, Re-built the catalog, and all has been well since. However, on any given day, one of the drives is offsite. Pro-Active Monitor Tab (Backup Sets) shows the present drive as "Ready", and the missing drive as "Media", this seems reasonable, BUT if I go to Backup Sets, and check the member properties for the Backup Set on the Offsite drive, it points to the "Other" present drive. I suspect that this is OK provided no backup attempts to run to the "Missing" backup Set. If it does, I think RS will create a directory structure for the Backup Set on any available removable drive. I am beginnnig to wonder if Retrospect would prefer to have a single Backup Set, with two members, one on each physical drive. It might then write to whichever "MEMBER" is available. Can anyone elaborate on how best to setup Retrospect to deal with this situation, and explain how it works, or point me to some documentation on this? TIA Trafford
  13. I wanted to do pretty much exactly this with 2 drives. I eventually figured it out. I have posted my solution as a separate post see http://forums.dantz.com/showpost.php?post/142042/
  14. For a simple fully automatic backup with minimal operator intervention I thought this would be desirable. After much experimentation, and some discussion I got the following setup to work nicely. [color:red]WARNING ======[/color] If you create a backup set pointing to the root of a USB drive it erases the whole drive If this is not what you want , point the BUSet to a directory on the drive. Retrospect WILL Relabel the Drive Volume to the BUSet Member name This Volume Name is most important, since you cannot predict the drive letter when mounting a USB drive. SO Retrospect uses the volume name to track the drives. SETUP Create a BUSet for Each Drive, Add the member by pointing to the Root or directory on the Drive. Create a ProActive Script Specifying ALL the above BUSets as destinations If all drives are connected, consecutive ProActive backups will cycle round the BUSets RS attempts to maintain this cycle. If a specific USB-Drive is not available it uses any other BUSet-Drive which is. Suspect that if there is a choice it will use the media with the oldest backup to bring it up to date. OPERATION When swapping out BUSet-Drives, Open ProActive monitoring Tab, Select Backup Sets. Dismount Drive(s) using "Safely Remove hardware" Mount Drives Give Retrospect a few minutes to poll the hardware Check that appropriate BUSets show Ready or Media Leave ProActive Backup to do it's job Welcome to painless backup See the white paper "Backing Up to External Hard Drives" in the knowledge base For detail of the discussion on this mentioned above see http://forums.dantz.com/showpost.php?post/137475/
  15. Hi Katie, I suspect you may have a permissions issue. Sounds like you are backing up clients from a server. You dont mention whether you have a domain or not. Retrospect runs on the backup srver either under the System Account or as a specific user as determined in COnfig>Prefs>Security So I supect that this user may not have permissions on your clients. If you have a domain I suggest creating a special user for retrospect, give it at least backup operator permissions, if not admin. If no domain you may need use a common machine administrator account, or create a common account on all the clients. See the Manual >Ch1 > Creaating Retro user Account and Ch9 > Administratin > Preferences > Security for more on this HTH
  16. If all the copies and compares can be completed within your backup window, without interfering with any other production processes, do you really care? If however you are squeezed for window, and need to get all the copying done IN the window, and would then like to get the compares done OUTSIDE the window you have a couple of options. 1. Full Compare with orignal data. On your backup script switch OFF Verify in the script options. ( Optionally specify a specific execution engine number. Bear with me . . . ) 2. Create a New Verify Script, Select the relevant backup set, Schedule the job to start a few minutes after the backup script starts and before it finishes. (Optionally specify the SAME execution Engine Number as the backup.) If you specify the Execution Engines as above the verify will try to run while the backup is running but CANNOT because the execution engine is occupied. The verify will then wait until the execution engine becomes free when the backup finishes and will run imediately after. Of course you could also just schedule the verify to run at any convenient time. Using a full verify will still cause the original data to be read which may interfere with production processes, in which case use the following approach, (My personal favourite) MD5 Hash verify as a separate job:- Check that MD5 checksums are enabled Config>Prefs>Media>Verification Disable Verify in your Backup Script Run your Verification in a separate job using either method described above. The advantage of this is that your source data is not read, and thus production processes are unaffected. Using this technique allows you to use your full overnight window for backup. Verification can then take place during thday without affecting production. See the Manual Chapter 5 > Scripted Verification for more details or Help > Automated Operations > Scripted Verify See also http://forums.dantz.com/showtopic.php?tid/32369/post/132228/hl//fromsearch/1/#132228 for similar discussion involving tapes and getting them to eject properly. Good Luck
  17. jgowing

    Files do not compare

    I am afraid I am also experiencing this problem on a trail copy of 7.7 for Windows we are running as a test at a client. I am agreat fan of the MD5 Digest verification, and have used seamlessly at several of my customers. This is my first experience of this problem. I get a simple statment that Path\file.jpg didn't compare. On the same server during the same run I am also getting :- An error occurred during the verification step. The MD5 digest for the file "C:\DATA\Home old do not use\benita\My Documents\nero.nrg" did not match, error -1129 ( MD5 digest mismatch) And necoIncoming: empty stream packet, tid 34 or 33 The client is MS SBS 2003 SP 2, and the Retrospect is on a separate server. Has anyone else had this bug reappearing?
  18. If I set grooming to nn backups, presumably at the first grooming, the backup to be deleted will be the original first full backup, will the grooming process recreate a new synthetic full at the date of the oldest backup to be retained? Also, if it does that, then effectively what the grooming process actualy does, is to recreate a synthetic full at the grooming point. Is this correct?
  19. jgowing

    remotely install client

    As far as I know Retrospect does not have a "Push Client" facility as found in some other products. The client is a standard executable which needs to be run on the client. You can generate private keys instead of using a password, and you can then use SMS or a group policy as described in the manual under Networked Clients > Installing Clients. Once installed there IS a push update facility accessible from the Configure > Clients function.
  20. I haven't used Open File extensively but it can require a bit of tuning. It requires a reasonable amount of free space on the volume to allow for the snapshots, there is also a "Disk Inactivity" threshold which you may need to adjust. Check the tips in the manual "Working with Open Files". See these KB Articles This one has a length discussion on open file issues. http://kb.dantz.com/display/2n/articleDirect/index.asp?aid=5595&r=0.1422235 http://kb.dantz.com/display/2n/articleDirect/index.asp?aid=6388&r=0.8234064 Also on the support site is a comprehensive list of errors with links to some solutions for them http://kb.dantz.com/display/2n/_index1.asp?tab=opt2&r=0.9851583 Hope this helps
  21. Copying the volume is fine but is inefficent. It involves copying the entire volume, there is no record or log, and no tracking of the target backups. If you use the Transfer Snapshot Tool, you will be gaining the benefit of Retrospect's Progressive Backup. It will add the most recent snapshots from your Primary Disk Backupsets, to your removable drives. Each time a Removable comes round for refresh only the latest changes will be added, speeding up the process. Also you get records of everything in the logs, and the removable disk backupsets are tracked in Retrospect, which all helps with managing the whole thing. All this is described in detial in the whitepaper mentioned above. Altho' it discusses tape as the target, the principle will work exactly the same with removable disks. I have implemented this and it really works well.
  22. I am guessing that you are trying to make a "Backup" of a volume containing the Retrospect Disk Backup Savesets. It is reasonable to expect Backup apps to protect themselves from a situation where backing up the backup server attempts to backup the backup sets to which the bakcup is being written. If you follow my drift. If you want to make a second copy of the backup sets, it is better to use the tools provided. From the tools menu you can transfer entire backup sets, or snapshots. These tools are the foundation of the Staged Backup facility where you backup to disk and then transfer backups to another media, usually removable for off site storage, but could also just be a second disk. Retrospect has a number of features and strategies to facilitate doing this and to make it fully automatic and self regulating. Check out the manual / documentation for the chapter on Managment > Backup Strategies. Checkout Backup Set & Snapshot Transfers, and Grooming See White Paper Backup to Disk to Tape for strategies for 2 stage backups. http://kb.dantz.com/display/2n/kb/article.asp?aid=8020&n=2&s= If I missed the marke here, perhaps you could explain a little more what you are trying to achieve.
  23. This was back in Dec last year, so I cannot remember exactly whether it was the Scanning or Matching. It was probably the scanning. The file system in question is pretty dense (Large # files & Small volume) which is a challenge for any backup, but Yes I know the performance was awful, and we eventually discovered that the problem was in the MB onboard raid controller. Once we {Eventually after 5 days} got a full backup of the server we rebuilt it with an Adaptec Raid card, which improved the raid performance by about 7 times. Since then we have divided the volume up into 6 subvolumes, and with the improved performance backups are now going fine. That said, at the time the issues highlighted the way Retrospect works under the skin, and once understood allows you to design / tune your config in a better way. So I thought I would share my findings.
  24. I recently had Retrospect Support clarify some subtleties of how Retrospect works under the skin with respect to Partial Backups excluding data using selectors. Particularly referring to Retropsects apparent behaviour of restoring stuff that was not backed up. Support's Comments are highlighted in Red Italics. Snapshot, Session, Selector Restore issue.    Scenario    You have a full backup of a volume in BUSet-Full    You take a backup of the same volume Excluding some data with a    selector to BUSet-Partial    Questions    For BUSet Partial Will the Snapshot reflect the complete volume or    the partial backup according to the selector? /{I Suspect it will be    the Full volume}/    [color:red]The snapshot for the partial backup will show all of the folders on the drive even the ones that have been excluded(there will be no data in the folders) and if you do a full volume restore it will restore those folders.[/color]    Confirm that the Session will contain only the partial data. /{I    suspect it will}/ [color:red]The session data will show only what was copied for a given date.[/color]    If the Snapshot reflects the Full volume, and the Session contains    only the partial data, and I do a restore based on the snapshot in    BUSet-Partial, will Retrospect attempt to find the files missing    from the session in BUSet-Partial, by looking in BUSet-Full.    {I suspect it does} [color:red]It will not try and restore the missing data from the BUSet-full, it will restore the empty folders from the snapshot(this is only if you do a full volume restore).[/color]    When I Run the Partial backup it appears that Retrospect scans the    Entire volume first, THEN applies the Selector to copy the partial    data to the Session. Is this the case? /{I Suspect it IS}/ [color:red]This is the expected behavior, it has to do a full scan of the drive first to see what is on the drive before it can apply the include or excludes of the selector.[/color]    IF SO is there a way of preventing this.    It is particularly a problem in the following scenario:    I have a poorly performing server with 2 Million files, a full scan    takes over 24Hours.    I want to backup a small portion of the data, so I setup a selector    to exclude most of the data, when I run the backup, the scan STILL    takes 24 hours, and then the actual file copy is quick for the small    amount of data selected. [color:red]Unfortunately there is no way of preventing this, I can put in a feature request to improve the selector and scanning process. I can not guarantee anything however as that would require a major change in the program on how we scan the drive and determine what to exclude or include.[/color]
  25. jgowing

    Backup Sets & Removable Drives

    Experimenting further with the 2 BUSet 2Drive approach, I noticed the following:- (Note I am using 2 Flash drives to simulate the USB Hard Drives. When Creating the Backup Sets :- Specify the root of the drive - RS warns and then overwrites the whole drive, and sets the VOlume Label to the backupset member name. Specify a directory on the drive - RS creates the BUSet structure in the directory specified and sets the Volume name to the BUSet Member Name. I seem to remember reading somewhere that this is the clever bit. Recognising a USB drive, and the fact that being removable, the drive letter is unpredictable, RS labels the volume for the BUSet member so it always knows which BUSet the member belongs to, even if the drive letter changes.
×