Jump to content

jgowing

Members
  • Posts

    46
  • Joined

  • Last visited

Everything posted by jgowing

  1. Have you seen this - Microsoft suspends Windows 10 update, citing data loss reports
  2. Firstly I suggest using the Tick Button to check your Selector against a particular source, it will show what gets included and excluded. Secondly remember that when Retrospect takes a backup, it takes a Snapshot of the ENTIRE Target, to record it's state at that time. Thus the snapshot will show ALL the files on the target. The Session will show ONLY the files that were backed up during that particular session. This difference between a snapshot and a session catches many people out, the snapshot is a necessary part of Retrospect's Progressive Backup technique in order to be able to restore a target to the state it was in at the time of any backup. Having a comprehensive understanding of the Progressive backup process is fundamental to using Retrospect effectively. Progressive Backup & Snapshots
  3. Probably not. whilst the files system might be BTRFS, Synology sets up its disks using linux volume manager. which would not be recognised on a windows machine. Your disk will have 2 partitions, 1 Synology DSM System partition and 1 Data partition usually called Volume1. However, if you were to boot a machine into a linux derivative and mount the drive it may well be visible. For this to work your disk must be mounted on the motherboard and be supported & visible in the bios ( USB wont work) I would recommend creating a PartedMagic boot disk ( partedmagic.com ) boot your windows machine with it (It wont touch your Windows installation ) Use the disk tools in PartedMagic to identify and mount the disk, With luck your data may be recoverable
  4. My script has only the Retrospect R:\ drive as the source. ( I have a separate Script and Backupset for the C:\ System drive )
  5. Retrospect 9.5 Server on Win7 Pro. Retrospect is installed on it's own drive, (best practice) ( Not the C:\ System Drive) in the folder "Program Files" There are also folders for Retrospect Catalogs, Reports, and Resources, plus a bunch of other assorted folders I have a script to backup the Retrospect Server Volume with a Selector which includes everything except the other assorted folders. I have just discovered that the "Program Folders" folder is never backed up. I tried variations of the Selector Explicitly including "Program Files" and Including Windows Special Folders "Program Files" but no matter what I try it will not backup that folder. I suspect that Retro recognises "Program Files" as being a special Windows Folder, and the Windows Special Folders Selector expects those folder to be on the System Drive. This could be a major issue as many Server applications are installed on non system Drives in a "Program Files" folder. which could prevent retrospect from properly backing up in these situations. Any thoughts or ideas?
  6. Recovering the Catalogs is amazingly simple. Once you have your new Retro up just drag and drop the catalog files onto the Retrospect window. It will automatically register them and create the backup sets for you. For the future it is good practice to make a duplication job to copy the \all users\retrospect directory and the catalogs to some safe place at least on another machine. I normally just overwrite them every day, so you have one good copy of that data handy for a DR scenario. See the user guide Chapter Management>Moving Retrospect, for tips an what needs to be copied. Then once you have a basic server, rebuilding Retro is 10 minute job. Copy the conifig.dat, Restart, Drop the catalog files. Start restoring clients data Cheers
  7. I found the "Proper" Release notes here on the Dantz KB http://kb.dantz.com/display/2n/kb/article.asp?aid=9729.
  8. Thought I'd share this little problem I had today. Error occurs in two scenarios, one during a compare, possibly involving large files over 1 GB (retrospect catalog files specifically), the other as a VSS snapshot is being taken of a volume prior to open file backup. Logs look like this Normal backup using at 8/26/2010 6:00 PM (Execution unit 2) To Backup Set ... T-8: VssWSnapVolume: DoSnapshotSet failed, winerr -2147024882, error -1018 Can't use Open File Backup option for on server01, error -1018 (not enough memory) Normal backup using at 8/30/2010 6:00 PM (Execution unit 2) To Backup Set ... T-8: VssWSnapVolume: DoSnapshotSet failed, winerr -2147024882, error -1018 Can't use Open File Backup option for on server, error -1018 (not enough memory) Further investigation revealed that the first job was still doing it's verification when the Snapshot for the second one fired off. This overwhelmed the memory in the server, resulting in the errors. Solution, For a given server, make sure there are no other Retrospect tasks running at the same time a snapshot for Open file needs to be taken.
  9. No Prob, you're welcome Of course if you were to run a thorough verify at a lter stage you do run the risk of some source data changing and thus giving an error. However since the MD5 hashes are taken at time of backup that problem won't occur with a later MD5 verify. The other cool thing with scripted verify which I fogot to mention is that Retro is clever and knows what it has already verified, so you can just run a verify whenever, and retro will only verify new stuff. Modern tapes esp LTO are pretty robust, but the "dailies' do tend to get a bit more wear & tear, so it is a good idea to cycle new ones into your high use Backup sets. ("New Member" function in Retro )
  10. For my final solution, which has now been running beautifully for several months see http://forums.dantz.com/showpost.php?post/142042/ Thanks for the help
  11. I wanted to do pretty much exactly this with 2 drives. I eventually figured it out. I have posted my solution as a separate post see http://forums.dantz.com/showpost.php?post/142042/
  12. For a simple fully automatic backup with minimal operator intervention I thought this would be desirable. After much experimentation, and some discussion I got the following setup to work nicely. [color:red]WARNING ======[/color] If you create a backup set pointing to the root of a USB drive it erases the whole drive If this is not what you want , point the BUSet to a directory on the drive. Retrospect WILL Relabel the Drive Volume to the BUSet Member name This Volume Name is most important, since you cannot predict the drive letter when mounting a USB drive. SO Retrospect uses the volume name to track the drives. SETUP Create a BUSet for Each Drive, Add the member by pointing to the Root or directory on the Drive. Create a ProActive Script Specifying ALL the above BUSets as destinations If all drives are connected, consecutive ProActive backups will cycle round the BUSets RS attempts to maintain this cycle. If a specific USB-Drive is not available it uses any other BUSet-Drive which is. Suspect that if there is a choice it will use the media with the oldest backup to bring it up to date. OPERATION When swapping out BUSet-Drives, Open ProActive monitoring Tab, Select Backup Sets. Dismount Drive(s) using "Safely Remove hardware" Mount Drives Give Retrospect a few minutes to poll the hardware Check that appropriate BUSets show Ready or Media Leave ProActive Backup to do it's job Welcome to painless backup See the white paper "Backing Up to External Hard Drives" in the knowledge base For detail of the discussion on this mentioned above see http://forums.dantz.com/showpost.php?post/137475/
  13. Hi Katie, I suspect you may have a permissions issue. Sounds like you are backing up clients from a server. You dont mention whether you have a domain or not. Retrospect runs on the backup srver either under the System Account or as a specific user as determined in COnfig>Prefs>Security So I supect that this user may not have permissions on your clients. If you have a domain I suggest creating a special user for retrospect, give it at least backup operator permissions, if not admin. If no domain you may need use a common machine administrator account, or create a common account on all the clients. See the Manual >Ch1 > Creaating Retro user Account and Ch9 > Administratin > Preferences > Security for more on this HTH
  14. If all the copies and compares can be completed within your backup window, without interfering with any other production processes, do you really care? If however you are squeezed for window, and need to get all the copying done IN the window, and would then like to get the compares done OUTSIDE the window you have a couple of options. 1. Full Compare with orignal data. On your backup script switch OFF Verify in the script options. ( Optionally specify a specific execution engine number. Bear with me . . . ) 2. Create a New Verify Script, Select the relevant backup set, Schedule the job to start a few minutes after the backup script starts and before it finishes. (Optionally specify the SAME execution Engine Number as the backup.) If you specify the Execution Engines as above the verify will try to run while the backup is running but CANNOT because the execution engine is occupied. The verify will then wait until the execution engine becomes free when the backup finishes and will run imediately after. Of course you could also just schedule the verify to run at any convenient time. Using a full verify will still cause the original data to be read which may interfere with production processes, in which case use the following approach, (My personal favourite) MD5 Hash verify as a separate job:- Check that MD5 checksums are enabled Config>Prefs>Media>Verification Disable Verify in your Backup Script Run your Verification in a separate job using either method described above. The advantage of this is that your source data is not read, and thus production processes are unaffected. Using this technique allows you to use your full overnight window for backup. Verification can then take place during thday without affecting production. See the Manual Chapter 5 > Scripted Verification for more details or Help > Automated Operations > Scripted Verify See also http://forums.dantz.com/showtopic.php?tid/32369/post/132228/hl//fromsearch/1/#132228 for similar discussion involving tapes and getting them to eject properly. Good Luck
  15. I am afraid I am also experiencing this problem on a trail copy of 7.7 for Windows we are running as a test at a client. I am agreat fan of the MD5 Digest verification, and have used seamlessly at several of my customers. This is my first experience of this problem. I get a simple statment that Path\file.jpg didn't compare. On the same server during the same run I am also getting :- An error occurred during the verification step. The MD5 digest for the file "C:\DATA\Home old do not use\benita\My Documents\nero.nrg" did not match, error -1129 ( MD5 digest mismatch) And necoIncoming: empty stream packet, tid 34 or 33 The client is MS SBS 2003 SP 2, and the Retrospect is on a separate server. Has anyone else had this bug reappearing?
  16. As far as I know Retrospect does not have a "Push Client" facility as found in some other products. The client is a standard executable which needs to be run on the client. You can generate private keys instead of using a password, and you can then use SMS or a group policy as described in the manual under Networked Clients > Installing Clients. Once installed there IS a push update facility accessible from the Configure > Clients function.
  17. I haven't used Open File extensively but it can require a bit of tuning. It requires a reasonable amount of free space on the volume to allow for the snapshots, there is also a "Disk Inactivity" threshold which you may need to adjust. Check the tips in the manual "Working with Open Files". See these KB Articles This one has a length discussion on open file issues. http://kb.dantz.com/display/2n/articleDirect/index.asp?aid=5595&r=0.1422235 http://kb.dantz.com/display/2n/articleDirect/index.asp?aid=6388&r=0.8234064 Also on the support site is a comprehensive list of errors with links to some solutions for them http://kb.dantz.com/display/2n/_index1.asp?tab=opt2&r=0.9851583 Hope this helps
  18. Copying the volume is fine but is inefficent. It involves copying the entire volume, there is no record or log, and no tracking of the target backups. If you use the Transfer Snapshot Tool, you will be gaining the benefit of Retrospect's Progressive Backup. It will add the most recent snapshots from your Primary Disk Backupsets, to your removable drives. Each time a Removable comes round for refresh only the latest changes will be added, speeding up the process. Also you get records of everything in the logs, and the removable disk backupsets are tracked in Retrospect, which all helps with managing the whole thing. All this is described in detial in the whitepaper mentioned above. Altho' it discusses tape as the target, the principle will work exactly the same with removable disks. I have implemented this and it really works well.
  19. I am guessing that you are trying to make a "Backup" of a volume containing the Retrospect Disk Backup Savesets. It is reasonable to expect Backup apps to protect themselves from a situation where backing up the backup server attempts to backup the backup sets to which the bakcup is being written. If you follow my drift. If you want to make a second copy of the backup sets, it is better to use the tools provided. From the tools menu you can transfer entire backup sets, or snapshots. These tools are the foundation of the Staged Backup facility where you backup to disk and then transfer backups to another media, usually removable for off site storage, but could also just be a second disk. Retrospect has a number of features and strategies to facilitate doing this and to make it fully automatic and self regulating. Check out the manual / documentation for the chapter on Managment > Backup Strategies. Checkout Backup Set & Snapshot Transfers, and Grooming See White Paper Backup to Disk to Tape for strategies for 2 stage backups. http://kb.dantz.com/display/2n/kb/article.asp?aid=8020&n=2&s= If I missed the marke here, perhaps you could explain a little more what you are trying to achieve.
  20. If I set grooming to nn backups, presumably at the first grooming, the backup to be deleted will be the original first full backup, will the grooming process recreate a new synthetic full at the date of the oldest backup to be retained? Also, if it does that, then effectively what the grooming process actualy does, is to recreate a synthetic full at the grooming point. Is this correct?
  21. This was back in Dec last year, so I cannot remember exactly whether it was the Scanning or Matching. It was probably the scanning. The file system in question is pretty dense (Large # files & Small volume) which is a challenge for any backup, but Yes I know the performance was awful, and we eventually discovered that the problem was in the MB onboard raid controller. Once we {Eventually after 5 days} got a full backup of the server we rebuilt it with an Adaptec Raid card, which improved the raid performance by about 7 times. Since then we have divided the volume up into 6 subvolumes, and with the improved performance backups are now going fine. That said, at the time the issues highlighted the way Retrospect works under the skin, and once understood allows you to design / tune your config in a better way. So I thought I would share my findings.
  22. Experimenting further with the 2 BUSet 2Drive approach, I noticed the following:- (Note I am using 2 Flash drives to simulate the USB Hard Drives. When Creating the Backup Sets :- Specify the root of the drive - RS warns and then overwrites the whole drive, and sets the VOlume Label to the backupset member name. Specify a directory on the drive - RS creates the BUSet structure in the directory specified and sets the Volume name to the BUSet Member Name. I seem to remember reading somewhere that this is the clever bit. Recognising a USB drive, and the fact that being removable, the drive letter is unpredictable, RS labels the volume for the BUSet member so it always knows which BUSet the member belongs to, even if the drive letter changes.
  23. I use a single ProActive script with both disk backupsets as destinations.
  24. No, should I have? I tried again, stopping ProActive backup but it did not make any difference. In fact it was worse in that this time it did not pop up the Media Request Window, The status of the Source in Proactive monitoring just showed "Media" So it looks like the correct approach is Individual backupSets on individual drives. That indicates that the logic to choose whichever backup target resource is available resides in the script processing rather than the BackupSet.
×
×
  • Create New...