Jump to content

All Activity

This stream auto-updates     

  1. Last week
  2. I am out of town for a week so I will check Smart status when I return x509
  3. Could you go back to whatever version you were using before you "upgraded" to 16.1?
  4. Surface test only checks "usable" disk space -- it doesn't include bad sectors that have already been mapped out. So it's probable that the "missing" files were on a bad sector that has already been "fixed" by Windows. Possibly, given the numbers involved, a problem in whatever is NTFS's equivalent of the file allocation table. You can check by viewing the SMART data for the drive, which will tell you how many sectors have already been re-allocated.
  5. About 6 weeks ago I started getting -559 network connection timeout errors after about 2 hours while running my weekly "Sat. Backup" full backup of my MacBook Pro (the first of 3 drives plus a Favorite Folder backed up in that script). There had not been any change in any of my software or hardware, so I guessed that at least one of my two Netgear Gbps Ethernet switches was starting to feel its age. I replaced both switches with non-Netgear 100Mbps switches I had lying around, and the problem went away—without slowing anything down except (moderately) the MBP's Compare phase of "Sat. Backup". Since NewEgg was having a US$15 sale on TP-Link 8-port Gbps switches (Heavens to Betsy, my home LAN is becoming obsolete because I'm not upgrading it to 10Gbps ), I ordered a pair of them. Even though replacing both 100Mbps switches with the Netgear Gbps switches one at a time didn't cause the -559 problem to recur, a week ago Sunday I replaced both switches with the newly-arrived TP-Link Gbps switches. The -559 problem still didn't recur last Saturday, so Sunday night I went into experimental mode and Deleted-Added my MBP with Use Multicast. Both a pre-scheduled "sacrificial" "NoOp Sun.-Fri Backup" script and "real" "Sun.-Fri Backup" script failed with -530 errors when I booted my Mac Pro "backup server" machine after the time when they were scheduled to run. I therefore Deleted-Added my MBP with Add Source Directly, and have had no further problems. IMHO this experience proves that my -530 Bugs 1 and 2 are not caused by a "security improvement" that was made solely in Netgear's Gbps Ethernet switches. Because I started getting -530 Bug 1 immediately after I replaced the failed D-Link 100Mbps switch in my study with a Netgear Gbps switch on 30 January 2017, without any change in my then-current Verizon DSL "gateway" router, my guess is that the "security improvement" was made to several manufacturers' Gbps Ethernet switches. And no doubt there is a contributing factor of Retrospect's implementation of its Multicast feature failing to keep up with the "security improvement".
  6. Update. Surface test showed drive had zero surface defects. The Verify Media operation on my 2019 Media Library dataset just completed, after almost 3 hours. There were many errors. Here is a sample: Next step is to recreate the dataset from files, and then verify again. I also want to verify six other datasets, which I have started and then paused. Those operations will take another five hours in total. However, in about 15 minutes, we are leaving for a vacation, so I have to hibernate this system. I'll complete all this week after we get back,and I'll post an update. Again, appreciation to Lennart.
  7. So following Lennart's suggestion in message #2 of this thread Massive number of "bad backup set header" messages I started to verify all the datasets on my 2019 backup volume. Retrospect could not find any of them. 😱 But that was because I was trying to verify 2019 datasets on my 2018 backup volume. Here is what happened. I always use Drive G for the Retrospect backup volume. If I need to retrieve files from a different year, I install that drive into my system (via docking station), and then use Minitool Partition Wizard to change Drive G to that other year's backup volume. Works very well, but this time I forgot to switch Drive G back to my 2019 backup volume before doing the dataset verification. Ooops. 😅 x509
  8. Lennart, Thanks. I'm running a disk surface test right now. My backup drive is 6 TB, and the test will need another 8+ hours to complete. I'm going to let the computer run overnight.
  9. Yes, it could but not likely by itself. It looks like you have a hard disk problem, since some files can't be read properly. Test the hard drive, including a surface scan. Page 459, "Verifying Backup Set Media" http://download.retrospect.com/docs/win/v16/user_guide/Retrospect_Win_User_Guide-EN.pdf Maybe you can. Page 454 in the above user guide: "Recreating a Catalog"
  10. I just tried to retrieve some files from my 2019 Media Library dataset and got all these error messages in the logfile. The logfile was 1375 lines total, and the activity monitor shows 1358 errors, no warnings. I'm running Retrospect Pro 16.1.1.102 on Win 10 Pro 64. My backup drive is internal SATA 6G. What could have caused all these messages. Just the other day, I ran the built-in Windows "defrag" command, and it ran normally. Could this operation have screwed up the dataset? How can I test my other datasets? Can this dataset be repaired, or do I need to create a new 2019 Media Library backup dataset? Fortunately, I can retrieve the files I need from my 2018 backup volume, but these messages have really scared me.
  11. Earlier
  12. OK. The copy finished and I got around to testing the copy. I can navigate into the copied directory for a Verify, but when I select the data member, it goes straight back to wanting me to Choose Media again, without putting anything useful into the log. Also, very oddly, when the top level directory of the copy is called "NewRetro", I can navigate in the directories inside it, but if I rename it to "Retrospect" (the original name), I can no longer navigate inside it. Also, When the old copy is renamed from its original name "Retrospect", to "OldRetro", I can navigate into its directories, but a Verify of its members fails in the same way as the Verify of the copy. Looks like I'll need to set up new backups.
  13. DavidHertzberg

    .mkv files and backups in Retrospect v.16

    CherylB, Here's a Retrospect 8 Forum thread from early 2011 that discusses this problem. IIRC Retrospect Mac 9 wasn't released until late in 2011. A key question is whether your user's .mkv files have the .mkv extension on the name of the file. If not you've got a problem, unless you restrict what is scanned via Favorite Folders etc. as later posts in that thread suggest. BTW, this "Retrospect Mac bug reports" forum is no longer routinely looked at by anyone from Tech Support, so you might as well have posted your problem in the parent "Retrospect 9 or higher for Macintosh" forum. Besides, it seems the problem isn't a Retrospect bug so much as a Retrospect limitation—it can't tell the type of a file that doesn't have an extension on its name.
  14. I am running a ProactiveAI backup on a MacBook Pro (no parallels installed) and specifically selected in Rules: "User Files and Settings except movies and music." I started this backup at noon and it is backing up .mkv movie files! After waiting a few hours I stopped the backup because the user is leaving the office for the day. Is there another trick to tell Retrospect NOT to backup the .mkv files??
  15. Apologies for the long delay in updating this thread. Among other things, I have been very busy supervising a house extension. I must confess that I never noticed Lennart's reference above to antivirus software, but about two weeks later I found a suggestion in some other forum that made me suspect that Microsoft Windows Defender (which now seems to be called Windows Security) might be the cause of my problems. Within its Ransomware Protection settings there is an option called 'Controlled folder access'. I turned off that option on 29 July and my daily Retrospect backups have been running fine ever since! One strange byproduct of using the 'Controlled folder access' option is that I found my attempts to save a modified Word or Access file under a new name was rejected on the grounds that the file did not exist. Of course the file didn't exist. That error message made no sense at all. I can only assume that Microsoft introduced a bug into into its security software via a recent update.
  16. I tried doing a partial copy of one of the disk media sets, and it seemed to be accessible from Retrospect. I'm currently doing a full copy of both disk media sets. It's s.l.o.w. I'll let you know how it goes. Thanks for the suggestion. It's still no clearer why two long-functioning media sets decided to "disappear" from retrospect.
  17. You might still be carrying some cruft over from the "old" directory structure. Instead of what you did, copy (not move) the members from old to new directory, creating any required sub-directories by hand as you go. Set all permissions to the same as the newly-created top level Retrospect directory. Get Retrospect to "Rebuild" the media set, adding members as required, but make sure to save the new catalog in a different location so you don't overwrite the old one. That's quite a lot of work. But it could get you out of a hole if you need to keep the old sets available for restores -- you never said in the OP if Retrospect still had the read-access that restores require. If it does then I wouldn't bother, just move onto the new sets.
  18. OK, I tried making a new backup into a directory that Retrospect had newly created on the NAS, and it all worked just fine. I then renamed that directory to the name of the original backup top level directory and moved the backup set directories into it, but still no good. I think I'll abandon the old backup sets and create new ones. All a bit annoying.
  19. Doesn't need much. NASs usually use Windows ACLs for permission control, which don't directly translate to POSIX/OS X permissions. So it's always a "best approximation", can be tighter or looser than expected/intended, and can be interpreted in different ways by different programs (if they aren't using OS X's APIs). I'm not expecting my workround to work, but it's worth trying before you contact Support -- more data points will help them help you.
  20. Thanks for the suggestion, but from Retrospect>About: "Version 16.1.2 (102)". So no luck there. Nigel, thanks for your suggestion. I haven't had time to try it out yet, but I will. I may also try moving the existing backup members to another directory on the NAS, too. The NAS doesn't have much in the way of permission control. Only registering users with passwords, specifying the name of the directory subtrees that are exported to them and whether they have read or write permission. The NAS feature of the router seems to have been a bit of an "oh, look, there's room for Samba, so let's put it in" effort. It was removed a while back for space reasons, but there were enough user grumbles that they found space for it again.
  21. DavidHertzberg

    AWS virtual tape library

    blm14, What Nigel Smith said. Let me add two pieces of information. First, it is a not-evident fact (see page 225 of the Retrospect Windows 11 User's Guide; last I heard you were still using that version) that Transfer Backup Sets operations can use Selectors. You could use Selectors in running Transfer Backup Sets operations whose source is a regular Amazon S3 bucket and whose destination is a Glacier bucket, specifying a date to separate the "archival" backups from the "recent" backups. But that would still leave you with the "archival" backups also in regular S3. It is a fact that, if you upgraded to at least Retrospect Windows 15, you could use Selectors in Grooming scripts to groom out the "archival" backups from regular S3 after you have migrated them to Glacier. This Knowledge Base article talks about using Selectors in Grooming scripts specifically for satisfying GDPR requirements, which is why the Selector capability was added to Grooming scripts in Retrospect 15. My guess is that you could specify the faster and cheaper (no Amazon download fees) Performance-Optimized Grooming in those scripts, since your grooming Selector would specify a date—not the name of an individual customer having a "right to be forgotten". In any case, my impression from this article is that you would have to have an on-premises appliance in order to use an AWS VTL. because that would require some kind of "intermediary" local storage. Since you seem to have an aversion to taking local tapes or disks off-site, how would that differ from Nigel's proposal? P.S.: Here's rforgaard's 2016 post on how to setup AWS S3 cloud backup for Retrospect Windows 11. Here's the only-slightly-later KB article; here and here are other applicable KB articles. P.P.S.: Just to make it crystal-clear, blm14, I don't think you need an Amazon VTL. Transfer Backup Sets in Retrospect Windows 11 gives you the capability to transfer "archival" backups from a local tape Backup Set directly to Amazon Glacier (after a 1-day stay in regular AWS). Retrospect Windows 15's enhanced Grooming scripts would give you the capability to keep the "recent" backups in regular AWS, without duplicating what you have in Retrospect Glacier. If you need multiple "conceptual tape" Backup Sets to keep the "recent" vs. "archival" dates straight, you'll probably have to pay for extra hardware even if Retrospect can interface with an Amazon VTL—and I don't know if it can (or needs to). P.P.P.S.: Here and here are articles discussing VTLs. The bottom of the second page of the first article discusses a solution that "is available either as a fully-built appliance or as a software component of ***'s Virtual SAN SDS solution that can run on a customer's existing hardware." The STK L700 is a physical tape library originally from StorageTek; my impression is that its instruction set is widely emulated. Again, I don't see why you can't simply run Transfer Backup Sets scripts to copy your "recent" backups to either local HDDs or regular AWS, and then run Transfer Backup Sets scripts to move the "archival" data to Glacier—followed by Grooming it from the "recent" backups.
  22. I had the same problem and after talking to RS Support found that there was a bug in an early version of 16.1.2 that was fixed in build 102. This solved my problem, but you state that this is the build you are using so maybe there is something else amiss. I'd check that you are for sure using 16.1.2 (102) and then check with RS support if it persists.
  23. Nigel Smith

    AWS virtual tape library

    But S3 would also offer you Standard, Glacier and even Deep Archive. So you could use Retrospect's Cloud Backup or Disk Set(s). I've never used Cloud (some who has might want to chip in) so I'd probably default to Disk Sets with reasonably small member sizes, generated on-prem then uploaded to S3 Standard once each is "full", then migrating from Standard to Glacier after a year or so. There are many ways to skin this cat -- how exactly you do it will be prompted by your data, how you manage your backups, how the regulations say you *must* manage them, how much money you're allocated for the job, etc... The use-case for VTL I thought of first was "lots of physical tapes which I can put into cloud storage by duplicating through the Gateway" i.e. you need to generate an exact virtual replica of your current physical holdings. If that isn't a regulatory requirement and you can manage the migration using Transfer operations from tape set to Cloud/disk set, VTL support may not be needed at all.
  24. blm14

    AWS virtual tape library

    The short answer is that I am backing up data which for various regulatory reasons must be retained for very long periods (minimum 7 years) and I would like to be able to segment this stuff into "recent" backups that are available more readily, and "archival" data which is stored and housed in glacier...
  25. Nigel Smith

    AWS virtual tape library

    Why use a VTL instead of doing a "normal" disk backup set to an S3 bucket or similar cloud service? (I can think of some reasons, but would be interested to hear yours.) Regardless, Retrospect isn't listed in Amazon's supported applications table and neither media changer is listed by Retrospect (unless the STK-L700 is close enough to e.g. the SL500 that it'll work), so I think you're still left waiting. Or very much on your own if you try to make something work. As David suggested, try contacting Sales. You may get lucky, and at least you'll flag it as something for them to consider in the future.
  26. DavidHertzberg

    AWS virtual tape library

    blm14, I still suggest contacting Retrospect Sales. Did you do that, as I suggested in the P.P.S. of my preceding post in this thread? Also, I said in the P.S. of that same post, AFAIK there would be a non-trivial hardware cost; however a re-listen to the brief mention in the NB video and Google search suggests that may be replaced by a virtual appliance in an AWS EC2.
  27. blm14

    AWS virtual tape library

    it's now been over a year and a half and I was wondering if there had been any updates here. It would be really nice to be able to attach retrospect to an AWS VTL
  1. Load more activity
×