Jump to content

pmcdonald

Members
  • Content count

    18
  • Joined

  • Last visited

  • Days Won

    2

pmcdonald last won the day on April 15 2013

pmcdonald had the most liked content!

Community Reputation

4 Neutral

About pmcdonald

  • Rank
    Member
  1. pmcdonald

    How to verify a very large backup set

    Hi Lennart, Thank you for the quick response and the solution. I hadn't considered marking the other tapes missing but it's a good lateral solution, cheers! In the event of a volume failure we have a mirrored NAS that can be swapped over with replicated content. If THAT volume failed as well (and both are RAID6, so that would be quite unlucky) I'd cherry pick the specific priority active jobs we needed to continue working while slowly restoring the rest later. This backup script is typically more for the odd missing file from older projects and files that are accidentally deleted or overwritten. One recent example is on restoring a large design project we found a few missing assets that lived in another now deleted job. I was able to select those specific files from the catalog, relink and continue working. I think of this backup set as the rings on a tree trunk - it gives us a complete historical copy of every modified and added file over the better part of the last decade and has saved us on a few occasions.
  2. One of the scripts I run is a nightly backup to add any new or altered files to a tape set. The backup set now spans about 70 LTO5 tapes, 100TB of data and several thousand snapshots. All well and good so far. The catalog is surprisingly quick to index and compare too. My only issue is some of the earlier tapes are getting on in age now and I'll like some way of verifying and retensioning them from time to time to get a heads up when the tape age is starting to become an issue. I'd do this via the 'Verify' job but it would take about 3 solid 24/7 months to complete, assuming someone was around to baby sit it and swap tapes and the Windows machine didn't restart at any point during that process (it has been known to). We only have one tape deck so that also ties up the backups for that time. Is there any way around this? Can I selectively verify specific tapes rather than files? I don't want to split the catalog up as some active jobs really do predate this script.
  3. pmcdonald

    New NAS, trouble recognising duplicate files

    A belatedly late thanks for the reply. I've just thrown the blanket 'Ignore before x date' condition on the script and that seems to work fine. There's a few other backup strategies in place aside from Retrospect so not too worried about any files falling through the cracks.
  4. Hi all, The situation is this: we've been working off a Synology NAS for the last few years. I have a Retrospect 12 script set to crawl over the NAS nightly and back up new and changed files to LTO. Due to bottlenecking I have added a QNAP NAS to the mix and copied key projects from the Synology NAS to the QNAP. The paths are the same but the volume names are different. Both NAS's are on the local network and mapped as shared drives to the PC running Retrospect. The issue (unsurprisingly) is when I add the QNAP drive to the script Retrospect identified most or all of the volume as new or changed files. We're talking about 10TB of data here so I'd like to avoid this if at all possible. Some digging around the forum has found users in a similar situation that solved the problem by unticking the Security options and the Macintosh/Client/Use attribute modification date... option. I've tried some and all of these in varying combinations with no luck. Any ideas? I'm thinking I might just have to apply a blanket 'Ignore all files created before x date' rule to the QNAP to only scoop up changed and worked on files. Cheers, Paul
  5. pmcdonald

    Restart Verify job

    All good BUT our daily backup task goes to backup set A and our archival library is stored on backup set B, which is the one I'm trying to verify. (The reasoning being if a catalog or tape set becomes irreparably corrupted we essentially have a duplicate to fall back on.) I think the sequence mentioned above only works if the daily task and the verify job are referencing the same project? I tried the method where I inserted the daily backup tape when the verify job was requesting the next tape and nothing, the scheduled daily task didn't begin. Search time is perfectly acceptable - it takes about 20 seconds to search and display an index of the entire 40 tape backup set. Daily backup is quite slow at about 40 minutes to read our 40TB NAS, but I suspect that's due to the shear amount of files it has to crawl as opposed to the time taken to cross reference the existing backup contents. I see the logic in smaller backup sets but often staff without knowledge of a job will need to cast a wider net to find random elements (quotes, stock shots, sound effects, that sort of thing), hence the reluctant necessity for an all-inclusive set.
  6. pmcdonald

    Restart Verify job

    Thanks, good trick. I'll give that a go. The reason for the crazy huge tape count is the backup set stores finished archived jobs. So typically we would only ever ingest one or two jobs spanning a tape or two at a time. At the time it made more sense in my head to put all the archived jobs in one set to maximise tape capacity and ensure we could search the entire archive of work if we were searching for elements that we weren't sure belonged to a specific job. Does that make sense? Afraid not re: the autochanger. See my response above. Running Retrospect 7 on a Windows 7 PC and backing up to a single slot Dell LTO5 deck. I'll try altering the options after I finish my backup, cheers. Thanks for the responses, I appreciate them!
  7. pmcdonald

    Restart Verify job

    I've had a few tape anomalies lately and need to run a verify job to check the media. The problem is the backup set consists of about 40 tapes, each taking half a day to verify. I need to run nightly backups so can't afford the downtime of the month or two the verify job would take to complete. Is there any way to pause or stop a verify job and pick it up where it left off at a later date? I'm guessing no but thought I'd ask anyway... Thanks
  8. pmcdonald

    How to manage swapping drives?

    Ah, that makes sense. Thank you kindly for the reply and explanation Scillonian. NAS 1 was running DSM 3.x and NAS 2 was on DSM 4.3 so that could be the cause of that. My IT knowledge is limited at best - I fumble through what I need to fumble through and am blissfully ignorant of the rest - so I didn't even consider the possibility that updates would impact on file metadata. As I type NAS 2 is dumping huge amounts of data back onto a freshly formatted and reconfigured NAS 1, and no doubt will continue to do so for the next several days, but the LTO backup seems to be happily up and running again using the method I detailed above. So we've got a solid backup in place until mid next week, when we'll have two solid backups in place. To recap: NAS 1 was primary storage NAS 2 was used as a hot backup, running a scheduled clone backup every few hours Retrospect running an LTO crawl nightly over NAS 1 adding new and updated files only NAS roles were swapped so NAS 2 became the primary storage and NAS 1 was wiped, reconfigured in a more efficient array and then used as a hot backup running a clone of NAS 2 When shifted from NAS 1 to NAS 2 Retrospect saw the whole volume as new (all 17-odd TB of it) I got around this by adding a date exclusion to the Retrospect script so it ignored data older than the date of the NAS swap above All now runs well!
  9. pmcdonald

    How to manage swapping drives?

    Okay, so I'm still not sure what is causing Retrospect to see the copied files as new but I've gotten around the problem by adding a date variable to the scheduled backup. By telling it to ignore everything older than the most recent NAS 1 backup it simply backups only the new material on NAS 2 that has been revised SINCE the copy, if that makes sense. Works for me.
  10. pmcdonald

    How to manage swapping drives?

    I run Retrospect 7.7 on a PC backing up to LTO5. Retrospect is scheduled to perform a nightly incremental backup on a NAS device (let's call it NAS 1). This entails reading the current contents of the NAS, matching it to the files in the backup set, and adding any new or updated files to the backup library. The system works well and means at most I lose a days work in the event of a catastrophic failure to our NAS. I've recently purchased a new, larger capacity NAS (let's call it NAS 2) that I've copied all the material from NAS 1 onto. NAS 2 is now our working drive and NAS 1 is relegated to being a hot backup. You can never have too many forms of backup, right. The only problem I'm encountering is that the backup set sees the copied material on NAS 2 as ALL being different to what lives in the archive, so it wants to try and rebackup the entire drive. If it were only a few hundred GB I wouldn't fuss, but we're talking around 17 TB of data - weeks of backing up and close to a grand of extra media. Is there any trick or setting to get Retrospect to backup just the material updated since the last backup? In the Matching option 'Don't add duplicates to Backup Set' is ticked and usually does the job - just not in this case. 'Match only files in same location' is unticked. The NAS to NAS copy did bring over the data and time stamps too, so as far as I can see the metadata of all the files is unchanged since the last backup, only the location is now a new NAS. Thanks for reading
  11. Well, uninstalling, restarting and reinstalling Retrospect seemed to do the trick. I left my catalogues, preferences, etc in their original locations (with backups of course), so on reinstall Retrospect picked up pretty much where I left it off. I guess these occasional errors are just the price of admission with the software..
  12. Can anyone please help me with an error I'm getting. Aside from the occasional nuked tape Retrospect 7.7 on Windows 7 x64 has been running flawlessly for the last 12 months. However on Friday it threw up the following error message: Assertion failure at "elem.cpp-1124" error And then crashes. I'm attempting to add incremental files to a large backup set of our server. Retrospect is located on a backup machine with an LTO drive hanging off it. The data I'm attempting to backup is located on a separate Synology Diskstation RAID array drive, using an EXT4 file system (supposedly). I'm not sure what has brought about the error all of a sudden - all the hardware, structures, etc are unchanged. I'm just keen to fix the bug and get on with protecting me data. I've read somewhere trashing configuration files can help. I've tried removing all the files in the ProgramData/Retrospect folder but all it does is prompt me for my serial again, ask where to find the catalogs and then crash again. Any ideas? Thanks for reading! EDIT: Log file of the error is attached. assert_log.utx
  13. pmcdonald

    Move backed up folders?

    In the previous HP tape backup software I used you had the option of moving folders in the catalogue once they were backed up. Let's say half a project sat in directory A and half in directory B. I could use this command to consolidate the two directories into one, or move both into a third directory. From then they would restore in the newly defined structure, instead of how they were initially committed to tape. I know, I know.. projects should be backed up in the right structure in the first place but occasionally the feature did come in handy for revisions, etc. Is there any way of doing this in Retrospect? I'm running Retrospect 7.7 for PC. Cheers
  14. I'm having exactly the same problem. Ticking off as many of the KB suggestions as was possible hasn't solved the problem. My wonderful workaround is to mark the tape as missing and move on, which so far has worked as we run duplicate LTO backups for just this sort of scenario, but I would love to know why Retrospect baulks at manually ejected tapes.
  15. Hello everyone, This morning I ran a restore job to verify a job I backed up late last week. Retrospect listed the media as 'content unrecognized'. I've read the kb article on the error and followed as many steps as I could to get the tape recognised but no luck. Fortunately we also run a parallel safety backup set so I have another copy of the media on the lost drive - only problem is I don't know specifically what files were on there. Is there any way within Retrospect to list which files live on which tape? I'm at a bit of a loss to understand what went wrong with the tape. We're running a six month old Dell Powervault LTO5 and have about 30 tapes across two sets. The tapes are stored in a cool, dry safe. The tape in question was functioning fine the Friday prior. It was fresh and purchased only a few weeks ago. The host computer appears sound. Every other tape I have tried since has functioned fine. Is there any way to rebuild or at least verify the tape? I think other programs I have used provided that function. I'd like to get to the bottom of what the problem is - I don't sleep well knowing something is destroying our backups. Thanks for reading.
×