Jump to content

Retrospect doing full backup each night on multiple servers


Recommended Posts

I am working as a contractor for a company that has tasked me with fixing their backup issues (I am fairly new to Retrospect, although the last 6 days have been a major league crash course in the app). Among many issues, most of the servers are being backed up completely (rather than just changed files) on a daily basis (I'm told that their scripts are set to do incrimental during the week with a full over the weekend). Any ideas why this might be happening?


Thanks in advance for any insight in to what might be causing this!


Retrospect version 7.5.370 running on Server 2003.

Backing up a number of different clients ranging from server 2003 with 7.5.111 to OSX 10.4 with Mac client 6.0.109.

Link to comment
Share on other sites


Among many issues, most of the servers are being backed up completely (rather than just changed files) on a daily basis


What leads you to this conclusion? It may just be, because you are new to Retrospect, that you don't understand its snapshot paradigm, which presents an apparent "full" backup each snapshot from which to restore, collecting in each snapshot the information from the first full backup and all successive incrementals (to use the terminology of other programs).



Link to comment
Share on other sites

No, I mean all files are being backed up. I don't think it is due to lack of understanding (it might very well be, but the staff, who have used the product for some time, are stumped as well).

For example, I found these facts:

I looked at the session (**make that snapshotview**) information for a server. A new backup set was just created on Friday (which would have been the 'full' backup). During this backup, 51,000 files were backed up. The job that was run last night, Feb 26 at 8:45 pm, was a 'normal' job, which means it was their 'progressive' backup. During this job, 51,000 files were backed up. Either the people here are SUPER busy, or there is something up.

Looking at another server (full back up on Friday, 'normal' last night), a server with 182,000 files has all of the files marked as being backed up last night. Once again, I think I am working with industry legends here!!

Is there something that I am missing (or for that fact, the full time staff here)?

Link to comment
Share on other sites

Yeah, they're all to the same back up set. They start a new set each weekend and I'm noticing that, say, on Tuesday, the job still backs up each and every file on the servers.


No erase or no new media.


As far as I know, there is nothing done to the server except being used by users. No other backups done, no nothing.


This is just an aside (but might just show the condition of things in this location): it seems like every night there is something wrong with the backups. At my old job, the only frustration the guy who handled our backups had was the constant requests to restore files. Here, every night it's a new error. One server has a -530 error, so it's rebooted. Another has a -541, again, rebooted, its fine. Another has a -1019. Another with a -519. Most of these end up being 'false positives' and are resolved by the old standby: reboot! Are there that many issues with this application or is it this location? (if this gets to be too major of a question that it will end up hijacking the thread, then we'll talk about it another time).



Link to comment
Share on other sites

Some side information: 'full' backup on Friday this past weekend took 1d 6 hours. The job that ran last night was on pace to take about 1d 5 hours. Again, this is using the 'normal' backup type after creating the new backup set on Friday.

So, this further goes to prove that *every* file is being backed up on a nightly basis instead of the 'progressive/incrimental' backup that should be happening.


I saw some posts in the forums about MD5 and verification in general. Verification is turned off on all jobs (boy, could you imagine what it would be like if verification was on?).

Link to comment
Share on other sites

You know what Fletch, you might not be going mad after all. Since upgrading from 7.0 (Multi Server) to 7.5 I have also noticed our backups taking a huuuuuuuuuuge amount of time (i.e. 8 hours to backup our 200GB fileserver). The backups are never erased, yet it seems to initiate a full backup every week.


I recall reading something about when upgrading to Retrospect 7.5 something needs doing to the catalog, so from today I have started recataloging all the disks on our backup set to try and remidy this.


If this fails I'm stumped, as I've checked and double checked the settings here and nothing has changed (except upgrading retrosepct turned on a verification level during the upgrade).


I'll try and remember to get back with a progress report, but I suspect when it finishes backing up the catalog is being corrupted by 7.5 'updating' a 7.0 structure so it cannot be read by 7.5 to filter out already backed up files.


That's my current guess anyway - can anyone confirm this may be the cause?


edit: we use Symantec Corp. Edition and have done since installing version 7.0 without ANY problems.


I should point out my source server is a SCSI raid5 array and my backup media is a fast SATA drive and the data is pumped across a gigabit ethernet. On larger files I see backup rates of 710MB/min but being software developers this throughput suffers greatly when backing up our CVS repository contianing many jar/html files. Normally the entire repository is already on the backup set, so little needs updating each night - until now.



Link to comment
Share on other sites

Here's an update from my place here: in anticipation of the daylight saving time change, we have updated all of our servers with the latest microsoft patches. Since doing so, our backups have gone from taking 18 hours to taking 8 hours. So, the backup *time* has gone down. That is a good thing.


However, that doesn't change the fact that every single file on darn near every server we have is being backed up nightly.


Thanks for the info, Richy Boy. I'll talk to the folks here and see when they upgraded to the lastest version and see if I can trace things back.

Link to comment
Share on other sites

Ok, a quick update from me also... it didn't work.


I rebuilt the catalog using the only backup set disk for that day (which took about 2.5 hours!) and then set about backing up our file server again. After matching, it said '45' files and about 45MB of near on 120GB of data matched (!) which is just crazy. The disc should of had MUCH more matching data than that.


It's now sitting here backing up 112GB of data, most of which will match itself anyway as when builds are made of our product, many files remain unchanged, yet duplicated. I don't get it, and neither will our development team either then they find CVS on our file server running like a dog again as Retrospect is still chugging away. frown.gif


I'll look for M$ updates though and hope that something magical happens, although I probably run different servers... i.e. filserver is Windows Server 2000, backup is XP Pro XP2.



Link to comment
Share on other sites

I think I have read about similar situations in other threads at this forum.


Those of you retro-experts out there correct me if I am off base, however I think you might want to check your backup settings for either the script or proactive backup.


In the properties window for all settings for a backup (detailed not using wizard) you will see a series of sections within the window. [sources] [destinations] [selecting] [options] [schedule].


Under the options section, make sure you are using "more choices" and look under the "windows" section for the security options. Select security. There are 4 choices, backup file security(servers), backup file security(workstations), backup folder security(servers), backup folder security(workstations).


Check whether you have the flag set for backup file security(servers). For some customers this is necessary, for others it is not, however it almost always will cause every file to be backed up each time any kind of backup is done.


If your customer has individual security settings for specific files within folders (joe can read this file but Nancy can both read and write this file) instead of the regular "inherits properties from its folder" then you DO need file security backup. However if these rules are limited to only a few folders within the entire hard drive you could create a separate backup for just that set of folders and do a non-file security backup for everything else.


If your customer has specific tracking needs for which user updated a particular file last (usually changing ownership) then you DO need security backup. I'm not positive if you need this setting where the customer is trying to enforce a space usage quota system.

Link to comment
Share on other sites

To perhaps find out what the system thinks is different between these daily versions...I suggest performing all except the last step of a single file restore.


Under the restore menu, choose the option "find files"

source = your disk backup or tape, whichever

destination = some hard drive on one of your servers, doesn't matter as the restore wont actually happen

searching = the exact spelling of the name of some file that gets backed up every day, any one of the many will do (include = universal -> name = file: [some file name] )


Then click on the [files chosen] button. Normally you see a hierarchy view...right click context menu and choose "sorted view"


You should now see a list of all backed up "version" of this file. You can right click on each one and get properties for each. Compare the windows of several to see if you can figure out what retrospect thinks was different.

Link to comment
Share on other sites


This topic is now archived and is closed to further replies.

  • Create New...