Jump to content

Recommended Posts

I'm using Retrospect 6.0.193 on a mac server running OS10.3 and back up to a Lacie tape cartridge on an automated schedule.

 

Apparently the backup is backing up several gigs of hidden files as well as the file I need. If there any way to stop backing up the hidden files?

Link to comment
Share on other sites

  • 10 months later...

I'm also having trouble with what I think are "hidden" files. I'm new to a Macintosh environment so I'm sure most of my issues can be resolved with some Mac knowledge.

 

I'm trying to trim down a set of backup scripts that our company runs for backing up users' desktops. Currently, each of the backups (just "duplicates") contains around 20-30 GB. This is generally from duplicating each user's directory. There are many redundant files in these directories so I'm building "Selectors" and using them in my script definitions. They are pretty simple, just identifying local Entourage database backups and local Microsoft data (Microsoft User Data\Office 2004 Identities\Main Identity [backed up...) backups.

 

This works great and has reduced the data size dramatically (duhh...eliminate duplicate copies of the largest files -> Entourage databases at 2-6 GB each) but I still get file sets around 8 GB on machines where the user data only totals around 4 GB. When I get directory information (on the Mac) the directory says 8 GB but when I get the same directory information for each of its 4 or 5 containing folders it only adds up to around 4 GB. Using the terminal it looks like there are hidden files that I can't see in "Finder." These are items like: .Trash, .DS_Store, etc.

 

Since I'm not a Unix expert either I don't want to mess with these too much but I suspect they are contributing to my bloated backups. I tried using a file selector with "starts with" "." but that didn't eliminate the problem. I also tried using the "special folders" "Trash" selector but that didn't work either.

 

If anyone has any ideas plesae let me know.

Link to comment
Share on other sites

The ".DS_Store" files are the database for view presentations for the folder (list, icon, sort order, etc.). That may not matter to you. There are certain bugs in AFP on an Xserve that can be triggered by corrupted .DS_Store files, simply FYI, and there is a discussion on afp548.com on fixing some of these bugs by removal of .DS_Store files.

 

By the way, "duplicates" are not "backups", but are more akin to cloning/copying. A "backup", and the real power of Retrospect, is to only back up each day what has changed from the previous backup, and then, through the illusion of "snapshots", give a presentation to the user, for purposes of restore, of the entire state ("snapshot") of the drive at the time of the backup by combining the succession of incrementals with the original full backup, plus remembering what files were deleted, etc.

 

All of those files are on the disk for a reason. I bet that the data needs would drop dramatically if you instead did "backups" rather than "duplicates".

 

The real things that you can punt are the browser caches, etc. - the standard selectors from Retrospect are a good place to start there.

 

Just a thought.

 

Russ

Link to comment
Share on other sites

Quote:

By the way, "duplicates" are not "backups"

 


 

yeah, I understand the difference even if I didn't make it clear in my original post I understand the limitations of using a duplicate strategy (periodically making a copy of files and overwriting the last copy with the new copy): No rolling-back past the single last copy, No protection from things like viruses, corrupted files, etc. that aren't caught before they are backed up, etc. Though I have a reason for my strategy...

 

Quote:

I bet that the data needs would drop dramatically if you instead did "backups" rather than "duplicates".

 


 

The majority of the data that I am backing up is encapsulated in a handful of files. For example, a single user may have 4 GB of data that I'm protecting but 3.8 GB of that is in a single file (Microsoft Entourage database). So, if that single file changes (and it always does between backups) then that file has to be re-written. Unless Retrospect performs some form of "byte-level" differencing and only catalogs byte level changes then I don't think a Retrospect "backup" will make much of a size difference compared with a "duplicate." In fact, I suspect that the "backup" would actually grow rapidly because each backup execution would cause an additional copy of that large file (because it has changed).

 

Am I wrong? Would Retrospect just record the byte changes in the 3.8 GB file and store those with the backup set? If so, then I would be glad to have all of the additional protections of maintaining a series of "snapshots."

 

Another strategy might be to save the small collection of large files using a "duplicate" methodology and then save all the other user files with a "backup" methodology.

 

I don't know. I'm still investigating.

 

Quote:

There are certain bugs in AFP on an Xserve that can be triggered by corrupted .DS_Store files, simply FYI, and there is a discussion on afp548.com on fixing some of these bugs by removal of .DS_Store files.

 


 

Thanks, my primary question I guess was more of a Macintosh OS question than a Retrospect one. I'll check out the site and see if I can't figure out where all these extra GBs are coming from. I'm just getting conflicting diretory size reporting from the Mac "Get Info" option on directories; The parent says, "8 GB" but the total of all child directories and files is only about 4 GB. So, Mac must be reporting on a large set of hidden files (the ones I can see I can count).

 

Thanks, Josh

Link to comment
Share on other sites

Josh,

 

Thanks for the explanation. Your strategy makes sense now. No, Retrospect's decision to back up is made on a file basis, not by bytes within the file. It might help you to get a handle on where the big files are by using the free WhatSize utility. You can also do a find piped to fgrep piped to sort to get more specialized listings. WhatSize is here:

WhatSize download

 

I just gave up when we migrated our office to Mac OS X and put in big tapes with an autoloader. If/when the Mac Retrospect achieves feature parity with the Windows Retrospect, we will move to disk-to-disk-to-tape to shorten the backup window time. I've accepted the explosion in data as inevitable.

 

You really might think, though, about what you are backing up and the cost to recover. We had a fire at our office almost two years back, and Retrospect and our off-site backups saved our butt. Computers were lost, but no data. Just buy new computers, restore the server, and go.

 

The other thing that you might want to investigate is the use of network homes on your server, such that each user's data is kept only on the server. Requires a fast network infrastructure, and there are issues that you have to iron out with some crappy apps that don't like networked homes, but it means that you care less about the data on individual clients. It's also nice that any user can go to any client computer and receive the same environment as on their "usual" computer.

 

Russ

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...