Jump to content

Recommended Posts

Retro 10.2.0 (221) engine on Mac OS X 10.8.4 on a Mac Mini.


I have been having trouble with media set copies, and so have been very watchful of the operations that I do to ensure that the output is sound.


I had a small media set, about 7.8 GB, we'll call it orig. I copied it to arch, with compression enabled in the script. The resultant media set was 2.2 GB, but had the requisite 203 files.  I wanted to verify that the 2.2 GB was "enough" by running a script *without* compression enabled to "re-inflate" the data.


I copied the copy to "zjunk" without compression, and the result had 203 files, and 2.2 GB.


I then tried resetting zjunk, and copying orig to zjunk, which resulted in 7.8 GB and 203 files.


I then tried copying arch (compressed) on top of the files already in zjunk, with options un-checked so that no duplicate elimination was done.  The resulting zjunk was 406 files and 9.9 GB.


It appears that "compression" is a one-way operation on media sets.  un-checking the "compress" option in the script will NOT un-compress the resultant copy.  It also appears that compressed, and un-compressed data can be "mixed" in a given media set. The media sets, therefore, have no "compress attribute".


Is this as it should be?


It would be nice to get more information on the compression.  There are a few places where "amount saved" is put in the logs, but when estimating the amount of space/tape that will be required, that compression performance data can be really useful.

Share this post

Link to post
Share on other sites

Correct, compression is a script option and not a backup set option. Aside from both Proactive and regular backup scripts, compression is also an option for Copy Media Set and Copy Backup scripts, so you could compress a previously uncompressed backup during transfer. As you have seen, it is a one way option.


True, it would be nice to have a better estimate on what the data will compress to but, unfortunately, this is very dependent on the type of data. Effectively, the more random the data, the less it compresses and many data types, such as JPEG, are already compressed and so do not compress further.

Share this post

Link to post
Share on other sites

Thank you.  It sounds like the data quantity reported is the bytes *after* compression, not *before* compression, right? That's what I'm surmising from the sizes of the media sets, and I presume that it's consistent about it.

Share this post

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now