Jump to content

1000 gb limit on storage set is a problem


Recommended Posts

I administrate a large prepress network. the fileserver recently grew to about 800 gigs. I use two macs with AIT3 tape drives on each - 100gb native, 200+ compressed. One mac and drive is for backup, one is for restore. Here's my backup schedule, from prior to the upgrade:

 

 

 

-make a backup set called "year - letter" such as "2002-A" followed by "2002-B". etc.

 

 

 

-let the backup server run as a "normal" backup. catching up - that is, backing up all current files from the server, take a couple days and several tapes. Once it's caught up, it runs once a day with only new files, which is several gigs per day.

 

 

 

-when the total of all the data on that tape set reaches 1000 gigs, it stops with the error "cant add any more data to catalog set, 1000gb limit reached", and i make a new set.

 

 

 

so far so good. problem is, since doubling my network size, i reach the 1000 gig limit in a week rather than a month or two. most of it is filled by the initial full backup. it's a huge waste of time and tapes.

 

 

 

Why is there a 1000gb limit on the size of a catalog file? it seems like an arbitrary limit. it's not based on the number of files or tapes. It's not based on the size of the catalog file itself. I think it's left over from when 1000gb sounded like a huge, unfillable amount of space.

 

 

 

Is or will this limit change in the next mac version? is it there in v6 for windows?

 

 

 

for now, i've changed my plan a bit with two concurrent backup sets each doing half the server. "2002-A1", "2002-A2". not pretty.

 

 

 

any other suggestions would be appreciated.

 

 

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...