Jump to content

[RETROSPECT 6.0 single server edition] too many dlts... :(


strauss

Recommended Posts

hello all,

 

I'm recently responsible of the backup methods. In our firm, we have a backup server which host a 6.0 single server edition of retrospect (2000 server).

clients are mac OS X and XP pro desktop.

each days a backup is done of some folders in all desktops:

all the copied folders are store on the a second hard disk of the server; afterwards, backup is done according with the following policy: only news files are backup.

but since I'm here, each days I have a message from restrospect that it ask me to insert a new dlt: we work on larges files, but it isn't not enough to go over 40 Go (dlt free space disk). furthermore, it seems that retrospect need to backup more that 119 Go (this is the message I have) but the second disk on the server is a 107 Go, and more than 40 Go free space disk .

I don't understand where he can found all theses data to backup: it is possible that an aborded backup is still active until it was done?

thanks in advance for your help, I really need it smile.gif

Link to comment
Share on other sites

hello,

 

 

 

thanks for your quick answer

 

where can I check my sources?

 

And for your second note, well there is some desktop which backup more than others but maxsize backup I found in logs is 3.7 Go, no more...

 

the desktop which have the max size to backup have more than 1000 errors par day, always with the same template:

 

File "C:\my_file.txt" : differents update dates and hours (source : 07/04/2003 15:57:43, dest: 07/04/2003 15:57:43)

 

 

 

Thanks for your help and sorry for my poor english wink.gif

Link to comment
Share on other sites

Well,

 

I have check my backup scripts:

- every days, all scripts will duplicate some network folders of the 6 desktop to an hard drive.

Here's an exemple of theses backup scripts:

 

- source: network folder on the specified desktop

- destination: name_of_the_desktop on drive D

- selection: exclude files create at a specified hour = file backup date.

- options:

check

don't backup windows state

don't backup windows ACL from desktop.

schedule:

every day

 

 

all theses scripts are done in the beginning in the evening

later in the night, another script does the real backup:

 

- souces: all folders on drive D

- destination: dlt(for exemple tuesday)

- selection: all files

- option:

don't check

compress

doesn"t compair

don' t duplicate windows state

don't duplicate ACL

 

 

Well I dont understand why it's too big to be backup on a 40 Go Dlt... :S

if you have some idée / tips...

thanks in advance for you help

Link to comment
Share on other sites

Hi natew,

 

Well I have update duplicates scripts and backup script for duplicate all files and just select files to backup.

to do the incremental backup, I have choose the following parameters:

- exclude files which have hour = backup date (previous settings)

- include files which have been backup at an hour < dlt backup date.

 

duplicates folders have the following sizes today (in Go):

11, 9 - 3 - 7,2 - 13,3 - 5,6

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...