markcuss Posted June 11, 2006 Report Posted June 11, 2006 Is there some restriction with the size of a disk backup set in v7.5? I have a big disk backup set (a terabyte or so) and Retrospect just struggles with this. It says the catalog file is wrecked and so grooms and backups fail. Attempts at rebuilding at the catalog "get stuck" about 20% into the recatalog. In the execution monitor, the performance monitor and time update, but the file count and byte count haven't changed in 2 days... It looks like I'm going to have to toss this backup set which makes me very unhappy. Is there some size restriction on how big a disk backup set can be? Has anyone else had problems like this? Thanks Mark
Mayoff Posted June 12, 2006 Report Posted June 12, 2006 Source disks have been tested with 4 million files. Backup Sets should be kept within that size range when possible. Several gigs of RAM is required to use large sets. How much RAM do you have?
markcuss Posted June 16, 2006 Author Report Posted June 16, 2006 My box is a Xeon 3 GHz with 2 GB of RAM... It's a server, so there is more room for CPUs or RAM if need be... The backup set in question consumed about 1 TB on disk... Speaking of RAM, I had the following error message in one of last night's backups: TMemory::createMapFile: Could not create the paging file, error 80 This definitely sounds RAM related... Any thoughts? Thanks Mark
nekr0phage Posted June 20, 2006 Report Posted June 20, 2006 Quote: TMemory::createMapFile: Could not create the paging file, error 80 Hi Mark, How are you set for free space on your OS volume?
Recommended Posts
Archived
This topic is now archived and is closed to further replies.