Jump to content
Sign in to follow this  
liamtu

Multi-processor, multi-core, infinite matching

Recommended Posts

I've been trying to set up a new install of multi-server, but when I try and backup our file server for a second time it gets stuck at "matching". This is a new Dell PE 2950 with dual quad-core Xeons, 4GB of RAM. The backup reports about 2.2 million files in 150GB of data. Retrospect only seems capable of using one core of one processor and thus is "pegged" at 13% of total CPU. Is there any way to get retrospect to use more of the available CPU? Any tips other than reducing the number of files or finding a multi-threaded backup product?

 

-Liam

Share this post


Link to post
Share on other sites

Is it just me or does an 8 core server seem like massive overkill for a backup server? I can't imagine you're going to find any backup products that will parallelize file matching. At best, you'll be able to run 8 jobs at a time and have each use a core. Sadly, this is the drawback of going for multi-core systems instead of faster cores. We're going to see this problem more and more in the coming years.

 

I have a dual CPU Athlon with 4GB of RAM (only 3.6 available because it's 32bit) and Retrospect doesn't seem0 to utilize both CPUs. But my big problem is that once you get around 2.5 million files, you'll have trouble getting a full backup at all. I've been unable to get a full backup of my user home directories (400GB, 2.6 million files) at all. After spending all night copying files, Retrospect eventually gives up because it is unable to allocate enough memory when building the snapshot. (see my post about this)

 

-matthew

Share this post


Link to post
Share on other sites

Quote:

But my big problem is that once you get around 2.5 million files, you'll have trouble getting a full backup at all. I've been unable to get a full backup of my user home directories (400GB, 2.6 million files) at all. After spending all night copying files, Retrospect eventually gives up because it is unable to allocate enough memory when building the snapshot.

 


 

One of our backup jobs is about 1.5 million files and about 550GB and we don't have problems with it. Our server is just a simple Windows XP Pro Pentium 4 2.8 with 1GB ram. It's just a cheap dell workstation that we've installed an Adaptec 29160 card and an Intel gigabit Ethernet card in. We use an Exabyte 10 tape autoloader.

 

Just thought I'd add my 2 cents...

 

David

Share this post


Link to post
Share on other sites

It isn't the size of the files copied, it is the number of files that is the problem, AFAICT. Just wait until you have more than 2.5 million files.

 

The worst part is that I have no easy way of breaking it up into smaller chunks. It sucks. The only way I'm going to get this backed up is with a good ol fashioned manual copy to an external drive ir something. So that'll get dated pretty quickly.

 

Even when I COULD get a full backup with slighly fewer files, it would take HOURS just to load the snapshot and select a file to restore.

 

-matthew

Share this post


Link to post
Share on other sites

This server is a hardware backup for an exchange server... I guess I'll have to go shopping for a dedicated single processor, single core 1U server. Any suggestions for specs? What is the fastest single core processor made? Even with multiple jobs running Retrospect never goes above 14% CPU utilization. How much memory can retrospect actually use?

 

As for parallel file matching, why not? I'm not using tape. I'm gonna have to do some research.

 

-Liam

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

×