Jump to content

Memory Issues?


tkatz

Recommended Posts

I have Multi Server 6.0 set up on a win2k box with 2gb of ram. I have various clients installed on windows, mac, and linux machines. The windows clients back up flawlessly, but I'm having some issues backing up some of the larger linux machines.

 

 

 

I'm trying to back up a NAS volume via one of my linux machines; it contains a few million files sizing in at over 150gb.. Trying to back that up caused errors like such:

 

 

 

TMemory::mhalloc: VirtualAlloc(790.5 M, MEM_RESERVE) failed, error 8

 

... bunch more of the similar error ... and then

 

Scanning incomplete, error -625 (not enough memory)

 

 

 

I saw in a faq that this was an earlier retrospect problem with large volumes .. so I tried to subvolume as much as I could, but there was still a volume with over a million files... I get a error "Not enough application memory" when it tries to back it up. It would be very inconvenient to subvolume as there are a few thousand website directories within it..

 

 

 

I'm using version 6.0.143 of the linux client...

 

 

 

Is there some way I can get retro to back this volume up? maybe turn off a match setting or something? It takes a long time to get the error, so I thought I would ask here first before trying a bunch of different combinations....

 

 

 

Thanks in advance,

 

Terry Katz

Link to comment
Share on other sites

We have successfully tested with over a million files but the folder structure plays a role. The flatter the file structure, the more files we can backup as it takes less memory to track flatter structures. At this time, we don't have a model of what will and won't work. If the application is running out of memory, the only work around is to continue to subvolume the source into manageable chucks of data.

Link to comment
Share on other sites

  • 2 weeks later...

I am seeing the same problem backing up OS X clients and Windows clients from a server with 512 MB of real RAM and over 1 GB of virtual memory. It appears even the 6.0 release is not able to use virtual memory properly, which was a problem with previous Windows releases, but has never been a problem with the Mac releases. As soon as it gets even slightly close to using the real RAM minus what the OS (Windows 2000) is using, it starts giving out these sorts of error messages. It appears to be worse in the 6.0 release than it was in the 5.6 release, since I have seen about 200 MB of real RAM go to waste when it logs a "not enough application memory" error message. At least 5.6 waited to log this error until it really had run out of real RAM. This must be fixed by Dantz, since not every backup server is going to be stuffed with several GB of RAM, and UNIX and Windows systems typically have over 100,000 files on a single volume nowdays. I cannot upgrade the backup server that backs up our servers until this problem is fixed, since we have very large volumes with up to 1 million files on several of our servers.

Link to comment
Share on other sites

  • 7 months later...

I don't understand that - I have a server with 1GB real memory and 4GB swap file that gets those same errors backing up large linux systems, with peak usage at about 830MB - i.e., using up all the real memory that the OS, etc. are not using, but not taking advantage of the swap file. Is there any way to get around that?

 

Rhian Merris

merrisr@saic.com

Link to comment
Share on other sites

By the way, to add to my last post, AmyC, I'm not saying you're wrong. In fact the errors in the execution log (TMemory::mhalloc: VirtualAlloc (104.5 M, MEM_RESERVE) failed, error 8) look like they bear out what you said. I just don't understand why Retrospect isn't using more of the swap file. I'm not saying that's necessarily a retrospect bug as opposed to a windows bug, but I don't care for it either way.

 

Rhian Merris

merrisr@saic.com

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...