a76448b2-0887-42b3-9161-d6b5e42f70d9 Posted March 22, 2011 Report Share Posted March 22, 2011 (edited) After backing up slightly more then 507 GB of data all left files getting "can't read file error -1019". It's backing up to disk. Tried 2 different drives the same result. Any backup less then 507GB working just fine. Latest version Retrospect server 7.7 and agent. Agent on Windows 2008 Server Edited March 23, 2011 by villav74 Quote Link to comment Share on other sites More sharing options...
be542399-681c-4963-9b7a-993e0d984d4c Posted May 11, 2011 Report Share Posted May 11, 2011 (edited) I'm seeing exactly the same problem backing up data from one of our Windows 2008 server machines and would be interested on finding a solution. It normally fails at the 510.0GB mark. If I copy the same data across to another server (running windows 2003) it backs up no problem. This makes me believe it's an issue at the client end. Maybe because the software is running in 32bit mode? Edited May 11, 2011 by Peteski Quote Link to comment Share on other sites More sharing options...
g.bevan Posted January 25, 2012 Report Share Posted January 25, 2012 I had this problem, tried everything inc. the registry hack for the paged pool memory, all the suggestions from the forum members and Retrospect staff. I'm afraid the only "fix" in my case was to re-install version 7.6. It's depressing to see the issue hasn't been fixed in such a long time. Hopefully development of the Windows version will improve now Retrospect is its own boss (although it looks like all the development is going on the Mac version at the moment!) Quote Link to comment Share on other sites More sharing options...
amcmis Posted February 2, 2012 Report Share Posted February 2, 2012 This is a very interesting post to bump. I am wondering if this is related to a problem we have had for years, namely that grooming will fail on large Disk backups from time to time, requiring us to rebuild (not repair) the catalog and reattach them to all scripts afterwards. (Quite a pain, considering how long it takes to rebuild. And if you forget, no backups until you remember.) We have even taken to just resetting backups to quickly solve the problem, but that leaves us with no historical snapshots. I never took statistics on this, but the last time I was informed the catalog was corrupt, (about a week ago) the backup set was just about half a terabyte. I have never heard about a paged pool memory hack, etc. Can someone point me to any other info on this 'limit' and if it has been acknowledged by the powers that be? And yes, I'm still on 7.6, latest and latest hotfix. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.