Jump to content
a76448b2-0887-42b3-9161-d6b5e42f70d9

Backup Failed After 507Gb. Can't Read File Error -1019

Recommended Posts

After backing up slightly more then 507 GB of data all left files getting "can't read file error -1019".

It's backing up to disk. Tried 2 different drives the same result. Any backup less then 507GB working just fine.

 

Latest version Retrospect server 7.7 and agent. Agent on Windows 2008 Server

Edited by villav74

Share this post


Link to post
Share on other sites

I'm seeing exactly the same problem backing up data from one of our Windows 2008 server machines and would be interested on finding a solution. It normally fails at the 510.0GB mark. If I copy the same data across to another server (running windows 2003) it backs up no problem. This makes me believe it's an issue at the client end. Maybe because the software is running in 32bit mode?

Edited by Peteski

Share this post


Link to post
Share on other sites

I had this problem, tried everything inc. the registry hack for the paged pool memory, all the suggestions from the forum members and Retrospect staff. I'm afraid the only "fix" in my case was to re-install version 7.6. It's depressing to see the issue hasn't been fixed in such a long time. Hopefully development of the Windows version will improve now Retrospect is its own boss (although it looks like all the development is going on the Mac version at the moment!)

Share this post


Link to post
Share on other sites

This is a very interesting post to bump. I am wondering if this is related to a problem we have had for years, namely that grooming will fail on large Disk backups from time to time, requiring us to rebuild (not repair) the catalog and reattach them to all scripts afterwards. (Quite a pain, considering how long it takes to rebuild. And if you forget, no backups until you remember.) We have even taken to just resetting backups to quickly solve the problem, but that leaves us with no historical snapshots.

 

I never took statistics on this, but the last time I was informed the catalog was corrupt, (about a week ago) the backup set was just about half a terabyte. I have never heard about a paged pool memory hack, etc. Can someone point me to any other info on this 'limit' and if it has been acknowledged by the powers that be? And yes, I'm still on 7.6, latest and latest hotfix.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×