Jump to content

Backup Failed After 507Gb. Can't Read File Error -1019


Recommended Posts

After backing up slightly more then 507 GB of data all left files getting "can't read file error -1019".

It's backing up to disk. Tried 2 different drives the same result. Any backup less then 507GB working just fine.

 

Latest version Retrospect server 7.7 and agent. Agent on Windows 2008 Server

Edited by villav74
Link to comment
Share on other sites

  • 1 month later...

I'm seeing exactly the same problem backing up data from one of our Windows 2008 server machines and would be interested on finding a solution. It normally fails at the 510.0GB mark. If I copy the same data across to another server (running windows 2003) it backs up no problem. This makes me believe it's an issue at the client end. Maybe because the software is running in 32bit mode?

Edited by Peteski
Link to comment
Share on other sites

  • 8 months later...

I had this problem, tried everything inc. the registry hack for the paged pool memory, all the suggestions from the forum members and Retrospect staff. I'm afraid the only "fix" in my case was to re-install version 7.6. It's depressing to see the issue hasn't been fixed in such a long time. Hopefully development of the Windows version will improve now Retrospect is its own boss (although it looks like all the development is going on the Mac version at the moment!)

Link to comment
Share on other sites

  • 2 weeks later...

This is a very interesting post to bump. I am wondering if this is related to a problem we have had for years, namely that grooming will fail on large Disk backups from time to time, requiring us to rebuild (not repair) the catalog and reattach them to all scripts afterwards. (Quite a pain, considering how long it takes to rebuild. And if you forget, no backups until you remember.) We have even taken to just resetting backups to quickly solve the problem, but that leaves us with no historical snapshots.

 

I never took statistics on this, but the last time I was informed the catalog was corrupt, (about a week ago) the backup set was just about half a terabyte. I have never heard about a paged pool memory hack, etc. Can someone point me to any other info on this 'limit' and if it has been acknowledged by the powers that be? And yes, I'm still on 7.6, latest and latest hotfix.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...