What are the challenges you face when working across database platforms? Take the survey
Options

Infinite copy loop

randyvrandyv Posts: 166
edited August 30, 2010 1:05AM in SQL Backup Previous Versions
Sorry to bring this one back up, but SQL Backup 6 is flooding our exchange server and I believe there is a bug in the product.

Here is the issue - I DO want the resilience feature.
But... I do not want a never-ending flood of copy failed messages if the backup file to copy has been removed. And I assert this is the bug in the product.

For example; I have a backup job set for a full backup. I make a copy to a local drive and I have a copy to the network share. If the backups made previously are over 5 days old, the file is removed. What appears to be happening, based on messages is that a file is not getting copied over the network (for some reason), and the retries are occuring over and over and over... indefinitely. Then 5 days downstream the local file is deleted, and now it can NEVER be copied to the network.

I'm sure you don't intend it to be that way, but that is what appears to be happening based on the copy failed messages I keep getting.

SQL Backup file copy process failed to copy "M:\LDF BACKUP\LOG_(local)_DBA_20100603_163000.sqb" to "\\whaley10\Backups\WFR-DailyBackups\Whaley08\LOG_(local)_DBA_20100603_163000.sqb"

Last attempt started on 6/4/2010 1:31:20 PM, and failed with the error "COPYTO error: Unable to copy M:\LDF BACKUP\LOG_(local)_DBA_20100603_163000.sqb to \\whaley10\Backups\WFR-DailyBackups\Whaley08\LOG_(local)_DBA_20100603_163000.sqb (Source file does not exist: M:\LDF BACKUP\LOG_(local)_DBA_20100603_163000.sqb).
"

61 attempts have already been made to copy this file.
What we do in life echoes in eternity <><
Randy Volters

Comments

  • Options
    Oh, and I ran the statement below on the server...

    exec master..sqbdata 'DELETE FROM backupfiles_copylist WHERE status <> ''S '''
    What we do in life echoes in eternity <><
    Randy Volters
  • Options
    Thank you for your forum post.

    When you execute the command in your second post, does the problem cease?

    It sounds like, due to the fact the jobs are staying in your log copy queue for so long, that you may have network problems which mean that the files are not able to copy. If you are able to fix this problem then the 'spam' issue should go away.

    Otherwise the only way to solve your 'spam' problem would be to use the USESIMPLECOPY keyword in the command which cancels the network resilience features and will only send you one email when it fails.

    The idea of reducing the number of emails in this type of case is being looked into by our development team for inclusion in the product.
  • Options
    peteypetey Posts: 2,358 New member
    SQL Backup will attempt to copy the file for the next 24 hours, after which it should simply abort the copy process for that file. It will make more sense to abort the copy process if the file doesn't exist.

    However, there may be cases where the file is located in a remote share, in which case a temporary network disruption should not immediately abort the copy process. A compromise may be to abort the copy process only if the source file is a local file.
    Peter Yeoh
    SQL Backup Consultant Developer
    Associate, Yohz Software
    Beyond compression - SQL Backup goodies under the hood, updated for version 8
  • Options
    I was having this same issue which lasted 2 days before I found the USESIMPLECOPY command to put into my script..My question now is, should I leave it in there or now that all my processes have caught up and everything looks to be the way it should? I want to make sure that this 'fix' is something that can be used long term or if its just something to use to fix the issue.

    Thanks
  • Options
    peteypetey Posts: 2,358 New member
    USESIMPLECOPY tells SQL Backup to only attempt to copy a file to the remote share using standard Windows API functions. It will not be placed in a queue where the copying process is reattempted over a period of 24 hours until it succeeds. Note that it only applies to backup files created using the USESIMPLECOPY function. It would not have been applied retrospectively to files already in the queue.

    Why was SQL Backup not able to copy the files originally?

    Thanks.
    Peter Yeoh
    SQL Backup Consultant Developer
    Associate, Yohz Software
    Beyond compression - SQL Backup goodies under the hood, updated for version 8
Sign In or Register to comment.