Log Shipping work around

DonFergusonDonFerguson Posts: 196 Silver 5
edited April 19, 2010 3:18AM in SQL Backup Previous Versions
Since I have not received an answer to my last question in http://www.red-gate.com/messageboard/vi ... php?t=9438 I am posting the question again in a new thread.

Backups in the source database don't get recorded in the "Server Cache" aka the SQL CE database. This is usually during times of heavy activity. Even excluding the SQL CE backups from the file system backup doesn't prevent an occasional failure to record a SQL Backup transaction log in the CE database. When it fails to record in CE, the file doesn't get shipped and log shipping breaks. I then need to manually identify the file that gets missed and copy it over to repair log shipping.

I am considering coding a work around to this problem. The work around would be along the lines of putting in a SQL Agent job step that compares the output of a sqbdata query to against the msdb..backup* tables. If a log backup is recorded in msdb but not in SQL CE, then I would perform a sqldata inserts of the missing data.

My question, is for the sqbdata insert, would it be sufficient to just insert a row into the backupfiles_copylist table? Or would I need to perform inserts into the other tables as well in order to get the the missed backup file onto the copy queue? Any guidance along those lines would be appreciated. Thanks.

Comments

  • peteypetey Posts: 2,358 New member
    Inserting into the backupfiles_copylist table is enough.

    This is a sample of the INSERT command that SQL Backup uses:
    INSERT INTO backupfiles_copylist 
    (backup_id,name,copyto,overwrite,status,count,diskretryinterval,diskretrycount,mailto,mailto_onerror,created,lastupdated)
    VALUES 
    (182,'e:\backups\LOG_(local)_pubs_20100414_200733.sqb','\\staging01\backups\pubs\LOG_(local)_pubs_20100414_200733.sqb',0,'P',0,30,10,'','',GETDATE(),'2010-04-14 12:07:34')
    
    Note that 'created' is the local time, but 'lastupdated' is the UTC datetime of the local time (GETUTCDATE in SQL Server).
    Peter Yeoh
    SQL Backup Consultant Developer
    Associate, Yohz Software
    Beyond compression - SQL Backup goodies under the hood, updated for version 8
  • DonFergusonDonFerguson Posts: 196 Silver 5
    Thanks Peter. I will work on building a script that parses the correct values and performs the insert when needed. I will let you know how it goes.

    In an effort to make the log shipping process more stable, I have one other issue. Just like the fact that occasionally a record does not get inserted into backupfiles_copylist, sometimes after a successful copy the status doesn't get updated causing a second copy of the file to be sent to the log shipping destination. When this happens, I get an error message on the destination that the file is out of sequence. When it tries to move the file to the MOVETO destination, it already exists and I get an error message that it can't overwrite the existing file. Hence I manually need to delete the file when this happens. Is there an option that can allow the MOVETO destination file to be overwritten when it does exist? Or alternatively delete the duplicate file in the log shipping destination directory?
  • peteypetey Posts: 2,358 New member
    No, there isn't an option to overwrite existing files in the MOVETO folder, or delete the duplicate files.
    Peter Yeoh
    SQL Backup Consultant Developer
    Associate, Yohz Software
    Beyond compression - SQL Backup goodies under the hood, updated for version 8
Sign In or Register to comment.