Question about Log Shipping + FTP + Restore...

mikebridgemikebridge Posts: 19
edited March 31, 2008 1:35PM in SQL Backup Previous Versions
Hi-

I have a backup scheme which works well, running a full backup and differential backup daily, plus a log file backup hourly. Now I want to try using log shipping to take these transaction logs and synchronize to an offsite database.

I found that I can take my full 4Gig backup offsite via FTP and manually apply the transaction logs to them without any problem. When I tried to automate this, I used the "sqlbackup" command (as per http://www.yohz.com/logship.html) and pointed it to the directory containing the log files. Unfortunately, it doesn't seem to be able to tell which log files have been applied and which ones haven't---it assumes you're just submitting the unapplied transaction log files. So it seems like I need to write some sort of script to figure out which files need to be passed to "sqlbackup" from the big mess of LOG files that I'm synching off the server. (I know that I can use MOVETO, but this creates some problems with our FTP-synching routine, which is only smart enough to download anything which exists on our database server but not the backup server.)

So my question is: is there an easy way of taking a directory full of maybe 70 transaction logs files, and figuring out which 24 need to be applied to sqlbackup before I try to write an ugly script to do it? (For example, is there a way to get the "last applied" log file timestamp, and then figure out from that what new TX log files there are?)

Thanks!

-Mike

Comments

  • peteypetey Posts: 2,358 New member
    No, SQL Backup does not yet have a function to selectively restore only transaction log files that have not been applied.
    Peter Yeoh
    SQL Backup Consultant Developer
    Associate, Yohz Software
    Beyond compression - SQL Backup goodies under the hood, updated for version 8
  • Hi-

    This would be a very cool feature---it would mean you could create multiple offsite backups without any scripting just by having client machines access the transaction logs. Once they're all accessible in a particular directory (either locally or across a network), it'd just be a matter of the clients pointing to that directory with the sqlbackup "restore" function.

    The only issue requiring scripting (apart from transferring the files) is figuring out which TX logs haven't been applied yet.

    Cheers!

    -Mike

    petey wrote:
    No, SQL Backup does not yet have a function to selectively restore only transaction log files that have not been applied.
  • Brian DonahueBrian Donahue Posts: 6,590 New member
    Hi Mike,

    It may work to make some sort of script that queries MSDB for the latest full backup checkpoint LSN and then examine the header of each backup file to find the next one to restore. I'm not 100% sure that this would work, but hopefully it will give you an idea of where to start!
    /* get the database checkpoint LSN */
    SELECT TOP 1 checkpoint_lsn FROM msdb.dbo.backupset
    WHERE database_name = 'MyDb'
    AND TYPE='D'
    ORDER BY backup_finish_date DESC
    
    /* Check the SQL Backup file header LSN */
    
    execute master..sqlbackup '-sql "RESTORE SQBHEADERONLY FROM DISK=''d:\sql2005\mssql.1\mssql\backup\Log_MyDb.sqb'' WITH PASSWORD=''x''"'
    
  • Hi Brian,

    Thanks for the pointer. I've just started working on selecting the files in the file transfer script, but if that doesn't work I'll have a closer look at doing it from the SQL side. I suspect that managing this outside SQL will be easier, because there are some issues to address such as avoiding applying partially-downloaded files.

    Thanks!

    -Mike

    Hi Mike,

    It may work to make some sort of script that queries MSDB for the latest full backup checkpoint LSN and then examine the header of each backup file to find the next one to restore. I'm not 100% sure that this would work, but hopefully it will give you an idea of where to start!
    /* get the database checkpoint LSN */
    SELECT TOP 1 checkpoint_lsn FROM msdb.dbo.backupset
    WHERE database_name = 'MyDb'
    AND TYPE='D'
    ORDER BY backup_finish_date DESC
    
    /* Check the SQL Backup file header LSN */
    
    execute master..sqlbackup '-sql "RESTORE SQBHEADERONLY FROM DISK=''d:\sql2005\mssql.1\mssql\backup\Log_MyDb.sqb'' WITH PASSWORD=''x''"'
    
  • peteypetey Posts: 2,358 New member
    Try the FTPTool utility, downloadable here.

    ## This is not an official Red Gate tool, just something I threw together. No warranties are expressed or implied here.

    FTPTool copies files from a source folder to a ftp site. It stores names of files it has already copied to a file, and skips copying those files again when it next runs. It's a companion tool to CopyTool, described here. Note that it can only 'push' files to a ftp site, and cannot 'pull' files.

    Feedback welcomed.
    Peter Yeoh
    SQL Backup Consultant Developer
    Associate, Yohz Software
    Beyond compression - SQL Backup goodies under the hood, updated for version 8
  • Hi-

    We actually just finished writing a little FTP utility which downloads everything from a directory (via a filename pattern) and deletes it. The strategy we're just about to deploy is this:

    1) configure the TX log backup routine so that it creates a copy in a separate directory on the main MSSQL machine.
    2) keep the FTP synchronization service running on the remote backup machine to download from that directory to a particular local directory
    3) run sqlbackup from the task scheduler to restore everything from that local directory, then move it elsewhere (or delete it).

    Using FTP is not ideal, but it should be ok for a simple setup like ours.

    Thanks!

    -Mike


    petey wrote:
    Try the FTPTool utility, downloadable here.

    ## This is not an official Red Gate tool, just something I threw together. No warranties are expressed or implied here.

    FTPTool copies files from a source folder to a ftp site. It stores names of files it has already copied to a file, and skips copying those files again when it next runs. It's a companion tool to CopyTool, described here. Note that it can only 'push' files to a ftp site, and cannot 'pull' files.

    Feedback welcomed.
  • peteypetey Posts: 2,358 New member
    I'm interested to know how you avoided downloading files that are in the process of being copied. Was this a feature of the FTP server you were using?

    Thanks.
    Peter Yeoh
    SQL Backup Consultant Developer
    Associate, Yohz Software
    Beyond compression - SQL Backup goodies under the hood, updated for version 8
  • Hi-

    Yes, our server (Serv-U) allows you to see the name an in-progress file via "LS", but returns a 450 error when you try to download it. I'm not sure why they don't make the filename totally invisible, but whatever.

    On the client side, we download files to a temporary directory then do an atomic "Move" operation when it's complete so that you don't have the same problem on the backup machine.

    So far it seems to be working fine---we download the new TX logs every hour, then restore them via a scheduled task on the backup machine overnight. We can deploy this to as many backup machines as we want this way.

    Cheers!

    -Mike


    petey wrote:
    I'm interested to know how you avoided downloading files that are in the process of being copied. Was this a feature of the FTP server you were using?

    Thanks.
Sign In or Register to comment.