Backup Copy of multiple databases at end of job
cmthomas68
Posts: 8 Bronze 1
in SQL Backup
We've been using Redgate backup for about 6 months now and love it, makes managing backups of our server which has upwards of 50 databases much easier to manager and track.
What we do struggle with, is the way that the copy component is implemented, that if we have a job that backups all databases (except ones we've excluded), it will do the copy after each backup completes. In theory this should be fine, however we've found quite a few times due to network performance issues on our system, that this causes the actual backups to drag on as it is waiting for the copy to finish.
We run a data warehouse, and have a nightly load that runs from 12-7am each day, so we definitely don't want backups running during this time sucking up the CPU. If it is copying backups during this time, this is less important given the network has minimal impact on our builds.
Is there a way to do this easily? We've recently switched to a separate robocopy job that runs at a set time, however it is quite disconnected, compared to doing it all in the same place.
What we do struggle with, is the way that the copy component is implemented, that if we have a job that backups all databases (except ones we've excluded), it will do the copy after each backup completes. In theory this should be fine, however we've found quite a few times due to network performance issues on our system, that this causes the actual backups to drag on as it is waiting for the copy to finish.
We run a data warehouse, and have a nightly load that runs from 12-7am each day, so we definitely don't want backups running during this time sucking up the CPU. If it is copying backups during this time, this is less important given the network has minimal impact on our builds.
Is there a way to do this easily? We've recently switched to a separate robocopy job that runs at a set time, however it is quite disconnected, compared to doing it all in the same place.
Tagged:
Answers
When executing a backup job that performs a backup of multiple databases using the COPYTO keyword, the job will backup and copy the backup file to the COPYTO location, before moving onto the next database.
There is no feature within SQL Backup to delay the copying process until all the backups for each database specified in a job have completed or just run the COPYTO process on its own. Which means creating a separate copy job, as you have already configured, which the SQL Backup GUI will have zero knowledge upon to help with management of the copying process.
An alternative is to use the SQL Backup Cloud Storage Settings, further reading here, however this does require you have your own AWS S3 or Azure storage. The upload to cloud storage is performed by a separate service outside of the backup job, but you will have visibility within the SQL Backup GUI.
Many Thanks
Eddie
Senior Product Support Engineer
Redgate Software Ltd
Email: support@red-gate.com
I will feedback your suggestion to the SQL Backup Product Manager and Development Team.
I have submitted an enhancement request whose internal reference number is SB-5994.
Sadly, I cannot guarantee that the enhancement request will be successful, or if approval is given, what future version of SQL Backup it will appear in. I will update this post if I receive any further information.
Many Thanks
Eddie
Senior Product Support Engineer
Redgate Software Ltd
Email: support@red-gate.com
A colleague has made me aware of a feature in SQL Backup that is rarely used, first introduced in V7.5. This requires editing the existing job via SSMS and adding a SQL Backup keyword called USEQUEUEDCOPY. This keyword allows the user to put a non transaction log backup with a COPYTO into the Log Copy Queue.
The Log Copy Queue is primarily used with Transaction Log Backups, where on completion of a backup using the COPYTO keyword, an entry is added to the Log Copy Queue and the copying process takes place outside of the Log Backup job. This is the default behaviour for Log Backups and is useful in a log shipping configuration to keep retrying the copy process in the event of a longer network outage.
By adding the USEQUEUEDCOPY keyword in a Full or Differential backup using the COPYTO keyword, on completion of the backup, the backup file is not copied but an entry is added to the Log Copy Queue. The SQL Backup Agent service will need copy the backup file to the COPYTO location and keep retrying over a 24 hour period.
Below is example syntax you could try. The USEQUEUEDCOPY keyword is not configurable using the backup wizards in the GUI, this means you will need to add the keyword manually by editing the Job Step in SSMS.
Example Syntax before, maybe like the following:
After adding the USEQUEUEDCOPY keyword:
PLEASE NOTE:
You need to consider the impact the above may have on the Log Copy Queue and the workload of the SQL Backup Agent service. I recommend that you test carefully in both a test and production environments.
If you already perform Log Backups of a high frequency every 10 minutes or less, I suspect that adding the USEQUEUEDCOPY keyword will greatly increase the workload of SQL Backup Agent and overwhelm it.
Many Thanks
Eddie
Senior Product Support Engineer
Redgate Software Ltd
Email: support@red-gate.com