Deploy-Once post-deployment scripts
andymcinnes
Posts: 3 New member
in ReadyRoll
Hi
We currently use readyroll but we have a scenario our setup doesn't seem to currently cater for...
At the moment everything in our database from stored procedures to table schema changes and deploy-once scripts, are handled by readyroll. When we deploy our product we tend to run our SQL the day before we deploy our code and consequently we need to ensure we have no braking changes.
This a great in theory, but there comes the situation where you can't avoid a breaking change and so, we require the use of a post deployment script.
Currently to do this we have to write a script manually and send it over to our deployment to run manually afterwards.
For our internal QA environments we use octopus deploy, for cloud we use chef and a few manual steps and for on-premise we package up our code and an installer runs the deployment.
So this got me thinking, is there a better way we can work this? I've done a little googling and noticed a few references to post-deployment scripts, we also have some old post-deployment scripts in our solution, but I've not found any information on how you can specific these post-deployment scripts to only be run once and how you can manage this within your deployment configuration.
Please can you point me in the right direction?
Many thanks
Andy
We currently use readyroll but we have a scenario our setup doesn't seem to currently cater for...
At the moment everything in our database from stored procedures to table schema changes and deploy-once scripts, are handled by readyroll. When we deploy our product we tend to run our SQL the day before we deploy our code and consequently we need to ensure we have no braking changes.
This a great in theory, but there comes the situation where you can't avoid a breaking change and so, we require the use of a post deployment script.
Currently to do this we have to write a script manually and send it over to our deployment to run manually afterwards.
For our internal QA environments we use octopus deploy, for cloud we use chef and a few manual steps and for on-premise we package up our code and an installer runs the deployment.
So this got me thinking, is there a better way we can work this? I've done a little googling and noticed a few references to post-deployment scripts, we also have some old post-deployment scripts in our solution, but I've not found any information on how you can specific these post-deployment scripts to only be run once and how you can manage this within your deployment configuration.
Please can you point me in the right direction?
Many thanks
Andy
Tagged:
Answers
Thanks for your question and it's great to hear you're still using ReadyRoll after all these years!
To clarify your use deployment use case, when you refer to the need to deploy breaking changes, is this an event that is performed separately to the "main" database deployment due to a limitation that you're running into with ReadyRoll's order of script execution?
Or is there an event, perhaps some sort of manual intervention or a rollback, that is occurring after the execution of Deploy-Once migrations (incidentally, for anyone else reading this: Deploy-Once migrations have since been renamed to Migrations) that necessitates the execution of a post-deployment script?
The manual intervention case, e.g. executing a long-running data transformation step, is a little difficult to prescribe without going more into the specifics. However in the case of a rollback, we tend to recommend simply making the required changes to the project within source control, e.g. by adding a new migration/updating a programmable object, and promoting it up through the pipeline again.
If I can get a better understanding of the use case, some other options might come to mind. Appreciate any further details you can provide!
Product Manager
Redgate Software
Thanks for the quick response, in terms of breaking changes, sometimes we need to make changes to the database that will only work once the new application code is deployed. We try to mitigate this situation as much as possible, but sometimes in can't be avoided.
Some basic examples might be changing a column type or renaming variables we read from the database for tech debt clean up, etc.
As we deploy our SQL a day before we deploy our code it means we can't just apply migrations that require the new code being in place at this time. So in this instance we have to create a separate script that is run as we deploy the application code.
Especially in cloud we may have a large number on tenants on a pod, each with their own database. Sometimes the migrations we run before we deploy our application code can take a long time to run, as we run the scripts one tenant at a time to avoid maxing out CPU on the database cluster. For this reason, we leave a day before deploying application code.
Any post deployment scripts tend to be quick to run, so we can just deploy the application code and post deployment scripts with little intermittent affect.
What I was hoping for was a way to leverage readyroll to be able to say, the deploy-once (migrations) and stored procedures etc, get bundled (as they do currently) for pre-application deployment. But then have another set of post-deployment migrations that create a separate bundle of scripts we can run after we deploy the application code.
From a visual studio/developer perspective all scripts will get run when you click deploy, as you will obviously have the code anyway.
Many thanks
Andy
Thank you for elaborating on the deployment scenario you had in mind.
The short answer is there's no specific support for orchestrating deployments in multiple stages.
The closest thing to achieving this would be to use 2 versions of your deployment package: one that includes the migrations for Stage A of the deployment (along with the Programmable Object changes), and another that includes Stage B migrations. You could use semver folders to distinguish the release stages, i.e.:
Once development is complete and you're ready to test your release, move the 2.0.0-StageB out of the Migrations folder, commit to source control to allow the package for the initial deployment stage to be built (when this is eventually deployed to SQL Server instance, only the scripts in 2.0.0-StageA will be executed).
Then, once the breaking changes to the code have been deployed, move the 2.0.0-StageB folder back to Migrations and commit+build again to build the second package.
You'll then have two package versions in Octopus which can be executed in turn:
1. Run package 1 to release StageA migrations+Programmable Objects
2. Deploy breaking code changes
3. Run package 2 to release Stage B migrations
There's probably some optimisation that can be done in there to reduce the friction around the moving around of folders, such as keeping a branch open for the StageB changes until it's time to perform the deployment. But unfortunately that's probably the most automated/systematic approach to handling releases in stages for the moment.
I'd be interested to hear whether any part of this approach would be workable for your team, given the complexities involved in your environments.
Product Manager
Redgate Software