What are the challenges you face when working across database platforms? Take the survey

Auto Build/Deploy fails: SchemasAndStaticDataAreNotIdentical

alannaalanna Posts: 10
edited March 25, 2015 10:47AM in DLM Automation
I have configured an automatic build/deployment in Team City using SQL CI and Octopus Deploy. It is set up to build the database from source control and deploy to database Test_B. There is an error being kicked up by the automatic deploy that I haven't been able to figure out yet.

I should also point out that we are just evaluating all these products, so don't have much experience with them/how they work.

I have local database Test_A, which is connected to source control (Git) using SQL Source Control. I took a backup of Test_A and restored it to another local database called Test_B.
I made a change to one record in a static data table in A and committed it to our Git repository.

If I use SQL Compare between Test_A & Test_B, it shows that the 2 databases are identical.
SQL Data Compare shows that all but one of the static tables are identical - the one that is different is the one that I'd updated the data in. It also shows that the remainder of the tables couldn't be compared - these are the tables that I had chosen to not include in the comparison (as they're not static so I don't care about their contents)

So far there are no issues.

If we now go into TeamCity and run the build project, it doesn't complete successfully. The process successfully builds the database, publishes the nuget package to Octopus Deploy, but fails when it actually does the deploy. It appears as though SQL Compare completes successfully, but then there's an error message being thrown out by the TC build (from SQL Release I believe?) that says the following:
[11:11:40][Failed: Tentacle script execution] Use-DatabaseRelease : Pre-update schema check failed: the schema in the Database Release object (created at 2015-03-20 
[11:11:40][Failed: Tentacle script execution] 11:10:40 AM -04:00) and database 'Test_B' on server 'DEV-LAPTOPMSI' are different, or static data is different.
[11:11:40][Failed: Tentacle script execution] At C:Users<userName>AppDataLocalTentacleTemp9b2741b0-3724-4b82-bc9b-7393f60a5b28.ps1:40 char:12
[11:11:40][Failed: Tentacle script execution] + $release | Use-DatabaseRelease -DeployTo $targetDB -Verbose
[11:11:40][Failed: Tentacle script execution] +            ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
[11:11:40][Failed: Tentacle script execution]     + CategoryInfo          : InvalidResult: (:) [Use-DatabaseRelease], TerminatingException
[11:11:40][Failed: Tentacle script execution]     + FullyQualifiedErrorId : SchemasAndStaticDataAreNotIdentical,RedGate.SQLRelease.PowerShell.Commands.UseDatabaseRe 
[11:11:40][Failed: Tentacle script execution]    leaseCommand
[11:11:40][Failed: Tentacle script execution] ==============================================
[11:11:40][Failed: Tentacle script execution] PowerShell exit code: 1
[11:11:40][Failed: Tentacle script execution] ==============================================
[11:11:40][Failed: Tentacle script execution] PowerShell script returned a non-zero exit code: 1

We do use a filter during the SQL CI process that excludes the following from the build:
- one schema (as the contents of which vary from one environment to the next)
- a series of local/remote views (we have a stored procedure that needs to be run after every build in order to update/create these)
- a number of tables that have a dynamic structure by production environment - the first few columns of each of these tables is the same for each production environment, but after that, most of the column names are different, and most of the time there's also a different number of columns among the same table across each production environment (I hope that makes sense)

Why would it error out stating that static data is different? Shouldn't it have the ability to update static data (and schemas for that matter)? I have a feeling there's some property I should be setting, or something like that, but I can't seem to find much guidance online as to what I should be looking for.


  • Options
    Hi Alanna,

    Your error tells me that the $targetDB is not the same as the -Target parameter of New-DatabaseRelease. There are a number of reasons this could be:
    1. The source and target parameters have been mixed up (Source should be Test_A and Target is Test_B)
    2. Unlikely - Test_B has been changed between creating the release and using it
    3. A bug in SQL Release
    To help Diagnose the problem could you please send us the script you are using and the log with the debug output. To get the debug output at the top of your script add the line:
    $DebugPreference = "Continue"
    If you don't want to post these to a public forum, could you send them to us at SQLReleaseSupport@red-gate.com
    Peter Gerrard

    Software Engineer
    Redgate Software
  • Options
    Thanks Peter.

    In TeamCity, the deploy step is configured as follows:
    Runner Type: Command Line
    Step Name: Auto deploy to Testing
    Run: Executable with parameters
    Command Executable: C:OctopusOctopusTools.
    Command parameters: create-release --server="http://<OurInternalServerName>/Octopus/&quot; --apiKey=<OurAPIKey> "--project=Compas" --version=0.%build.number% --deployTo=Testing --waitfordeployment --enableServiceMessages --progress

    In Octopus Deploy, there is an environment configured called Testing that points to my local machine. Clicking on "check health" reports that it's able to connect to the machine ok. The Compas project is configured with 2 process steps: 1. Download the package for the database, this step completes successfully. 2. Deploy the package to the database - this is the step that fails.

    In the Octopus Deploy step responsible for deploying the nuget package to the database in question, I'm using the Redgate template for the "Redgate - Deploy from Package" step type.

    The template is set up as follows:
    Database package step: The previous step for downloading the package for the database is selected.
    Target SQL Server instance is set to the variable #{DATABASE_SERVER}, which is assigned the value of the name of my local machine/SQL Server instance: DEV-LAPTOPMSI
    Target database name is set to the variable #{DATABASE_NAME}, which is assigned the value of Test_B
    Username & Password are blank (NOTE: Every time I check this process step, it pre-populates my username & password. If I make changes, I have to remember to clear these fields again before hitting save)
    Filter path: empty. I have tried setting it to C:SQLCompareFiltersForTentaclemainDatabaseExclusions.scpf NOTE: This is the same filter that TeamCity is set to use when it builds the database nuget package initially. I have tried it both with/without the filter at this point and the outcome is the same.

    Presumably this configuration will properly set the target and source databases.

    I'm unsure of how to get my hands on the script that this process is using - I did spend some time last week trying to find it because I figured if I could just try running it myself, it may help point me in the right direction. However, I wasn't successful - it seems as though all the scripts/files that are created get deleted as soon as it fails (which makes sense) and I haven't been fast enough to copy them before they're removed.

    I can, however, email you a copy of the log files from both Octopus Deploy and Team City. Hopefully those will help.
  • Options
    I'd like to thank SQL Release Support for helping me figure this out. Just in case anyone else encounters this issue, I'll explain what went wrong and how it was resolved.

    A quick summary of what the issue was:
    The first issue we encountered was due to the fill factor bug that they say will be fixed in an upcoming release - in the meantime, the workaround is to skip the pre-update check (see step 2 below).

    Once that was in place, it was discovered that there ended up being an issue with the foreign keys on two of our static tables - these foreign keys reference the static table that I'd updated as part of my test. When data was added into that table, constraints had to be dropped and re-added. Re-adding the constraints had the side effect of causing those same constraints to be untrustworthy (WITH NOCHECK). This caused the post-update check to fail. It is possible to skip this check (add -SkipPostUpdateSchemaCheck to the script mentioned in step 2 below), but this is not recommended by the SQL Release support team.

    Instead, here's what I had to do to fix it:

    1. In Octopus Deploy, open the Step Template for Redgate - Deploy from Package
    2. At the end of the script, change: $release | Use-DatabaseRelease -DeployTo $targetDB -Verbose
    to: $release | Use-DatabaseRelease -DeployTo $targetDB -SkipPreUpdateSchemaCheck -Verbose
    3. Click Save and be sure to update the process step in your project to use the updated step template
    4. Go into the Octopus Deploy project, click on Process and select the process step associated to the Redgate - Deploy from Package template (for me, this was process step 2, the 1st one being the one that downloads the package)
    5. In the SQL Compare options field, enter the following then click Save: IgnoreWithNoCheck
    6. Add a new step, also using the Redgate - Deploy from Package template
    7. Enter in all the same information as in step 2, but this time leave the SQL Compare options field blank.

    Instead of doing steps 6 & 7, you could also achieve the same thing by either manually running SQL Compare after the deployment completes (not really an option if you're looking to fully automate the whole process), or use Octopus Deploy to run a SQL script that would drop and re-add the constraints for the tables in question. This wasn't an option for us as we'd like to remove all manual intervention/customization from our deployments and this would require us to create a custom script every time depending on what static tables were updated in the deployment.

    NOTE: They did say that they'd look into the issue surrounding the constraints - hopefully at some point in the future it won't be an issue anymore and I can remove that last step that I'd added to Octopus Deploy.
Sign In or Register to comment.