What are the challenges you face when working across database platforms? Take the survey
Options

Best practices for automating deleting a local/development database

During full deployments to, logically transient, non-prod environments we want databases to be recreated from scratch every time.  For that matter, not every dev will be concerned with deploying migrations locally.

I've investigated a few options, is there anything considered best practice by those that have been using the product(s) for a while?

For devs specifically...
  • Have I missed some SCA/VS user-configuration such that VS deployments "overwrite"?  Sometimes devs won't care, sometimes we just want to blow away the DB and start from scratch before writing new migrations without changing tooling.
For automation...
  • Is there some argument I've missed on the SCA scripts that already does this?
  • Roll our own process to drop the DB(s) before entering the SCA process?
  • Roll our own conditional SQL in the prj's normal script process triggered on a SQLCMD variable? ...concerning to deploy a DROP DB to prod
Thank you! :)

Best Answer

  • Options
    Andrea_AngellaAndrea_Angella Posts: 66 Silver 4
    I am not aware of any argument you can provide to get the behavior you expect.

    The last option you suggest can be achieved using a pre-script but I would be very concerned to drop production unintentionally.

    So I suggest the second option of adding an extra step to drop the database before calling SCA. Add this extra step only when you deploy to transient environments. 

    Andrea Angella
    Lead Software Engineer | Red Gate
    Website: www.productivecsharp.com
    Twitter: @angella_andrea

Answers

  • Options
    pmenardpmenard Posts: 5 New member
    Thanks Andrea :)

    For the devs,
    A toggle in SCA's UI would be great.  There are limited options for automating such for the devs (a new configuration copied from debug, with an added drop task?).

    As for automation,
    It has to run somewhere, and with a push to have consistent deployment tooling across environments, the drop code is somewhere inclusive to production deployments.
    It would be great if SCA had an argument which could be driven from configuration so roll-your-own wasn't necessary.

    But I agree, if my current choices are to put it in conditional SQL or our overall mgt scripts, the latter wins.

    Thank you!
  • Options
    Are thinking non-prod environments like an integration/qa/staging/etc environment? Or a dev environment? Part of what I would do for the former, if needed, is orchestrate this in the pipeline to clean out or drop the database.
  • Options
    pmenardpmenard Posts: 5 New member
    @way0utwest
    >integration/qa/staging/etc environment? Or a dev environment? 

    Automation => all common environments, dev -> prod.  Only lower environments ever get an automated 'drop' database, however.

    UI => local/personal dev environments



  • Options
    I suppose you could potentially write a script in the pre area that would check for a set of instance names.Either potentially intermediate environments, or just prod and then execute (or not) the drop db.

    While I think the idea of starting your db from scratch is great, there is the data issue. Loading a set of test data is something you need to consider, or I'd expect you would. In any case, if you start each dev session with a new db, I think you're ahead of most shops.

    I'll drop a note to the team about how someone might integrate this with SCA-Client.
  • Options
    pmenardpmenard Posts: 5 New member
    Thanks @way0utwest
    >write a script in the pre area
    I think I'm still leaning towards keeping the drop controls outside of SQL scripts for now.

    >Loading a set of test data is something you need to consider
    Absolutely :)

    We're in the midst of migrating an existing home-grown deployment system to SCA...

    Primary development occurs within a personal environment, and gets built from scratch with basic metadata.  Test-specific data is loaded only as necessary.

    For non-prod shared environments, we have mixtures of fully redeployed and long-lived/migrated databases.  We drive test-data loads from the tests scheduled against the particular database.

    ...as such, test-data will likely continue to be non-SCA, though now that you have me thinking about it... 💡 I do wonder if additional, data-only, SCA projects could be paired with our integration tests to package data & tests side by side. 🤔   ...that won't be until later this year though :)
  • Options
    Phil_Fact0rPhil_Fact0r Posts: 20 Bronze 3
    edited August 28, 2019 10:38AM
    @pmenard I may have misunderstood the problem, but you can easily use SCA in order to create and build a whole lot of clean databases recreated from scratch every time.

    It is probably best not to wire this into SCA as a feature because it is so simple to do. I like the way that SCA actually allows you to do things the way you prefer.

    The trick is to delete the database and re-create it before you do the release. Because all your releases are identical (same source and the target is an empty database) you can use just one Release object. This saves a lot of time!

    Here is the whole script, including the database deletion and recreation. Please let me know if it isn't quite what you're looking for and I'll fix it.
    Import-Module SqlChangeAutomation
    import-Module sqlserver
    <# here you specify the source of the release, a source control directory in our case but it could
    be a single build script or a connection string to a source database
    You need to specify the project details under 'projects' and then a list of connection strings
    for each of the targets on which you'd like a clean install
    #>
    $Databases =
    @{
           'source' = 'MyPathToTheScriptDirectory';
           'Project' = @{ 'name' = 'KillnFill'; 'version' = '1.4.5'; 'description' = 'This demonstrates how to do a clean build' }
           
           'CleanTargets' = @(
        <# list of connection strings for each of the targets on which you'd like a clean install    #>
                  'Server=Philf01;Database=Thomas;User Id=PhilFactor;Password=Wouldntyouliketoknow;Persist Security Info=False',
                  'Server=Philf01;Database=Richard;User Id=PhilFactor;Password=Wouldntyouliketoknow;Persist Security Info=False',
                  'Server=Philf01;Database=Harold;User Id=PhilFactor;Password=Wouldntyouliketoknow;Persist Security Info=False',
                  'Server=Philf01;Database=Ebeneezer;User Id=PhilFactor;Password=Wouldntyouliketoknow;Persist Security Info=False'
                  
           )
           
    }
    
    $ReleaseArtifact = $null; #start off with a null release artifact so you know when to make one
    $databases.CleanTargets | foreach {
           $csb = New-Object System.Data.Common.DbConnectionStringBuilder
           $csb.set_ConnectionString($_) #we need to remove the database to connect at server level
           $connectionString = $csb.Remove('database')
           try # now we make an SMO connection  to the server, using the modified connection string
           {
                  $sqlConnection = new-object System.Data.SqlClient.SqlConnection $csb.ConnectionString
                  $conn = new-object Microsoft.SqlServer.Management.Common.ServerConnection $sqlConnection
                  $srv = new-object Microsoft.SqlServer.Management.Smo.Server $conn
           }
           catch
           {
                  Write-error "Could not connect to SQL Server instance '$csb.server': $($error[0].ToString() +
                         $error[0].InvocationInfo.PositionMessage). Script is aborted"
                  exit -1
           }
           $csb.set_ConnectionString($_) #restore the connection string as we need the name of the database
           $Db = $csb.'database'
           
           #if the database already exists, then kill it
           If (@($srv.Databases | % { $_.Name }) -contains $csb.'Database') # search for the name
           { $srv.KillDatabase($csb.'Database') } # if it is there then kill it
           # in one of my articles, I show how to write out any changes
           # Now we create the database
           $DestinationDatabaseObject = `
           New-Object Microsoft.SqlServer.Management.Smo.Database ($srv, $csb.'Database')
           $DestinationDatabaseObject.Create()
           # double check that it all worked
           if ($DestinationDatabaseObject.name -ne $csb.Database)
           {
                  Write-error "Could not create database   $($csb.'Server').$($csb.'Database')): $($error[0].ToString() +
                         $error[0].InvocationInfo.PositionMessage). Script is aborted"
                  exit -1
           }
           if ($ReleaseArtifact -eq $null)
           {
                  #only do this once since all release artifiacts will be the same
                  Write-Verbose "Now creating the build object using ( $($csb.'Server').$($csb.'Database'))"
        <# we create a build artifact so we can put in the package version etc#>
                  $buildArtifact = $databases.source |
                  New-DatabaseProjectObject | #wrap up the script and create a build artefact
                  New-DatabaseBuildArtifact -PackageId $Databases.Project.name -PackageVersion $Databases.Project.version `
                                                             -PackageDescription $Databases.Project.Description
                  Write-verbose "Now creating the release object using ( $($csb.'Server').$($csb.'Database'))"
                  
                  $ReleaseArtifact = new-DatabaseReleaseArtifact  `
                                      -Source $buildArtifact  `
                                      -Target $csb.ConnectionString   `
                                      -AbortOnWarningLevel None -SQLCompareOptions IgnoreSystemNamedConstraintNames
           }
           #at this point, we will have the release object so we can use it
           Use-DatabaseReleaseArtifact -InputObject $ReleaseArtifact -DeployTo $csb.ConnectionString
    }
    
Sign In or Register to comment.