Differences between Cmdlets
ChrisGStevens
Posts: 10 Bronze 1
My goal is create specific objects and then deploy the static data. I have a few different static data scenarios.
I have a database with several different schemas. My goal is to start small with just a single schema in the repository.
I created a Filter to only look at a specific DatabaseUser, Role, Schema and 4 tables.
Basically we have a single database that all of our applications exist. The applications are split by schema and I want to be able to automate build, testing and deployments at a schema level and then tests on the whole thing.
The following script validates the schema using the filter but the output contains ALL objects.
If am only validating based on the filter shouldn't the output of the Invoke-DlmDatabaseSchemaValidation be only objects which were validated?
When executing the Documentation and Package cmdlets I can see all of the objects which I am trying to filter out.
The other issue I noticed with the validation is the -IgnoreStaticData should be an option. Every cmdlet should have it.
If we are going to be able to include static data then there should be an option to use of the SQL Data Compare filter/configuration. Perhaps you can and I haven't found it yet
I have tables which I don't want to include fields in my data compares like CreateDate and ModifyDate. Since I can't limit the fields the DLM Automation complains that you can't insert NULL values into a NOT NULLABLE field.
I get around this by limited the fields in SQL Data Compare project which forces the defaults to kick in when executing the SQLDataCompare in PowerShell;
I also have tables that have static data but has another which is dynamic data or environmental data which is another reason why I want to include only the fields of the tables which the data is static and same in every environment.
I have a database with several different schemas. My goal is to start small with just a single schema in the repository.
I created a Filter to only look at a specific DatabaseUser, Role, Schema and 4 tables.
Basically we have a single database that all of our applications exist. The applications are split by schema and I want to be able to automate build, testing and deployments at a schema level and then tests on the whole thing.
The following script validates the schema using the filter but the output contains ALL objects.
If am only validating based on the filter shouldn't the output of the Invoke-DlmDatabaseSchemaValidation be only objects which were validated?
When executing the Documentation and Package cmdlets I can see all of the objects which I am trying to filter out.
$myScriptPath = "C:\Dev\MyDatabase"
$myFilterFile = "C:\Dev\SSO_Static_Deployment_Filter.scpf"
$compareOptions = "-IncludeDependencies,-IgnoreUserProperties"
$validatedSchema = Invoke-DlmDatabaseSchemaValidation $myScriptPath -Filter $myFilterFile -SQLCompareOptions $compareOptions
$documentation = New-DlmDatabaseDocumentation $validatedSchema
$databasePackage = New-DlmDatabasePackage $validatedSchema -PackageId "MyDatabase" -PackageVersion "1.0" -Documentation $documentation
Export-DlmDatabasePackage $databasePackage -Path "C:\Dev\Packages"
---------------------------------------------------
---------------------------------------------------
The other issue I noticed with the validation is the -IgnoreStaticData should be an option. Every cmdlet should have it.
If we are going to be able to include static data then there should be an option to use of the SQL Data Compare filter/configuration. Perhaps you can and I haven't found it yet
I have tables which I don't want to include fields in my data compares like CreateDate and ModifyDate. Since I can't limit the fields the DLM Automation complains that you can't insert NULL values into a NOT NULLABLE field.
I get around this by limited the fields in SQL Data Compare project which forces the defaults to kick in when executing the SQLDataCompare in PowerShell;
I also have tables that have static data but has another which is dynamic data or environmental data which is another reason why I want to include only the fields of the tables which the data is static and same in every environment.
Answers
I didn't notice the difference between Invoke-DlmDatabaseSchemaValidation and New-DlmDatabaseDocumentation. One is -Filter and the other is -FilterPath.
I need to make sure I apply the same filter and config to the entire flow.
Now I just need to be able to use the Filter and such when creating my package.
You don't need to set a filter when creating the Nuget package because New-DlmDatabasePackage will use the schema that was validated in Invoke-DlmDatabaseSchemaValidation taking into account any filtering that took stage in that step.
The static data feature is more limited than SQL Data Compare and it will not allow you to use data filters or any SQL Data Compare options, if you need this I would advise you to use the SQLDataCompare.exe that you can find on the SDC subfolder of your DLM Automation install folder as a separate step in your process.
Product Support Engineer
Redgate Software Ltd
Please see our Help Center for detailed guides on how to use our tools
Thanks for getting back with me.
When I do the following the Invoke-DlmDatabaseSchemaValidation will use my Filter and CompareOptions.
$validatedSchema = Invoke-DlmDatabaseSchemaValidation $myScriptPath -Filter $myFilterFile -SQLCompareOptions $compareOption
But the output of the $validatedSchema includes ALL objects. Am I missing a flag or something?
I would expect it to only contain the objects which are not filtered out. In my case a single schema.
I found that I have to use the Filter and CompareOptions when creating the Documentation even though I am only passing it the $validatedSchema results. If not it will use the whole repo and try to document it.
$documentation = New-DlmDatabaseDocumentation $validatedSchema -FilterPath $myFilterFile -SQLCompareOptions $compareOption
I did notice that if I don't include the filter when creating the documentation I get an error because of a synonym pointing to a linked server.
But even though I received this error it still creating the documentation based on the what was validated.
So it is almost like, yes it is correct that the New-DlmDatabaseDocumentation uses the $validatedSchema when creating the documentation but if I don't filter this cmdlet wants to validate objects which should not be checked or whatever is happening under the covers.
Now using the New-DlmDatabasePackage. It doesn't allow me to use the Filter or CompareOptions so I get all of the repo and then the filtered documentation that doesn't match.
So right now I don't see a way for me to create a package based on what has been validated or filtered down to.
I have tried the following using the Package and ValidateSchema results.
When I sync I can use the ValidateSchema and unless I use the filter I will get objects that should have been filtered out.
Sync-DlmDatabaseSchema -Source $validatedSchema -Target $myTempDB #-FilterPath $myFilterFile -SQLCompareOptions $compareOption
Thank you for the info for the Data. I have been working on a way to do that using the SQLDataCompare.exe.
But first I would love to get my nuget package created with the objects that I want.
Basically our apps are separated by schema and I want to be able to create an automated single schema and dependencies.
The first schema contains "Shared" objects which are used by other schemas but not all.
I went ahead and created a second filter for a different schema, "Customer", which has dependency on the first schema.
I have successfully created 2 power shell scripts and this seems to work great.
So right now besides the .nuget containing all of the objects I thought this was going to work...
My new issue in my attempt is Migrations.
Once I added a new Mirgration to my Shared schema I was no longer able to deploy the Customer shema.
Basically my migration does this.
I can't get passed the validation when I am running for the Customer schema; it is comparing the Shared schema as I get the following error.
I can go back remove the field and remove my migrations history for the Shared sync and rerun Shared.ps1.
It runs fine as well as the next run with no change.
But some reason once I added the migrations the Customer.ps1 which only looks at customer schema based on the Filter is looking at the Shared objects and those objects migrations.