Error When Getting Latest
dmeyer
Posts: 3
I am trying to get latest on a database table that has ~58,000 records. When I get latest, I receive a "System.OutOfMemoryException" message. I have already gotten latest on all of the other database objects, so this is the only thing I have left to sync. I have tried closing SQL management studio, restarting Windows, etc. Does anyone have any suggestions? The complete error message is as follows:
System.OutOfMemoryException: Exception of type 'System.OutOfMemoryException' was thrown. at System.Windows.Forms.Control.MarshaledInvoke(Control caller, Delegate method, Object[] args, Boolean synchronous) at System.Windows.Forms.Control.Invoke(Delegate method, Object[] args) at System.Windows.Forms.Control.Invoke(Delegate method) at #JLc.#fXA.Invoke(MethodInvoker ) at #JLc.#fXA.#mN7(MethodInvoker ) at #JLc.#PLc.#ETc(Object , #eTb ) at #JLc.#PLc.#t.#Jfb.#EJf() at RedGate.SQLSourceControl.Engine.SmartAssembly.ExceptionReporting.ErrorReporterBase.Do(Action , Predicate`1 , Boolean ) at RedGate.SQLSourceControl.Engine.SmartAssembly.ExceptionReporting.ErrorReporterBase.DoWithObviousExceptionsRethrowAll(Action action) at RedGate.SQLSourceControl.CommonUI.Forms.ErrorDialog.DoWithObviousExceptionsRethrowAll(Action action) at #JLc.#PLc.#CTc(ICancellableOperation`1 , Object )
Comments
It is one thing to get the data sync'd into an empty database, but when trying to detect changes before a commit to source control, a significant amount of data does cause SSC to spend a lot of time working out the data differences.
It got so bad for us with our platform that we decided to take out data from source control until a future time that SSC can handle data much more effectively. We run the data in as a post build script manual task and I hear on the grapevine that the next version of SSC may have support for these type of pre/post build scripts which would be great!...
Sorry to not to be able to offer a solution. Just wanted to share our experiences and would be keen to find out if there are workarounds to speed up the data sync process in SSC.
aolcot is right, pretty much. Currently, the source controlling of data isn't really designed to handle large numbers of records- when we say you can source control static data, this is along the lines of lookup tables your application may use that don't change too frequently, and don't have large numbers of records - things like country codes, categories of users, product-types.
58000 records sounds more like transactional data, such as a list of customers - that's not really the kind of thing we'd currently expect you to manage through source control unfortunately. If this *is* statci data that developers need to maintain we'd love to know the details behind it though!
As a workaround, you can use SQL Data Compare to sync the records for these tables between developers' databases.
In any case, we are hoping to improve performance of SQL Source Control in the future, and this sort of information is really useful in establishing the expectations users have.
Redgate Software
SSC is painful with the zipcode recordset and in some peoples opinion unusable with the icd9 diagnosis recordset.
James, the data in question is fairly static. We extend/maintain a third-party application that uses this table to dynamically generate asp.net forms and their associated controls/layouts. Let me know if you'd like more information about the system and we can talk in more detail offline.
I'm trying to get some indication of what we'd see as an acceptable amount of data from this end, as I realise that me saying "not much" is a bit vague!
As ccollins suggested, a full set of Zip's is common as "static" data, and testing this here (on a set with ~42192 records) took just over 2 minutes to "Get Latest" down to my database from an SVN server on our LAN, if thats of interest as a comparison point.
Redgate Software