Slow Performance on Large Datasets - possible suggestion

jassonkjassonk Posts: 2
I’m not sure how your source code works in building the transfer/update scripts but based on the demo it’s too slow to be usable for our company’s needs. However, I would like to offer a possible avenue to explore as a resolution to part of the issue.

Based on the scripts it outputs I’m assuming you are simply appending the next line for the script to the end of the string it’s building as it goes and then exporting the full script to a file.

Ugly pseudo-code as follows:
ChangesScript as String

	While not end of recordset
		If RowA  <> RowB then
			ChangesScript = ChangesScript + NewLineToUpdate
		End if
	End
	Write ChangesScript to file
IF this is the case, the following suggestion will offer a HUGE performance improvement if you structure it as follows.
ChangesScript as String
	CurrentChangesScript as String
	RowCounter as index

	While not end of recordset
		If Row A  <> Row B then
			CurrentChangesScript = CurrentChangesScript + NewLineToUpdate
			CurrentChangesScript = ""
			RowCounter = RowCounter + 1
		End if

		If RowCounter >  500 then   //You may have to play with the counter a bit depending on your standard record size
			ChangesScript = ChangesScript + CurrentChangesScript
			RowCounter = 0
		End if 
	End

	If RowCounter > 0 then 
		ChangesScript = ChangesScript + CurrentChangesScript
	End if
	Write ChangesScript to file
When working with strings there is a huge performance increase due to the allocation of memory. When continuously adding to the string it bogs down. If I’m off base, skip this and move on. :D If I'm on the mark, send check or money order to... j/k ;)

Comments

Sign In or Register to comment.