Bottlenecks / Optimization?
mfc2mfc2
Posts: 17 Bronze 2
SQL Data Compare can be very slow on large databases, sometimes taking hours to determine the differences (over a VPN connection). What are the common speed bottlenecks? My guess is that the network speed is the main one, but I'm looking to confirm that assumption. Other than removing the very large tables from the comparison, are there any good optimization techniques?
Thank you,
Mike Chabot
Thank you,
Mike Chabot
Comments
I suspect that the network bandwidth and the amount of data to be compared, the cause of your performance issues with SQL Data Compare.
The machine running the SQL Data Compare application will make a connection to each data source and pull across the network or vpn connection the required data to perform the comparison. The data required is stored on a local disk as temporarily files, once SQL Data Compare has the necessary data it will then perform the comparison.
If you are comparing large databases this process can take a very long time.
By default the application will not compare or show identical data, unless you have enabled "Show identical values in results" option, in the 'Comparison Behavior' section . If you have this option enabled, please disable this option.
You can reduce the amount of data to be compared, by setting a 'Where' clause filter. Edit the SQL Data Compare project ->Tables & Views tab so you only compare the data that meets the conditions set by filter.
Also editing the Project ->Tables & Views tab, you select just the tables you are interested in comparing. Finally you can reduce the amount of data to be compared, by reducing the number of columns in each table to be compared.
I hope the above helps.
Many Thanks
Eddie
Senior Product Support Engineer
Redgate Software Ltd
Email: support@red-gate.com