Synchronizing Large Files
Chris
Posts: 6
Hi all,
I have a rather unique situation. I have a need to transfer databases via text files from on network to another network, and these networks have no access to each other. Consequently, I have to burn a CD with the SQL generated by SQL Data Compare. No problems yet.
I am having the problem when I try to run the script on the other network. The SQL file is 240MB. Query Analyzer can't open a file that large without truncating it. OSQL also bombs out on me.
I used SQL Data Compare to synchronize on the first network, and it worked like a charm. Ran the 240MB of inserts easily, though it took a while.
Can you open a SQL file created by SQL Data Compare with SQL Data Compare and run it? If not, that would be a nice feature.
The only other way I can think of to get around the issue is to save a few tables at a time. Some of the scripts generated could only be done as 1 table. Another nice feature might be to save the scripts based on the max size that Query Analyzer could handle in a series of files.
I actually bought 2 copies of the suite, so being able to run the entire script would be the easiest solution for me, but any suggestions are most welcome.
Am I missing something, or am I going to have to save my data 1 table at a time?
-Chris
I have a rather unique situation. I have a need to transfer databases via text files from on network to another network, and these networks have no access to each other. Consequently, I have to burn a CD with the SQL generated by SQL Data Compare. No problems yet.
I am having the problem when I try to run the script on the other network. The SQL file is 240MB. Query Analyzer can't open a file that large without truncating it. OSQL also bombs out on me.
I used SQL Data Compare to synchronize on the first network, and it worked like a charm. Ran the 240MB of inserts easily, though it took a while.
Can you open a SQL file created by SQL Data Compare with SQL Data Compare and run it? If not, that would be a nice feature.
The only other way I can think of to get around the issue is to save a few tables at a time. Some of the scripts generated could only be done as 1 table. Another nice feature might be to save the scripts based on the max size that Query Analyzer could handle in a series of files.
I actually bought 2 copies of the suite, so being able to run the entire script would be the easiest solution for me, but any suggestions are most welcome.
Am I missing something, or am I going to have to save my data 1 table at a time?
-Chris
This discussion has been closed.
Comments
I was not aware of a file size limitation in Query Analyzer. There is a limitation if you send the file directly from Data Compare to QA due to the way it's sent and that limit is 2 megabytes.
If QA can't open the whole file, you can use osql.exe to run it. Since that looks like the best option at this point, I think that the best thing to do is to figure out why it doesn't finish running the script. Does an error message get returned by osql.exe?
I was not aware of a file size limitation either. However, I think I found what may be the problem. I believe that when the script was saved by SQL Data Compare, an EOF marker was somehow saved into the data. There was a box-looking character in the file when I opened it in Notepad. Nothing would display after the character when I tried to open the file in QA.
Let me test a little more and I will post back my findings.
OSQL did not return an error. It just did nothing.
-Chris
I see. Maybe you need to make sure that when you save the script, you've chosed the text format as Unicode. You can do this in SQL Compare 3.1.8 (not in earlier versions) to prevent exactly this sort of thing.
I am still looking at the other issue I mentioned before, though. How is running the SQL through Data Compare different that running it in QA? It's SUPER-FAST relatively speaking when run through Data Compare as opposed to QA. It would be nice to have that option (opening/running a SQL file with Data Compare) if there was indeed a difference.
I am guessing that I could use the APIs to code my own application, but I haven't had the time to get around to that quite yet.
-Chris
The time difference between SQL Data Compare's script execution and SQL Query Analyzer's script execution can probably be attributed to the fact that Query Analyzer is based on SQLDMO.dll (SQL Distributed Management Object) and SQL Data Compare uses ADO .net.
Older versions of SQL Data Compare were based on DMO and they were terribly slow compared to Data Compare 3.0+, so I'd assume that's the reason.
If you feel like using Data Compare's functionality in your own .net program, feel free to have a look at the SQL Bundle sample projects in the Red Gate SQL Bundle installation folder. You can use the shared libraries to add data synchronization capability in as few as 20 or so lines of code.
-Chris
I compared a blank instance of the database to its current state and the sql file generated is 210MB. I ran the query from the command prompt, here is the statement: osql -E -d [database_name] < [script_file_name]. The query was just chugging and showed progress but was taking way too long, after 15 hours I gave up.
I must be doing something wrong since it takes me only 2 hours performing operations with the UI of my application to generate this exact same data. I assumed that doing a straight SQL dump would be much faster. Do I even have the right software for what I need?