Is AWE supported?

Hello.

Regarding this KB article, does SQL Data Compare 7 support Address Windowing Extensions (AWE) for 32-bit systems that have more than 2 GB of RAM?

In my case, we have 32 GB of RAM and I've set a policy for my user that is running the tool that enables "Lock pages in memory". These are usually the steps necessary to enable AWE for a user/program. I'm aware that supporting AWE would most likely require some native API calls, so it's understandable if this is not the case.

Thank you,

-Matt

Comments

  • Thanks for your post.

    We don't currently use AWE in our applications so you will be unable to use more than 2GB of memory in the SQL Data Compare process. However, unless you have a truly vast schema, and/or massive amounts of data in each row, SQL Data Compare shouldn't really be using that much physical memory.

    SQL Data Compare does quite a lot of caching to disk. If you get an OutOfMemoryException, it could be a lack of physical memory, but in some circumstances a lack of disk space is also a factor. If low disk space is the problem, you can configure a new environment variable which SQL Data Compare with then use to cache to a different hard disk, perhaps one with more space.

    There is more information here on our knowledge base, including instructions for setting RGTEMP.
    http://www.red-gate.com/support/kb/KB200705000041.htm

    With regards to the physical memory SQL Data Compare uses, it keeps the schema (a stripped down version of what SQL Compare gathers) plus the matching row being compared. I would only expect SQL Data Compare to run out of memory if the data in the two rows being compared exceeds the process limit, e.g. having something like 700+MB data in each row.

    I hope this is helpful.
    Chris
  • Thanks for the reply, Chris.

    You are describing a similar scenario that I am dealing with. I am tasked with moving a database to a new server with minimal downtime for the application. It is a 3rd party application/database, so I can luckily point the finger elsewhere. The data files exceed 50 GB, so dragging them over the network will take time (about 45 minutes in my test environment).

    Naturally, I turned to SQL Data Compare as a nice way to catch the new server up with changes since the backup/restore until we "flip the switch" to the new server. The issue is, there were huge BLOBS in the DB (around 600 MB on 1 row) until I had those rows removed from the DB.

    Now, the largest row in either DB is around 100 MB according to the datalength function for the only image column in the table. Yet, I am still receiving the OutOfMemoryException on during comparisons on that table. I have 137 GB free on my primary drive. I changed the RGTEMP environment variable to verify that the temporary files were being used during the comparisons.

    Can you offer me any other suggestions on circumventing this issue?

    Thanks for some great applications!
  • Are you using the 'Compress Temporary files' option? There is a known issue that can cause OutOfMemoryExceptions when the option is being used.

    If you are not using this option and disk space isn't an issue, can you monitor the memory usage of the application during the compare and see if it hits the process limit? I think due to other .net overheads, the SQL Data Compare process will abort when it uses about 1.5GB of mem.
    Chris
  • I am not using the 'Compress temporary files' option. The physical memory of the application peaked at around 750 MB. The virtual memory reached around 1.25 GB at the point the exception was generated.

    The good news is, I only need to perform this data move one time to get it all on the new server. Judging by the time it took for SQL Data Compare to get to the 27% completion point, it will take about the same time to copy the compressed backup over the network. Of course, I have SQL Backup to thank for that.

    I appreciate your assistance with this issue. I am obviously missing something in the database that is just huge. I honestly didn't expect the compare to work, so I am glad to see that it got as far as it did. Imagine the size of the script file it would have generated for all that binary data.
Sign In or Register to comment.