Which Tool for a Large Database?
AMoney
Posts: 1 New member
Hi all!
I've inherited a database which has grown to a huge bloat, and needs to be shrunk. How big? Well, it's over 4TB on disk - and more than 45% is bloat / empty pages. IE, more than 2TB can be reclaimed ... which would really help on our AWS costs.
Note: This problem is compounded several times, since there were several copies of this "starter" database used, so there's actually about 12TB of disk space being used and almost 7TB of disk space that can be free'd up across all of the servers.
Running shrink is a 4 day procedure on a cloned / copied version of the system, and no reads/writes, which we obviously cannot do in production. Some of the tables are full of binary data (that's a whole different problem).
I do have an extended 12 hour maintenance window that I can shut down the server, export / copy / whatever to a new database, and drop the old database.
Any suggestions? Is there a Red Gate tool to help here?
-
A
I've inherited a database which has grown to a huge bloat, and needs to be shrunk. How big? Well, it's over 4TB on disk - and more than 45% is bloat / empty pages. IE, more than 2TB can be reclaimed ... which would really help on our AWS costs.
Note: This problem is compounded several times, since there were several copies of this "starter" database used, so there's actually about 12TB of disk space being used and almost 7TB of disk space that can be free'd up across all of the servers.
Running shrink is a 4 day procedure on a cloned / copied version of the system, and no reads/writes, which we obviously cannot do in production. Some of the tables are full of binary data (that's a whole different problem).
I do have an extended 12 hour maintenance window that I can shut down the server, export / copy / whatever to a new database, and drop the old database.
Any suggestions? Is there a Red Gate tool to help here?
-
A
Tagged:
Answers
Unfortunately, we don't have a tool that would be able to help in this scenario.
Are the first things that you'll have to check in this case is whether there is lots of space or not. If there are tables with indexes that have the largest slots then this might be one of the reasons behind the issue that you are facing.
You can simply go for the reorganization of the indexes in order to free up the required space.
Kashyapi Prajapati has been involved in the world of accounting software, SEO, and cloud computing from a very long time and currently, she is working as a lead content writer with Cloudwalks, a QuickBooks hosting which offers affordable QuickBooks hosting pricing. Cloud computing and SEO is what she eats and drinks.
Hi,
I'm planning a new dwh project and i'm interested in SSDT as changemanagement tool that deploys my changes easily from develop to test to production. But, i'm a bit worried about some blogposts and articles I've read. As you may know a datawarehouse is a huge database with lots of tables and data. I've read that DACPAC's (that is being generated with SSDT(?)) results in a copy of the empty (only structure?) database to another database, apply changes, copy the data and finally rename the database (?).
This is not a desired behaviour in case of a huge datawarehouse project.
Are my conclusions right? Or has this been changed in the latest release?
------------------------------
We researched the best eyeliners out there. Whether you are looking for liquid, pencil, gel, or waterproof eyeliner, you will find the best picks here.