Options

'Timeout Expired' Error Message

rsg2rsg2 Posts: 4
edited January 13, 2006 6:04PM in SQL Log Rescue
Just installed SQL Log Rescue. After specifying database and getting list of .bak/.trn files then selecting ones to view, after click 'Next' then (after several minutes) get this error message:

"Database Error (title bar).
"Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding."

I had selected 4 .BAK files and their corresponding 4 .TRN files.

Nothing shows to view after this error.

What is wrong? I can find nothing in the menus to set anything related to timeout periods or whatever.

Comments

  • Options
    Brian DonahueBrian Donahue Posts: 6,590 Bronze 1
    Hello,

    There is no option of setting a connection or query timeout as this should be handled 'seamlessly' by the .NET Framework.

    The thing to do would be to try to track down the cause -- can you tell us if the software gets to the point where it says it has validated the backup files, or does the error crop up before this?
  • Options
    Hi, I work with the original poster.

    The problem occurs during the step "choose data backups." No matter how many backup files we choose, after hitting "next," the "anaylzing transaction log" box pops up for probably 10 minutes and then we receive the timeout/server not responding error.
  • Options
    A bit more info - I saw the page in the Help file about 'locked files' due to incomplete transactions. I don't think that is the case here because when I select a single backup pair of files (TRN and BAK) that ran a week ago the same error occurs. I have set a Maintenancne Plan to back up the database daily at 0400L and this has been running now for several months. The BAK/TRN files I tried to analyze were from this daily backup. Surely if there were any incomplete transactions they would have stopped before the system backs up the database. Again, choosing a backup file set from a week ago should not have any problems with incomplete transactions. I thus do not understand why the files seem to be 'locked' and getting the SLR Timeout error message.

    The error occurs after I select a backup set and click the 'Next' button. The screen shows the 'gas gauge' form labeled 'Analyzing files' but the 'gas gauge' never displays any 'bars' on it so it is as if it never gets started.
  • Options
    Hi

    When you select the backup files you should get a screen information you that the files were ok. You then have to select the finish button to continue and have the log analysed. This is the same if you don't select any back files, you just receive a warning.

    If you are having a problem while analysing the file where are they located and do they all still exist. You may have to deselect some of them, or add your own selection in.

    If you leave the files to be analysed for some time does the application run out of memory.

    Regards
    Dan
    Daniel Handley
    Red Gate Software Ltd
  • Options
    Just tried the operation again. Program loaded, connected to desired target DB in SS2K. Select 'Create New Project', goto 'Project Settings' form - log into DB, see screen listing all .BAK and .TRN files - selected one set for 17Oct (1 .BAK and 1 .TRN file for that date) - click 'Next' button - get 'gas gauge' form with title 'Analyzing transaction log...' - sits there for 5 minutes - displays 'Timeout expired' error message as in original post. 'Gas gauge' on this form dide not show even a single bit of activity (blue squares). No other error messages given. Program is installed on our network SS2K server with 1 GB of RAM. I access it using Remote Desktop, which I also use to manage the SS2K databases - never had any problem with that and have been doing this at present employer since early June of this year.

    My impression is that SQL Log Server is not working correctly. I am disappointed in the support we have received from this vendor. Last week I tried a $50 program from another vendor and have received several responses from the vendor when troubleshooting it, but so far not a thing from the vendor of this program, don't know whether those replying to my posts are vendor support people or other users.
  • Options
    PDinCAPDinCA Posts: 642 Silver 1
    I installed the software on our staging server, which is barely used and has a decent configuration, then pointed it at the 1.4Gb .bak and 10 .trn files totalling about 50Mb. Same "Analyzing" dialog, then the timeout, no progress on the meter.

    In our case, the analyze process ate through so much temp disk space that the server naturally couldn't go on - hence the timeout. The c: drive had 2Gb free, but that appears inadequate...

    Perhaps the way that Log Rescue handles its analysis phase needs a look under the hood (bonnet). Does it really need to chew up all that space and keep it? Can it estimate the space it needs and tell me it doesn't have enough on the drive it's going to use and offer me the choice of another? If I tried this on a large DB, am I likely to wait an hour while it "analyses" during which time my ear is being sorely bent by managers who need the DB fixed!? I noticed other posts/comments on other sites that cited analysis times that were far too long for practical use with large DBs - what is the truth?

    I'd hate to give up on the product as, on the face of it, it looks to have a very good feature-set and one that will same me gobs of development effort to have the same kind of undo-transaction functionality, and "who's the culprit?" detection capabilities.
    Jesus Christ: Lunatic, liar or Lord?
    Decide wisely...
  • Options
    Brian DonahueBrian Donahue Posts: 6,590 Bronze 1
    Hello,

    If you do not recieve an answer from Red Gate within 24 hours, chances are very good that we never got your email. If you suspect this, we're not shy about giving you our telephone number -- in fact I believe it is on every page of our website at www.red-gate.com. Go ahead and give us a ring if you want!

    There are some distinct differences in the way Log Rescue works when compared to other vendors. First off, log analysis incurs a performance penalty and you can either pay this at the server end or the client end. I believe we have chosen wisely by designing the software so that it pulls all of the relevant log data off of the server and stored it on temporary files on the client which is running Log Rescue. This is done for performance reasons as well, to prevent contention between Log Rescue and the SQL Server over the log file.

    Second, for a better user experience, lots of data, particularly BLOBS, are cached at the client. In order to produce the most accurate results, we ask for your log files, which will also have information parsed from them and stored in memory and disk.

    I'm sorry that you weren't happy with the software, but I believe that the software design prevents many more problems than it causes. You will need to ensure that you have ample computer power for analyzing your logs.
    So it is entirely concievable that 2GB of free space may not be enough to compare a 1.4 GB log file.
  • Options
    PDinCAPDinCA Posts: 642 Silver 1
    Thanks for your response, Brian.

    I still have some questions/points:

    1. Could you develop a "space required" algorithm and apply it to the "default" temp file drive to see if you can complete the "analyze" stage?

    2. Give us a configuration option to tell Log Rescue where to put its temp files.

    3. Can you give some kind of "Estimated Time Until Ready" on the analyze progress meter? You should be able to do this given the file sizes you are asked to analyze and your software's progress through them...

    4. If we know the approximate time when the transactions we need to analyze were started and ended, can you not "skip read" through the logs and only save the data to temp files for the time period we need, rather than go wholesale when we only need nibbles? If we, the Users, find we need to expand our timeslice, that's our problem in incurring another parse - it still stands a chance of being much faster than saving everything, especially when dealing with high volume, large, databases... Even if you started saving all the data from the timestamp we request, we'll likely be able to get into recovery before our bosses go spare! If your software could allow us to page through time as soon as possible, i.e., some kind of async process is spooling to temp files in the background while the UI is able to satisfy "walk through time" requests on data already spooled, we can at least get started rather than wait until the last log byte has been spooled.

    Cheers.
    Jesus Christ: Lunatic, liar or Lord?
    Decide wisely...
  • Options
    Hello,

    I'll certainly suggest it and we'll see if it's possible to do these things.

    You can already change the location for temp files. Like all Red Gate products, we ask the operating system to furnish us with temporary storage, so you can change the location by setting the TMP environment variable in your system properties. You can change this to a larger drive, log off and on again, and Log Rescue will store its' working data set in the new physical location.
  • Options
    PDinCAPDinCA Posts: 642 Silver 1
    Pity the TMP location is used... If "standard operating procedures" require this to be the C: drive on the server, or one only has a single hard drive on one's workstation, one is hosed!

    Perhaps you should also consider making the temp location your own configurable parameter...?
    Jesus Christ: Lunatic, liar or Lord?
    Decide wisely...
  • Options
    Brian DonahueBrian Donahue Posts: 6,590 Bronze 1
    You can configure this using Windows system properties by setting the environment variable called TMP to a larger drive.
  • Options
    PDinCAPDinCA Posts: 642 Silver 1
    The package is on a SERVER and that server is not just for Log Rescue. The option to change the TMP environment variable is not open to me (as the user of the software, not the server admin, and I'm prevented by Security Policy from mucking with environment variables).

    If the package used its own Environment Variable for temp files, e.g., LR_TMP, configurable as a system variable, the issue would be solved very simply.

    The "larger drive" the server has access to is actually a NETWORK DRIVE and to use that as the Windows Temp folder is flat out suicidal.
    Jesus Christ: Lunatic, liar or Lord?
    Decide wisely...
This discussion has been closed.