Command-line ignoring include/exclude options

daveburnsdaveburns Posts: 4
edited November 2, 2010 3:50PM in SQL Packager Previous Versions
I'm running on WinXP against SQL Server 2008. (I'm using the latest patch because things failed with the latest supported release but I can't remember what right now.)

I'm trying to run packager from the command line so I can automate nightly backups and it fails because one of my tables does not have a primary key. I'm not sure why this matters but fine, I'll exclude it for now to get things working. I add /excludedata:table:foo to the command line (no other include or exclude options) and yet I still get the error:

Error: Cannot uniquely identify row in table 'foo' because there is no primary key or unique index defined on it. Output script generation failed.

For the purposes of just getting my nightly automation working, I've tried including only one table with /includeschema:Table:blah /includedata:Table:blah. When using /verbose, I see that it only dumps that table's schema but then gets to "Comparing databases" and takes forever. Turns out it's not frozen, it's just comparing the entire database and results in the same error as above.

Is this known, is there a workaround, and is there anything I can do to help diagnose further?



  • Thanks for your post and sorry you're having this issue with SQL Packager.

    I'm not sure SQL Packager is the ideal choice for an automated backup utility. I would recommend using SQL Servers native backup, or a third party backup tool for this job. If you're using SQL Server express and can't schedule a SQL server backup using the SQL Agent, then you could use the command line interface to SQL Server (SQLCmd.exe) and schedule a command line backup using a batch file triggered through a windows scheduled task.

    Anyway, with regards to:
    Error: Cannot uniquely identify row in table 'foo' because there is no primary key or unique index defined on it. Output script generation failed

    I've seen this kind of error before if a table contains rows that only differ by the data in a BLOB column. Could you send me the table structure for table 'foo'?
  • Chris -

    I agree SQL Packager is not ideal but I'm curious what you would recommend for my client's situation: they're using SQL Server in a shared hosting environment where they cannot run a native backup. The hosting provider offers this at $5 a pop. If you want daily backups, it gets expensive and you need to find another way.

    My initial idea was to replicate the database to one on my machine using Compare and Data Compare and then backing up that copy with native backup. I ran into a hitch when I realized my client did not buy the Pro versions of those so I couldn't automate that approach. SQL Packager will run from the command-line though and the backups *seem* ok - nicely compressed and mostly portable as a handful of files.

    Would you recommend a different approach given the constraints?

    As for the table structure, that's pasted from SSMS below. There were no indices on this table and I've since worked around this problem by placing a unique key on UNIQUE_ID.

    UNIQUE_ID nvarchar(12) Checked
    CTIME smalldatetime Checked
    ETIME smalldatetime Checked
    MTIME smalldatetime Checked
    [USER] nvarchar(128) Checked
    TYPEID float Checked
    TYPE nvarchar(50) Checked
    REGARDING ntext Checked
    USER_TIME smalldatetime Checked
    ATTACHMENT nvarchar(255) Checked
    CONTACTID nvarchar(12) Checked
    GROUPID nvarchar(12) Checked
    RECORDMGR nvarchar(50) Checked

  • I suspect the problem is that there are some rows that are identical apart from having a possible difference in the ntext column.

    The comparison engine cannot use any BLOB columns as a comparison key, so when all the other columns are identical, it can't uniquely identify a row as it can't consider what's in the BLOB column. It wouldn't matter if there were duplicate rows, it's the uncertainty about what's in the BLOB column (ntext) that stops things.

    Putting a unique index on the table solves the problem as the rows can be uniquely identified.

    Can you check if you have any duplicate rows in the table?
  • Chris -

    Thanks for your reply but I've had to move on since I found the workaround. I'd very much like to hear your thoughts on the overall backup constraints I described above though.

  • I think $5 for a backup is scandalous, but they really have you by the soft and danglies if that's their policy.

    If they deny you any backup privileges, then backing up the database with scripts is probably your only other option.

    You could have a database on your server, and then use a combination of SQL Compare and SQL Data Compare to keep these databases in sync on a schedule. If you run the sync fairly often, then the changes would be small and there is less chance of things going wrong. You can then take as many backups of your local copy as you like.

    I hope this helps.
Sign In or Register to comment.