SQL Server 'Resume Next' Equivalent
I’m working on a project in VB.net which takes large text files containing T-SQL and executes them against a local SQL database, but I’ve hit a problem in regards to error handling.
I’m using the following technologies :
- Framework 3.5
- SQL Express 2005
The SQL I’m trying to execute is mostly straight-forwards but my app is completely unaware of the schema or the data contained within. For example :
UPDATE mytable SET mycol2='data' WHERE mycol1=1 INSERT INTO mytable (mycol1, mycol2) VALUES (1,'data') UPDATE mytable SET mycol2='data' WHERE mycol1=2 INSERT INTO mytable (mycol1, mycol2) VALUES (1,'data') UPDATE mytable SET mycol2='data' WHERE mycol1=3
The above is a sample of the sort of thing I’m executing, but these files will contain around 10,000 to 20,000 statements each.
My problem is that when using sqlCommand.ExecuteNonQuery(), I get an exception raised because the second INSERT statement will hit the Primary Key constraint on the table.
I need to know that this error happened and log it, but also process any subsequent statements. I’ve tried wrapping these statements in TRY/CATCH blocks but I can’t work out a way to handle the error then continue to process the other statements.
The Query Analyser seems to behave in this way, but not when using sqlCommand.ExecuteNonQuery().
So is there a T-SQL equivalent of ‘Resume Next’ or some other way I can do this without introducing massive amounts of string handling on my part?
Any help greatly appreciated.
6 Solutions collect form web for “SQL Server 'Resume Next' Equivalent”
Actually your batch executed until the end since key violations are not intrerupting batch execution. If you run the same SQL file from Management Studio you’ll see that the result is that all the valid statements were executed and the messages panel contains an error for each key violation. The SqlClient of ADO.NEt behaves much the same way, but at the end of the batch (when SqlCommand.ExecuteNonQuery returns) it parses the messages returned and throws an exception. The exception is one single SqlException but it’s Errors collection contains a SqlError for each key violation that occured.
Unfortunately there is no silver bullet. Ideally the SQL files should not cause errors. You can choose to iterate through the SqlErrors of the exception and decide, on individual basis, if the error was serious or you can ignore it, knowing that the SQL files have data quality problems. Some errors may be serious and cannot be ignored. See Database Engine Error Severities.
Another alternative is to explictily tell the SqlClient not to throw. If you set the FireInfoMessageEventOnUserErrors property of the connection to true it will raise an SqlConnection.InfoMessage event instead of twroing an exception.
At the risk of slowing down the process (by making thousands of trips to SQL server rwther than one), you could handle this issue by splitting the file into multiple individual queries each for either INSERT or UPDATE. Then you can catch each individual error as it take place and log it or deal with it as your business logic would require.
SQL Server does have a Try/Catch syntax. See:
To use this with your file, you would either have to rewrite the files themselves to wrap each line with the try/catch syntax, or else your code would have to programatically modify the file contents to wrap each line.
There is no T-SQL equivalent of “On Error Resume Next”, and thank Cthulhu for that.
One technique I use is to use a try/catch and within the catch raise an event with the exception information. The caller can then hook up an event handler to do whatever she pleases with the information (log it, collect it for reporting in the UI, whatever).
You can also include the technique (used in many .NET framework areas, e.g. Windows.Forms events and XSD validation) of passing a CancelableEventArgs object as the second event argument, with a Boolean field that the event handler can set to indicate that processing should abort after all.
Another thing I urge you to do is to prepare your INSERTs and UPDATEs, then call them many times with varying argments.
I’m not aware of a way to support resume next, but one approach would be to use a local table variable to prevent the errors in the first place e.g.
Declare @Table table(id int, value varchar(100)) UPDATE mytable SET mycol2='data' WHERE mycol1=1 --Insert values for later usage INSERT INTO @Table (id, value) VALUES (1,'data') --Insert only if data does not already exist. INSERT INTO mytable (mycol1, mycol2) Select id, Value from @Table t left join mytable t2 on t.id = t2.MyCol1 where t2.MyCol is null and t.id = 1
Ok, I don’t know that I suggest this per se, but you could achieve a sort of resume next by wrapping the try catch in a while loop if you set an exit condition at the end of all the steps and keep track of what step you performed last.
Declare @Table table(id int, value varchar(100)) Declare @Step int set @Step = 0 While (1=1) Begin Begin Try if @Step < 1 Begin Insert into @Table (id, value) values ('s', 1) Set @Step = @Step + 1 End if @Step < 2 Begin Insert into @Table values ( 1, 's') Set @Step = @Step + 1 End Break End Try Begin Catch Set @Step = @Step + 1 End Catch End Select * from @Table
Unfortunately, I don’t think there’s a way to force the SqlCommand to keep processing once an error has been returned.
If you’re unsure whether any of the commands will cause an error or not (and at some performance cost), you should split the commands in the text file into individual SqlCommands…which would enable you to use a try/catch block and find the offending statements.
…this, of course, depends on the T-SQL commands in the text file to each be on a separate line (or otherwise delineated).