Database Isolation Models
I have a database with an “ID” column. Whenever there is a new entry for the database, I fetch the last ID from the database, increment the value, and then use it in the Insert statement.
EDIT : I need the ID to use in multiple Insert statements. I will fetch this ID from the primary table and use this ID to insert values into related tables.
NextID = Select Max(ID) + 1 From Table INSERT INTO Table1(ID, Col1, Col2...) Values(NextId, Value1, Value2...) INSERT INTO Table2 (ID,col1,col2....) Values (NextID, Value1, Value2...)
I dont know if this is a good way because I know there will be concurrency issues.
When my application tries to read the NextID, there is a chance that another instance of the application is also trying to read the same value and thus concurrency issues may arise.
Is there a proper way to deal with this situation? I mean there are ways to set the database isolation level. Which would be a proper Isolation level for this situation.
Also if anybody could suggest me with an alternate way to maintain and increment manually the ID in the database, I’m also open to that.
If this information is not enough, please let me know what you require.
I am working with ASP.Net with VB and MS Sql Server 2008. I do not want to use the built-in “Identity” of SQL Server.
3 Solutions collect form web for “Database Isolation Models”
The only way to get the next ID is to actually insert the row, and use identity. Everything else will fail. So you must start by inserting into the parent table:
begin transaction; insert into Table (col1, col2, col3) values (value1, value2, value3); set @Id = scope_identity(); insert into Table1(ID, col1, col2) values (@Id, ...); insert into Table3(ID, col1, col2) values (@Id, ...); commit;
This is atomic and concurrency safe.
I do not want to use the built-in “Identity” of SQL Server.
tl;dr. What you ‘want’ matter little unless you can make a clear justification why. You can do it correctly, or you can spend the time ‘ill oblivion reinventing the wheel.
Esentially you have a batch of three SQL statements – one select and two inserts. The database engine can execute another statement from a different session anywhere between them, thus breaking your data consistency – some other session can get the same MAX() value that you’ve got and use it for other insert statements. The only way to prevent DB engine from doing it is to use transactions. Wrap your batch with BEGIN TRANSACTION … COMMIT and you are done.
Your way of doing this fine, what you would need is transaction handling..
BEGIN TRANSACTION begin try NextID = Select Max(ID) + 1 From Table INSERT INTO Table1(ID, Col1, Col2...) Values(NextId, Value1, Value2...) INSERT INTO Table2 (ID,col1,col2....) Values (NextID, Value1, Value2...) COMMIT TRANSACTION end try begin catch ROLLBACK TRANSACTION --exception logging goes here end catch