Implement a SQL Server Queue for parallel Job Execution using Integration Services SQL Tasks


The difficult part is to avoid Deadlocks.

 

Step 1) Create a table that will be used to store tasks with a timestamp column to indicate if the task has been started: tbl_AUFTRAG (id, start_date)

It is important to add a clustered index on the id column to get the right locking behavior.

 

Step 2) Create an integration service package and add SQL Tasks with the following code to act as workers that consume jobs from the job table:

 

while Exists (Select 1 from tbl_AUFTRAG with (readpast) where start_date is null)

begin

 

begin transaction

declare @id varchar(36)

set @id =  (select top 1 id from tbl_AUFTRAG with (xlock, readpast) where start_date is null)   //to get an xlock that is kept till the end

update tbl_AUFTRAG  set start_date = getdate() where id = @id

commit

 

EXEC storedprocedure  -- Do the actual work here.  Pass parameters from the tbl_Auftrag –job table.

end


Comments (3)

  1. Good stuff. Both internal and external processes can send and receive messages by using extensions to Transact-SQL Data Manipulation Language which is convenient.

  2. Leiner says:

    I appreciate the German flair but why (1) prefix the table with "tbl_" and (2) why inadvertently obfuscate by saying "auftrag" instead of "order"? Apologies, I don't mean to sound rhetorical, these are genuine questions.

    1. LukasSteindl says:

      most of the scripts are the result of customer engagements /sometimes are the result of interactive live demos. if a certain idea is reused multiple times i take the time to polish the code. (i realized that i can write many more blog posts if i lower the initial quality a bit. 🙂 )

Skip to main content