Enqueue each row in a ssb queue from a large table
I have a table that contains 2.5 million rows, each row has one column of type xml. All records should be deleted and enqueu开发者_如何转开发ed in a sqlserver service broker queue when a message arrives in another queue (triggerqueue). Performance is very important and now it's too slow. What would be the best way to achieve this?
currently we use an activated sp on the triggerqueue which does in a while(@message <> null) loop:
begin transaction
delete top (1) from table output @tempTable
select top 1 @message = message from @tempTable
send on conversation @message
commit transaction
are there faster ways to tackle this problem?
By the way: before someone asks: we need to start from the table, because it is filled with the output from an earlier calculated merge statement
So your performance problem is on the send side rather than receive side, right? (it's a bit unclear from your question). In this case, you'll want to start with trying:
- Batch many operations in a single transaction. You're most likely getting hit the most by synchronous log flushes at commit time.
- Try processing the table more efficiently (e.g. select more rows at once into the temp table and then use cursors to iterate over it and send messages)
In case you're experiencing problems on the receive side, take a look at this great article by Remus.
精彩评论