My Adventures in Coding

March 6, 2014

SQL Server – Alter database in Single User mode to Multi User mode

Filed under: SQL,SQL Server 2005 — Brian @ 8:43 am
Tags: , , , ,

We have some test environment databases that get rebuilt nightly. The job that does the rebuild always switches the database into single user mode when it is being rebuilt. When the rebuild is finished it will switch it back to multi user mode. However, if there was a problem with a script and the job failed, it will leave the database in single user mode.

First, open a SQL Server Management Studio query window connected to database “master”.

The command to change the database back to multi user mode is:

ALTER DATABASE {InsertDatabaseNameHere} SET MULTI_USER WITH ROLLBACK IMMEDIATE
GO

However, if there is an existing process blocking this alter, you may get the error message:

“Transaction (Process ID 864) was deadlocked on lock resources with another process and has been chosen as the deadlock victim”

Since both the existing running process and the current request have the same deadlock priority, preference is given to the longer running process which is why your alter database command is chosen as the deadlock victim. So to deal with this when you run the command again, set the deadlock priority to HIGH. However, if that does not work, you will need to kill the existing connection.

To find the “spid” of the existing connection, you can use the following query:

SELECT sd.[name], sp.spid, sp.login_time, sp.loginame 
FROM sysprocesses sp 
INNER JOIN sysdatabases sd on sp.dbid = sd.dbid  
WHERE sd.[name] = 'DatabaseNameGoesHere'

As an alternative, you can also use the command “sp_who” to get the “spid” of the open connection:

exec sp_who

So, here is our revised command:

KILL SpidToKillGoesHere
GO
SET DEADLOCK_PRIORITY HIGH
GO
ALTER DATABASE DatabaseNameGoesHere SET MULTI_USER WITH ROLLBACK IMMEDIATE
GO

That should do it!

Advertisements

May 27, 2012

C# – Bulk copying data into MS SQL Server with DataTables

Filed under: c#,SQL Server 2005 — Brian @ 9:45 pm
Tags: , , ,

Recently, we were asked to start pulling data daily from a number of sources (e.g., several REST APIs), aggregating the data, and saving it to a database to be used for generating reports. As usual we want to make sure the application is easy to test (We need to make sure those stats are correct!), but we also need to ensure it performs well because we will be adding possibly hundreds of thousands of rows daily to a number of different tables every time this job runs. We were worried that the bottleneck in this application would be in running all those insert statements against our MS SQL Server database. However, we were in luck, .NET has a handy feature called DataTables.

A DataTable is basically an in memory representation of an MS SQL Server table. DataTables allow you to create the table in memory, add rows to it, edit values in specific columns of a row, etc, until all the data is exactly what you want. Once the DataTable is ready, it is just a simple SqlBulkCopy statement to insert all the data at once. So rather than hundreds of thousands of insert statements, it is just one bulk copy, and rather than taking minutes or longer to run, it just takes seconds to dump all the data into MS SQL Server. Also, because the data is all in memory, it makes it very easy to test all of our stats. We simply pass in the data we would receive and assert on the values in the DataTables. That’s all!

The following is a simple example where we are saving daily sales figures for each sales person.

Create the table

CREATE TABLE [dbo].[DailySalesStats](
	[Date] [smalldatetime] NOT NULL,
	[SalesPersonId] [int] NOT NULL,
	[TotalSales] [int] NOT NULL,
 CONSTRAINT [PK_DailySalesStats] PRIMARY KEY CLUSTERED 
(
	[Date] ASC,
	[SalesPersonId] ASC
)) ON [PRIMARY]

Example of writing sales stats to the dbo.DailySalesStats table using a DataTable and SqlBulkCopy

using System;
using System.Data;
using System.Data.SqlClient;

namespace DataTableExample
{
    class Program
    {
        static void Main(string[] args)
        {
            // Create a datatable with three columns:
            DataTable dailySalesStats = new DataTable("DailySalesStats");

            // Create Column 1: Date
            DataColumn dateColumn = new DataColumn();
            dateColumn.DataType = Type.GetType("System.DateTime");
            dateColumn.ColumnName = "Date";

            // Create Column 2: SalesPersonId
            DataColumn salesPersonIdColumn = new DataColumn();
            salesPersonIdColumn.DataType = Type.GetType("System.Int32");
            salesPersonIdColumn.ColumnName = "SalesPersonId";

            // Create Column 3: TotalSales
            DataColumn totalSalesColumn = new DataColumn();
            totalSalesColumn.DataType = Type.GetType("System.Int32");
            totalSalesColumn.ColumnName = "TotalSales";

            // Add the columns to the dailySalesStats DataTable
            dailySalesStats.Columns.Add(dateColumn);
            dailySalesStats.Columns.Add(salesPersonIdColumn);
            dailySalesStats.Columns.Add(totalSalesColumn);

            // Let's populate the datatable with our stats, which could come from REST APIs, etc.
            // You can add as many rows as you want here!

            // Create a new row
            DataRow dailySalesStatsRow = dailySalesStats.NewRow();
            dailySalesStatsRow["Date"] = DateTime.Now.ToString("yyyy-MM-dd");
            dailySalesStatsRow["SalesPersonId"] = 1;
            dailySalesStatsRow["TotalSales"] = 2;

            // Add the row to the dailySalesStats DataTable
            dailySalesStats.Rows.Add(dailySalesStatsRow);

            // Copy the DataTable to SQL Server
            using(SqlConnection dbConnection = new SqlConnection("Data Source=dbhost;Initial Catalog=dbname;Integrated Security=SSPI;Connection Timeout=60;Min Pool Size=2;Max Pool Size=20;"))
            {
                dbConnection.Open();
                using (SqlBulkCopy s = new SqlBulkCopy(dbConnection))
                {
                    s.DestinationTableName = dailySalesStats.TableName;
                    foreach (var column in dailySalesStats.Columns)
                        s.ColumnMappings.Add(column.ToString(), column.ToString());
                    s.WriteToServer(dailySalesStats);
                }
            }

            // That's it, we are done!
        }
    }
}

The Results

select * from dbo.DailySalesStats

Date                SalesPersonId  TotalSales
2012-05-27 00:00:00	   1	        2

Overall I have been really happy with the DataTable and SqlBulkCopy features in .NET. It is a fast and easy way to load a lot of data into a MS SQL Server database and is a good fit for our statistics gathering job!

February 29, 2012

Scala – Connecting a Scala app to MS SQL Server with Squeryl

Filed under: Scala,SQL Server 2005 — Brian @ 9:21 pm
Tags: ,

For all of our Scala applications up until now, we have being using MongoDB. However, recently we needed to connect one of our Scala applications to Microsoft SQL Server database. One of my excellent co-workers pointed us at a really nice project called Squeryl which provides a very simple library for doing just that. The documentation is excellent and great examples are provided to help get you up and running quickly.

Let’s look at a simple example.

Create a table in your SQL Server database:

CREATE TABLE [dbo].[Customer](
	[id] [int] NOT NULL,
	[name] [varchar] (100) NOT NULL,
 CONSTRAINT [PK_Customer] PRIMARY KEY CLUSTERED 
(
	[id] ASC
)) ON [PRIMARY]

Insert a row:

INSERT INTO dbo.Customer(id, name)
VALUES(1,'John Smith')

Get all of the libraries you will need to make the program run:

  • Download Squeryl jar (2.9.0)
  • Download jtds driver jar (1.2.5)
  • Download cglib jar (2.2.2)
  • Download asm jar (3.1)

Now, let’s run our test code to select a row from our Customer table:

package spikes

import org.squeryl.Session
import org.squeryl.adapters.MSSQLServer
import org.squeryl.SessionFactory
import org.squeryl.PrimitiveTypeMode._
import org.squeryl.Schema

// Create a Customer class which has the same fields as the Customer table in SQL Server
case class Customer(id: Long, name: String) {}

object SqlSpike extends App with Schema {
  val databaseConnectionUrl = "jdbc:jtds:sqlserver://myservername;DatabaseName=mydatabasename"
  val databaseUsername = "myusername"
  val databasePassword = "password"

  // Set the jtds driver
  Class.forName("net.sourceforge.jtds.jdbc.Driver")

  // Connect to the database
  SessionFactory.concreteFactory = Some(()=>
    Session.create(
      java.sql.DriverManager.getConnection(databaseConnectionUrl, databaseUsername, databasePassword),
      new MSSQLServer))

  // Setup the Customer class to be mapped to the "Customer" table in SQL Server
  val customers = table[Customer]("Customer")

  // Select Customer with id=1 from the Customer table
  transaction {
    val customer = customers.where(c=> c.id === 1).single
    println("Customer name: " + customer.name)
  }
}

Finally, the output of the program:

[info] Running spikes.SqlSpike 
Customer name: John Smith
[success] Total time: 4 s, completed 29-Feb-2012 8:42:24 PM

Overall we have found the Squeryl library to be very useful. If you need to connect a Scala application to a SQL database I recommend you give it a try, it should save you some aggravation.

January 25, 2009

SQL Tips – Update From, Delete Into, and more

Filed under: SQL,SQL Server 2005 — Brian @ 10:43 pm
Tags:

Over the last few years I have spent a lot of time working with Microsoft SQL Server. I frequently get asked SQL questions by my co-workers so I thought I would compile a list of some of those most commonly asked SQL questions.

Checking Index Fragmentation

As an index becomes more fragmented, performance of the index will degrade. You can check the fragmentation of an index with the following query:

declare @table sysname;
set @table = 'dbo.MYTABLE';
declare @indexid int;
select      @indexid = index_id
from        sys.indexes
where       object_id = object_id(@table) and [name] = 'IX_MYTABLE_MYINDEX';

select      *
from        sys.dm_db_index_physical_stats(db_id(), object_id(@table), @indexid, NULL, NULL);

Update From (With Joins)

When migrating data it is common to update a value in one table, filtering by a value stored in another table. Usually the simple answer is to do the following:

UPDATE dbo.Friends
SET status='Declined'
WHERE LastName IN
(
	Select LastName
	From dbo.Enemies
	Where status='Active'
)

However, sometimes you may need to filter an update based on the values in two columns such as LastName and FirstName. This is where the SQL statement “UPDATE FROM” becomes very useful:

UPDATE fr
SET status='Declined'
FROM dbo.Friends fr
JOIN dbo.Enemies en
	ON fr.LastName = en.LastName
	AND fr.FirstName = en.FirstName
WHERE en.status='Active'

Delete Into (Output)

Let’s say that I want to delete old friends, but at the same time I also need to capture the list of friends deleted so that I can cleanup data in other tables. Now this can be accomplished with two statements, a SELECT followed by a DELETE, however it is much cleaner to just do this in one statement with a DELETE INTO statement. Here is a simple example:

Declare a table to store the deleted friendIDs:

declare @deletedFriendIDs table (FriendID int);

Delete old Friends, and output the list into the table variable:

delete	
	from	dbo.Friends
	output	deleted.Friends INTO @deletedFriendIDs
	where	LastAccessDate < (getdate() - 90)

Now let’s display the list of the deleted FriendIDs to make sure that it worked

Select * from @deletedFriendIDs

Case Sensitive Query

SQL is case insensitive. However, sometimes you may need to do a query where case is important. The following example will return all Friends with the first name “chris” but exclude “Chris”.

select * from dbo.Friends
where FirstName = 'chris'
COLLATE SQL_Latin1_General_CP1_CS_AS

How to Capitialize the first letter and make all other letters lower case

It is common that you might have some data, such as a list of names where the first letter is capitalized in some names and not in others, so you will want to clean up this data. Here is a simple query to capitalize the first letter in each name and make the rest of the name lower case:

update dbo.Friends
Set firstname = upper(left(firstname, 1)) + lower(right(firstname, len(firstname) - 1)),
lastname = upper(left(lastname, 1)) + lower(right(lastname, len(lastname) - 1))

Removing spaces from a varchar

So it is possible that some data slipped into your database without spaces being trimmed off. So if this happens and after the code bug is fixed you need to cleanup existing data, here is a simple query to do it:

UPDATE dbo.Friends
set LastName = replace (LastName,' ','')
where LastName IS NOT NULL

How to get the total row count of a very large table without using a “Select count(*)…” statement

For some production tables that are very large doing the typical “Select count(*) from dbo.Friends” statement can be very expensive and slow. SQL Server has a property on each table which stores the current row count for the table. So rather than using an expensive select you can just access that property:

select      ISNULL(sum(spart.rows), 0)
from        sys.partitions spart
where       spart.object_id = object_id('dbo.Friends') and spart.index_id < 2

December 21, 2008

SQL Server Quirk – Index Being Ignored

Filed under: SQL Server 2005 — Brian @ 12:50 pm
Tags: , , ,

I recently ran into this case with an index. The table had an index on the column “Status”, however when querying on “Status” the SQL optimizer was deciding not to use the Index, even though this actually hurts performance. The issue was in checking multiple values of “Status” in the select:

Where Status=’Active’ – uses index
Where Status=’New’ – uses index
Where Status IN (‘Active’,’New’) – ignores index
Where Status=’Active’ OR Status=’New’ – ignores index

The interesting thing was that when this table had less than 50,000 rows it used the index, but after the table grew in size the SQL Optimizer decided that the query was nolonger “selective” enough to justify using the index. By ignoring the index the Optimizer lowered the CPU usage of the query but as a side effect increased disk IO (The Optimizer figured this solution was better overall). However, this caused the number of reads for a look up to be significantly higher (On average reading 500,000+ plus rows on every lookup) costing us a great deal of disk IO. For our system IO was the bottleneck (Not CPU) and ignoring this index caused significant production issues.

So the fix was to use a UNION ALL, and do two selects, rather than one select that checks two values (Yes you could also hard code the index usage into the select but that has its own problems as well). Note: the reason we use UNION ALL here is because it avoids the cost of having to check for any duplicate values in the results. In our case we do not have to worry about duplicates (no value can have both status A and N at the same time) so we can use UNION ALL to improve performance.

Select FirstName, LastName From dbo.Friends
Where Status='Active'
UNION ALL
Select FirstName, LastName From dbo.Friends
Where Status='New'

The Tipping Point

If you are interested in reading about the cases in which an index in SQL Server will no longer be used you can find some great information at sqlskills.com. Kimberly Tripp has an article explaining The Tipping Point.

December 7, 2007

How to use Recursion with FOR XML PATH

Filed under: SQL,SQL Server 2005 — Brian @ 10:17 pm
Tags: , ,

If you have tried writing a recursive function using FOR XML EXPLICIT you will know how tedious it can be. However, with FOR XML PATH recursion is simple and incredibly fast. I had to recently write a recurisve function to return all province and city data in Canada in our system as a formatted XML document. The following is just a simple example I wrote to demonstrate how to accomplish this task with FOR XML PATH (NOTE: this function is just a basic example, it is not the full version I wrote for the task I was working on). If you need to write a recursive function, I hope this helps you get started!

This example recursive function takes in a LocationID and produces an XML document with every Location descendant of the given LocationID:

CREATE FUNCTION dbo.GetAllLocationsFromParent(@Parent int)
RETURNS XML
WITH RETURNS NULL ON NULL INPUT
BEGIN RETURN
  (SELECT loc.LocationID as '@id',
	loc.[name] as '@name',
	loct.[Name] as '@type',
      CASE WHEN loc.Parent=@Parent
      THEN dbo. GetAllLocationsFromParent(loc.LocationID) --Recursive call
      END
	FROM dbo.Location loc
		JOIN dbo.LocationType loct ON loct.LocationTypeID = loc.LocationTypeID
	WHERE loc.Parent=@Parent
		and loc.[Status] = 'Approved'
   FOR XML PATH('location'), TYPE)
END
GO

The following is a sample of the format of the XML document if the Parent LocationID passed in was 0 for World, where all countries and their corresponding decendants would be returned:

<location id="1" name="Canada" type="Country">
    <location id="123" name="Saskatchewan" type="State">
      <location id="12345" name="Saskatoon" type="City">
        </location>
        ...
      </location>
      ...
    </location>
    ...
  </location>
   ...
  </location>
  ...

That’s all! Writing a recursive method with FOR XML PATH is very straight forward.

November 26, 2007

What to do when a Service Broker Queue Appears Stalled

Filed under: SQL Server 2005 — Brian @ 10:21 am

What happens sometimes with a service broker queue, usually when there has been a db failure or some other issue, is you will have a queue that is enabled and activation is still turned on, but is not processing requests.

So you will try and run the following to fix the issue:

ALTER QUEUE dbo.MyQueue
WITH STATUS = ON , RETENTION = OFF ,
ACTIVATION ( STATUS = ON )

However, this will NOT fix the issue.

The reason is that service broker will see that activation is already turned on for the queue and just do nothing. So what you need to do is first turn activation OFF then turn activation back ON. I know this seems strange, however it seems to be the only way we have been able to restart a queue once we have had a problem occur.

So the fix is to run the following script:

ALTER QUEUE dbo.MyQueue
WITH STATUS = ON , RETENTION = OFF ,
ACTIVATION ( STATUS = OFF )
GO

ALTER QUEUE dbo.MyQueue
WITH STATUS = ON , RETENTION = OFF ,
ACTIVATION ( STATUS = ON )
GO

November 22, 2007

SQL Server: Service Broker Tips and Tricks

Filed under: SQL Server 2005 — Brian @ 8:31 am

In using Service Broker Queues for the last few months I have come up with a list of very handy frequently used queries.

Is Service Broker Enabled?

SQL Server has a database wide setting that allows all service broker queues in the database to be disabled by setting a single flag. The problem is if a database is moved or restored, this flag is automatically reset to false. There is no way to override this setting. The problem is all messages that are sent to queues while service broker is disabled, will be lost, because this setting disables receiving of messages on all queues.

Check if service broker is enabled

SELECT is_broker_enabled FROM sys.databases WHERE name = 'MyDatabaseName'

Enable service broker

--Waits for all open transactions to finish before making change
ALTER DATABASE MyDatabaseName set NEW_BROKER
OR
--Rolls back all other transactions instead of waiting
ALTER DATABASE MyDatabaseName set NEW_BROKER WITH ROLLBACK IMMEDIATE

Those commands work fine on a local machine, but if you need to enable service broker on a database where others have open connections (e.g., query window), you will need the following:

ALTER DATABASE myDatabase set single_user with rollback immediate
GO
ALTER DATABASE myDatabase set new_broker
GO
ALTER DATABASE myDatabase set multi_user

Activate/Enable a Queue

Activation on a queue allows the attached stored procedure to be run anytime a new message is inserted into the queue. (Note: I am also ensuring the queue is enabled in this update statement). If a queue is not enabled, it will not accept messages, so any messages sent to the queue will be lost!

Enable activation on the queue to process messages

ALTER QUEUE dbo.MyQueue
WITH STATUS = ON , RETENTION = OFF ,
ACTIVATION ( STATUS = ON )

Disable activation on the queue to stop processing messages

ALTER QUEUE dbo.MyQueue
WITH STATUS = ON , RETENTION = OFF ,
ACTIVATION ( STATUS = OFF )

Service Broker General Queue Information

This query gives you all information about each queue in the database. Use this query to find out if a queue is enabled, if activation is enabled on the queue, etc.

select * from sys.service_queues

Viewing Messages on a Queue

If you need to view the body of messages on a service broker queue, you can use this query to return the XML content of each message on the queue.

SELECT cast(Message_Body as XML)
FROM dbo.QueueName with(nolock

Count Messages on a Queue

You can use “Select count * from dbo.QueueName”, however that is slow, especially if there are thousands of messages presently on the queue. It is much faster to use the message count that service broker stores on each queue.

select p.rows
from sys.objects as o
join sys.partitions as p on p.object_id = o.object_id
join sys.objects as q on o.parent_object_id = q.object_id
where q.name = 'queuename'
and p.index_id = 1

Get the SPID of a Running Activation Sproc

When an acitvation sproc is running on prod and causing problems, it can be difficult to figure out which SPID is associated with that sproc when we need to kill that process. The following query makes that easy to find:

select * from sys.dm_broker_activated_tasks

Find out of a conversation has been ended

If you ever get this error message, then you have a transaction that has not been ended.

Msg 8429, Level 16, State 1, Line 46
The conversation endpoint is not in a valid state for SEND. The current endpoint state is ‘DI’.

DI means a “Disconnected Inbound” which means the other end of the conversation has been ended with an END CONVERSATION command, however this side of the conversation has been left open. To check for unended conversations use the following command:

select * from sys.conversation_endpoints

To close the conversation get the conversation_handle from the above command then run an END CONVERSATION on that conversation handle. Here is an example:

END CONVERSATION '73645350-72C4-DC11-9374-0003FF3A5C87'

Useful Resources

How to troubleshoot service broker problems

October 23, 2007

SQL Server: Handling Failures Gracefully in Service Broker Queues

Filed under: SQL Server 2005 — Brian @ 8:23 am
Tags: ,

In my previous post I provided an example for setting up a Service Broker Queue. One of the issues you will most likely encounter soon after using Service Broker Queues is a message that fails when being processed. With Service Broker Queues it is strictly enforced that all requests in the queue be processed in order. The problem is that when an error occurs, it stops the queue and prevents any other messages from being processed until someone intervenes and fixes the the bad message (Usually a poor systems administrator who is called in the middle of the night!).

What if you are using the queue to process messages where order is not that important? If a message fails you would rather that the message be skipped and placed on a dead letter queue and not interrupt processing of the rest of the messages on the queue. Unfortunately service broker does not offer this feature, unlike many other queuing systems which do.

The following is an example of how to implement a simple dead letter queue for your Service Broker Queue. Note: Many items, such as the schema and message type will all be the same as the ones used by your main queue because both queues will be handling the same messages.

STEP 1: Create a RECEIVE sproc for clearing the queue

This sproc will be used to empty the dead letter queue when you want to reprocess any messages sitting in the queue. This may seem like a lot of SQL for this task, but it will make reprocessing entries in the queue later a simple one line of SQL.

CREATE PROCEDURE [dbo].[fooDeadLetterReceive]
AS
SET NOCOUNT ON;
DECLARE @dh UNIQUEIDENTIFIER;
DECLARE @mt SYSNAME;
DECLARE @MessageBody XML;

BEGIN TRANSACTION;
WAITFOR (RECEIVE TOP (1) @dh = conversation_handle,
	@mt = message_type_name,
	@MessageBody = message_body
	FROM [dbo].[fooDeadLetterQueue]), TIMEOUT 1000;
WHILE (@dh IS NOT NULL)
BEGIN
	BEGIN TRY
		
		--Send message back to the main queue to be processed
		DECLARE @handle uniqueidentifier;

		IF (convert(varchar, @MessageBody.query('count(/Root/*)')) != 0)
		BEGIN
			BEGIN DIALOG CONVERSATION	@handle
			FROM SERVICE	fooService
			TO SERVICE	'fooService'
			ON CONTRACT	fooMessageContract
			WITH ENCRYPTION = OFF;		SEND
			ON CONVERSATION	@handle
			MESSAGE TYPE	fooMessage (@MessageBody);
			END CONVERSATION @handle;
		END
		
		END CONVERSATION @dh;

		COMMIT;

	END TRY
	BEGIN CATCH
			-- If there are any exceptions then end conversation,
			-- rollback the transaction and stop processing the queue
			END CONVERSATION @dh;
			ROLLBACK TRANSACTION;
			ALTER QUEUE [dbo].[fooDeadLetterQueue] WITH ACTIVATION ( STATUS = OFF );
			BEGIN TRANSACTION
			BREAK
	END CATCH	-- Try to loop once more if there are more messages
	SELECT @dh = NULL;
	BEGIN TRANSACTION;
	WAITFOR (RECEIVE TOP (1) @dh = conversation_handle,
		@mt = message_type_name,
		@MessageBody = message_body
		FROM [dbo].[fooDeadLetterQueue]), TIMEOUT 1000;
END
COMMIT;

--Shutdown processing of the deadletter queue when we have gone through
--all existing messages in the queue
ALTER QUEUE [dbo].[fooDeadLetterQueue] WITH ACTIVATION ( STATUS = OFF );
GO

STEP 2: Create a SEND sproc for inserting messages into the queue

This sproc will be used later to insert messages into the dead letter queue when the main queue runs into a problem processing a message (Called from the CATCH block of the main queue) .

CREATE PROCEDURE [dbo].[fooDeadLetterSend]
	@body XML
AS
DECLARE @handle uniqueidentifier;

IF (convert(varchar, @body.query('count(/Root/*)')) != 0)
BEGIN
	BEGIN DIALOG CONVERSATION	@handle
	FROM SERVICE	fooDeadLetterService
	TO SERVICE	'fooDeadLetterService'
	ON CONTRACT	fooMessageContract
	WITH ENCRYPTION = OFF;		
	SEND
	ON CONVERSATION	@handle
	MESSAGE TYPE	fooMessage (@body);
	END CONVERSATION @handle;
END

STEP 3: Create a Dead Letter queue

Nothing special here, just create a new queue, called fooDeadLetterQueue. Also we attach the activation sproc that will be used for reprocessing of failed messages.

CREATE QUEUE [dbo].[fooDeadLetterQueue] WITH STATUS = ON , RETENTION = OFF , 
ACTIVATION (  STATUS = OFF , PROCEDURE_NAME = [dbo].[fooDeadLetterReceive] , 
MAX_QUEUE_READERS = 1 , EXECUTE AS OWNER  ) ON [PRIMARY]

Also note that activation on the queue (STATUS=OFF) is currently disabled. This is done on purpose. On a normal queue we always enable activation so that any time a message is received by the queue, it will be immediately processed. In the case of a dead letter queue we do not want messages processed until the messages have been checked/fixed.

STEP 4: Update the Catch block in the receive sproc of your main queue

Normally in the catch block of your TRY CATCH in a service broker queue, if an exception is thrown while processing the message, the catch block is just used as a place to log the error and then turn activation off on the queue so no more messages will be processed.

With your new dead letter queue, rather than turning off the main queue, instead we take the message that failed and send it to the dead letter queue. Now the problem message has been removed from the main queue and we can continue on with processing the rest of the messages in our queue. Now those Systems Administrators can finally relax.

BEGIN CATCH
	-- If there are any exceptions then end conversation rollback the transaction
	END CONVERSATION @dh;
	ROLLBACK TRANSACTION;
	
	--Send problem message to the dead letter queue
	exec dbo.fooDeadLetterSend @MessageBody

	BEGIN TRANSACTION
END CATCH	-- Try to loop once more if there are more messages

STEP 5: Reprocessing messages from the dead letter queue

To reprocess the queue you simply have to copy the messages in the dead letter queue back to the main queue. Now that RECEIVE sproc we wrote earlier comes in handy. Simply turn activation on for the dead letter queue. This will take all existing messages on the queue and push them back to the main queue for processing. When the dead letter queue is empty it shuts itself off again.

ALTER QUEUE dbo.fooDeadLetterQueue
WITH STATUS = ON , RETENTION = OFF ,
ACTIVATION ( STATUS = ON )

October 15, 2007

Service Broker Tutorial – SQL Server 2005

Filed under: SQL Server 2005 — Brian @ 5:46 pm
Tags: ,

The following is meant as a “Quick Start” tutorial for anyone needing to create a service broker queue under extreme time pressure! This is a generalized example of a db script for a new use of service broker queue called fooQueue. Replace attributes foo1, foo2 with whatever attributes (and types) you need in your message. Replace foo with a relevant name, e.g. CustomerOrders.  Replace schemaName with dbo or with whatever schema name you are using.  (I have experienced tricky permissions issues in the past with schema names other than dbo.)

NOTE: During our use of service broker, we ran into a number of “Gotchas”. The bonus of these examples is all of those fixes are built into the example. So if you need a high performance service broker queue where all of the problems have been ironed out, then feel free to cut and past this example.

STEP 1: Create the message schema

This schema defines the XML messages that can be placed onto the service broker queue.

CREATE XML SCHEMA COLLECTION [schemaName].[fooMessageXmlSchema] 
AS N'<xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema">
      <xsd:complexType name="foo">
            <xsd:attribute name="foo1" type="xsd:integer" />
            <xsd:attribute name="foo2" type="xsd:integer" />
      </xsd:complexType>
      <xsd:complexType name="Root">
            <xsd:sequence>
                 <xsd:element name="foo" type="foo" minOccurs="0" maxOccurs="unbounded"/>
            </xsd:sequence>
      </xsd:complexType>
      <xsd:element name="Root" type="Root" />
</xsd:schema>'

STEP 2: Create the Send Request Stored Procedure

This procedure is responsible for taking a properly formatted message and sending it to the service broker queue.

CREATE PROCEDURE [schemaName].[fooSend]
      @body XML (DOCUMENT fooMessageXmlSchema)
AS
      DECLARE @handle uniqueidentifier;

      IF (convert(varchar, @body.query('count(/Root/*)')) != 0)
      BEGIN
            BEGIN DIALOG CONVERSATION     @handle
            FROM SERVICE      fooService
            TO SERVICE  'fooService'
            ON CONTRACT fooMessageContract
            WITH ENCRYPTION = OFF;        SEND
            ON CONVERSATION   @handle
            MESSAGE TYPE      fooMessage (@body);
            END CONVERSATION @handle;
      END
GO

STEP 3: Send Stored Procedure Caller

This piece of code creates a new message that can then be placed on the queue. Place this code inside some calling stored procedure and/or trigger.

DECLARE @body XML (DOCUMENT fooMessageXmlSchema);
      SELECT @body =  (
           SELECT (
                SELECT
                     i.foo1 as <span>"@foo1"</span>,
                     i.foo2 as <span>"@foo2"</span>
                FROM inserted i
                FOR XML PATH ('foo'), TYPE
           ) FOR XML PATH ('Root'), TYPE
     );    EXEC schemaName.fooSend @body;

STEP 4: Create the Receive Stored Procedure

The receive stored procedure is where the bulk of your work will be done. This is the sproc that will be called by the service broker queue every time a new message is inserted into the queue.

CREATE PROCEDURE [schemaName].[fooReceive]
AS
SET NOCOUNT ON;
DECLARE @dh UNIQUEIDENTIFIER;
DECLARE @mt SYSNAME;
DECLARE @MessageBody XML (DOCUMENT fooMessageXmlSchema);
DECLARE @foo1 int;
DECLARE @foo2 int;
DECLARE @fooCount int;
DECLARE @fooCurrent int;
BEGIN TRANSACTION;
WAITFOR (RECEIVE TOP (1) @dh = conversation_handle,
      @mt = message_type_name,
      @MessageBody = message_body
      FROM [schemaName].[fooQueue]), TIMEOUT 1000;
WHILE (@dh IS NOT NULL)
BEGIN
      BEGIN TRY
            IF @mt = N'http://schemas.microsoft.com/SQL/ServiceBroker/EndDialog'
            BEGIN
		END CONVERSATION @dh;
            END
            ELSE IF (@mt = N'fooMessage')
            BEGIN
                  SET @fooCount = @MessageBody.value('count(/Root/foo)', 'int');
                  SET @fooCurrent = 1;

                  WHILE (@fooCurrent &lt;= @fooCount)
                  BEGIN
                        --set variable values from xml message body
                        SELECT @foo1 = Root.foo.value('@foo1','int'),
                               @foo2 = Root.foo.value('@foo2','int')
                        FROM @MessageBody.nodes('(/Root/foo[sql:variable(&quot;@fooCurrent&quot;)])') AS Root(foo)

                        -- DO YOUR WORK HERE

                        -- increment the position

                        SET @fooCurrent = @fooCurrent + 1;
                  END
                  END CONVERSATION @dh;
            END
            COMMIT;
      END TRY
      BEGIN CATCH
                  -- If there are any exceptions then end conversation,
                  -- rollback the transaction and stop processing the queue
                  END CONVERSATION @dh;
                  ROLLBACK TRANSACTION;
                  INSERT INTO ServiceBrokerErrors (queuename, errornumber, errormessage, errordate)
                  VALUES ('fooQueue', ERROR_NUMBER(), ERROR_MESSAGE(), getdate())
                  ALTER QUEUE [schemaName].[fooQueue] WITH ACTIVATION ( STATUS = OFF );
                  BEGIN TRANSACTION
      END CATCH   -- Try to loop once more if there are more messages
      SELECT @dh = NULL;
      BEGIN TRANSACTION;
      WAITFOR (RECEIVE TOP (1) @dh = conversation_handle,
            @mt = message_type_name,
            @MessageBody = message_body
            FROM [schemaName].[fooQueue]), TIMEOUT 1000;
END
COMMIT;
GO

Note: The RECEIVE top (1) used in the above store procedure is good for a single reader. If you need to use multiple readers, you can use the following to receive all messages for a conversation group:

DECLARE  @Messages TABLE (MessageBody XML, MessageTypeName SYSNAME, ddh UNIQUEIDENTIFIER)
WAITFOR (RECEIVE  message_body, message_type_name, conversation_handle
	FROM [dbo].[EmailServiceColpaEventQueue]
	INTO @Messages), TIMEOUT 1000

You can then loop through the messages in the temporary table and end the conversation appropriately.

STEP 5: Setting up the Service Broker Queue

Create the Message Type for the queue. This states what the queue will accept as a valid message. In this Message Type we are saying it must be an XML document and must conform to the schema fooMessageXmlSchema.

CREATE MESSAGE TYPE [fooMessage] AUTHORIZATION [dbo] VALIDATION = VALID_XML
WITH SCHEMA COLLECTION [schemaName].[fooMessageXmlSchema]
GO

The Contract just states who is authorized to add messages to the queue and references the Message Type.

CREATE CONTRACT [fooMessageContract]
AUTHORIZATION [dbo] ([fooMessage] SENT BY INITIATOR)
GO

Create the new service broker queue.
Note: When creating the service broker queue, the Activation section allows you to attach the receive stored procedure (e.g., fooReceive) that you have already created. This stored procedure will be called any time a new message is inserted into the queue.

CREATE QUEUE [schemaName].[fooQueue]
WITH STATUS = ON , RETENTION = OFF ,
ACTIVATION 
(
      STATUS = ON ,
      PROCEDURE_NAME = [schemaName].[fooReceive] ,
      MAX_QUEUE_READERS = 1 ,
      EXECUTE AS OWNER
)
ON [PRIMARY]
GO

Finally, create a new Service to run the new queue.

CREATE SERVICE [fooService]
AUTHORIZATION [dbo]  ON QUEUE [schemaName].[fooQueue] ([fooMessageContract])
GO

Well that is all! I hope this helps!

Create a free website or blog at WordPress.com.