Real Time Data Integration with Service Broker and Other SQL Techniques

This article discusses how to use various SQL technologies to accomplish real time data integration between SQL Server instances. It provides a set of sample code to help users with their development. The document focuses on the usage of each technology which is incorporated into the data integration service. Please refer to provided links for detail information about the technologies.

Real time data integration definition

Real time data integration supports event-driven data movement and transformation between SQL Server instances which host databases with different schemas. The data integration should be transparent to source systems without significantly impacting the systems when events are captured and delivered. The technique also supports an intermediate format which allows decoupling of schemas between source and destination systems. It allows either system to change schemas without breaking the application in the other system. The data integration provides fast and efficient data delivery to a destination in an event-driven model, without polling the source system for new data.


Sales data integration

The real time data integration demo shows the sales data integration between the databases, AdventureWorks (AW) and AdventureWorksDW (AWDW). The data integration service catches the sales data change on AW, and transforms the data in the schema supported in AW onto a general XML format. The service sends the data in XML onto AWDW, and transforms it to correspond to the AWDW schema.

The demo uses the sample databases on SQL Server. Please refer to the link for the detail information about the databases [https://msdn.microsoft.com/en-us/library/ms124659.aspx]. Users can download and install the databases for SQL Server 2008 from the following link [https://technet.microsoft.com/en-us/library/ms124501(SQL.100).aspx].


Techniques

Change tracking

Change tracking provides a mechanism to query for changes to data and to access information related to the changes. This solution provides answer to the following questions. What rows have changed for a user table? What are the latest data in the rows? Change Tracking requires small amount of storage for each changed row, while it only works for getting the latest data. Please refer to the following link for detail information about Change Tracking [https://msdn.microsoft.com/en-us/library/bb933874(SQL.100).aspx]

. If an application requires information about all the changes and the intermediate values of the changed data then it should use Change Data Capture (CDC). Please refer to the following document for the comparison of two techniques [https://msdn.microsoft.com/en-us/library/cc280519(SQL.100).aspx]. We plan to write another document which shows how to use CDC as a change tracking option. The following code block shows how to enable Change Tracking on the database and table levels.

ALTER

DATABASE AdventureWorks

SET CHANGE_TRACKING = ON

(CHANGE_RETENTION = 2 DAYS, AUTO_CLEANUP = ON)

ALTER

TABLE [AdventureWorks].[Sales].[SalesOrderHeader]

ENABLE CHANGE_TRACKING

WITH (TRACK_COLUMNS_UPDATED = ON)

ALTER

TABLE [AdventureWorks].[Sales].[SalesOrderDetail]

ENABLE CHANGE_TRACKING

WITH (TRACK_COLUMNS_UPDATED = ON)

Changed data in XML

After setting change tracking on a database and tables the change tables are populated with the data change information when data is inserted, deleted or updated on the tables. The data integration service uses the following code block to fetch the change information and create an XML file with the data change. Using CHANGETABLE function it creates change tracking information for the tables, ‘SalesOrderHeader’ and ‘SalesOrderDetail’. The code generates an XML document containing the information using the FOR XML mode. In the XML file the root, top-level element is named with ‘Sales’, and each sales order header corresponds to an element named with ‘SalesOrderHeader’. A ‘SalesOrderHeader’ element contains one or more ‘SalesOrderDetail’ elements which describe the data change information on the table, ‘SalesOrderDetail’. INNER JOIN clauses make sure that all the change data information is retrieved from the tables.

SET

@changeReportXML =

(

SELECT

SYS_CHANGE_OPERATION, c_soh.SalesOrderID,

(

SELECT SYS_CHANGE_OPERATION, c_sod.SalesOrderID,

c_sod

.SalesOrderDetailID

FROM CHANGETABLE

(

CHANGES

[AdventureWorks].[Sales].[SalesOrderDetail],

@last_sync_version

)

AS c_sod

INNER JOIN [AdventureWorks].[Sales].[SalesOrderDetail] sod

ON sod.SalesOrderDetailID = c_sod.SalesOrderDetailID

WHERE c_soh.SalesOrderID = c_sod.SalesOrderID

FOR XML PATH ('SalesOrderDetail'),

type

, ELEMENTS XSINIL

)

FROM CHANGETABLE (

CHANGES

[AdventureWorks].[Sales].[SalesOrderHeader],

@last_sync_version

)

AS c_soh

INNER JOIN [AdventureWorks].[Sales].[SalesOrderHeader] soh

ON soh.SalesOrderID = c_soh.SalesOrderID

WHERE @salesOrderID = c_soh.SalesOrderID

FOR XML PATH ('SalesOrderHeader'),

root

('Sales'),ELEMENTS XSINIL

);

Change notification

SQL Server provides several mechanisms for notifying data change to an application. For example, Trigger [https://msdn.microsoft.com/en-us/library/ms189599.aspx] and Query Notification (QN) [https://msdn.microsoft.com/en-us/library/ms130764.aspx]. Trigger provides a simple way for the notification, while only supporting synchronous mechanism. QN supports asynchronous notification and rich filtering semantics. However QN cannot be configured in TSQL within SQL Server. In the Real Time Data integration demo we use a technique integrating Service Broker and Trigger. It provides a simple way to support event notification implementing asynchronous semantic in TSQL within SQL Server. The following code block shows the event notification on the demo.

CREATE

TABLE ConversationHandle

(

conversationHandle uniqueidentifier);

--Create a dialog to send all the transactions on

BEGIN

TRANSACTION

DECLARE @conversationHandle uniqueidentifier

--Create a new conversation on the table

BEGIN DIALOG @conversationHandle

FROM SERVICE AsynchTriggerInitiatorService

TO SERVICE N'AsynchTriggerTargetService'

ON CONTRACT [AsynchTriggerContract]

WITH ENCRYPTION = OFF;

INSERT ConversationHandle (conversationHandle)

VALUES

(@conversationHandle)

COMMIT

;

-- TRIGGER for initiating the change tracking demo

CREATE

TRIGGER ChangeTrackingTrigger

ON

[AdventureWorks].[Sales].[SalesOrderHeader]

AFTER

INSERT, DELETE, UPDATE

AS

BEGIN

TRANSACTION;

DECLARE @conversationHandle uniqueidentifier;

SELECT TOP (1) @conversationHandle = conversationHandle

FROM

ConversationHandle;

SEND ON CONVERSATION @conversationHandle

MESSAGE

TYPE [AsynchTriggerMessageType]

COMMIT

;

Reliable data movement

Service Broker provides asynchronous and reliable data movement. It supports TSQL programming model built on SQL Server database engine. Please refer to the following link for the detail information about Service Broker [https://technet.microsoft.com/en-us/sqlserver/bb671396.aspx].

The data integration service uses multiple conversations for message delivery to increase throughput. Using multiple dialogs brings the data parallelism on the receiving side. Multiple threads can receive and process the messages in the dialogs independently. However, initiating the conversations brings load to a system. Therefore right amounts of conversations should be chosen smartly. In the real time data integration demo we choose four conversations for processing the messages with high throughput. In the real time data integration service we initiate four dialogs, and store them onto a table. The demo uses the dialogs for sending messages about changed data information. Please refer to the following code block for the dialog creation. The following code blocks present the procedure for sending an XML message using Service Broker. The procedure uses the four conversations evenly distributed messages based on the sales order ID ( SET @dialogHandleID = @salesOrderID % 4) . Because messages for the same sales order ID are delivered in a single conversation it is guaranteed that the messages are delivered exactly once in order manner.

CREATE

PROCEDURE SendChanges

AS

BEGIN

DECLARE @last_sync_version bigint;

DECLARE @salesOrderID bigint;

DECLARE @dialogHandleID INT;

DECLARE @dialogHandle uniqueidentifier;

DECLARE @changeReportXML XML;

DECLARE @next_baseline bigint;

DECLARE @TotalDialogs INT;

DECLARE @logMsg VARCHAR(MAX);

BEGIN TRANSACTION;

SELECT TOP (1) @last_sync_version = lastVersion

FROM LastVersion;

SET @TotalDialogs = 4;

--Create a cursor on the change table for [SalesOrderHeader]

DECLARE cursorChangeOrderHeader

CURSOR FORWARD_only READ_ONLY

FOR SELECT SalesOrderID FROM

CHANGETABLE (

CHANGES [SalesOrderHeader], @last_sync_version) AS Cursor_CH

ORDER BY SYS_CHANGE_VERSION;

--Open the cursor on the change table for [SalesOrderHeader]

--Loop for each changed sales order id

OPEN cursorChangeOrderHeader

WHILE(1=1)

BEGIN

FETCH NEXT FROM cursorChangeOrderHeader INTO @salesOrderID;

--If there is no more changed sales order then exit

IF (@@FETCH_STATUS != 0) BREAK;

--<Fetching changed data and creating XML file.>

--<Please refer to the code block on section 2.a.1.>

--Find the conversation handle for the sales order

--from the dialog handle table

SET @dialogHandleID = @salesOrderID % @TotalDialogs;

SELECT @dialogHandle = dialogHandle

FROM DialogHandles

WHERE ID = @dialogHandleID;

--Capture last version info

SELECT @next_baseline = SYS_CHANGE_VERSION

FROM CHANGETABLE (

CHANGES[AdventureWorks].[Sales].[SalesOrderHeader],

@last_sync_version) as c_soh

WHERE @salesOrderID = SalesOrderID;

--Send the message using Broker

SEND ON CONVERSATION @dialogHandle

MESSAGE TYPE [RealTimeDImessagetype](@changeReportXML);

END

CLOSE cursorChangeOrderHeader;

DEALLOCATE cursorChangeOrderHeader;

UPDATE LastVersion SET lastVersion = @next_baseline;

COMMIT;

END

;

Activation

Activation allows message processing logic to be launched when a message arrives on a Service Broker queue. When an internal activation is used for processing messages a stored procedure is declared on a Service Broker queue, and invoked on a background thread when a message arrives. A user can also specify an executable for processing the messages as an external activator. For example, SQL Server Integration Services (SSIS) can be used as an external activation procedure to process messages. Please refer to the following link for the code sample and document of External Activator [https://www.codeplex.com/SQLSrvSrvcBrkr/Release/ProjectReleases.aspx?ReleaseId=3853].

In the real time data integration services demo we use internal activators to process event notification messages on the initiator service as well as changed data information messages on the target service. We briefly mention how the services process the messages in the activation procedures.

  • Message processing in the initiator

The real time data integration initiator handles messages from two different services. One of the services is an asynchronous event notification service, and the other is a real time data integration target service. A single service in the initiator handles the message from the two different sources based on message types and service names. The following pseudo-code block describes the message processing logic in the initiator.


WHILE

there is any message on ‘RealTime_DI_Initiator_queue’

RECEIVE a message FROM the queue

IF message type is ‘EndDialog’ THEN

END CONVERSATION

ELSE IF message type is ‘ERROR’ THEN

IF service name is ‘RealTime_DI_Initiator_Service’ THEN

Raise error;

Create a new dialog;

Resend pending messages using the dialog;

Replace old dialog with the new one;

END CONVERSATION (old dialog);

IF service name is ‘Asynchronous_Trigger_Target_Service’ THEN

Raise error;

END CONVERSATION;

ELSE IF message type is ‘Asynchronous triggering’ THEN

RECEIVE

WHERE conversation_handle is identical with this message’s handle

EXEC SendChanges PROCEDURE

- Message processing in the target

The message processing procedure in the real time data integration target receives messages from a target queue, and transforms the messages from the XML format into a supported schema. A simple and straightforward way to process messages is to receive a message from the queue and to transform it one by one until all the messages are processed on the queue. However, the mechanism may hurt the performance of the data integration target. Instead of receiving a single message and transform it the data integration target service uses a cursor-based processing mechanism. It receives all the messages from the target queue, and stores in a temporary table. A cursor iterates the table to fetch a message and process it to covert from an XML format to a desired schema. The following code block shows the activation procedure on the target.

CREATE

PROCEDURE ProcessMessagesDW

AS

BEGIN

DECLARE @handle uniqueidentifier;

DECLARE @messageBody XML;

DECLARE @tableMessages TABLE(

queuing_order

BIGINT,

conversation_handle UNIQUEIDENTIFIER,

message_body

VARBINARY(MAX));

DECLARE cursorMessages CURSOR

FORWARD_ONLY

READ_ONLY

FOR SELECT conversation_handle,

message_body

FROM @tableMessages

ORDER BY queuing_order;

WHILE(1=1)

BEGIN

BEGIN TRANSACTION;

WAITFOR(RECEIVE

queuing_order

,

conversation_handle,

message_body

FROM [RealTimeDItargetqueue]

INTO @tableMessages), TIMEOUT 1000;

IF(@@ROWCOUNT = 0)

BEGIN

COMMIT;

BREAK;

END

OPEN cursorMessages;

WHILE(1=1)

BEGIN

FETCH NEXT FROM cursorMessages

INTO @handle, @messageBody;

IF(@@FETCH_STATUS != 0)

BREAK;

-- <Message transformation>

END

CLOSE cursorMessages;

DELETE FROM @tableMessages;

COMMIT;

END

DEALLOCATE cursorMessages;

END

Data transformation

After receiving messages the target service transforms the received messages, and populates tables with the changed data information from the messages. The received messages are in XML format. The service processes each of the messages to obtain required information from the message using TSQL language coupled with integrated XML support. The following code block shows a sample of the transformation using TSQL. On this example, the transformation is occurred only for the data insert event.

INSERT

INTO [AdventureWorksDW].[dbo].[FactInternetSales]

SELECT

N1

.SOH.value('CustomerID[1]', 'int')

AS

[CustomerKey]

,N2.SOD.value('SpecialOfferID[1]', 'int')

AS [PromotionKey]

,N2.SOD.value('CarrierTrackingNumber[1]','NCHAR(9)')

AS

CarrierTrackingNumber

,N1.SOH.value('PurchaseOrderNumber[1]','NVCHAR(25)')

AS

[CustomerPONumber]

FROM

@messageBody

.nodes('/Sales/SalesOrderHeader') N1(SOH)

CROSS

APPLY soh.nodes('SalesOrderDetail') N2(SOD)

WHERE

N1

.SOH.value('CustomerType[1]', 'CHAR') = 'I'

AND

N2.SOD.value('SYS_CHANGE_OPERATION[1]','CHAR')='I'

SQL Server Integration Services (SSIS) also provides the data transformations. SSIS supports various forms of data transformation between heterogeneous sources. Please refer to the following link for more detail information about SSIS [https://technet.microsoft.com/en-us/sqlserver/bb671392.aspx].

This document discusses about real-time data integration technologies with the coordination of a set of powerful SQL Server technologies. The service provides reliable and transparent data integration between instances. The service is composed with the following technologies.

Data tracking: Change tracking

Change notification: Triggers and Service Broker

Reliable data movement: Service Broker

Activation: Internal, Blocking with WAITFOR RECEIVE

Transformation: TSQL with XML support

The complete code list for the demo can be found on the following link [https://www.codeplex.com/SQLSrvSrvcBrkr/Release/ProjectReleases.aspx?ReleaseId=15139].