SSIS: Logging in pipeline components

Ethan from AMB Dataminers had a great question about how to do logging through his pipeline component.

 

One way to do it is to call the Fire* methods that's on the IDTSComponentMetadata90. However, that does incur the cost of basically firing an event and handling an associated event handler. That's appropriate at times, while what's required at other times is the ability to just log something as quickly as you can. In addition, some situations call for such log entries to be separated from other log entries that other components might raise. I.e. the OnInformation log event can be pretty chatty, and might not be necessarily what you might want to force the user to use and get a plethora of other events they might not be interested in.

 

So, here's another way to do logging in the components, thanks to Ted and Matt with their help in this investigation!

 

IDTSComponentMetadata90 also exposes another method (in addition to the Fire* ones) called PostLogMessage.

 

Here's the steps for this route:

1. Register your log entry with the runtime environments so that it's available as a log entry in the logging dialog for the pipeline task.
2. Add your log entry!

 

As an example of doing this, lets add a log entry that will describe the SQL statement used by the ADOSource sample component that gets installed by Setup.

 

Registering log entries

In the component, I define a couple of consts:

private const string MyLogEntryName = "AdoNet Source Log Entry";
private const string MyLogEntryDescription = "Log entry from the AdoNet Custom Source";

and then overrode the 'RegisterLogEntries' method like this:

this.LogEntryInfos.Add(MyLogEntryName, MyLogEntryDescription,

Microsoft.SqlServer.Dts.Runtime.Wrapper.DTSLogEntryFrequency.DTSLEF_CONSISTENT);

DTSLogEntryFrequency provides a hint to the runtime about how frequently will the event be logged:

DTSLEF_OCCASIONAL: only sometimes, not during every execution
DTSLEF_CONSISTENT: constant number of times during every execution
DTSLEF_PROPORTIONAL: proportional to amount of work completed

The call above, then, creates a log entry with the name and description we want. I'll be logging once per execution, so am using DTSLEF_CONSISTENT.

This magically means that when I add a new component to the pipeline task, the 'Logging' dialog in the designer shows a new log entry now available for the pipeline task with the name "AdoNet Source Log Entry". Cool!

Logging

Now time to create the log entry. In this case, I just want to log the SQL statement that'll be used, and see that the existing code gets this statement in PreExecute. So, the following go at the end of that function:

DateTime now = DateTime.Now;
byte[] additionalData = null;
ComponentMetaData.PostLogMessage(MyLogEntryName, ComponentMetaData.Name, "Command Sent was: " + oledbCommand.CommandText, now, now, 0, ref additionalData);

Testing

After registering the component, I configure the statement in the source to be 'select * from INFORMATION_SCHEMA.TABLES', ask this log event to go to a file, and get the following:

User:AdoNet Source Log Entry,ASHVINIS-MOBILE,REDMOND\ashvinis,Ado Source CS Sample,{0000004D-0000-0000-0000-000000000000},{9FD94716-EEB8-48A8-A396-79E3CB705A0A},7/13/2005 5:15:38 PM,7/13/2005 5:15:38 PM,0,(null),Command Sent was: select * from INFORMATION_SCHEMA.TABLES

Hope this helps, thanks again, Ethan, for asking such an interesting question!

regards,
ash