A while back, I talked about the Locks and Threads performance counter data collected by version 2 of the .NET Compact Framework. That post mentioned that the counters collected data for ThreadPool threads.
What is a ThreadPool?
Some readers may wonder, “What is a thread pool?” A thread pool is a collection of threads on which work items can be scheduled.
The threads in the thread pool are typically managed by a class library (or operating system) rather than by the application. Having a pool of threads that are managed by the system is very handy in that it makes writing multi-threaded applications a bit simpler to write. I say “a bit simpler” because, while an application need not manage thread creation or manually limit the number of simultaneous threads, it must still ensure that threading sensitive operations — updating the user interface, for example — is handled properly.
It is also important to note that, like application managed threads, applications should not depend upon the execution order of your work items when using a thread pool. Threads are not guaranteed to complete in the order in which they are added to the ThreadPool queue.
Also, please keep in mind that thread pool threads are aborted when the application exits (as mentioned in the following note from the MSDN ThreadPool documentation). On the .NET Compact Framework, this information applies to version 2.
The threads in the managed thread pool are background threads. That is, their IsBackground properties are true. This means that a ThreadPool thread will not keep an application running after all foreground threads have exited.
This is a very important point that cannot be stressed strongly enough. Take care to ensure that critical operations, for which completion is required, (ex: saving a user’s data) are allowed to complete. This can be done in a number of ways, including:
- creating a foreground thread
- blocking the main thread from exiting until the critical operation is complete
- executing the critical operation on the main thread at application exit time
Using the ThreadPool
Using the ThreadPool is a straight forward task. You write a method to perform the desired work and queue that method into the thread pool. The snippet below shows a simple example.
/// Main application entry point
public static void Main()
// — application code goes here —
// queue a work item into the ThreadPool
String workerData = “David”;
ThreadPool.QueueUserWorkItem(new WaitCallback(WorkerTask), workerData);
// — application code goes here —
/// Method to be queued to the ThreadPool
/// <param name=”state”>Data to be processed by the method. In this method, a username.</param>
private void WorkerTask(Object state)
// the state that we will be passed is a string
String workerData = state as String;
// ensure that we are passed data of the proper type
Debug.Assert(workerData != null, “Invalid data”);
// simulate some processing
// — perform any desired application notification —
In the above example, the Main method queues an instance of the work item implemented by the WorkerTask method and resumes processing. The WorkerTask casts the state object to a String (it’s expected object type) and performs it’s processing (simulated by the Sleep statement). Work items can optionally notify the application upon completion. I often call a registered event handler to allow for user interface updates upon task completion.
You may notice the Debug.Assert statement in the WorkerTask method. I placed this here to underscore the fact that the application can send an object of any type as the state argument. Use of the as statement to cast state to workerData will result in it’s value being null if the state object is not the specified type. By asserting that workerData is non-null, you will be notified of any incorrect usage of the method during development and testing, provided a debug build is used.
The size of the thread pool can be queried (GetMaxThreads) and set (SetMaxThreads). When setting the maximum number of thread pool threads, it is important to be aware that changes you make impact the .NET Compact Framework runtime as well as your application. Setting the maximum number of threads too high or too low can cause application performance to suffer.
This posting is provided “AS IS” with no warranties, and confers no rights.