Export (0) Print
Expand All

The Managed Thread Pool

The ThreadPool class provides your application with a pool of worker threads that are managed by the system, allowing you to concentrate on application tasks rather than thread management. If you have short tasks that require background processing, the managed thread pool is an easy way to take advantage of multiple threads. For example, beginning with the .NET Framework 4 you can create Task and Task<TResult> objects, which perform asynchronous tasks on thread pool threads.

Note Note

Starting with the .NET Framework 2.0 Service Pack 1, the throughput of the thread pool is significantly improved in three key areas that were identified as bottlenecks in previous releases of the .NET Framework: queuing tasks, dispatching thread pool threads, and dispatching I/O completion threads. To use this functionality, your application should target the .NET Framework 3.5 or later.

For background tasks that interact with the user interface, the .NET Framework version 2.0 also provides the BackgroundWorker class, which communicates using events raised on the user interface thread.

The .NET Framework uses thread pool threads for many purposes, including asynchronous I/O completion, timer callbacks, registered wait operations, asynchronous method calls using delegates, and System.Net socket connections.

There are several scenarios in which it is appropriate to create and manage your own threads instead of using thread pool threads:

  • You require a foreground thread.

  • You require a thread to have a particular priority.

  • You have tasks that cause the thread to block for long periods of time. The thread pool has a maximum number of threads, so a large number of blocked thread pool threads might prevent tasks from starting.

  • You need to place threads into a single-threaded apartment. All ThreadPool threads are in the multithreaded apartment.

  • You need to have a stable identity associated with the thread, or to dedicate a thread to a task.

Thread pool threads are background threads. See Foreground and Background Threads. Each thread uses the default stack size, runs at the default priority, and is in the multithreaded apartment.

There is only one thread pool per process.

Unhandled exceptions on thread pool threads terminate the process. There are three exceptions to this rule:

  • A ThreadAbortException is thrown in a thread pool thread, because Abort was called.

  • An AppDomainUnloadedException is thrown in a thread pool thread, because the application domain is being unloaded.

  • The common language runtime or a host process terminates the thread.

For more information, see Exceptions in Managed Threads.

Note Note

In the .NET Framework versions 1.0 and 1.1, the common language runtime silently traps unhandled exceptions in thread pool threads. This might corrupt application state and eventually cause applications to hang, which might be very difficult to debug.

The number of operations that can be queued to the thread pool is limited only by available memory; however, the thread pool limits the number of threads that can be active in the process simultaneously. Beginning with the .NET Framework 4, the default size of the thread pool for a process depends on several factors, such as the size of the virtual address space. A process can call the GetMaxThreads method to determine the number of threads.

You can control the maximum number of threads by using the GetMaxThreads and SetMaxThreads methods.

Note Note

In the .NET Framework versions 1.0 and 1.1, the size of the thread pool cannot be set from managed code. Code that hosts the common language runtime can set the size using CorSetMaxThreads, defined in mscoree.h.

The thread pool provides new worker threads or I/O completion threads on demand until it reaches a specified minimum for each category. You can use the GetMinThreads method to obtain these minimum values.

Note Note

When demand is low, the actual number of thread pool threads can fall below the minimum values.

When a minimum is reached, the thread pool can create additional threads or wait until some tasks complete. Beginning with the .NET Framework 4, the thread pool creates and destroys worker threads in order to optimize throughput, which is defined as the number of tasks that complete per unit of time. Too few threads might not make optimal use of available resources, whereas too many threads could increase resource contention.

Caution note Caution

You can use the SetMinThreads method to increase the minimum number of idle threads. However, unnecessarily increasing these values can cause performance problems. If too many tasks start at the same time, all of them might appear to be slow. In most cases the thread pool will perform better with its own algorithm for allocating threads.

The thread pool also provides the ThreadPool.UnsafeQueueUserWorkItem and ThreadPool.UnsafeRegisterWaitForSingleObject methods. Use these methods only when you are certain that the caller's stack is irrelevant to any security checks performed during the execution of the queued task. QueueUserWorkItemand RegisterWaitForSingleObject both capture the caller's stack, which is merged into the stack of the thread pool thread when the thread begins to execute a task. If a security check is required, the entire stack must be checked. Although the check provides safety, it also has a performance cost.

Beginning with the .NET Framework 4, the easiest way to use the thread pool is to use the Task Parallel Library (TPL). By default, parallel library types like Task and Task<TResult> use thread pool threads to run tasks. You can also use the thread pool by calling ThreadPool.QueueUserWorkItem from managed code (or CorQueueUserWorkItem from unmanaged code) and passing a WaitCallback delegate representing the method that performs the task. Another way to use the thread pool is to queue work items that are related to a wait operation by using the ThreadPool.RegisterWaitForSingleObject method and passing a WaitHandle that, when signaled or when timed out, calls the method represented by the WaitOrTimerCallback delegate. Thread pool threads are used to invoke callback methods.

The code examples in this section demonstrate the thread pool by using the Task class, the ThreadPool.QueueUserWorkItem method, and the ThreadPool.RegisterWaitForSingleObject method.

The following example shows how to create and use a Task object by calling the TaskFactory.StartNew method. For an example that uses the Task<TResult> class to return a value from an asynchronous task, see How to: Return a Value from a Task.

// Demonstrated features: 
//		Task ctor() 
// 		Task.Factory 
//		Task.Wait() 
//		Task.RunSynchronously() 
// Expected results: 
// 		Task t1 (alpha) is created unstarted. 
//		Task t2 (beta) is created started. 
//		Task t1's (alpha) start is held until after t2 (beta) is started. 
//		Both tasks t1 (alpha) and t2 (beta) are potentially executed on  
//           threads other than the main thread on multi-core machines. 
//		Task t3 (gamma) is executed synchronously on the main thread. 
using System;
using System.Threading;
using System.Threading.Tasks;

class StartNewDemo
{
    static void Main()
    {
        Action<object> action = (object obj) =>
        {
            Console.WriteLine("Task={0}, obj={1}, Thread={2}", 
                              Task.CurrentId, obj.ToString(), 
                              Thread.CurrentThread.ManagedThreadId);
        };

        // Construct an unstarted task
        Task t1 = new Task(action, "alpha");

        // Cosntruct a started task
        Task t2 = Task.Factory.StartNew(action, "beta");

        // Block the main thread to demonstate that t2 is executing
        t2.Wait();

        // Launch t1 
        t1.Start();

        Console.WriteLine("t1 has been launched. (Main Thread={0})", 
                          Thread.CurrentThread.ManagedThreadId);

        // Wait for the task to finish. 
        // You may optionally provide a timeout interval or a cancellation token 
        // to mitigate situations when the task takes too long to finish.
        t1.Wait();

        // Construct an unstarted task
        Task t3 = new Task(action, "gamma");

        // Run it synchronously
        t3.RunSynchronously();

        // Although the task was run synchronously, it is a good practice 
        // to wait for it in the event exceptions were thrown by the task.
        t3.Wait();
    }
}

The following example queues a very simple task, represented by the ThreadProc method, using the QueueUserWorkItem method.

using System;
using System.Threading;

public class Example
{
    public static void Main()
    {
        // Queue the task.
        ThreadPool.QueueUserWorkItem(new WaitCallback(ThreadProc));

        Console.WriteLine("Main thread does some work, then sleeps.");
        // If you comment out the Sleep, the main thread exits before 
        // the thread pool task runs.  The thread pool uses background 
        // threads, which do not keep the application running.  (This 
        // is a simple example of a race condition.)
        Thread.Sleep(1000);

        Console.WriteLine("Main thread exits.");
    }

    // This thread procedure performs the task. 
    static void ThreadProc(Object stateInfo)
    {
        // No state object was passed to QueueUserWorkItem, so 
        // stateInfo is null.
        Console.WriteLine("Hello from the thread pool.");
    }
}

The following code example uses the QueueUserWorkItem method to queue a task and supply the data for the task.

using System;
using System.Threading;

// TaskInfo holds state information for a task that will be 
// executed by a ThreadPool thread. 
public class TaskInfo
{
    // State information for the task.  These members 
    // can be implemented as read-only properties, read/write 
    // properties with validation, and so on, as required. 
    public string Boilerplate;
    public int Value;

    // Public constructor provides an easy way to supply all 
    // the information needed for the task. 
    public TaskInfo(string text, int number)
    {
        Boilerplate = text;
        Value = number;
    }
}

public class Example
{
    public static void Main()
    {
        // Create an object containing the information needed 
        // for the task.
        TaskInfo ti = new TaskInfo("This report displays the number {0}.", 42);

        // Queue the task and data. 
        if (ThreadPool.QueueUserWorkItem(new WaitCallback(ThreadProc), ti))
        {
            Console.WriteLine("Main thread does some work, then sleeps.");

            // If you comment out the Sleep, the main thread exits before 
            // the ThreadPool task has a chance to run.  ThreadPool uses 
            // background threads, which do not keep the application 
            // running.  (This is a simple example of a race condition.)
            Thread.Sleep(1000);

            Console.WriteLine("Main thread exits.");
        }
        else
        {
            Console.WriteLine("Unable to queue ThreadPool request.");
        }
    }

    // The thread procedure performs the independent task, in this case 
    // formatting and printing a very simple report. 
    // 
    static void ThreadProc(Object stateInfo)
    {
        TaskInfo ti = (TaskInfo) stateInfo;
        Console.WriteLine(ti.Boilerplate, ti.Value);
    }
}

The following example demonstrates several threading features.

using System;
using System.Threading;

// TaskInfo contains data that will be passed to the callback 
// method. 
public class TaskInfo
{
    public RegisteredWaitHandle Handle = null;
    public string OtherInfo = "default";
}

public class Example
{
    public static void Main(string[] args)
    {
        // The main thread uses AutoResetEvent to signal the 
        // registered wait handle, which executes the callback 
        // method.
        AutoResetEvent ev = new AutoResetEvent(false);

        TaskInfo ti = new TaskInfo();
        ti.OtherInfo = "First task";
        // The TaskInfo for the task includes the registered wait 
        // handle returned by RegisterWaitForSingleObject.  This 
        // allows the wait to be terminated when the object has 
        // been signaled once (see WaitProc).
        ti.Handle = ThreadPool.RegisterWaitForSingleObject(
            ev,
            new WaitOrTimerCallback(WaitProc),
            ti,
            1000,
            false );

        // The main thread waits three seconds, to demonstrate the 
        // time-outs on the queued thread, and then signals.
        Thread.Sleep(3100);
        Console.WriteLine("Main thread signals.");
        ev.Set();

        // The main thread sleeps, which should give the callback 
        // method time to execute.  If you comment out this line, the 
        // program usually ends before the ThreadPool thread can execute.
        Thread.Sleep(1000);
        // If you start a thread yourself, you can wait for it to end 
        // by calling Thread.Join.  This option is not available with 
        // thread pool threads.
    }

    // The callback method executes when the registered wait times out, 
    // or when the WaitHandle (in this case AutoResetEvent) is signaled. 
    // WaitProc unregisters the WaitHandle the first time the event is 
    // signaled. 
    public static void WaitProc(object state, bool timedOut)
    {
        // The state object must be cast to the correct type, because the 
        // signature of the WaitOrTimerCallback delegate specifies type 
        // Object.
        TaskInfo ti = (TaskInfo) state;

        string cause = "TIMED OUT";
        if (!timedOut)
        {
            cause = "SIGNALED";
            // If the callback method executes because the WaitHandle is 
            // signaled, stop future execution of the callback method 
            // by unregistering the WaitHandle. 
            if (ti.Handle != null)
                ti.Handle.Unregister(null);
        }

        Console.WriteLine("WaitProc( {0} ) executes on thread {1}; cause = {2}.",
            ti.OtherInfo,
            Thread.CurrentThread.GetHashCode().ToString(),
            cause
        );
    }
}
Was this page helpful?
(1500 characters remaining)
Thank you for your feedback
Show:
© 2014 Microsoft