Synchronous & Asynchronous and MultiThread Programming -6 — ThreadPool & WaitHandles

Alperen Öz
8 min readJul 7, 2024

--

ThreadPool & WaitHandles

Photo by Austin Distel on Unsplash

What is a Thread Pool?

  • In C#, the Thread Pool mechanism used in multi-threading programming is designed to manage and make numerous threads reusable.
  • The Thread Pool mechanism in C# is designed to manage a large number of threads efficiently and make them reusable.
  • Creating a thread can require approximately 1 MB of memory and starting a new thread can incur processor costs.
  • The Thread Pool reduces the overhead of managing many threads and allows CPU cores to be used more efficiently, focusing on utilizing processor resources more effectively.
  • This mechanism performs multi-threaded operations at a lower performance cost by preventing unnecessary thread creation and minimizing thread overhead.

How is a Thread Pool Structured?

  • A Thread Pool consists of a group of pre-created threads. These threads wait in the pool to perform specific tasks. Once a task is completed, the thread returns to the pool and is ready for a new task. This approach removes the cost of creating and destroying threads from the software.

How is a Thread Pool Used?

  • A thread can be obtained from the Thread Pool in C# using the ThreadPool.QueueUserWorkItem method, which then executes the WorkerMethod.

Here’s an example:

ThreadPool.QueueUserWorkItem(WorkerMethod, "Task 1");
ThreadPool.QueueUserWorkItem(WorkerMethod, "Task 2");
ThreadPool.QueueUserWorkItem(WorkerMethod, "Task 3");
ThreadPool.QueueUserWorkItem(WorkerMethod, "Task 4");
ThreadPool.QueueUserWorkItem(WorkerMethod, "Task 5");
ThreadPool.QueueUserWorkItem(WorkerMethod, "Task 6");
Console.ReadLine();

void WorkerMethod(object state)
{
Console.WriteLine($"*** Thread Count: {ThreadPool.ThreadCount}");
string jobName = (string)state;
Console.WriteLine($"Task Started: {jobName}");
Thread.Sleep(new Random().Next(1000, 5000));
Console.WriteLine($"Task Done: {jobName}");
}

As observed, the program starts, calls the WorkerMethod six times, and executes this operation via the thread pool.

Looking at the output, you can see that the thread pool assigns a thread to the first call with the parameter Task 2, and then continues with the others. This order can vary.

Additionally, the number of active threads in the thread pool is observed during each call.

The thread pool can autonomously increase or decrease the number of active threads.

How to Optimize the Thread Pool?

The Thread Pool starts with a single thread when it is first created. As new tasks arrive, the pool expands, creating new threads and retiring existing threads as needed.

Additionally, for optimization purposes, the Thread Pool configures the threads it contains to run in the background. Thus, when the application stops, all threads will automatically be terminated.

There is no way to cancel a thread once it has been submitted to the Thread Pool.

The only optimization we can perform on the Thread Pool is to set the maximum and minimum number of threads that can run concurrently. Using the SetMaxThreads method, you can set the number of threads that can run concurrently in the pool, and similarly, using the SetMinThreads method, you can set the minimum number of threads.

The SetMinThreads method might initially seem insignificant, but it is critical for optimization. Each thread in the Thread Pool has a cost in terms of system resources, similar to the costs encountered when creating a thread. Therefore, the goal is to minimize this cost in the Thread Pool.

To avoid these costs and ensure a certain number of threads are always active, you can use the SetMinThreads method. This ensures that a specific number of threads are always ready and able to handle the workload.

ThreadPool.SetMaxThreads(4,4);
ThreadPool.SetMinThreads(2,2);

ThreadPool.QueueUserWorkItem(WorkerMethod, "Task 1");
ThreadPool.QueueUserWorkItem(WorkerMethod, "Task 2");
ThreadPool.QueueUserWorkItem(WorkerMethod, "Task 3");
ThreadPool.QueueUserWorkItem(WorkerMethod, "Task 4");
ThreadPool.QueueUserWorkItem(WorkerMethod, "Task 5");
ThreadPool.QueueUserWorkItem(WorkerMethod, "Task 6");
Console.ReadLine();
void WorkerMethod(object state)
{
Console.WriteLine($"*** Thread Count: {ThreadPool.ThreadCount}");
string jobName = (string)state;
Console.WriteLine($"Task Started: {jobName}");
Thread.Sleep(new Random().Next(1000, 5000));
Console.WriteLine($"Task Done: {jobName}");
}
//*** Thread Count: 6
//Task Started: Task 2
//*** Thread Count: 6
//Task Started: Task 1
//*** Thread Count: 7
//Task Started: Task 5
//***Thread Count: 6
//Task Started: Task 3
//***Thread Count: 8
//Task Started: Task 6
//***Thread Count: 6
//Task Started: Task 4
//Task Done: Task 1
//Task Done: Task 2
//Task Done: Task 3
//Task Done: Task 5
//Task Done: Task 6
//Task Done: Task 4

In the above example, the minimum and maximum number of threads in the Thread Pool are determined using the SetMinThreads and SetMaxThreads methods. However, as seen in the output, the Thread Pool does not adhere strictly to the minimum and maximum values we set and can operate outside these limits.

Even though the max and min values are specified, the Thread Pool does not guarantee adherence to these values. These values are recommendations for the Thread Pool, but the final decision is made by the Thread Pool itself.

The second parameters of the SetMaxThreads and SetMinThreads methods, completionPortThreads, ensure that while a thread is running, another thread's I/O operations can be executed asynchronously. This prevents the specified number of threads from being blocked until their I/O operations are completed, thereby allowing the Thread Pool to use resources more optimally.

When Should the Thread Pool Be Used?

High workload applications: The Thread Pool helps manage processes more efficiently under heavy workloads, especially when many threads need to be run concurrently.

Network-based applications: Applications requiring network communication can use the Thread Pool to handle network requests in parallel, minimizing network-related delays and improving performance.

Web servers and services: Web servers and services can use the Thread Pool to process incoming requests in parallel, allowing them to handle a large number of requests simultaneously and respond faster.

GUI applications: Graphical user interface applications can use the Thread Pool to manage user interactions and processes in parallel, enhancing user experience and ensuring smoother application performance.

Database operations: Database operations, which are often delayed processes, can be managed in parallel using the Thread Pool, making the application faster and more efficient.

Using the Thread Pool in these scenarios ensures more efficient management of threads and enhances system performance.

The Purpose of Using the Thread Pool

Performance improvement: The Thread Pool prevents unnecessary thread creation and optimizes thread management, thereby improving performance. This allows for more efficient use of CPU resources and enhances the overall performance of the application.

Scalability support: The Thread Pool can manage a large number of threads and balance system performance effectively, making the application scalable and adaptable to different load levels.

Resource management optimization: By reusing threads, the Thread Pool ensures effective resource management, increasing system stability and preventing resource wastage.

Load balancing and system stabilization: The Thread Pool distributes the workload evenly, ensuring balanced performance across the system. It also queues pending operations, balancing system load and enhancing overall application stability.

Application responsiveness: By reusing threads and reducing the cost of unnecessary thread creation and termination, the Thread Pool makes the application faster and more responsive. This is especially critical for applications requiring quick user interactions.

Disadvantages of the Thread Pool

Limited control: The Thread Pool does not offer much control over threads. You cannot determine the priority or execution time of threads.

Task Class and the Thread Pool

The Task class represents and executes asynchronous operations. The Task class leverages the Thread Pool to perform its operations in the background. Methods like Task.Run and Task.Factory.StartNew create a Task object, which runs on a Thread Pool thread in the background. The Task class, working with the .NET's TPL (Task Parallel Library), simplifies parallel programming and thread management. The integration of tasks with the Thread Pool ensures more efficient resource utilization and improves system performance.

Thread Class and the Thread Pool

The Thread class is designed for directly creating and managing threads. It operates independently of the Thread Pool, with each new thread creation resulting in a new system thread. This approach offers more flexibility and control but consumes more resources and is harder to manage.

Summary

The Thread Pool is a powerful tool for managing threads effectively in C# and .NET applications. It reduces the cost of creating and destroying threads, enhancing performance and optimizing resource usage. The Thread Pool is ideal for short-lived and frequently performed operations, and when used with the Task class, it simplifies and makes asynchronous programming and parallel processing more efficient.

What are Wait Handles?

Wait Handles are mechanisms that implement a signaling approach, allowing one thread to wait for another active thread to complete its operations. Essentially, AutoResetEvent and ManualResetEvent, mentioned in the previous signaling article, are types of wait handles. These are the core components of the wait handles mechanism.

Using Wait Handles with ThreadPool

The ThreadPool.RegisterWaitForSingleObject method is used to manage the synchronization and event-based operations of threads within the ThreadPool effectively, thereby enhancing application performance and efficiency. This method is particularly useful for having threads within the ThreadPool wait for a specific WaitHandle object to signal. It allows for receiving notifications when asynchronous operations are complete and is ideal for completing these operations without blocking.

AutoResetEvent autoResetEvent = new(false);
RegisteredWaitHandle registeredWaitHandle = ThreadPool.RegisterWaitForSingleObject(autoResetEvent, WorkerMethod, "Task 1 Wait Handle", -1, true); // No Time Out!
// RegisteredWaitHandle registeredWaitHandle = ThreadPool.RegisterWaitForSingleObject(autoResetEvent, WorkerMethod, "Task 1 Wait Handle", 15000, true); // will activate automatically after 1.5 seconds...!
Thread.Sleep(2500);
autoResetEvent.Set();
registeredWaitHandle.Unregister(autoResetEvent);
Console.Read();
void WorkerMethod(object state, bool timedOut)
{
Console.WriteLine($"*** Thread Count: {ThreadPool.ThreadCount}");
string name = (string)state;
Console.WriteLine($"{name} started");
Thread.Sleep(new Random().Next(1000, 5000));
Console.WriteLine($"{name} done");
}
// *** Thread Count: 5
// Task 1 Wait Handle started
// Task 1 Wait Handle done

By providing the value -1 to the parameter, it indicates that there is no timeout period, and the thread will wait indefinitely until it receives a signal. If a value were provided, the thread would automatically execute once the timeout period is reached. Additionally, the true value in the fifth parameter indicates that the thread will only run once. If false were provided, the thread would execute repeatedly whenever the timeout occurs.

It’s important to use the Unregister method to release system resources once the process is complete.

registeredWaitHandle.Unregister(autoResetEvent);

Signaling — Helper Methods

In works that exhibit signaling behavior, we can leverage these methods to control the states of multiple different types of signals.

WaitHandle.WaitAll

The WaitHandle.WaitAll method takes an array of signals and blocks the calling thread until all signals have been received. This can be thought of as an AND (&) condition between the signals.

AutoResetEvent autoResetEvent1 = new(false);
AutoResetEvent autoResetEvent2 = new(false);
ManualResetEvent manualResetEvent1 = new(false);
ManualResetEvent manualResetEvent2 = new(false);

autoResetEvent1.Set();
autoResetEvent2.Set();
manualResetEvent1.Set();
manualResetEvent2.Set();
WaitHandle.WaitAll(new WaitHandle[]
{
autoResetEvent1, autoResetEvent2, manualResetEvent1, manualResetEvent2
});
Console.WriteLine("Wait All");
Console.ReadLine();

WaitHandle.WaitAny

The WaitHandle.WaitAny method takes an array of signals and blocks the calling thread until any one of the signals has been received. This can be thought of as an OR (||) condition between the signals.

AutoResetEvent autoResetEvent1 = new(false);
AutoResetEvent autoResetEvent2 = new(false);
ManualResetEvent manualResetEvent1 = new(false);
ManualResetEvent manualResetEvent2 = new(false);

manualResetEvent1.Set();
WaitHandle.WaitAny(new WaitHandle[]
{
autoResetEvent1, autoResetEvent2, manualResetEvent1, manualResetEvent2
});
Console.WriteLine("Wait Any");
Console.ReadLine();

WaitHandle.SignalAndWait

The WaitHandle.SignalAndWait method takes two signals, sends a signal to the first one, and then waits for the second one to signal by triggering the WaitOne method.

AutoResetEvent autoResetEvent1 = new(false);
AutoResetEvent autoResetEvent2 = new(false);

autoResetEvent1.Set();
autoResetEvent2.Set();
WaitHandle.SignalAndWait(autoResetEvent1, autoResetEvent2);
Console.WriteLine("Wait SignalAndWait");
Console.ReadLine();

Source: Gençay Yıldız — Asenkron & Multithread Programlama

--

--