
Pro CSharp 2008 And The .NET 3.5 Platform [eng]
.pdf
602 CHAPTER 18 ■ BUILDING MULTITHREADED APPLICATIONS
Notice that this Main() method is not making a call to Console.ReadLine() to force the console to remain visible until you press the Enter key. Thus, when you run the application, it will shut down immediately because the Thread object has been configured as a background thread. Given that the Main() method triggers the creation of the primary foreground thread, as soon as the logic in Main() completes, the AppDomain unloads before the secondary thread is able to complete its work. However, if you comment out the line that sets the IsBackground property, you will find that each number prints to the console, as all foreground threads must finish their work before the AppDomain is unloaded from the hosting process.
For the most part, configuring a thread to run as a background type can be helpful when the worker thread in question is performing a noncritical task that is no longer needed when the main task of the program is finished.
■Source Code The BackgroundThread project is included under the Chapter 18 subdirectory.
The Issue of Concurrency
All the multithreaded sample applications you have written over the course of this chapter have been thread-safe, given that only a single Thread object was executing the method in question. While some of your applications may be this simplistic in nature, a good deal of your multithreaded applications may contain numerous secondary threads. Given that all threads in an AppDomain have concurrent access to the shared data of the application, imagine what might happen if multiple threads were accessing the same point of data. As the thread scheduler will force threads to suspend their work at random, what if thread A is kicked out of the way before it has fully completed its work? Thread B is now reading unstable data.
To illustrate the problem of concurrency, let’s build another C# Console Application project named MultiThreadedPrinting. This application will once again make use of the Printer class created previously, but this time the PrintNumbers() method will force the current thread to pause for a randomly generated amount of time:
public class Printer
{
public void PrintNumbers()
{
...
for (int i = 0; i < 10; i++)
{
// Put thread to sleep for a random amount of time.
Random r = new Random(); Thread.Sleep(1000 * r.Next(5)); Console.Write("{0}, ", i);
}
Console.WriteLine();
}
}
The Main() method is responsible for creating an array of ten (uniquely named) Thread objects, each of which is making calls on the same instance of the Printer object:
class Program
{
static void Main(string[] args)

CHAPTER 18 ■ BUILDING MULTITHREADED APPLICATIONS |
603 |
{
Console.WriteLine("*****Synchronizing Threads *****\n");
Printer p = new Printer();
//Make 10 threads that are all pointing to the same
//method on the same object.
Thread[] threads = new Thread[10]; for (int i = 0; i < 10; i++)
{
threads[i] =
new Thread(new ThreadStart(p.PrintNumbers)); threads[i].Name = string.Format("Worker thread #{0}", i);
}
// Now start each one. foreach (Thread t in threads)
t.Start();
Console.ReadLine();
}
}
Before looking at some test runs, let’s recap the problem. The primary thread within this AppDomain begins life by spawning ten secondary worker threads. Each worker thread is told to make calls on the PrintNumbers() method on the same Printer instance. Given that you have taken no precautions to lock down this object’s shared resources (the console), there is a good chance that the current thread will be kicked out of the way before the PrintNumbers() method is able to print out the complete results. Because you don’t know exactly when (or if) this might happen, you are bound to get unpredictable results. For example, you might find the output shown in Figure 18-8.
Figure 18-8. Concurrency in action, take one
Now run the application a few more times. Figure 18-9 shows another possibility (your results will obviously differ).

604 CHAPTER 18 ■ BUILDING MULTITHREADED APPLICATIONS
Figure 18-9. Concurrency in action, take two
■Note If you are unable to generate unpredictable outputs, increase the number of threads from 10 to 100 (for example) or introduce another call to Thread.Sleep() within your program. Eventually, you will encounter the concurrency issue.
There are clearly some problems here. As each thread is telling the Printer to print out the numerical data, the thread scheduler is happily swapping threads in the background. The result is inconsistent output. What we need is a way to programmatically enforce synchronized access to the shared resources. As you would guess, the System.Threading namespace provides a number of synchronization-centric types. The C# programming language also provides a particular keyword for the very task of synchronizing shared data in multithreaded applications.
Synchronization Using the C# lock Keyword
The first technique you can use to synchronize access to shared resources is the C# lock keyword. This keyword allows you to define a scope of statements that must be synchronized between threads. By doing so, incoming threads cannot interrupt the current thread, preventing it from finishing its work. The lock keyword requires you to specify a token (an object reference) that must be acquired by a thread to enter within the lock scope. When you are attempting to lock down a private instance-level method, you can simply pass in a reference to the current type:
private void SomePrivateMethod()
{
// Use the current object as the thread token. lock(this)
{
// All code within this scope is thread-safe.
}
}
However, if you are locking down a region of code within a public member, it is safer (and a best practice) to declare a private object member variable to serve as the lock token:
public class Printer
{
// Lock token.
private object threadLock = new object();


606 CHAPTER 18 ■ BUILDING MULTITHREADED APPLICATIONS
Figure 18-10. All threads are now synchronized.
Synchronization Using the System.Threading.Monitor Type
The C# lock statement is really just a shorthand notation for working with the System.Threading. Monitor class type. Once processed by the C# compiler, a lock scope actually resolves to the following (which you can verify using ildasm.exe or reflector.exe):
public void PrintNumbers()
{
Monitor.Enter(threadLock); try
{
// Display Thread info.
Console.WriteLine("-> {0} is executing PrintNumbers()", Thread.CurrentThread.Name);
// Print out numbers. Console.Write("Your numbers: "); for (int i = 0; i < 10; i++)
{
Random r = new Random(); Thread.Sleep(1000 * r.Next(5)); Console.Write("{0}, ", i);
}
Console.WriteLine();
}
finally
{
Monitor.Exit(threadLock);
}
}
First, notice that the Monitor.Enter() method is the ultimate recipient of the thread token you specified as the argument to the lock keyword. Next, all code within a lock scope is wrapped within a try block. The corresponding finally clause ensures that the thread token is released (via the

CHAPTER 18 ■ BUILDING MULTITHREADED APPLICATIONS |
607 |
Monitor.Exit() method), regardless of any possible runtime exception. If you were to modify the MultiThreadSharedData program to make direct use of the Monitor type (as just shown), you will find the output is identical.
Now, given that the lock keyword seems to require less code than making explicit use of the System.Threading.Monitor type, you may wonder about the benefits of using the Monitor type directly. The short answer is control. If you make use of the Monitor type, you are able to instruct the active thread to wait for some duration of time (via the Wait() method), inform waiting threads when the current thread is completed (via the Pulse() and PulseAll() methods), and so on.
As you would expect, in a great number of cases, the C# lock keyword will fit the bill. However, if you are interested in checking out additional members of the Monitor class, consult the .NET Framework 3.5 SDK documentation.
Synchronization Using the System.Threading.Interlocked Type
Although it always is hard to believe until you look at the underlying CIL code, assignments and simple arithmetic operations are not atomic. For this reason, the System.Threading namespace provides a type that allows you to operate on a single point of data atomically with less overhead than with the Monitor type. The Interlocked class type defines the static members shown in Table 18-4.
Table 18-4. Members of the System.Threading.Interlocked Type
Member |
Meaning in Life |
CompareExchange() |
Safely tests two values for equality and, if equal, changes one of the values |
|
with a third |
Decrement() |
Safely decrements a value by 1 |
Exchange() |
Safely swaps two values |
Increment() |
Safely increments a value by 1 |
|
|
Although it might not seem like it from the onset, the process of atomically altering a single value is quite common in a multithreaded environment. Assume you have a method named AddOne() that increments an integer member variable named intVal. Rather than writing synchronization code such as the following:
public void AddOne()
{
lock(myLockToken)
{
intVal++;
}
}
you can simplify your code via the static Interlocked.Increment() method. Simply pass in the variable to increment by reference. Do note that the Increment() method not only adjusts the value of the incoming parameter, but also returns the new value:
public void AddOne()
{
int newVal = Interlocked.Increment(ref intVal);
}
In addition to Increment() and Decrement(), the Interlocked type allows you to atomically assign numerical and object data. For example, if you wish to assign the value of a member variable

608CHAPTER 18 ■ BUILDING MULTITHREADED APPLICATIONS
to the value 83, you can avoid the need to use an explicit lock statement (or explicit Monitor logic) and make use of the Interlocked.Exchange() method:
public void SafeAssignment()
{
Interlocked.Exchange(ref myInt, 83);
}
Finally, if you wish to test two values for equality to change the point of comparison in a thread-safe manner, you are able to leverage the Interlocked.CompareExchange() method as follows:
public void CompareAndExchange()
{
// If the value of i is currently 83, change i to 99.
Interlocked.CompareExchange(ref i, 99, 83);
}
Synchronization Using the [Synchronization] Attribute
The final synchronization primitive examined here is the [Synchronization] attribute, which is a member of the System.Runtime.Remoting.Contexts namespace. In essence, this class-level attribute effectively locks down all instance member code of the object for thread safety. When the CLR allocates objects attributed with [Synchronization], it will place the object within a synchronized context. As you may recall from Chapter 17, objects that should not be removed from a contextual boundary should derive from ContextBoundObject. Therefore, if you wish to make the Printer class type thread-safe (without explicitly writing thread-safe code within the class members), you could update the definition as follows:
using System.Runtime.Remoting.Contexts;
...
// All methods of Printer are now thread-safe!
[Synchronization]
public class Printer : ContextBoundObject
{
public void PrintNumbers()
{
...
}
}
In some ways, this approach can be seen as the lazy way to write thread-safe code, given that you are not required to dive into the details about which aspects of the type are truly manipulating thread-sensitive data. The major downfall of this approach, however, is that even if a given method is not making use of thread-sensitive data, the CLR will still lock invocations to the method. Obviously, this could degrade the overall functionality of the type, so use this technique with care.
At this point, you have seen a number of ways you are able to provide synchronized access to shared blocks of data. To be sure, additional synchronization types are available within the System. Threading namespace, which I will encourage you to explore at your leisure. To wrap up our examination of thread programming, allow me to introduce four additional types: TimerCallback, Timer,
ThreadPool, and BackgroundWorker.

CHAPTER 18 ■ BUILDING MULTITHREADED APPLICATIONS |
609 |
Programming with Timer Callbacks
Many applications have the need to call a specific method during regular intervals of time. For example, you may have an application that needs to display the current time on a status bar via a given helper function. As another example, you may wish to have your application call a helper function every so often to perform noncritical background tasks such as checking for new e-mail messages. For situations such as these, you can use the System.Threading.Timer type in conjunction with a related delegate named TimerCallback.
■Note The Windows Forms API provides a GUI-based Timer control that provides the same functionality of the TimerCallback type. In fact, the GUI-based Timer type is typically simpler to use, as it can be configured at design time.
To illustrate, assume you have a Console Application (TimerApp) that will print the current time every second until the user presses a key to terminate the application. The first obvious step is to write the method that will be called by the Timer type:
class Program
{
static void PrintTime(object state)
{
Console.WriteLine("Time is: {0}", DateTime.Now.ToLongTimeString());
}
static void Main(string[] args)
{
}
}
Notice how this method has a single parameter of type System.Object and returns void. This is not optional, given that the TimerCallback delegate can only call methods that match this signature. The value passed into the target of your TimerCallback delegate can be any bit of information whatsoever (in the case of the e-mail example, this parameter might represent the name of the Microsoft Exchange server to interact with during the process). Also note that given that this parameter is indeed a System.Object, you are able to pass in multiple arguments using a System.Array or custom class/structure.
The next step is to configure an instance of the TimerCallback delegate and pass it into the Timer object. In addition to configuring a TimerCallback delegate, the Timer constructor allows you to specify the optional parameter information to pass into the delegate target (defined as a System.Object), the interval to poll the method, and the amount of time to wait (in milliseconds) before making the first call, for example:
static void Main(string[] args)
{
Console.WriteLine("***** Working with Timer type *****\n");
//Create the delegate for the Timer type.
TimerCallback timeCB = new TimerCallback(PrintTime);
//Establish timer settings.
Timer t = new Timer(
timeCB, // The TimerCallback delegate type.
null, // Any info to pass into the called method (null for no info).

610 CHAPTER 18 ■ BUILDING MULTITHREADED APPLICATIONS
0, // Amount of time to wait before starting.
1000); // Interval of time between calls (in milliseconds).
Console.WriteLine("Hit key to terminate..."); Console.ReadLine();
}
In this case, the PrintTime() method will be called roughly every second and will pass in no additional information to said method. If you did wish to send in some information for use by the delegate target, simply substitute the null value of the second constructor parameter with the appropriate information:
// Establish timer settings.
Timer t = new Timer(timeCB, "Hello From Main", 0, 1000);
We can then obtain the incoming data as follows:
static void PrintTime(object state)
{
Console.WriteLine("Time is: {0}, Param is: {1}", DateTime.Now.ToLongTimeString(), state.ToString());
}
Figure 18-11 shows the output.
Figure 18-11. Timers at work
■Source Code The TimerApp project is included under the Chapter 18 subdirectory.
Understanding the CLR ThreadPool
The next thread-centric topic we will examine in this chapter is the role of the CLR thread pool. When you invoke a method asynchronously using delegate types (via the BeginInvoke() method), the CLR does not literally create a brand-new thread. For purposes of efficiency, a delegate’s BeginInvoke() method leverages a pool of worker threads that is maintained by the runtime. To allow you to interact with this pool of waiting threads, the System.Threading namespace provides the ThreadPool class type.
If you wish to queue a method call for processing by a worker thread in the pool, you can make use of the ThreadPool.QueueUserWorkItem() method. This method has been overloaded to allow you to specify an optional System.Object for custom state data in addition to an instance of the
WaitCallback delegate:

CHAPTER 18 ■ BUILDING MULTITHREADED APPLICATIONS |
611 |
public sealed class ThreadPool
{
...
public static bool QueueUserWorkItem(WaitCallback callBack); public static bool QueueUserWorkItem(WaitCallback callBack,
object state);
}
The WaitCallback delegate can point to any method that takes a System.Object as its sole parameter (which represents the optional state data) and returns nothing. Do note that if you do not provide a System.Object when calling QueueUserWorkItem(), the CLR automatically passes a null value. To illustrate queuing methods for use by the CLR thread pool, ponder the following program, which makes use of the Printer type once again. In this case, however, you are not manually creating an array of Thread types; rather, you are assigning members of the pool to the PrintNumbers() method:
class Program
{
static void Main(string[] args)
{
Console.WriteLine("***** Fun with the CLR Thread Pool *****\n");
Console.WriteLine("Main thread started. ThreadID = {0}",
Thread.CurrentThread.ManagedThreadId);
Printer p = new Printer();
WaitCallback workItem = new WaitCallback(PrintTheNumbers);
// Queue the method ten times. for (int i = 0; i < 10; i++)
{
ThreadPool.QueueUserWorkItem(workItem, p);
}
Console.WriteLine("All tasks queued"); Console.ReadLine();
}
static void PrintTheNumbers(object state)
{
Printer task = (Printer)state; task.PrintNumbers();
}
}
At this point, you may be wondering if it would be advantageous to make use of the CLRmaintained thread pool rather than explicitly creating Thread objects. Consider these benefits of leveraging the thread pool:
•The thread pool manages threads efficiently by minimizing the number of threads that must be created, started, and stopped.
•By using the thread pool, you can focus on your business problem rather than the application’s threading infrastructure.