Widgets All posts tagged 'threading'


I dream in code

About the author

Robert Williams is an internet application developer for the Salem Web Network.
E-mail me Send mail
Go Daddy Deal of the Week: 30% off your order at! Offer expires 11/6/12

Recent comments




Code Project Associate Logo


The opinions expressed herein are my own personal opinions and do not represent my employer's view in anyway.

Delayed Automatic Exit from Console Application v2.0

This is an alternate version to the one originally posted here. It occurred to me that if someone was actually at the console and the program quit without warning, while allegedly waiting for a key press, that would be a bit surprising and/or annoying. This new version displays a warning with an on screen count down leading up to the automatic exit:


    1 using System;

    2 using System.Threading;


    4 namespace ConsoleApplication1

    5 {

    6     class Program

    7     {

    8         static void Main(string[] args)

    9         {

   10             //... code here for whatever your console app does

   11             Console.WriteLine("[Result of the Program]" + Environment.NewLine);


   13             Console.WriteLine(" any key to exit." + Environment.NewLine);


   15             delay = new ExitDelay();

   16             delay.Start();

   17             MyTimer = new Timer(TimesUp, null, 1000, 1000);

   18         }


   20         static ExitDelay delay;

   21         static Timer MyTimer;

   22         static int CountDown = 10;


   24         // Timer callback: they didn't press any key, but we don't want this window open forever!

   25         private static void TimesUp(object state)

   26         {

   27             if(CountDown > 0)

   28             {

   29                 Console.CursorLeft = 0;

   30                 Console.Write("Program will exit automatically in " + CountDown + " ");

   31                 CountDown--;

   32             }

   33             else

   34             {

   35                 Console.CursorLeft = 0;

   36                 Console.WriteLine("Exiting - Bye!                                   ");

   37                 Thread.Sleep(1000);

   38                 delay.Stop();

   39                 MyTimer.Dispose();

   40                 Environment.Exit(0);

   41             }

   42         }


   44     }


   46     public class ExitDelay

   47     {

   48         private readonly Thread workerThread;


   50         public ExitDelay()

   51         {

   52             this.workerThread = new Thread(;

   53             this.workerThread.Priority = ThreadPriority.Lowest;

   54             this.workerThread.Name = "ExitTimer";

   55         }


   57         public void Start()

   58         {

   59             this.workerThread.Start();

   60         }


   62         public void Stop()

   63         {

   64             this.workerThread.Abort();

   65         }


   67         private void work()

   68         {

   69             Console.ReadKey();

   70             this.Stop();

   71         }

   72     }


   74 }


Categories: VB
Posted by Williarob on Saturday, December 13, 2008 12:08 PM
Permalink | Comments (0) | Post RSSRSS comment feed

Instantly Increase ASP.NET Scalability

Each time a request for a .NET resource (.aspx, page, .ascx control, etc.) comes in a thread is grabbed from the available worker thread pool in the worker process (aspnet_wp.exe on IIS 5.x, w3wp.exe on IIS 6/7) and is assigned to a request. That thread is not released back into the thread pool until the final page has been rendered to the client and the request is complete.

Inside the ASP.Net Worker Process there are two thread pools. The worker thread pool handles all incoming requests and the I/O Thread pool handles the I/O (accessing the file system, web services and databases, etc.). But how many threads are there in these thread pools? I had assumed that the number of threads would vary from machine to machine – that ASP.NET and IIS would carefully balance the number of available threads against available hardware, but that is simply not the case. ASP.Net installs with a fixed, default number of threads to play with: The CLR for the 1.x Framework sets these defaults at just 20 worker threads and 20 I/O thread per CPU. Now this can be increased by modifying the machine.config, but if you are not aware of this, then 20 threads is all you’re playing with. If you have multiple sites sharing the same worker process, then they are all sharing this same thread pool.

So long as the number of concurrent requests does not exceed the number of threads available in the pool, all is well. But when you are building enterprise level applications the thread pool can become depleted under heavy load, and remember by default heavy load is more than just 20 simultaneous requests. When this happens, new requests are entered into the request queue (and the users making the requests watch an hour glass spin). ASP.NET will allow the request queue to grow only so big before it starts to reject requests at which point it starts returning Error 503, Service Unavailable.

If you are not aware of this “Glass Ceiling of Scalability”, this is a perplexing error – one that never happened in testing and may not reproducible in your test environment, as it only happens under extreme load.

So the first thing you can do to improve scalability is to raise the values. The defaults for the ASP.NET 2.0 are 100 threads in each pool per CPU and the defaults for the ASP.NET 3.x CLR is 250 per CPU for worker threads and 1000 per CPU for I/O threads, however you can tune it further using the guidelines below. 32 bit windows can handle about 1400 concurrent threads, 64 bit windows can handle more, though I don’t have the figures.

You can tune ASP.NET thread pool using maxWorkerThreads, minWorkerThreads, maxIoThreads, minFreeThreads, minLocalRequestFreeThreads and maxconnection attributes in your Machine.config file. Here are the default settings.

     <add address="*" maxconnection="2" />
  <httpRuntime minFreeThreads="8" minLocalRequestFreeThreads="4"  />
  <processModel maxWorkerThreads="100" maxIoThreads="100"  />

Here is the formula to reduce contention. Apply the recommended changes that are described below, across the settings and not in isolation.

  • Configuration setting Default value (.NET Framework 1.1) Recommended value
  • maxconnection 2 12 * #CPUs
  • maxIoThreads 20 100
  • maxWorkerThreads 20 100
  • minFreeThreads 8 88 * #CPUs
  • minLocalRequestFreeThreads 4 76 * #CPUs
  • Set maxconnection to 12 * # of CPUs. This setting controls the maximum number of outgoing HTTP connections that you can initiate from a client. In this case, ASP.NET is the client. Set maxconnection to 12 * # of CPUs.
  • Set maxIoThreads to 100. This setting controls the maximum number of I/O threads in the .NET thread pool. This number is automatically multiplied by the number of available CPUs. Set maxloThreads to 100.
  • Set maxWorkerThreads to 100. This setting controls the maximum number of worker threads in the thread pool. This number is then automatically multiplied by the number of available CPUs. * Set maxWorkerThreads to 100.
  • Set minFreeThreads to 88 * # of CPUs. This setting is used by the worker process to queue all the incoming requests if the number of available threads in the thread pool falls below the value for this setting. This setting effectively limits the number of requests that can run concurrently to maxWorkerThreads – minFreeThreads. Set minFreeThreads to 88 * # of CPUs. This limits the number of concurrent requests to 12 (assuming maxWorkerThreads is 100).
  • Set minLocalRequestFreeThreads to 76 * # of CPUs. This setting is used by the worker process to queue requests from localhost (where a Web application sends requests to a local Web service) if the number of available threads in the thread pool falls below this number. This setting is similar to minFreeThreads but it only applies to localhost requests from the local computer. Set minLocalRequestFreeThreads to 76 * # of CPUs.

Note The recommendations that are provided in this section are not rules. They are a starting point. Test to determine the appropriate settings for your scenario. If you move your application to a new computer, ensure that you recalculate and reconfigure the settings based on the number of CPUs in the new computer.

By raising these values, you raise the “glass ceiling of scalability”, and in many cases that may be all you need to do, but what happens when you start getting more than 250 simultaneous requests? To make optimum use of the thread pool all IO requests that you know could take a second or more to process should be made asynchronously. More information on Asynchronous programming in ASP.NET is coming soon.

I recommend testing your new machine.config locally or virtually somewhere first because if you make a mistake - for example you paste this in to the wrong area, or there is already a <system.web> element and you paste in a second one - ALL websites on the box will stop functioning as ASP.Net will not be able to parse the machine.config!!!

Posted by Williarob on Tuesday, December 02, 2008 7:41 AM
Permalink | Comments (1) | Post RSSRSS comment feed

Process and Thread Basics

Programs, Processes and Threads

In .NET terms, a program can be defined as an assembly, or group of assemblies, that work together to accomplish a task. Assemblies are little more than a way of packaging instructions into maintainable elements and are generally compiled into a dynamic link library (DLL) or an executable (EXE), or a combination of the two.

A process gives a program a place to run, allowing access to memory and resources. Generally, each process runs relatively independent of other processes. In particular, the memory where your program variables will reside is completely separate from the memory used by other processes. Your email program cannot directly assign a new value to a variable in the web browser program. If your email program can communicate with your web browser—for instance, to have it open a web page from a link you received in email—it does so with some form of communication that takes much more time than a memory access.

By putting programs into processes and using only a restricted, mutually agreed-upon communication between them has a number of advantages. One of the advantages is that an error in one process will be less likely to interfere with other processes. Before multitasking operating systems, it was much more common for a single program to be able to crash the entire machine. Putting tasks into processes, and limiting interaction with other processes and the operating system, has greatly added to system stability.

All modern operating systems support the subdivision of processes into multiple threads of execution. Threads run independently, like processes, and no thread knows what other threads are running or where they are in the program unless they synchronize explicitly. The key difference between threads and processes is that the threads within a process share all the data of the process. Thus, a simple memory access can accomplish the task of setting a variable in another thread. Every program will have at least one thread.

In his book ".NET Multithreading" Alan Dennis compares a Process to a house and a thread to a housecat. He writes:

The cat spends most of its time sleeping, but occasionally it wakes up and performs some action, such as eating. The house shares many characteristics with a process. It contains resources available to beings in it, such as a litter box. These resources are available to things within the house, but generally not to things outside the house. Things in the house are protected from things outside of the house. This level of isolation helps protect resources from misuse. One house can easily be differentiated from another by examining its address. Most important, houses contain things, such as furniture, litter boxes, and cats.

Cats perform actions. A cat interacts with elements in its environment, like the house it lives in. A housecat generally has a name. This helps identify it from other cats that might share the same household. It has access to some or the entire house depending on its owner’s permission. A thread’s access to elements may also be restricted based on permissions, in this case, the system’s security settings.


Multitasking means that more than one program can be active at a time. You may take it for granted that you can have an email program and a web browser program running at the same time. Yet, not that long ago, this was not the case. In the days of DOS you would need to save and close your spreadsheet before opening your word processor. With the advent of Windows, you could open multiple applications at once. Windows 3.x used something called Cooperative Multitasking which is based on the assumption that all running processes will yield control to the operating system at a frequent interval. The problem with this model was that not all software developers followed these rules and a program that did not return control to the system, or did so very infrequently, could destabilize the operating system, causing it to "lock up". When Windows 3.x started a new application, that application was invoked from the main thread. Windows passed control to the application with the understanding that control would quickly be returned to windows. If that didn't happen, all other running applications including the operating system could no longer execute instructions. Today, Windows employs Preemptive Multitasking. In this model, instead of relying on programs to return control to the system at regular intervals, the OS simply takes it. 

The main thread of a typical windows program executes a loop (Called a message pump). The loop checks a message queue to see if there is work to do and if there is, it does the work. For example, when a user clicks on a button in a windows application the click event adds work to the message queue indicating which method should be executed. This method is of course known as an event handler. While the loop is executing an event handler, it cannot process additional messages. Multithreading (literally using more than one thread) is how we can work around this limitation. Instead of having the main thread that was assigned to this program do the time consuming work, we assign the work to a seperate thread and have it do the work.

There are a number of ways to create and manage these new threads - using System.Threading, creating a delegate method, implementing the Event Based Asynchronous Pattern, using waithandles, etc. and I intend to explore all of them future articles. In a single core processor, this execution on a separate thread would be periodically interrupted by the operating system to allow other threads a chance to get work done; but after decades in a world where most computers had only one central processing unit (CPU), we are now in a world where only "old" computers have one CPU. Multi-core processors are now the norm. Therefore, every software developer needs to Think Parallel.


Multithreading allows a process to overlap I/O and computation. One thread can execute while another thread is waiting for an I/O operation to complete. Multithreading makes a GUI (graphical user interface) more responsive. The thread that handles GUI events, such as mouse clicks and button presses, can create additional threads to perform long-running tasks in response to the events. This allows the event handler thread to respond to more GUI events. Multithreading can speed up performance through parallelism. A program that makes full use of two processors may run in close to half the time. However, this level of speedup usually cannot be obtained, due to the communication overhead required for coordinating the threads.

Posted by Williarob on Saturday, June 28, 2008 9:00 PM
Permalink | Comments (0) | Post RSSRSS comment feed

Asynchronous DataSet

If you haven't yet read the article "Scalable Apps with Asynchronous Programming in ASP.NET" by Jeff Prosise or seen his presentation at TechEd then you should cetainly take the time to do so, however I'll summarize the key points briefly here. Basically, there is a finite number of Threads available to ASP.Net for request handling, and by making database calls the way that most textbooks and articles recommend, many of the available threads that should be handling requests are actually tapping their feet waiting for your database request to complete, before it can serve your page and return to the thread pool. When all the threads are busy, incoming requests are queued, and if that queue becomes too long then users start to see HTTP 503 Service unavailable errors. In other words there is a glass ceiling to scalability, with synchronous IO requests in

 To make optimum use of the thread pool all IO requests that you know could take a second or more to process should be made asynchronously and the links above will give you plenty of examples of how you should do this. The purpose of this article is to demonstrate how you can return a DataSet asynchronously, which is not something I could find an example of anywhere. Another thing I could not find in my research was how to use asynchronous database calls when you have a datalayer, as opposed to a page or usercontrol that contacts the database directly, and I will provide you with both here.

If you have looked at examples of asynchronous database calls elsewhere on the web before arriving here you have probably become familair with the BeginExecuteReader and EndExecuteReader methods of the SQLClient namespace, but where is the BeginExecuteDataSet method? If you need a dataset, why should that have to be a synchronous request? I could not find any asynchronous methods for the DataAdapter and while I did find ways to call a WebMethod or WebService that returns a dataset asynchronously, why should you have to break your methods that return datasets out to webservices? I also found some examples of how to create delegates or use System.Threading.Thread.Start to create datasets in their own thread, but, according to Mr. Prosise these are worthless in ASP.Net because both of these methods actually steal threads for the same thread pool ASP.Net is using anyway! So by using System.Threading.Thread.Start to create your dataset all you are doing is returning a thread to the threadpool and immediately grabbing another one. So how can you do it?

For this example I borrowed the Datalayer from the Job Site Starter Kit and added some Asynchronous methods to it. Here is my BeginExecuteDataSet method:

       Public Function BeginExecuteDataSet(ByVal callback As System.AsyncCallback, _
          ByVal stateObject As Object, ByVal behavior As CommandBehavior) As System.IAsyncResult   
             Dim res As IAsyncResult = Nothing
             res = cmd.BeginExecuteReader(callback, stateObject, behavior)
             Return res
       End Function

But how is that different to BeginExecuteReader? It is not it is exactly the same, I don't see the need to rewrite the DataAdapter class from scratch to support Asynchronous functions when I can simply use a datareader to populate a dataset. The key differences are in the EndExecuteDataSet Method:

       Public Function EndExecuteDataSet(ByVal asyncResult As System.IAsyncResult) As DataSet   
             Dim ds As Dataset = Nothing
             Dim rdr As SqlClient.SqlDataReader = cmd.EndExecuteReader(asyncResult)
             Dim dt As DataTable = New DataTable()
             ds = New DataSet
             Return ds
       End Function

Calling these methods is therefore no different to calling BeginExecuteReader. For example the following code would work from both an .aspx page or an .ascx control.

    Dim db As Classes.Data.DAL
    Protected Sub Page_Load(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Load
        'Append async attribute to connection string
        db = New Classes.Data.DAL(String.Concat _
        (ConfigurationManager.ConnectionStrings("MyDB")ConnectionString, ";async=true"))
        db.CommandText = "asyncTest"
        ' Launch data request asynchronously using page async task  
        Page.RegisterAsyncTask(New PageAsyncTask(New BeginEventHandler(AddressOf BeginGetData), _
        New EndEventHandler(AddressOf EndGetData), New EndEventHandler(AddressOf GetDataTimeout), _
    End Sub
    Function BeginGetData(ByVal sender As Object, ByVal e As EventArgs, _
        ByVal cb As AsyncCallback, ByVal state As Object) As IAsyncResult
       Return db.BeginExecuteDataSet(cb, state)  
    End Function
    Sub EndGetData(ByVal ar As IAsyncResult)
           gv1.DataSource = db.EndExecuteDataSet(ar)
        Catch ex As Exception
           lblMsg.Text = ex.ToString()
        End Try
    End Sub
    Sub GetDataTimeout(ByVal ar As IAsyncResult)
        lblMsg.Text = "Async connection timed out"
    End Sub

There you have it, a DataSet returned asynchronously, from a Data Access Layer. Download the entire Data Access Layer (zip file contains both Visual Basic and C# versions - 3.05 kb).

Posted by Williarob on Wednesday, December 19, 2007 8:07 AM
Permalink | Comments (0) | Post RSSRSS comment feed