Custom parallel looping constructs

Stephen Toub - MSFT

For those of you that have examined the internals of the Task Parallel Library in our December ’07 CTP release, you’ve likely noticed that the methods on the System.Threading.Parallel type are implemented on top of System.Threading.Tasks.Task type, and that they do so taking advantage of Task’s self-replicating functionality.  The idea behind replication is that the Task will make just enough copies of itself as is necessary to ensure that all available processing cores are saturated with work.  This is useful not only for the Parallel constructs we implement, but also for custom loop constructs that you might want to implement for your own needs.  We’re providing what we believe to be the most commonly desired constructs, but we’re also quite aware that there will be plenty more than we can provide that will be useful.

As an example, we don’t currently provide a parallel while construct, but one could easily be built on top of Task.  Consider a typical while loop:

while (condition()) body();

Here, body is executed over and over while condition is true; when condition is false, the loop exits.  We can create a parallel loop that behaves very much like this, but running on all available processing cores.  How might we do so?  First, let’s try with the ThreadPool:

public static void ParallelWhile(
    Func<bool> condition, Action body)
{
    int n = Environment.ProcessorCount;
    int count = n;

 

    using (ManualResetEvent mre =
        new ManualResetEvent(false))
    {
        WaitCallback wc = delegate
        {
            while (condition()) body();
            if (Interlocked.Decrement(ref count) == 0)
               
mre.Set();
        };

        for (int i = 0; i < n; i++)
        {
            if (i == n – 1) wc(null);
            else ThreadPool.QueueUserWorkItem(wc);
        }

        mre.WaitOne();
    }
}

Certainly not an infeasible amount of code to write, but there are some difficulties here, and the implementation is not as efficient as one might like.  We can implement a similar construct with the Task Parallel Library:

public static void ParallelWhile(
    Func<bool> condition, Action body)
{
    Task.Create(delegate
   
{ 
       
while (condition()) body();
   
},
   
TaskCreationOptions.SelfReplicating).Wait();
}

Here, there’s significantly less boilerplate necessary, and many of the potential performance issues with the previous example are addressed by the underlying library (or at least will be by the final release; there are performance issues with the current CTP, but that’s all part of the beauty of a technology preview ;). 

Now, neither of these completely duplicates the semantics of the sequential while loop, which might be fine, but you might also have special requirements for how you want the loop to function.  As an example, in the sequential implementation, once condition returns false, no additional invocations of body will occur, even if another call to condition would return true; that’s not the case with this parallel implementation.  Imagine this were running on four threads: one thread could see condition return false and exit the loop, but other threads could see condition return true and continue on (e.g. the condition delegate randomly returns true or false).  If those semantics were important to us, we could allow such information to be communicated between multiple iterations of the loop, at the expense of some volatile reads and writes:

private class VolatileBoolpublic volatile bool Value; }

public static void ParallelWhile(
    Func<bool> condition, Action body)
{
    VolatileBool shouldExit = new VolatileBool();
    Task.Create(delegate
    {
        while (!shouldExit.Value && condition()) body();
        shouldExit.Value = true;
    },
    TaskCreationOptions.SelfReplicating).Wait();
}

Now when one iteration sees condition return false, it’ll signal to all other threads that they should stop as soon as possible, such that even if condition returned true, they should exit before starting their next iteration.  As another example, if one iteration in the sequential implementation throws an exception, no additional iterations will start; while we can’t mimic that entirely in the parallel implementation, as with the previous example we can at least signal to other threads that they should bail as soon as possible:

private class VolatileBool { public volatile bool Value; }

public static void ParallelWhile(
    Func<bool> condition, Action body)
{
    VolatileBool shouldExit = new VolatileBool();
    Task.Create(delegate
    {
        try
        {
            while (!shouldExit.Value && condition()) body();
        }
        finally { shouldExit.Value = true; }
    },
    TaskCreationOptions.SelfReplicating).Wait();
}

Again, not a lot of code, but a fairly complete implementation (a few things remain, like validating that the arguments to the method are not null).

When would a parallel while loop like this be useful?  Consider a problem where you might use a heuristic such as simulated annealing to approximate the best result.  You might have, say, five seconds to find the best result, and thus you might run the heuristic over and over as many times as you can within that five seconds looking for the best possible answer.  With multiple cores, you could run this over and over in parallel, and for that a ParallelWhile construct like the one we just created would be very handy:

object monitor = new object();
double bestResult = 0;
DateTime endTime = DateTime.UtcNow.AddSeconds(5);
ParallelWhile(() => DateTime.UtcNow < endTime, delegate
{
    double result = SimulatedAnnealing();
    lock (monitor) if (result > bestResult) bestResult = result;
});

We can’t provide all possible variations of looping constructs that all customers will be interested in.  We do, however, want to provide the lower-level mechanisms that will making building custom parallel loop implementations easy.  If you have comments or suggestions, we’d really love to hear them.

0 comments

Discussion is closed.

Feedback usabilla icon