One of the major requirements most developers are up against these days is ensuring their code runs as fast as possible. No one likes to wait for an application to run, they want seamless, invisible, instant speed. Multithreading is becoming more popular now that many desktops and certainly most servers are shipping with multiple processors or multiple cores. By spinning up multiple threads to execute many different code-paths simultaneously, you can certainly increase the performance of your application.
This mentality is ever-increasingly being blended into Outlook development as well. There has been a rash of developers in recent months who are reporting problems with their Outlook add-ins crashing Outlook when doing multi-threading. While it may give the perception of a better performing add-in, the reality is that this really isn’t going to help, nor will it work.
Outlook Object Model is run in a STA COM server. This means that all OOM calls are executed on the main thread. If you are building an add-in that is making OOM calls from another thread (other than the main thread), then your calls need to be properly marshaled back to thread 0. For unmanaged code, you should use CoMarshalInterface or CoMarshalInterThreadInterfaceInStream to marshal your object pointer across threads. This will ensure your OOM calls are properly executed on thread 0. If you are writing managed code, using the OOM through COM interop by means of the PIA will actually handle this marshalling for you.
All of this talk about marshalling interfaces back to the main thread begs the question, "What’s the point?" If the OOM calls need to run on thread 0, why spin up a new thread to do OOM work in the first place? That’s a great question. You don’t gain any performance because all the calls are going to run on the same thread anyway and you incur the overhead hit of doing the marshaling to begin with, so there’s not really an advantage to multithreading Outlook Object Model.
Here are some links to some smart folks talking about this same thing: