PSA: Pinvokes may be 100x slower under the debugger.

I need to make an embarrassing public service announcement (PSA): Pinvokes are 100x slower under the debugger in VS2005 / Whidbey.  For example, this is the issue that came up here. It may also manifest as your app "hanging" under the debugger w/ 100% CPU usage.

How did this happen?
I'll explain why, but I first want to make it clear this explanation is not an attempt to justify this horribly broken behavior.

1. In Whidbey, we introduced Managed Debugging Assistants, which are runtime checks designed to catch a certain class of end-user bugs that corrupt the runtime. These catch things like stack-imbalance on pinvokes.

2. A certain set of MDAs are on by-default under the debugger. Debuggers aren't supposed to change behavior, so we tried to be very rigorous about scrubbing this list. Any MDA with notable side-effects (eg, side-effects larger than the overhead of the manage-debugging channel) was cut from the list. For example, we cut the QueryInterface MDA (which made additional QI calls to ensure that an object was complying with the QI calls) because making additional QI calls was too big of a change. One of those extra QIs would crash, and then we'd get "this app only crashes under the debugger" problems.

3. We didn't identify performance side-effects in our MDA scrub. One of the pinvoke MDAs (PInvokeStackImbalance - for checking stack balance) changes the code-gen of the pinvoke stubs to go down a slow path.  We didn't explicitly check for performance-side effects.

This slipped through because several individual teams did the right thing in isolation, but didn't really step back and look at the whole system:
On one hand: This pinvoke check is empirically extremely valuable. Stack imbalance pinvokes corrupt the runtime and are near impossible for an end-user to diagnose. This single error check may save you days of debugging. Thus, the pinvoke team had a strong motivation to push for this to be on by default under a debugger. Furthermore, pinvoke team advises that pinvokes should only be used very rarely (relative to your overall managed code). Thus we didn't explicitly stress high volume pinvokes under the debugger because we didn't believe it was a core scenario (this frees our limited resources up to test other core scenarios).
On the other hand: The MC++ team uses pinvokes extremely liberally. (In a worst case, every function call in MC++ could be a pinvoke).

What's the workaround?
If you get hit with this slowdown, disable the PInvokeStackImbalance MDA.  See here for steps on how to disable MDAs (there are a variety of options).

Comments (12)
  1. How does this compare to VS2003? Or is that what you were implying?

    btw, Awesome blog 🙂 I’m a regular reader but its my first comment.

  2. It should not impact vs2003 as MDA’s were first introduced in whidbey (2.0).

    / also a regular reader, and yeah it is an awesome blog:)

  3. Paul Kline says:

    You wouldn’t believe how many hours in the last 2 months 2 developers have spent on this. We sent emails to microsoft and no response on this problem, but luck had it, I was looking at the blogs and came across our problem.

    Thanks guys, we were majorly worried about PInvoke running SLOW. Mostly cause we use a very highly specialized math library that would be slower if we wrote it in .net (FSINCOS, etc assembly)

  4. Paul – I’m sorry about the grief this has caused. If you can give me a ping at:, I’d like to follow up with you more about this. I’m particularly interested to know who you sent mail to.

  5. Mike,

    It was my MSDN bug report that you linked to in this post, and I’m appreciative of the fact that there was a work-around :-). I really thought I was sunk when I ran into this.

    Yes indeed C++ /clr users are quite likely to pinvoke (e.g., C & C++ libraries are still native). However, C++ users generally call these via Interop and not dllimports, so it begs the usefullness of a stack imbalance check in this context.

    –jeff anderson

  6. Adam M. says:


    This was a huge issue for me because the game I went out and bought VS2005 professional, spent all that money, only to see the speed of the game I was developing slow to an absolute crawl in the debugger.

    It’s good to know that there’s a workaround.

    — Adam

  7. j.stagner says:

    Is there any way to disable MDA’s when creating your own debugger?  I notice that in Mdbg all of my programs are 100x slower, but in the VS2005 debugger and it’s cousins there is much less difference in performance.  I don’t know whether to attribute this difference to MDA or not, but it’s my best guess at this point.

  8. j.stagner says:

    Further testing (by disabling all MDAs using an environment variable) demonstrates that the perf difference was related to MDA.

    Unfortunately, I can’t justify turning off all MDAs on a client’s machine 😉  I just can’t seem to find any way to do so on a "during this debugging session" basis.

  9. J – if you disable MDAs via an environment variable, then it’s not machine wide. The debugger can provide the environment to the debuggee, and so only your debuggee will be affected.

    (simplest case is to just set the env vars in your debugger process and let the debuggee inherit them).

  10. j.stagner says:

    WOW!  I thought I had another workaround that was more involved but wasn’t working quite right, but this sounds perfect!  Sounds so simple when you hear it, now I wonder why I didn’t think of that 😉

    Thanks again, your blog is a real gift to aspiring debugger developers.

  11. Here are two main ways a debugger can pass an environment variable to the debuggee.1. Set the var in…

  12. How to Disable a Specific MDA

    As Mike Stall has noted in his

    post, the


Comments are closed.

Skip to main content