AnyCPU Exes are usually more trouble than they’re worth

Over the past few months I've had some interesting debates with folks here (and some customers) about the cost/benefit trade-off of "AnyCPU" (architecture-neutral) managed EXEs.  I think we've converged on a consensus that most of the time they're not what you want and so shouldn't be the default in Visual Studio.  I suspect this topic may interest (and even shock) some folks, so I thought I'd share the rationale with you here.

Background - .NET and 64-bit
With the introduction of Win64 (64-bit versions of Windows), PE files (EXEs and DLLs) can be marked either as 32-bit OR 64-bit.  When a 32-bit EXE is launched on Win64, it runs in "the WOW" (Windows-32 on Windows-64) to present an illusion of a 32-bit operating system to the process.  Generally only 32-bit DLLs can be loaded into a 32-bit process, and only 64-bit DLLs can be loaded into a 64-bit process.  When the CLR added 64-bit support in version 2.0, we had an interesting decision to make.  Should we mark our binaries as 32-bit or 64-bit by default?  Techncally managed binaries had no hard CPU dependency, so they could be either (actually there is a small loader thunk, but that's unused on any newer OS including all the 64-bit ones since the OS loader knows about managed EXEs explicitly).  Since we wanted people to be able to write .NET libraries that they could re-use from both 32-bit and 64-bit processes, we worked with Windows to extend the OS loader support to enable architecture-neutral ("AnyCPU") PE files.

Managed architecture-neutral DLLs are fairly straight-forward - they can be loaded into either 32-bit or 64-bit processes, and the (32-bit or 64-bit) CLR in the process will do the right thing with them.  AnyCPU EXEs are a little more complicated since the OS loader needs to decide how to initialze the process. On 64-bit OSes they are run as 64-bit processes (unless the 'ldr64' master OS switch says otherwise), and on 32-bit OSes they are run as 32-bit processes.  In Visual Studio 2008, AnyCPU is the default platform for C# and VB projects.  This means that by default, applications you compile will run in 64-bit processes on 64-bit OSes and 32-bit processes on 32-bit OSes.  This is fine and does often work alright, but there are a number of minor downsides.

The costs of architecture-neutral EXEs
There are a number of reasons to think that AnyCPU should not be the default for EXEs.  Don't get me wrong, 64-bit hardware and OSes are definitely the way to go (in fact all 4 of my development machines have 64-bit OSes on them - I stopped bothering to install 32-bit OSes years ago).  But that doesn't necessarily mean that most processes should be 64-bit.  Here's the list I've been using in our discussions to justify making x86 the default for EXE projects in Visual Studio:

  1. Running in two very different modes increases product complexity and the cost of testing
    Often people don't realize the implications on native-interop of architecture-neutral assemblies.  It means you need to ensure that equivalent 32-bit and 64-bit versions of the native DLLs you depend on are available, and (most significantly) the appropriate one is selected automatically.  This is fairly easy when calling OS APIs due to the OS re-mapping of c:\windows\system32 to c:\windows\syswow64 when running in the WOW and extensive testing the OS team does.  But many people who ship native DLLs alongside their managed app get this wrong at first and are surprised then their application blows up on 64-bit systems with an exception about their 32-bit DLL being in a bad format.  Also, although it's much rarer than for native-code, pointer-size bugs can still manifest in .NET (eg. assuming IntPtr is the same as Int32, or incorrect marshalling declarations when interopping with native code).
    Also, in addition to the rules you need to know to follow, there's just the issue that you've now really got twice as much code to test.  Eg., there could easily be (and certainly have been many) CLR bugs that reproduce only on one architecture of the CLR, and this applies all the way across the stack (from OS, framework, 3rd-party libraries, to your code).  Of course in an ideal world everyone has done a great job testing both 32-bit and 64-bit and you won't see any differences, but in practice for any large application that tends not to be the case, and (at Microsoft at least) we end up duplicating our entire test system for 32 and 64-bit and paying a significant ongoing cost to testing and supporting all platforms.
    [Edit: Rico - of CLR and VS performance architect fame - just posted a great blog entry on why Visual Studio will not be a pure 64-bit application anytmie soon]
  2. 32-bit tends to be faster anyway
    When an application can run fine either in 32-bit or 64-bit mode, the 32-bit mode tends to be a little faster.  Larger pointers means more memory and cache consumption, and the number of bytes of CPU cache available is the same for both 32-bit and 64-bit processes.  Of course the WOW layer does add some overhead, but the performance numbers I've seen indicate that in most real-world scenarios running in the WOW is faster than running as a native 64-bit process
  3. Some features aren't avaiable in 64-bit
    Although we all want to have perfect parity between 32-bit and 64-bit, the reality is that we're not quite there yet.  CLR v2 only supported mixed-mode debugging on x86, and although we've finally added x64 support in CLR V4, edit-and-continue still doesn't support x64.  On the CLR team, we consider x64 to be a first-class citizen whenever we add new functionality, but the reality is that we've got a complicated code-base (eg. completely separate 32-bit and 64-bit JIT compilers) and we sometimes have to make trade-offs (for example, adding 64-bit EnC would have been a very significant cost to the JIT team, and we decided that their time was better spent on higher priority features).  There are other cool features outside of the CLR that are also specific to x86 - like historical debugging in VS 2010.  Complicating matters here is that we haven't always done a great job with the error messages, and so sometimes people "upgrade" to a 64-bit OS and are then disgusted to see that some of their features no longer appear to work (without realizing that if they just re-targetted the WOW they'd work fine).  For example, the EnC error in VS isn't very clear ("Changes to 64-bit applications are not allowed"), and has lead to some confusion in practice.  I believe we're doing the right thing in VS2010 and fixing that dialog to make it clear that switching your project to x86 can resolve the issue, but still there's no getting back the time people have wasted on errors like this.

When do 64-bit processes make sense?
The biggest benefit of 64-bit processes is obivously the increased address-space.  Many programs are bumping up against the 2GB limit of traditional 32-bit processes (even though they may not be using anywhere near 2GB of RAM).  One thing that can be done for such programs is to have them opt-into 4GB mode so that they can get a full 4GB of address space when running in the WOW on a 64-bit OS [Edit: softened wording here due to some guidance to the contrary].  If more address space would be useful, than sometimes the right thing to do is to decide to target JUST x64 and avoid the cost of supporting two platforms (Exchange Server has done this for example).  But often the right trade-off is to support both 32-bit and 64-bit processes, so that you can still run on 32-bit OSes, but take advantage of the large address space when running on 64-bit OSes. 

This is where architecture-neutral assemblies make a lot of sense.  If you're a library vendor, then building as AnyCPU and testing on all supported architectures absolutely makes sense.  If you are producing an application EXE and have reason to believe your application may need more than 4GB of address space, then switching to AnyCPU may be a good idea.  But, as for native, really needing this much address space is still pretty rare and usually only necessary for large complex applications, and so opting-in to AnyCPU should really not be a burden.  You've got to think about what you want your testing strategy to be anyway, so making this step explicit seems to make sense.

Another argument for AnyCPU being the default which I think deserves serious thought is the impact on the Windows ecosystem and desire to move to a pure 64-bit world someday.  No doubt the WOW adds extra complexity and confusion and it would be great to just kill it as quickly as we can.  I definitely agree with that sentiment, and we should work to get to that place.  But realistically, we're a long way from being able to seriously consider killing the WOW from Client OSes (it is already optional on Server Core - but how many EXE projects are really server apps?).  Here are some things that need to happen before we can seriously consider killing the WOW: Windows needs to stop shipping new 32-bit-only client OSes, most new native applications need to fully support 64-bit, all popular existing apps need to move to 64-bit (including VS), etc.  I'm sure we'll get there some day (as we did with the 16-bit to 32-bit transition), but I don't think defaulting to x86 in Visual Studio is going to be a major barrier here.  When it starts looking like killing the WOW may be a feasible option in the near-future, then perhaps we should probably just switch the default to be x64-only, and let people opt-in to supporting 'legacy' 32-bit platforms.

So how is Visual Studio 2010 and .NET 4.0 changing?
We are not changing anything here in the CLR or compilers - they continue to support both modes.  However, after discussing these issues, VS project system team has agreed to make EXE projects default to the x86 platform in VS 2010.  Unfortunately there is a bug in Beta1 where ALL managed project types default to x86 (I take the blame for this - I didn't think to check DLL projects when validating the changes were in).  AnyCPU is still incredibly valuable for DLLs (you may not always know what processes it will be loaded into), and since it just enables a DLL to be used in more places without actually affecting the bitness of the process, there isn't sufficient justification to disable AnyCPU by default.  This bug has been fixed, and so the plan is to ship Beta2 with just the EXE projects defaulting to x86, and all DLL projects remaining as AnyCPU.

That said, there's still time to get customer feedback and so this could change for VS2010 RTM.  We've already heard a lot of surprised reactions where people seem to think we're treating x64 as second-class, or otherwise resisting the natural evolution to 64-bit systems.  Many people have mentioned that 32-bit hardware and 32-bit OSes are quickly becoming a thing of the past.  I agree completely and this is a good trend, but it's completely orthogonal.  64-bit hardware and OSes give us the ability to run processes with 64-bit address spaces, but they by no means make using them a requirement or necessarily even preferable to using the WOW.  The Windows folks did such a good job building the WOW, and the CPU designers did a good job supporting 32-bit modes (for x64 at least, ia64 is a different story), so there aren't a lot of downsides to relying on them.  Someday I'm sure we'll decide the WOW has outlived it's useful lifetime and Windows should kill it rather than maintain it, but I'm sure that day is a LONG way off (eg. when did Windows finally remove support for 16-bit processes, and did anyone really notice?). 

When I actually get into debating this issue on it's merrits, almost everyone I've talked to has agreed that making x86 the default seems to be the best choice - at least for the next several years.  This, by no means signifies decreased support for 64-bit OSes and frameworks.  I can tell you that most CLR developers work almost exclusiviely on x64 OSes, and do much of their testing and development with 64-bit processes.  Our testing (like most teams at Microsoft) treats x86 and x64 as first-class and generally equal-priority (except for products like Silverlight of course that are still x86-only).  But when we put ourselves in the shoes of our users and study this issue on it's merrits, it just makes practical sense for x86 to be the default for EXE projects.

Let me know if you agree or disagree.  Regardless, I hope you enjoy using VS2010 - it's really shaping up to be a great release!

[Edit - added section about helping the ecosystem move to all 64-bit]

Comments (18)
  1. Stephen Cleary says:

    I remain ambiguous to your conclusion. 🙂


    1A. Native DLL interop is a common stumbling point, but one that is quickly resolved with a mere Google search. If a company advertises support for a 64-bit platform, they should test on it.

    1B. CLR bugs are rare; I’ve never encountered one (BCL bugs are much more common). A 32-bit platform has a 32-bit OS and 32-bit CLR underneath the code; a 64-bit platform has a 64-bit OS, WoW, and 32-bit CLR underneath, which removes one possibility for platform bugs (the 64-bit CLR) but adds another (WoW).

    2. Hmmm. I don’t actually know about this one. Never tested. 🙂

    3. I’m sad to hear that historical debugging won’t be supported on 64-bit. I never use EnC, though. I do all my development on a 64-bit platform quite happily.

    I think there’s only one really good reason behind this change: the BadImageFormatException on 32-bit P/Invoke.

    My opinion is that x86 *should have been* the default in VS2008, but should be AnyCPU in the future:

    1. Developers as a whole are in the process of moving from 32-bit to 64-bit platforms for development. This means all the "64-bit only" mistakes (such as P/Invoking a 32-bit DLL) will be caught immediately instead of in the testing phase. This will also naturally increase demand for historical debugging / EnC on 64-bit platforms.

    2. WoW64 is not installed by default on Server 2008 R2 Core:

    It’s possible that future OS releases (e.g., Server 2012 Standard) may follow the same path.

    VS2010 is a tough call. I believe the default should definitely be AnyCPU in the VS2012, so I can see where keeping the default as AnyCPU in VS2010 would have fewer surprises for developers. On the other hand, people still do trip up on 32-bit P/Invoke, so changing the default to x86 in VS2010 would benefit them.

    Conclusion: ambiguity. There is no correct choice. 🙂

    I know! Do it both ways! Have a checkbox at install time that allows the developer to choose their own default. 😉

  2. Jason Haley says:

    Interesting Finds: June 9, 2009

  3. ShadowChaser says:

    I must admit, I was a bit disappointed when I saw the change to x86. I’m right on the fence in terms of the logic – my company generally uses AnyCPU when we can, and fully test against 64-bit.

    WoW may be "good enough" or "just as fast today", but it won’t be forever. Case in point – when the NTVDM & WOW32 were ripped out of 64-bit Windows, there were still legacy apps such as major installation packages shipping 16-bit binaries.

    The next server version of Windows, Windows Server 2008 R2, won’t have support for 32-bit hardware. I’d be shocked if the client OS release after Windows 7 shipped with 32-bit support too.

    Most people treat the Visual Studio defaults as the "best practices". A percentage of assembies written as AnyCPU would might not have been tested on a 64-bit OS, but factor that against the huge number of 100% managed applications that work perfectly.

    I’m not sure if Edit and Continue is a good argument against 64-bit. I’ve always considered it a "dubious feature" and one of the weirdo VB features that got carried into C#. I’m sure people use it, but (so far) I’ve never met a developer that does. You said it yourself – it’s not high enough priority to add 64-bit Edit and Continue.

    I know it’s not Microsoft’s intent, but 64-bit still feels like second class. Silverlight is a really good example of the chicken and the egg problem we’re facing:

    * Flash doesn’t have a 64-bit IE plugin

    * No one runs 64-bit IE because Flash doesn’t work

    * Silverlight doesn’t provide a 64-bit assembly because no one runs 64-bit IE

    * adobe doesn’t build a 64-bit plugin for Flash because no one is using 64-bit IE and even Microsoft isn’t providing 64-bit plugins. Argh!

    I love 64-bit and the extra memory capcity I get, but it’s extremely frustrating supporting a frankenstein installation of 32-bit in a 64-bit world. I have programs that literally have 32-bit MSI installers, 64-bit Explorer plugins, and 32-bit helper processes that are loaded by the 64-bit Explorer plugins. Ugh.

    It’s a bit psychological – many people don’t want Visual Studio generating what is perceived to be "legacy applications" by default. Regardless of how well architected WOW64 is, it will still feel second class and inferior to many people.

    That said, I frequently encounter two common problems when writing AnyCPU .NET apps:

    * The 64-bit version of SetWindowLong has a different name than the 32-bit version (ie/ it’s now SetWindowLongPtr). It’s probably the #1 bug most 64-bit interop developers make – the MSDN documentation says to call "SetWindowLong", but internally the Win32 SD maps it to SetWindowLongPtr. This is a tough one to work around and typically requires wrapper/helper functions. It would be great if there was a cleaner way to do this, or at very least if it could be detected as a compiler warning somehow.

    * It’s difficult to deploy a "side by side" 32-bit and 64-bit unmanaged assembly and have an AnyCPU compiled executable intelligently decide which one to load. It’s difficult even when developers want to hard code the names of the two different DLLs – from what I’ve seen, most developers give up and compile two separate .NET executables, even though the unmanaged assemblies are being dynamically linked. I’m not sure if there’s an easy way to fix this, but it sure would be nice to provide separate 32-bit and 64-bit unmanaged DLL names to pinvoke.

    Put me down in the AnyCPU default camp. Tough decision for the VS team though!

  4. Some good points, thanks.  

    One argument against which I think is legitimate and deserves some real thought is that we’d all like to get to a simpler 64-bit-only world as quickly as we can (to avoid the complexity, confusion and cost of the WOW).  By changing the defaults, we might be slowing down that transition (chicken-and-egg problem).  That’s definitely worth taking into account, but I don’t think it’s a terribly strong argument until at least a large fraction of native applications out there support 64-bit.  Maybe the right thing to do here at some future point is to make x64 the default (so people have to actually opt into supporting ‘ancient legacy OSes’ <grin>) – but we’re clearly not there yet.

    Anyway, I’m sure there’s going to be lots of great points here, and I’m not going to try to debate them all.  I will, however, make sure the product team responsible for this change knows about the comments so they can consider whether anything should change for RTM.  Thanks!

  5. ssteesy says:

    Excellent discussion of the x86/x64 transistion in development.

    I also have been running on strictly x64 OSes for years now, and customer are finally catching up.  In the past 6 months we have had a huge uptick in requests for x64 version of our applications.  I’ve also noticed a lot or laptops and PC being advertised coming preinstalled with Vista x64 … so I think we may make a majority of the trasistion in the next 2-3 years, but I’m also sure we’ll be keeping the WOW for a while. So, I’d rather you keep the default as "Any CPU".

    Now I have a complaint … I have been waiting for Edit and Continue on x64 ever since it didn’t make it into Whibdey.  Someone above said they never use it and don’t know any developers that do and say it comes from VB.  I disagree!  I’ve never been a VB developer, I was VERY pleased when I started using Visual Studio 6’s C/C++ compiler and it supported making code changes on the fly and continuing execution.  I do it all the time, as when you are deep in a trace and debug session and it has taken literally hours to get to the source of a problem, it is absolutely wonderful to be able to make a minor logic change, move the line of execution back in front of the change and step through it again, see that it works and continue tracing to find other issues.  Now that customers want x64 support because they have machines with 4+ GB of RAM (and our applications can use it) it is becoming a major pain trying to debug in x64 without Edit and Continue.

  6. Thanks for the feedback ssteesy.

    I hear you on x64 EnC – we really wanted to add it in CLR v4, but since it requires some significant work on the x64 JIT compiler it had to be traded-off against features that affect ALL users (like perf improvements) and ultimately we decided the cost/benefit was higher for those other features.  One of the things we use in determining priority is the number/strength of ‘connect’ votes.  This one currently has only 5, feel free to increase the vote here:

    That said, we have a plan for getting this nearly "for free" in a future release by coupling it with some other JIT work we’re planning on doing.  That, coupled with increasing x64 development as you mention, should make the cost/benefit ratio much more clearly in favor of doing the work.

  7. Yuhong Bao says:

    "eg. when did Windows finally remove support for 16-bit processes"

    Even 32-bit Windows 7 still support it. But 64-bit Windows never did, and in fact virtual 8086 mode needed for DOS virtual machines was never supported in long mode, making it impossible.

  8. Thomas says:

    I understand that your decision for x86 was driven by EnC and performance concerns, fine. But is there a (simple) way to load an assembly that was produced by VS2010 without exception for postprocessing purposes? I don't want to run it, just for introspection? I also do not want to distribute different executables / msbuild tasks for x86/x64/ia64/msil? Plain MSIL should be good enough!

  9. Thomas,

    DLLs produced by VS2010 are AnyCPU by default and so you shouldn't have any trouble there.  EXEs are x86 by default and so won't load for EXECUTION in a 64-bit process (eg. Assembly.Load/LoadFrom will fail).  But if you're just loading an EXE for inspection purposes, Assembly.ReflectionOnlyLoad should be able to load it for you – even if it isn't compatible with your processes bitness (and you should be using ReflectionOnlyLoad anyway if you don't want to execute anything from the assembly).

    Hope this helps,


  10. Andreas M. says:

    Thank you for this detailed clarification on the subject. You say "The first thing that should usually be done to such programs is to have them opt-into 4GB mode so that they can get a full 4GB of address space when running in the WOW on a 64-bit OS". Can you point me to an MSDN article or something that explains how this is to be done? Thanks in advance!

  11. Andreas,

    The EXE for the process must have the LARGEADDRESSAWARE bit set in it's header.  If it's a .NET EXE then you need to use the 'editbin' tool after it's created to set this bit (see here).  Is seems the managed compilers have chosen not to make this an option due to concerns over some DLLs you may load that don't support LAE (see here).  I've added these links and softened my guidance above to account for this.  Thanks for bringing this up.


  12. Jesper says:

    why thank you. I just spent hours debugging a crash: a BadImageFormatException in kernelbase.dll apparently. Turns out to be because of your little shortcut to make Edit&Continue work by default on 64-bit machines. One of our libraries had been set to x86 because of it, and our Any CPU process running on a 64-bit machine apparently didn't like that. Setting everything consistently to Any CPU fixed it.

    So, when changing this default, it didn't occur to anyone that some users might not create every project anew in VS2k10, and let its output type stay fixed forever? People importing a project from VS2008 get a nice AnyCPU process. And if you dare change the output type of an executable project to Class Library, you get a x86 .dll. And if you try to load that into your AnyCPU process on a 64-bit machine, hilarity ensues.

    Yes, I can understand the thinking behind this decision, but it seems like no one spent as much as 5 minutes thinking about the negative consequences of this change. I don't know about everyone else, but I prefer a product that doesn't crash over Edit & Continue support.

  13. Jesper, I'm sorry to hear you've had this trouble.  Just to make sure I understand your scenario: you created an EXE project in VS2010 (which is set to x86), and then later changed it to a build as a class library instead, but didn't realize that it was buiding as an x86-only class library?  I agree that's a nasty trap to fall into – it would have been nice if we thought to add a warning in that case suggesting the user may want to change the platform as well.

    That said, this is far from an issue of "a product that crashes vs. Edit & Continue support".  As I mention above, by far the strongest reason for this change is about reliability of testing and consistent behavior.  I'd bet you prefer having your applicaiton crash reliably and quickly whenever you test it on a 64-bit machine, then having one that seems to work OK but occasionaly causes weird problems for your customers on 64-bit machines.  The bottom line is that if you really want your application to switch automatically between running as a 32-bit and 64-bit process, that's an important decision – and one we think you should make intentionally with knowledge of the implications it has for your testing and deployment.

  14. I should also add: I think a core part of the problem here is the diagnostics experience for BadFormatException.  Would it have helped if you got at least a more precise error message here?  Something like "BadFormatException: Attempt to load an x86-only DLL 'pathfoo.dll' into a 64-bit process"?  That would be a GREAT feature suggestion for the CLR, and I'd be happy to talk to some people here to see if we can get that done for CLR 4.5.

  15. SamC says:

    I'm staggered at the attitude of "let's not do anything".

    1. Test and fix issues. Software always has bugs anyway. Test on a 64-bit platform. Everyone has one now.

    2. Who cares that you lose a few percent performance? > 2GB memory available for processes on 64-bit. More and more applications will NEED this.

    3. People like you are the REASON the tools haven't been made yet. Microsoft are using YOUR BLOG as an example of reason to JUSTIFY not doing anything. See Microsoft Connect ID 431200.

    Your points are irrelevant to future (and some current) application requirements.

  16. Simon Waterer says:

    I'll just share an issue I had upgrading a VS2008 project to VS2010 relating to AnyCPU exe's and COM Interop assemblies.

    The project was created in VS2008 as an AnyCPU exe and referenced the "Windows Script Host Object Model" COM library, which causes the Interop.IWshRuntimeLibrary.dll assembly to be generated at compile time. With VS2008 all was well running on both x86 and x64 platforms.

    After upgrading the project to VS2010 the project appeared to remain targeted to the AnyCPU platform but it would encounter a BadImageFormatException when run on x64 platforms at the point of loading the Interop.IWshRuntimeLibrary.dll assembly. The exception did not occur on x86 platforms.

    I eventually tracked down the problem by creating a new project afresh in VS2010 and diffing the .csproj files until I found that the freshly created VS2010 project file had a <PlatformTarget>AnyCPU</PlatformTarget> element present within the <PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Debug|AnyCPU' ">…</PropertyGroup> and <PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Release|AnyCPU' ">…</PropertyGroup> elements, whereas the project upgraded from VS2008 to VS2010 did not define the <PlatformTarget>AnyCPU</PlatformTarget> element at all.

    I've been able to reproduce this issue several times, every time it appears that the VS2008 to VS2010 upgrade does not place the <PlatformTarget>AnyCPU</PlatformTarget> element in the .csproj file.

    I'm guessing here that the BadImageFormatException arose because the C# compilation target in MSBuild uses the Platform property (which was correctly left at AnyCPU during the upgrade) as the platform target, whereas the COM reference generation target in MSBuild uses the PlatformTarget property, and since it was not defined after the upgrade it defaulted to x86. There was no problem earlier with VS2008 because it would have defaulted to AnyCPU.

    What misled me about this was that the VS2010 GUI was displaying Any CPU in the dropdown for Platform target under the project property pages. It wasn't until I changed it to x86 or x64 (i.e. something other than Any CPU), saved the project, then changed it back to Any CPU, and saved the project again did the <PlatformTarget…/> element appear in the project file.

  17. Simon Cooper says:

    For those interested, I've figured out how you can change it back –…/94124.aspx

  18. Ethan says:

    This breaks SharePoint development, since all SharePoint 2010 bits are built 64bits.

Comments are closed.

Skip to main content