Visual Studio: Why is there no 64 bit version? (yet)


Disclaimer: This is yet another of my trademarked “approximately correct” discussions


 From time to time customers or partners ask me about our plans to create a 64 bit version of Visual Studio. When is it coming? Why aren’t we making it a priority? Haven’t we noticed that 64 bit PC’s are very popular? Things like that. We just had an internal discussion about “the 64 bit issue” and so I thought I would elaborate a bit on that discussion for the blog-o-sphere.


So why not 64 bit right away?


Well, there are several concerns with such an endeavor.


First, from a performance perspective the pointers get larger, so data structures get larger, and the processor cache stays the same size. That basically results in a raw speed hit (your mileage may vary).  So you start in a hole and you have to dig yourself out of that hole by using the extra memory above 4G to your advantage.  In Visual Studio this can happen in some large solutions but I think a preferable thing to do is to just use less memory in the first place.  Many of VS’s algorithms are amenable to this.  Here’s an old article that discusses the performance issues at some length: http://blogs.msdn.com/joshwil/archive/2006/07/18/670090.aspx


Secondly, from a cost perspective, probably the shortest path to porting Visual Studio to 64 bit is to port most of it to managed code incrementally and then port the rest.  The cost of a full port of that much native code is going to be quite high and of course all known extensions would break and we’d basically have to create a 64 bit ecosystem pretty much like you do for drivers.  Ouch.


[Clarification 6/11/09: The issue is this:  If all you wanted to do was move the code to 64 bit then yes the shortest path is to do a direct port.  But that’s never the case.  In practice porting has an opportunity cost, it competes with other desires.  So what happens is more like this:  you get teams that have C++ code written for 32 bits and they say “I want to write feature X, if I port to managed I can do feature X plus other things more easily, that seems like a good investment” so they go to managed code for other reasons.  But now they also have a path to 64 bit.  What’s happening in practice is that more and more of the Visual Studio is becoming managed for reasons unrelated to bitness. Hence a sort of net-hybrid porting strategy over time.]


So, all things considered, my feeling is that the best place to run VS for this generation is in the 32 bit emulation mode of a 64 bit operating system; this doubles your available address space without taking the data-space hit and it gives you extra benefits associated with that 64 bit OS.  More on those benefits later.


Having said that, I know there are customers that would benefit from a 64 bit version but I actually think that amount of effort would be better spent in reducing the memory footprint of the IDE’s existing structures rather than doing a port.  There are many tradeoffs here and  the opportunity cost of the port is high.


Is it expensive because the code is old and of poor quality?


It’s not really about the quality of the code – a lot of it is only a few releases old – as it is the amount of code involved.  Visual Studio is huge and most of its packages wouldn’t benefit from 64 bit addressing but nearly all of it would benefit from using more lazy algorithms – the tendency to load too much about the current solution is a general problem which results in slowness even when there is enough memory to do the necessary work.  Adding more memory to facilitate doing even more work that we shouldn’t be doing in the first place tends to incent the wrong behavior.  I want to load less, not more.


Doesn’t being a 64 bit application save you all kinds of page faults and so forth?


A 64 bit address space for the process isn’t going to help you with page faults except in maybe indirect ways, and it will definitely hurt you in direct ways because your data is bigger.  In contrast a 64 bit operating system could help you a lot!  If you’re running as a 32 bit app on a 64 bit OS then you get all of the 4G address space and all of that could be backed by physical memory (if you have the RAM) even without you using 64 bit pointers yourself.   You’ll see potentially huge improvements related to the size of the disk cache (not in your address space) and the fact that your working set won’t need to be eroded in favor of other processes as much.  Transient components and data (like C++ compilers and their big .pch files) stay cached  in physical memory, but not in your address space.  32 bit processes accrue all these benefits just as surely as 64 bit ones.


In fact, the only direct benefit you get from having more address space for your process is that you can allocate more total memory, but if we’re talking about scenarios that already fit in 4G then making the pointers bigger could cause them to not fit and certainly will make them take more memory, never less.  If you don’t have abundant memory that growth might make you page, and even if you do have the memory it will certainly make you miss the cache more often.  Remember the cache size does not grow in 64 bit mode but your data structures do.  Where you might get savings is if the bigger address space allowed you to have less fragmentation and more sharing.  But Vista+ auto-relocates images efficiently anyway for other reasons so this is less of a win.  You might also get benefits if the 64 bit instruction set is especially good for your application (e.g. if you do a ton of 64 bit math)


So, the only way you’re going to see serious benefits is if you have scenarios that simply will not fit into 4G at all.  But, in Visual Studio anyway, when we don’t fit into 4G of memory I have never once found myself thinking “wow, System X needs more address space” I always think “wow, System X needs to go on a diet.”


Your mileage may vary and you can of course imagine certain VS packages (such as a hypothetical data analytics debugging system) that might require staggering amounts of memory but those should be handled as special cases. And it is possible for us to do a hybrid plan with including some 64 bit slave processes. 


I do think we might seem less cool because we’re 32 bit only but I think the right way to fight that battle is with good information, and a great product.


Then why did Office make the decision to go 64 bit?


This section is entirely recreational speculation because I didn’t ask them (though frankly I should). But I think I can guess why. Maybe a kind reader can tell me how wrong I am 🙂


First, some of the hardest porting issues aren’t about getting the code to run properly but are about making sure that the file formats the new code generates remain compatible with previous (and future) versions of those formats. Remember, the ported code now thinks it has 64 bit offsets in some data structures.  That compatibility could be expensive to achieve because these things find their way into subtle places – potentially any binary file format could have pointer-size issues. However, Office already did a pass on all its file formats to standardize them on compressed XML, so they cannot possibly have embedded pointers anymore. That’s a nice cost saver on the road to 64 bit products.


Secondly, on the benefit side, there are customers out there that would love to load enormous datasets into Excel or Access and process them interactively. Now in Visual Studio I can look you in the face and say “even if your solution has more than 4G of files I shouldn’t have to load it all for you to build and refactor it” but that’s a much harder argument to make for say Excel.


In Visual Studio if you needed to do a new feature like debugging of a giant analytics system that used a lot of memory I would say “make that analytics debugging package 64 bit, the rest can stay the way they are” but porting say half of Excel to 64 bits isn’t exactly practical.


So the Office folks have different motivations and costs and therefore came to different conclusions — the above are just my personal uninformed guesses as to why that might be the case.


One thing is for sure though: I definitely think that the benefits of the 64 bit operating system are huge for everyone. Even if it was nothing more than using all that extra memory as a giant disk cache, just that can be fabulous, and you get a lot more than that!

Comments (78)

  1. You may have noticed that there is not a native 64-bit version of Visual Studio 2010. With all of the

  2. Thanks a lot for this great explanation Rico… this was a common question I think and your explanation is clear.

    However, I think that many of us are waiting a 64bit version of VS in the future…

  3. ShuggyCoUk says:

    You could do with making some of the existing slave processes 64bit. fsi.exe is one example. You can use corflags to patch it, just ignore the warning about the stripping of the strong name since it still works despite this. (though if you strip the strong name by hand then it *does* break it)

  4. The pointer size increase is a bane.  However, the 64-bit ISA’s advantages (larger and more registers) can cause a significant performance boost.  An obvious compomise would be to use just 4gb of address space in a 64-bit process and simply make due with 32-bit pointers.  Although code-size would be slightly larger due to instruction size issues, this looks like an almost pure win, otherwise (it’d be nice to have this option for other programs too).

    With fancier tricks, some people seem to think that it’s possible to use 32-bit pointers even in 64-bit java programs that use more that 4GB of memory:

    http://docs.thinkfree.com/docs/view.php?dsn=855108

  5. ricom says:

    Back when I used to do mixed model tricks with 16 bit pointers addressing 64k segments we had interesting compiler magic that would let you address more than one pointer sort of.  Far and near pointers based pointers.  You could image that world all over again with 64 bit — 32 bit ‘near’ pointers in a 4G segment.  I guess it could work but boy it was no fun then.  Maybe some special cases or something.

    More registers can make a difference but as I was saying the other day, the regular 32 bit instruction set actually has a lot more registers than you might think.  With L1 being what it is  [EBP+4] [EBP+8] [EBP+12] are all nearly as good as registers.

    Interestingly, when we changed our code-gen patterns in VS2008 so that EBP was no longer used as a general purpose register, thereby reducing the actual number of registers by 1 (which is a big deal if you consider that this reduces you from 7 to 6 depending on how you count it, that’s a significant percentage.  But actually there was no difference.

    Sometimes there is, but really with out of order instructions, register renaming, and a good L1 cache it isn’t nearly as bad as it used to be.

  6. Rik Hemsley says:

    "That basically results in a raw speed hit (your mileage may vary)"

    Have you measured the performance hit? What was it?

  7. Fred Morrison says:

    Makes me long for the days when 640KB was the memory limit.  Or maybe we should go back to punched cards.  On second thought, we’d just end up debating the merits of 90-column round-hole cards vs. 80-column rectangle-hole cards. 🙂

  8. Jeronimo says:

    Everyone is saying "Go 64bit!". I have listened to them and now I have driver problems, compatibility problems etc. If you have 64bit OS then you have to have 64bit (native) tools, native drivers etc. Plain and simple as that.

  9. Jason Haley says:

    Interesting Finds: June 11, 2009

  10. Ooh says:

    Maybe Visual Studio doesn’t need to be 64-bit — but please give us an environment that brings parity for 64-bit and 32-bit development!

    So for example bring the improvements made to the 32-bit-JIT-compiler in 3.5 SP1 to 64-bit. And bring Historical Debugging to 64-bit…

  11. Apart from memory there is another reason to want a 64 bits VS IDE: it’s getting WOW out of the way in VS extensions.

    One disadvantage from using 32 bits Visual Studio on a 64 bits OS is that VS extensions also have to run on the 32 bits WOW layer.

    WOW gets in the way when you want to automate development tasks agains 64 bits server products that run on your development box – e.g. SharePoint – from within Visual Studio.

    E.g. when you want to access the registry you get the ‘wrong’ one (32 bits). There are also differences in program files and GAC.

    We encounter this problem in our Factory Guide VS extension for our "Macaw Solutions Factory".

    So the lack of a 64 bits VS IDE is hindering integration when developing for 64 bits MS server products.

  12. Our Visual Studio Chief Architect, Rico Mariani, wrote up a great blog about why we haven’t moved to

  13. Greg Low says:

    I see two issues with this argument:

    1. Things don’t work the same in WOW. For example, much noise was made about Edit and Continue for C# (and already for VB) in recent versions. It doesn’t work in WOW.

    2. Lots of other apps have been built to run in the VS shell under the VSIP program. A good example is the SQL toolset. Can you really see SQL Server Management Studio staying as a 32 bit tool for many years to come?

  14. ricom says:

    Porting the VS shell to 64 bit would be a much more interesting project.  Many people could use it (like SSMS) and it’s much less code.

    Could be a great incremental step.

    Some folks pointed out that *runtime* support for 64 bit isn’t as good as it could/should be (e.g. full edit and continue support is desired).  I totally agree, but of course that isn’t because the IDE itself isn’t 64 bit.

  15. ricom says:

    @vincent:

    For VS extensions that really need to be 64 bit because of (e.g.) 64 bit services they use we have been recommending that the extension be split into a 64 bit part that does the heavy lifting and a 32 bit part that hooks into Visual Studio.  This is the strategy for integrating with 64 bit Office for us.  This actually has many architectural benefits as well but it is more work.  Sorry 🙁

  16. Ray says:

    Don’t care much about a ‘true’ 64 bit VS.NET, but how about fixing Edit and Continue for 64 bit managed applications? Cheating your way out of the problem by making the default project suddenly x86 in vs.net 2010 is just a really really poor and decieving way to hide the failure that after 5 years since we got E&C you are still either unwilling or unable to make it work on 64 bit.

  17. Rick Byers says:

    Over the past few months I’ve had some interesting debates with folks here (and some customers) about

  18. ricom says:

    I hear that loud and clear.

  19. Keith Farmer says:

    E&C under x64 is a red herring.  If you’re editting code while running, you’re invalidating the run itself.  You may come back to me and say "Oh, but we need to do end-to-end testing, and we just had to tweak this little bit…", and I’ll respond "then you’re still developing, and should be isolating that code to be exercised by an inexpensive unit test."

    If there is *any* motivation for an x64 VS, E&C is *not* it.  I wouldn’t even call it a "nice to have".. maybe a "cute to have" at best.

  20. Visual Studio 2010 will ship with a 32bit version, and no 64bit version.  My team built a very smart

  21. I’d much rather have the VS IDE team (and just about every other MS dev team) work on making their tools use multiple procs.  There’s way too much single-threaded execution going on in there.

    I understand this is easiest when writing/moving to managed code (especially with 4.0 improvements) – yet more reason to continue moving in that direction!

  22. Gerhard Junker says:

    Yes, 64bit is sometimes nice to have. VS works nice on 64 Systems.  What brings more speed? I think using multi-coe CPUs will be a great benefit. Building an solution – running on 4 cores in parallel – should give me the factor of 3. But I have NOT checked if the compiler etc.. will use more cores. Speed up the documentation browser, speed up building the solutions – thats a help for me.

  23. A performance hit?  Seems unlikely given what I’ve seen in practice.

  24. This makes perfect sense to me. From the perspective of cramming as much code into the CPU cache as possible, 32-bit code is the better optimization, and 64-bit is the wrong implementation choice.

    And yet I encounter people asking "When will we get 128-bit?", despite 64-bit addressing an astronomically bigger range than would be needed to address every bit of storage in the world. When you have that much storage, use URLs instead.

    Ironically the one program I use regularly that would benefit (Pro Tools with plugins loading multi-GBs of drum samples into memory) is unfortunately 32-bit for the foreseeable future.

    And yet I have a 64-bit version of Notepad! Should come in handy if a readme.txt file happens to be 10^19 bytes long.

  25. Lee says:

    Personally, I don’t care whether VS or any other program is 32- or 64-bit as long as they work on my system.

    @Daniel Earwicker: Nevermind 64-bit Pro Tools. Is there ANY audio interface with USABLE 64-bit Vista drivers? Maybe by the time Windows 8 hits the streets! LOL

  26. This week on Channel 9, Brian and Dan cover the top developer news, including: – XNA Game Studio 3.1

  27. Jalf says:

    Wait, so you, of all people, are stating that something is slower without offering benchmark data to support it? 😉

    Of course the larger pointers have a cost, but what about the benefits? More registers have already been mentioned, but there’s also niceties like a faster calling convention (and one which saves a handful of push/pop instructions, meaning the code size may decrease a bit), a more efficient exception handling mechanism (as far as I know. Haven’t studied the details, but it seems like it avoids a lot of potentially expensive setup code for exception handlers), and presumably, not having to go through a WoW emulation layer must save a few cycles too. I’m not saying 64-bit should be expected to be faster by default, but there are a few more factors at play than just "larger pointers and larger address space". Do you have any real-world benchmarking data for this, or is it just a guesstimate that you "start in a hole and have to dig ourself out"?

    Not that I expect it to make a big difference either way in practice of course. You’re right that there doesn’t seem to be much direct benefit for VS.

  28. Doug Gale says:

    Forget 64-bit, I’d rather you do something that will actually help productivity…

    The default project settings are utterly terrible. Ever setup a project consisting of an exe and a couple of DLLs with 32-bit and 64-bit targets, and tried to separate the object files from the output files? I generally spend several hours tediously digging through the poorly implemented project settings dialog. Sometimes I resort to opening up the vsproj files and tediously (and dangerously) editing the settings there by performing search-and-replace operations.

    Please, please, please, implement a way to setup "solution" default "project settings", and allow the settings to get inherited by the actual projects. This way, you can actually leverage the $(IntDir), $(OutDir), etc. settings throughout the project.

    For example, say my $(IntDir) is "$(SolutionDir)obj$(ProjectName)$(ConfigurationName)$(PlatformName)", allow the "solution default" setting to inherit its way into the project, and perform the expansion at the last moment. I can’t tell you how much time that would have saved me over the years.

    That’s my only issue with VS. Besides this one big complaint, I love VS :-).

  29. ricom says:

    @Jalf:

    Actually, it’s a semi-political decsion.  I have assorted data and experience that tells me how much slower it usually is but I thought if I posted it then it would create a storm of "no it’s not 1.x slower it’s only 1.y slower" followed by "no no it’s actually 2.x"  and that isn’t the point.

    Ultimately any measurements I have aren’t generally useful to others anyway because YMMV so I just went for vagueness.

    Cowardly I know.

    Sorry 🙂

  30. Jalf says:

    Hehe, no prob, just couldn’t resist pointing it out. 🙂

    Again, I doubt it’d make a noticeable difference either way. I wouldn’t notice VS becoming 3% faster, and 3% slower wouldn’t really bother me either.

    Although I’d be curious to see a rough "common case" percentage based on your data. I promise not to argue about it. 😉

    @Doug: You know about property sheets, right? They let you specify settings once, that your project can inherit, so that solves part of the problem. Of course the default projects still override virtually everything, so you still have to go through every project you create and manually set everything back to "inherit". Of course, if the project wizards weren’t this horribly jumbled mess of undocumented javascript, it’d be easy enough to just create new project types ourselves, with sane default settings.

    This probably isn’t Rico’s responsibility though. But if you do come across someone who’s involved in this stuff, hit them over the head from us… 🙂

  31. Vijay says:

    Microsoft disappoints yet again. I always believed their dev tools were way ahead of the Java ecosystem. But if 2010 VS is not going to have it? I have enjoyed using VS2008.

    The lack of future vision is all the more reason to avoid an MS-based ecosystem. Its falling apart, little piece by piece. I have seen the other side of the fence (Python, Ruby, Php, Linux) and the programmer’s investment of time and energy look far far more worthwhile.

    Not that Vista 64-bit is great – its a prettier skin (yes I value that) on top of XP.

  32. Anders says:

    Forget 64-bit version, Visual Studio is something of a disgrace in the IDE line-up

    off the top of my head its missing

    * decent performance on a small (600KLOC) project

       * decent build times

       * decent debugger speed (switching between the  dev and debug layouts is painful)

       * decent unit testing speed (again launching is very slow)

    * decent code navigation features (open type, type hierarchy, call hierarchy, mark occurrences etc..)

    * decent refactoring support

    * language interoperability (c# refactoring forgets about f# projects in the solution)

    this isn’t intended as a rant though it may look like one – my point is worrying about 64-bit versions of VS is entirely besides the point – it hasn’t even reached mediocrity in 32-bit space yet :/

  33. secretGeek says:

    Why not just switch VS to emulate a harvard style architecture with separate code and data address spaces and then continue using 32 bit for code, but allow 64 bit addressing for data. Sheesh, this is easy! You guys must be simple morons!

    (I’m only kidding… that was my attempt at being an undergrad idiot)

    Excellent write up, thank you! Very good to read such a frank and interesting discussion!

    lb

  34. Mitch says:

    I suspect the answer is easy – cost.  VS has been around at least since 1997 and now with 3000 people involved with the development of VS and several million lines of source code later.  The cost to properly port versus the value it would provide becomes a no brainer.

  35. Keith says:

    I’ve been writing software since 1982 on everything from TRS-DOS to CP/M to Windows to AS-400.  I remember when the 16bit/32bit Visual Basic 4.0 hybrid came out and all the issues around that.

    I suggest a couple of things (MSFT: I hope you’re listening):

    1) Do stabilize what you currently have and stop confusing us as to "which" data access technology to use.

    2) Annoyances such as linking SSRS report builder IDE to specific version of VS.NET and SQL Server is stupid!!!!!  We’ve got VS2008 apps out there, but have to have VS2005 installed as well in order to work on our reports that pull data from sql2005. (if the report builder is a true add-in, then it should work for vs2008 and 2010)

    3) I can certainly justify the need for a 64-bit version of Visual Studio.  We (developers) don’t need our hands held. Get a 64-bit version out there.  For goodness sakes, you all have 42,000 different Windows Vistas out there (business, home, home premium, enterprise), why not a 32-bit VS and a 64-bit VS?

    Focus more on simplifying things for us and your other customers.  We don’t need 42,000 different Windows SKUs.  Keep it simple.  (Windows 7 [or whatever the heck you decide to name it])

    You can include all of the nice bells and whistles for home users on the DVD to be installed later or by the PC vendor.

  36. gOODiDEA.NET says:

    Web How to Easily Create a JavaScript Framework, Part 1 – Part 2 Asynchronous innerHTML 10 HTML Tag Crimes

  37. guy says:

    Wow.  Apple managed to port Xcode to four architectures, two of which are 64 bit, and make it 100% garbage collected.  Yes, Xcode is a lot smaller than Visual Studio – but Visual Studio has a hell of a lot more people working on it too.

    It’s pretty pathetic that Microsoft can’t do this.

  38. Zephiris says:

    Microsoft generating FUD about 64-bit…amazing.

    Microsoft’s own documentation points out that, without LARGEADDRESSAWARE (and the proper programming), user apps are limited to 2GB. And beyond the 4GB, it also applies to any virtual addressing.

    x86_64 has, in effect, three times as many free GP registers: 15 vs. 5 for x86.

    Larger pointers doesn’t automagically mean all code size is doubled.

    Considering Visual Studio’s track record, limitations, and that it’s playing catch up to NetBeans and the like now as far as ease of use and features, 64-bit should be extremely basic, and built in to VS2010. Especially when porting for 64-bit, generally means "follow standard code practices, including Microsoft guidelines, that have been out there since the late 1990s".

    If Microsoft can’t ‘be bothered’ to get it right and fully supported for their own development tools, five years after the wide availability of user-end 64-bit processors, and four years after the release of their own compatible version of XP, is it any wonder that the Windows Ecosystem has extremely few 64-bit programs in general, while the vast majority of programs (including major ones) for other operating systems have been compatible for years?

  39. Jalf says:

    @Zephiris: The number of registers or changes to code/data size doesn’t really matter as much as overall performance. If the code becomes slower from being ported to 64-bit, it doesn’t really matter how many GP registers are available.

    Apart from that, I think you do have a valid point with the last bit. From a political/marketing/ecosystem point of view, I agree it’d be a good signal to send if Microsoft went that extra mile and made sure their *own* software was available in 64-bit versions. Not because I desperately need a 64-bit version (like I said above, I don’t think it makes much difference), but because it’s hard to take Microsoft’s guidelines seriously when it’s so painfully clear that most Microsoft software ignores them.

    If there’s one thing that characterizes the Windows software ecosystem, it is "Microsoft doesn’t care about following rules or guidelines, so why should we?"

    But that’s more a political/strategic decision. As far as I understand it, Rico’s job at the moment is simply to make VS go fast. (And that means he has his hands full already ;))

  40. ricom says:

    Be careful reading too much into what I say, or generalizing it.  I don’t engage in FUD. This isn’t about creating uncertainty it’s about explaining the reasons behind one particular choice.  Thereby, hopefully helping others to make informed choices themselves.

    I’ve been very consistent for a half decade on this point: The benefit of large address space is just that, large address space.  One should not assume that it comes with other free benefits.  One should also not assume that it comes at no cost.

    There are many applications I can think of that would benefit from being 64 bit — I use quite a few of them.  However that is not a universal situation.

    So, like every other engineering decision, you should go into it carefully.

    There is nothing nefarious here. I’m far too busy to be nefarious because, as Jalf pointed out, my job right now is to make VS go as fast as possible.  I’m very busy 🙂

  41. Nick says:

    Microsoft’s ongoing lack of support for x64 while they increase the minimum requirements for their software is an embarrassment. While there is certainly a performance hit when going from 32bit to 64bit, I find there’s more of a performance hit when limited to 3-odd GB of RAM by an operating system that really requires at least 2GB to boot itself, never mind running a game or something like that.

  42. Jalf says:

    @Nick: Then use 64-bit Windows. Problem solved.

  43. Jay says:

    @Vijay – If you think Vista is just XP with a prettier skin you have no understanding of either OS.

  44. Tom says:

    I have to agree with some of the other readers – I do think that Microsoft deserves some criticism in not fully following their own guidance.  I realize there is no business case to be made for porting Visual Studio or Office to 64-bit.  But there is immense value in doing so anyway to provide LEADERSHIP in terms of embracing current technologies.

    For example, suppose I’m an architect trying to sell upper management on the idea of embracing a current Microsoft technology, such as .NET or 64-bit.  You know what ends that conversation?  When management asks, "Does Microsoft use <current MS technology> in <major MS product>?" the conversation is pretty much over and I’ve lost before I even started when I have to answer "no."

    The fact that Microsoft has neither fully embraced .NET nor 64-bit for its flagship products sends a clear signal about the perceived value/risks/etc. of development on those platforms, plain and simple.

    I consider myself a "Microsoft advocate" in general, and I personally specify Microsoft technologies – I just wish Microsoft would fully put itself behind the development methodologies it recommends its customers use.

  45. Owen says:

    You might want to have a talk with whoever does the feed Visual Studio uses for news items.  The title of the article according to that feed is "Why There is 64 Bit Version of Visual Studio.

    Does this suggest some poor sap has to manually type entries into that feed?

  46. Tom says:

    I just read Dustin Campbell’s comment on Somasegar’s blog stating that the default project type for .NET apps has been changed from "Any CPU" to "x86" by default!  Yikes – talk about a step in the wrong direction!

    That means, by default, .NET apps running on 64-bit Windows will run as 32-bit apps.  Yuck!

    From all external signs, it would seem that Microsoft has completely thrown in the towel on 64-bit application development!

  47. Bruce says:

    Rico, I think you’ve touched upon what I consider a raw nerve. There is so much development going on without consideration to the Engineering side of design. Case in point is the request to port software X to 64 bit just to be in a "Pure" 64 bit environment; without thinking about what it means for the app. in question.

    Having said that, visual studio isn’t just a product as it is; rather its the backbone of the VSIP ecosystem (for good or for worse). Going 64 bit would demand more L1, L2 and Pointer memory; but isn’t that a problem that processor manufacturers are happy to solve with more cache, as we’ve seen recently?

    This fork in the road provides you with a fantastic opportunity to break cleanly with 32 bit code and optimize the whole environment for 64bit; while also implementing the lazy loading as a service pack sometime down the line. Seeing that VS will now use WPF as its rendering pipeline, taking the effort to migrate an increasing number of components to managed code would probably be the best direction in my ignorant opinion.

    That said, thanks for trying to make VS faster. I can’t appreciate that enough.

  48. Greg says:

    We would be more interested in improvements to 64 bit machine code generation in terms of speed (first priority) and code size (second priority).  The VS environment comes a distant third.

  49. Theo Zographos says:

    I think MS should be bothered with fixing the (many) bugs in VS2008 rather than porting to 64-bit.

    Rewritting parts of VS2010 in WPF is a step forward. We would expect most of it to be written in WPF.

    The major issue however, is not VS; it’s .NET 4.0 and what it has to contribute to the software industry in the year 2010.

  50. Derek says:

    I’m not really one to demand a 64-bit IDE just for sake of it.  I have noticed that depending on my development platform, I have various features that are unavailable to me.  First, I tried Server2003 and many of the debug windows just disappeared out of the app (Exceptions dialog, immediate window, and processes window all come to mind).  Those windows suddenly reappeared when I reinstalled using XP64, but I still can’t edit and continue.  And in moving from the server to XP I’ve lost the ability to install SQL2005-64bit.

    So, I really don’t care if the IDE itself is 32bit, 64bit, and "some prime number" bit.  What I really want is my choice of a 64-bit development platform to be fully supported.

  51. Evgeny says:

    The real issue for me is not performance, it’s compatibility. When you use a 64-bit OS running a 32-bit development environment is a pain. Sure, it SHOULD all "just work" – but it never does.

    One specific issue I can think of is that when I started my program under the debugger it worked, but when I started it without debugging it failed. Can you guess why? I eventually figured it out, but it wasted a lot of time. Even after I figured it out it continued to waste time, because I kept forgetting about it and the symptoms may be different enough from time to time that I don’t realise it’s the same problem.

    Another, related issue is that post-build steps which registered type libaries failed to work. That is, they didn’t return an error, but the type library just wasn’t registered. The underlying cause was the same: 32-bit registry redirection. I couldn’t just call the 64-bit version of regtlib or regasm, either (I don’t remember now which one of those it was), because it kindly ran the 32-bit version for me when started from a 32-bit process. So I had to write my own 64-bit EXE, which would call it and call that from the post-build step. That is an absolutely ridiculous hoop to have to jump through!

    There were some other issues with the GAC and COM+ as well, but I don’t remember the details now, because I went back to 32-bit OS. I do remember that there were quite a few things which should have worked in theory, but didn’t in practice.

    Mixed-mode debugging is another thing that was sorely lacking from VS 2008. At least that is (apparently) in 2010, so we can finally debug properly.

    Microsoft, we DO need a 64-bit IDE to run on a 64-bit OS, performance issues be damned! Even if it’s twice as slow it’s still better than not having that option available at all.

  52. Greg says:

    It’s not entirely MS’s fault on the 32 vs 64 bit since MS has had 64 bit versions of the OS back for the DEC Alpha CPU over 10 years ago.  Dec Alpha failed to gain market share, thus the 64 bit OS and dev tools market dried up.  It wasn’t until quite recently (3 years) that INTEL/AMD had a new ‘standard’ for 64 bit extensions to x86 for Microsoft to target.

    MS tools and end user products target their end user’s desktop architecture which is 32 bit for almost all users.  Server side push to 64 bit only (e.g., Exchange was the first 64 bit only) have had moderate effect to push 64 bit into the back office server machine.

    When the majority or a sizable amount, such as 33%, of new desktops/laptops have 64 bit is when we will see MS push out 64 bit for most end user and developer tool applications.

  53. Shane says:

    I frankly do not really see why it’s such a nessity that debug interfaces force thread’s to be 64bit such that they can debug a 64bit target.  If debugging is available over a firewire or serial port for the kernel, why not have the same interface at least exposed over the debug port?  

    If I know it’s a 64 bit application which I am debugging, it’s a lot less painful to un/pack set’s of 64 values then to write an entire 64bit application of my own, for the explicit purpose of debugging another 64 bit application.  

    Debugging 64 is much more of a road block to development into the 64 space than I think most give credit, especially when considering the need to deliver a compatiable option for consumers, who expect a 64 clean version of your tool/plugin/etc… saving the hassles of split registry and filesystem WoW confusion.

    What’s so scarry about allowing easy 64 bit debugging, too many bugs in MS bin’s? 🙂  But really, I’ve herd stories about debugging Windows remotely over NDIS based debug port’s, why cant we use it from the same system with a thread in a WoW?  (oh ya I am guessing it’s got more todo with expectations of debug.lib’s assuming symbols will be where the debugg-er is instead of the debug-ee)…

  54. ben k says:

    Disagree the larger regsiter sizes in 64 bit are a big help , less than 1% of non float data types are 64 bit , even more so for moves and compares you can use the SSE/MMX instructions which can do 256 bits at a go.

    64 bit (except for a few cases like a huge DB ) for apps is mainly marketing this is prob the reason Excel is moving.

  55. Nick says:

    I’m not too fussed about the whole do / don’t port argument. Certainly performance should be a key concern. But there are two functionality issues with the current approach used by VS2008:

    * Edit and continue (as per many comments above, this is annoying)

    * Handling of 32 bit only and / or 64 bit only assemblies when defining references / building.

    On the second point, I’ve had a few cases where I couldn’t get a build to succeed because a dependent assembly was only available in a 64 bit flavour – some bizarre issue with the compiler where it obviously couldn’t see the assembly at some key step.

    Conversely, I recently had an issue with TFS assemblies (which are only available as 32 bit) and were not visible to the VS GUI – I had to hand code the references in the project file.

    So Microsoft – do all your cost / benefit analysis on the pros / cons of swallowing the bitter pill to do a full port, but in the mean time, please address these two functionality issues – by whatever means.

  56. Roman says:

    OK , got the story on 64bit , but is there ANY CHANCE of having references to different 3rd party .NET dlls for different platforms ? (like it was with C++ where you could have different .libs for x86 and x64 platform configurations) ?

    Because right now the only options I have is to change pronject files manually adding conditions to project like this:

    instead of:

    <Reference Include="Leadtools, Version=16.5.0.0, Culture=neutral, PublicKeyToken=9cf889f53ea9b907 , processorArchitecture=x86">

         <SpecificVersion>False</SpecificVersion>

         <HintPath>..ThirdPartyLeadTools16_5RedistDotNetWin32Leadtools.dll</HintPath>

       </Reference>

    manually edit to:

       <Reference Include="Leadtools, Version=16.5.0.0, Culture=neutral, PublicKeyToken=9cf889f53ea9b907">

         <SpecificVersion>False</SpecificVersion>

         <HintPath Condition=" ‘$(Platform)’ == ‘x64’ ">..ThirdPartyLeadTools16_5RedistDotNetx64Leadtools.dll</HintPath>

         <HintPath Condition=" ‘$(Platform)’ == ‘AnyCPU’ ">..ThirdPartyLeadTools16_5RedistDotNetWin32Leadtools.dll</HintPath>

       </Reference>

  57. Matt Neerincx (MSFT) says:

    In my opinion VS.NET has avoided 64-bit for too long.  Stop making excuses for it.  If VS.NET wants to be a serious developer tool it must support 64-bit and must also be built on 64-bit, otherwise how can I as a developer take it seriously?

    For example if all of the developers of VS.NET are writing 32-bit code and debugging 32-bit code, they are NOT testing and using the 64-bit features of VS.NET.

    VS.NET needs to eat it’s own dogfood so to speak, then it will become a more serious development tool for 64-bit developers.

  58. HardwareIsCheap says:

    Stop thunking around and git r done.

  59. Perica says:

    Hi there,

    well this is all fine but what we REALLY need is unit testing engine for 64 bit. I don’t care is the VS client 32 or 64 bit as long as everything is still working in 64 bit (including unit tests).

    cheers

    Perica

  60. Jason Short says:

    The entire MS stack is going 64 bit.  The fact that I don’t have edit and continue for 64 bit processes seems a huge deal for a lot of people.

    Visit BestBuy or any store, how many machines can you get that are 32 bit?  Not many.  That doesn’t mean you HAVE to make everything 64 bit, but it definitely points that furthering 32 bit code is a wasted effort (IMHO).

    I think the real reason is that MS can’t move some stuff to 64 bit.  COM drivers, ActiveX controls, etc.  A lot of users who want to run 64 bit can’t because they chose Access as their database, you can’t load it in a 64 bit app.

    And if running 32 bit is not a big deal then why is Windows 7 running 32 bit apps as emulated application?  Why the whole VM for XP to the OS?

  61. T.J. says:

    My opinion on the matter is pretty basic:  64 bit processor, 64 bit OS.  Why?  Memory addressing.  It’s really that simple.  I want to use the 6GB of RAM I paid for.  

    With the majority of MS software using 32 bit with a disabled PAE, we are stuck with a memory limitation and stack issues.

    Even open source operating systems have recognized all the problems mentioned above, but have successfully made the transition to 64 bit with their entire application stack – easily maintaining a common codebase for both processor types.

    C’mon guys.  You are supposed to lead the pack, and you are falling behind.

  62. JACK says:

    I DON’T SEE WHEN VS WILL HAVE 64. VS DIE AT 32? TOO YOUNG TO DIE. VS DIE AT 64 MAY BE BETTER.

  63. Srikanth says:

    SharePoint Server 2010 is coming on 64 bit only. If Visual studio 2010 is not ready for 64 bit. How developers will work on SharePoint Server 2010.

  64. I would like to comment that i have seen vast improvements running VS on my 64 bit lenovo with 4gb of ram over my 32bit Dell XPS with 4gb of ram , but that i do have a world of frustration with the whole WOW layer and for me getting rid of WOW would be the real win.

    I would allso like to point out that Visual Studio with out a tool like resharper is next to useless for true Agile development but that i resent having to buy a 3rd party tool to do what visual studio should actually do and for me i would much rather visual studio team smelt the coffee hay put it on my starbucks tab if it gets me a usefull product for my teams and started making visual studio more than a entry level development product i think you know if you take a walk along resharper or dev xpress feature set you see its not rocket science and it cant be hard to do there must be commercial drivers for doing it and yah i know you have to consider your 3rd party venders so on and so forth but i think you have a bigger responsobility to the community who use and promote your product. My view would be put your investment into makng a better more useable product 64bit can wait.

  65. Ed says:

    Since when has Microsoft been concerned about bloatware?  8 bytes per pointer… ha, that ain’t nuttin!  🙂

  66. RS says:

    Excellent explanation, but…

    "The cost of a full port of that much native code is going to be quite high and of course all known extensions would break and we’d basically have to create a 64 bit ecosystem pretty much like you do for drivers.  Ouch."

    Isn’t it ironic that Microsoft won’t do full overhaul due to costs, yet expects the rest of the world to do exactly this?

  67. Dan says:

    We’ve got a 32-bit product that needs to scale up.  We’re currently running out of virtual address space due to the 32-bit limit (no, it doesn’t have memory leaks…) and while it’s of course possible to re-architect it, it would be much more cost effective for us to move it to 64-bit.

    Of course, when I tried to move the build chain onto a 64-bit machine, it broke even when building a 32-bit version.  Sheesh…

  68. p0wer says:

    For me, it’s most important that the IDE works fast and it is responsive. I’m not a very advanced developer, my projects never grew over few dozens of files (yet of many different types at once), but I guess performance impact related to loading TONS of unneeded stuff is visible even in small projects (or especially in small projects since you’d expect them to load faster). I’d also set for closer integration of debugging process or faster code or whatever, since this would speed up my development process. When your project hits 4GB limit then either you’re doing analysis of year-worth data or account for huge data flow, or you have some terribly wrong assumptions and algorithms and you should trim the data amount as much as possible. That’s my opinion

  69. Andy-Pennell says:

    There was, briefly, a real 64-bit version of the VS IDE: DBGW64. It ran native 64-bit on the Itanium, around 2002 as I recall, based on the [pre-Everett version of the] Whidbey codebase. Poor Vadim even ported the Office toolbar code over, which was probably the most painful part of the work. It could edit code and debug (which was its actual purpose). http://blogs.msdn.com/andypennell/archive/2005/02/21/377621.aspx

  70. ashok Grafton WI 53024 says:

    Hi Rico,

    Very convincing explanation..gr8!!

    But will the 32 bit processing speed be the same as 64 bit processing?

  71. Joey says:

    I didn’t know that there wasn’t a 64 Bit version… I guess I have to get two Pc’s now otherwise I can’t game on 32 Bit… No 8 Gigs ram…

  72. wlad says:

    Native 64 bit Itanium Visual Studio was what I sorely missed back in the days. Whidbey betas ran all ok on IA-64 and the RTM time they stopped installing. This made IA64 development pretty much nightmare, (any cross-development and cross-compiling /cross-debugging is a nightmare). Now I do not have to support Itaniums anymore and this is quite a relief. Interesting ,what are MS plans about Itaniums? Will Notepad still remain the single GUI application on this platform?

  73. Franklin says:

    Why don’t MS give up of the VS development and start making extensions for Eclipse?

    It is a way better than VS.

    Franklin

  74. David says:

    My scientific research requires to manage memory more than 100GB. I like c# very much, but it seems not possible to run my research coded in c#.

  75. David says:

    I tested the following C# code with Visual Studio 2008:

    using System;

    namespace CSharp411

    {    

    class Program    

    {  

     static void Main( string[] args )  

    {        

     int bits = IntPtr.Size * 8;        

      Console.WriteLine( "{0}-bit", bits );        

      Console.ReadLine();

    }

    }

    }

    The result is "64-bit".

    It seems to me that a c# application written with Visual Studio 2008 can manage 64-bit address. My scientific research requires to manage memory(RAM) more than 100GB. Will my application still be limited by 4GB-memory? Thanks!

  76. Sandy says:

    Ahhh Rico… I feel for you man. It seems though you got some useful information out of this crapstorm. A 64 bit version of VS would be useful it seems for some of the same reason as a 64 bit version of Word ;-).

  77. aoa says:

    In the open source world, usually the core tools like gcc and the kernel are ported to new architectures FIRST. Then the user applications follow the example.

    Usually the core applications in the Linux world are the ones with the top quality, performance, etc. So the LEAD by EXAMPLE.

    The fact that Visual Studio has not been able to catch up with Windows in the 64 bit arena has much to say about the quality of the MS core dev tools, and that is kind of sad.