"Orcas" .NET Framework compatibility


As you know, work is currently underway in my group for the next version of Visual Studio which we call internally as “Orcas”.  


 


Visual Studio “Orcas” is designed to enable customers to rapidly create connected applications on the latest platforms and with the highest quality user experience either individually or as part of a team.


 


We’re shipping the next generation programming framework that includes Windows Presentation Foundation (WPF), Windows Communication Foundation (WCF) and Windows Workflow Foundation (WF) in Windows Vista.  We’re also making this available for the Windows XP platform.  Both of these items will be available prior to the release of Visual Studio “Orcas”.  So we have been spending a lot of time thinking about how we should service these components.  We wanted to make sure that we are really customer focused on how this impacts you to ensure that your experience with the next version of VS is a seamless one.


 


Jason Zander wrote a blog on Framework compatibility last year in the VS 2005 context which is a good read on compatibility.  


  


The approach we are taking is aimed at minimizing the impact of delivering new features and functionality by servicing existing runtime components in-place and then advancing the platform with new assemblies.   Internally we’re referring to this as the “Red and Green” model which I’ll explain below.


 


“Red” assemblies include things like WPF, WCF and WF (which will ship as part of Windows Vista) and .NET Framework 2.0 that shipped with Visual Studio 2005. Our goal for dealing with “Red” items is to deliver service pack like compatibility.  We are going to try very hard to minimize the overall number of changes being done in “red bits” to help reduce churn and achieve very high levels of compatibility.  


 


“Green” assemblies are where we will be introducing new functionality into the platform.  The aim is to have any new feature be additive (for example new controls) rather than a reimplementation of an existing feature (outside of the servicing I mentioned above).  Finally we want to ensure that installing any new assembly will not impact an existing application.  For example, for C# language in Orcas, we will make a small “red bits” change to allow you to get at the new compiler and the new compiler is then a “green bits” addition.


 


If you write a component that depends on a “Red” assembly you can be assured that we’re going to do everything we can to ensure binary compatibility for your code even as we update the .NET Framework and WPF, WCF, WF as we move forward.  All the major new features you’ll see from us will appear in the “Green” assemblies so you can feel comfortable that both your new and old code will work on the target machine.


 


The “Red and Green” model will help us deliver a great new product that makes your experience adopting it as easy as we can.  Look forward to more details on this as we make more progress.


 


Namaste!

Comments (52)

  1. JohnGalt says:

    There is absolutely NO reason to maintain backwards compatibility with previous versions of .NET with new versions.  First you’re artificially tying your hands, second, since the frameworks can live side-by-side (except the web.config bug between .net 2.0 and .net 1.1 virtual directories) there is absolutely no reason for for it. (ship the version that your application requires with your application!)

    You tried this with .NET 2.0 and failed to the point where you can’t possibly assume that an application will work the same on a machine with .NET 1.1 or .NET 2.0. (i.e. all of the byte manipulation stuff breaks between versions)

    Go build the very best thing you can, and drop all of the crap that isn’t needed and forget about backwards compatibility. This isn’t about storage space anymore. Do it right, and forget MS’s backwards compatible legacy that got it into the security mess.

    Oh and make sure VS.net isn’t betaware when it’s released like Vs.net 2005… (code editing at 1 cpm anyone????)

  2. Don says:

    I have to agree with JohnGalt’s comment.  I thought the whole point of .NET’s versioning system was that you could run side-by-side.  

    You’re already putting wiggle-words into your sentences like "we’re going to do everything we can to ensure binary compatibility for your code."  That’s very definitely not the same as "we will assure binary compatibility."  You already have a versioning system that makes it so you never have to touch a version after release.  Why would you want to mess with that?

    This sounds like a really dumb idea, and the whole "Red Green" thing just makes it sound worse.  Ever seen the "Red Green" show?  Not the guy I want planning my version compatibility scenarios.

    -Don

  3. jesam says:

    I agree with the first two Posters.

    Jens Samson

  4. FullMetal says:

    I basically agree with the first two comments: why do you want to ensure backwards compatibilty at all costs?

    The Framework is relatively new, so this is the best moment in its history to introduce new features and do some heavy refactoring/replacing of the old ones.

    .NET is beautiful, so learn from your – inevitable – errors and take advantage of its early age to render it the best framework around. Look more seriously at the feedback center, there are a lot of great suggestions to consider.

    Remember that we, the programmers using .NET, are at the cutting edge, we are early adopters and experimenters, at this time we expect new features but, more than that, we expect improvement over the existing ones.

    You will have to ensure backwards compatibility in the future years, when .NET will be too widespread to change it heavily. That will be the time to focus on consolidation, but now it’s  really just too early.

    Yes, like JohnGalt said you will end up tying up your hands: the product is new but you are already suffocating under the burden of legacy.

    You risk to raise a contorted tree.

    Let me say that we like the framework very much: that’s why we are so passionate about it and its future.

    We are talking about survival of the fittest framework!

  5. bz says:

    I think you’re doing the right thing.

    Aim for compatibility and performance.

  6. With the recent LINQ CTP, XLinq’s feature set is getting close to what we plan to release in "Orcas". …

  7. Doesn’t Side by Side Cure All?

    Soma posted in his blog how we plan to produce an updated version of…

  8. dalmuti509 says:

    I agree with the previous posters regarding backwards compatiblity.  One other thing I think that Microsoft should do is always make fields, properties and methods in base classes protected, not private.  This is a HUGE HEADACHE because you are only exposing enough that if I need to make a simple change, I have to rewrite half the functionality.  This is a poor example of how to write base controls.  I could have written two lines of code to add custom sort functionality to the GridView control, but because you didn’t expose anything i had to rewrite about half the code.

  9. To assist developers, Microsoft is assigning a traffic light color scheme – red and green – to assess .Net Framework updates. Or, if you prefer, a Christmas tree color combo. In a blog entry entitled, "Orcas, .Net Framework compatability," Microsoft’s..

  10. Sean.McLellan says:

    Even though I have a strong sense of pride in the level of backward compatability that 1.1/2.0 provides, I find myself agreeing with the previous posters.

    Lets step back and ask why someone would want to upgrade their application to a newer version — probably the biggest reason is to take advantage of the new features in vNext — keep in mind that when I say features, I primarily mean new functionality that the BCL provides, but be also aware that new features features includes things such as IDE improvements and things that increase productivity. By nature, using and incorporating new functionality requires refactoring — stripping out and improving my existing code to take advantage of features introduced in vNext.

    When I’m upgrading my application to vNext I’m not concerned with total backwards compatability because I KNOW I’m going to have to change my code anyway to incorporate new functionality. In fact, I’d bet that most devs, as part of this effort, would be happy fixing breaking changes made in the BCL classes if they are an improvement in some way, consolidate functionality scattered elsewhere, or are a fix of a bad design.

    If I’m fat, dumb and happy with the current version and am not interested in new features and functionality, I’m probably not going to upgrade my application just to say that it runs on the latest version of the framework. This seems to be the story that most of the backwards compat. work is done for, and as indicated, I feel that it’s not needed.

    As history has shown with 1.1 to 2.0, moving an application forward to a new major version of the framework is an undertaking that requires at least some effort. If part of that development effort is needed to reflect changes made to vNext of the BCL, so long as these changes are documented, and involve making the quality and surface of the BCL better in respect to new and existing developers alike, I think that engineering breaking backwards compat. between major versions is definately acceptable.

  11. M.Sha says:

    does ‘orcas’ support existing .net 2.0 coding?

  12. So there you have it MS.

    Fix stuff. Make it work right, and screw backwards compatibility. We don’t want, or need it (we being everyone except one person that has replied)

    Do the right thing instead of sticking to your broken "backwards compatible" mantra that got you in trouble with security in the first place and just give us the best possible thing you can and change things if you need to.

    As madening as development on WSE is, I like the fact that they change it and improve it and screw backwards compatibility. It makes me use the new system, but I get benefits, and it isn’t tacked on, it’s stuff that makes sense from the ground up.  That’s the hallmark of the .NET framework and should be the mantra in MS.

  13. During the past few months, the Visual C++ team allocated many of its development resources towards addressing…

  14. For the Orcas release we want to only fix bugs that won’t create barriers to adoption. We have created…

  15. On two life altering occasions I heard that phrase spoken, and writing it down still makes me weak in…

  16. Earlier today, I was working on my article on the timeline for .NET releases over the next 18 months….

  17. Sven's Blog says:

    The title of this post might actually be more appropriate if it was called what aren’t they planning….

  18. Kathy Kam says:

    Didn’t I say it two weeks ago that API naming is the most difficult thing? :) My BCL post on System.TimeZone2

  19. Ever since Kathy Kam announced on her weblog that a new type named TimeZone2 will be introduced into

  20. I finally got around to downloading the Orcas September 2006 CTP bits, and started to play with it a

  21. Microsoft has released the October 2006 CTP of Visual Studio Codename "Orcas". If you have the available

  22. Dennes says:

    3 dos feedbacks que postei no Connect receberam resposta hoje. Segundo a resposta, os feedbacks…

  23. Microsoft has released the September 2006 CTP of Visual Studio “Orcas” as an image for Virtual

  24. Microsoft has released the September 2006 CTP of Visual Studio “Orcas” as an image for Virtual

  25. Microsoft has released the September 2006 CTP of Visual Studio “Orcas” as an image for Virtual

  26. Незабаром в нас буде нова версія .Net Framework 3.5, яка схоже, як і 3.0, буде спиратися на базовий .Net

  27. Green and red bits where introduced at the release of the .NET Framework 3.0. Somasegar talked about

  28. Ο τίτλος δεν έχει σχέση με τον Ολυμπιακό και τον Παναθηναϊκό. Το κόκκινο και το πράσινο χρησιμοποιούνται

  29. While installing VisualStudio 2008 Beta2 I was surprised that the NET framework 2.0 installation got

  30. Theres a whole lot of new stuff in Networking in Orcas. Some features are going out as part of redbits

  31. Theres a whole lot of new stuff in Networking in Orcas. Some features are going out as part of redbits

  32. Visual Studio .NET 2008 (codename Orcas) is just around the corner. With the release of Beta2 a few weeks

  33. Two months ago, Scott blogged about the multi-targeting support in Visual Studio 2008 . I worked on this

  34. Bill says:

    Sorry for joining this topic so late.

    The importance of backward compability varies depending on the customer.  I work for a large bank and the industrial strength applications we develop costs millions of dollars and years to build; after all we’re counting your money!  These applications often run our business for 10-20 years and we maintain them (with enhancements) for that period. We absolutely need to have backwards compability and have the old versions supported for that period.

    Perhaps once every 10-15 years we will re-write an entire system but only if the business need demands it. I was a technie myself once, but this cutting edge, new technology, upgrade every few years, "scam" doesnt pay the bills.  We’re  sometimes forced to pick unix or even mainframes to avoid the Microsoft upgrade game.

  35. .NET 3.5 November Ship Date Announced

  36. There’s one problem with the "screw backwards compatibility" approach. Add-ins. If you write an Office add-in, each process can only hold one version of the CLR. So if Winword.exe or Outlook.exe loads .NET 3.5, and .NET 3.5 isn’t compatible with .NET 2.0, then the add-in written against .NET 2.0 breaks.

    Now if you can make it so that a process isn’t limited to one CLR instance, and Outlook.exe and Winword.exe, etc can run .NET 2.0 and .NET 3.5 add-ins side by side in the same process, then I agree with everyone else and screw backwards compatibility. That’s how you wind up with stupid names like TimeZone2.

  37. David Nelson says:

    Late to the game as usual, but I just wanted to say that I agree with the majority of posters that backward compatibility should not be a primary design goal of a new version. Its not that is has no value, but its just not worth it if it means holding back other more important fixes or features. I completely agree with Sean McLellan’s post; if I upgrade to a new version, I already know that I will have to make some code changes (especially since you can’t guarantee 100% backward compatibility, no matter how hard you try). So what if I have to make some extra changes to deal with breaking changes? EVEN IF those changes are actually fairly extensive, its worth it if it means that the new version is better and more usable. And if its not worth it to people like Bill and his banking system, they don’t have to upgrade! If you’re so worried about your ancient mission-critical system, why are you upgrading to a new version in the first place? Shouldn’t you stay with what you know works? In short, backward compatibility simply isn’t worth the price you (MS) are forcing us to pay.

  38. Tim Sneath says:

    As many people will have noticed, we released Windows Vista Service Pack 1 this week ( read about the

  39. As many people will have noticed, we released Windows Vista Service Pack 1 this week ( read about the

  40. TheCPUWizard says:

    Old post, I the "issue" is going to come up again with .NET 4.0 (already in CTP).

    Backwards compatibility is an abolsute requirement to the ISV [Independant Software Vendor]. I have a library of over 2 million lines of code (simple linebreak counter, includes non-code lines).

    If I re-compile with a new version I "hope" that the code will function identically, but I "need" to know that if there is a breaking change in functionallity, that it will break the build.

    Yes, I have ALOT of Unit and Functional testing code [actually included in the above line count], but even the most robust APPLICATION testing does not rigorously test the internal functioning of the vendor supplied code.