Big Step/Small step


I’ve been spending the last couple of days responding to a customer bug complaining about the performance of the WinForms designer (WFD) when you use C# versus when you use VB.   For those who haven’t used it, the WFD uses a round-tripping model where you can visually design your form, have it spit out the corresponding language specific code, edit that code (albeit with many restrictions) and see your changes reflected in the designer.

 

So why was the experience so much better in VB than in C#?  Well, certain parts of the WFD architecture is language neutral (specifically the GUI designer), however both the code-spit and code-analysis portions are implemented by the specific language teams.  Now, the code-spit part was no problem whatsoever, but as it turns out the code-analysis portion was a significant hot-spot for C# but not VB. 

 

Now, I’ve talked in the past how C# differs from VB in that they have a full incremental compiler whereas we have a more demand driven model that analyzes and deciphers code as necessary to satisfy the current request being made.  We also cache that information but aggressively invalidate that cache to insure that we’re correctly understanding the code that you’re changing.  (I’d prefer to have the VB model, but this is how it’s currently done for VS2005 and we have to work within those constraints).   Now, for the code analysis portion of the WFD we were doing things very naively (but at the same time very safely).  For example, when compiling each statement we were computing a lot of the same information over and over again instead of just using the information we’d compiled for the previous statement and just adding onto it.  We were also not caching when we could have.  For example in the statements:

 

                  this.ForeColor = System.Drawing.Color.Red;

                  this.BackColor = System.Drawing.Color.Black;

 

When we were trying to bind the identifiers “System”, “Drawing”, and “Color” in the second line we went through the same logic we went through in the first line.  So, I updated the code to deal with both issues and we have so far gotten dramatic performance increases from it.  However, even though I’ve spent quite a bit of time talking about this, this post is not about the WFD, profiling or the C# implementation.  What I actually wanted to talk about was the customer, his problem, and software development in general. 

 

In my last post I talked a bit about driving VS2005 to be shippable and making decisions about what issues need to be addressed, and which issues don’t meet the necessary bar.  It could have been quite possible that in the future this bug wouldn’t meet the bar, and it’s altogether possible that the fix I’ve created might not make it because it could be too risky.  i.e. we have a working designer right now, and it’s not clear if I might have missed something that will cause functionality to regress with this change, so is the risk of possibly breaking the designer worth the performance gain?

 

So what does that mean for the customer?  He’ll end up with another release where he wants to use C#, but finds the experience decidedly suboptimal.  I have no idea when the next version of the product will be, but I’d guess ~2 years after this one.  So that means that he’ll have to wait several years in order to get this issue resolved for him.  Personally, I’d feel terrible if that happened.  This customer has been great to us by providing us with an excellent scenario and performance data that they already collected, and they’ve been incredibly patient so far.  Normally i don’t mind deciding not to fix a bug because i feel that it wont’ relaly affect anyone terribly, but in this case i know full well that there is a company out there that would really benefit from this and if we don’t work on this then they will be affected.

 

Now, consider two different methodologies we could be producing VS in: an incremental model with fast releases every few weeks or months, and an open source model.  I think both models would be fantastic for addressing this issue the customer is facing.  In the former we’d still have to tell him “sorry, we weren’t able to address this for VS2005.  But hey, here’s VS2005.1.”  In the latter, the community could provide a patch for this issue that the customer could use.  We could then examine the patch and incorporate it into the main source once we’d verified it wasn’t too risky.  It still wouldn’t get into the product for everyone until a later release, however, the customer’s problem would be solved and he wouldn’t be left waiting.

 

I’d really like the VS group to start investigating avenues that allow for this kind of rapid response to customer needs.  We’re never going to be able to ship the perfect product, and being able to address issues quickly rather than on a multi-year basis is something I think our customer want and deserve.

 

What do you think?


Comments (20)

  1. Senkwe says:

    I like the open source idea, but the question is, would we still have to pay for a product that we’ve essentially helped to write? Visual Studio being free would be a GREAT problem for MS to have IMO 🙂

  2. I think the incremental releases for the VS.Net editor could work. I would be happy to run ‘Visual Studio Update’ every few weeks. The challenge for Microsoft would be ensuring the quality of each of these incremental releases (the NT4 SP6 debacles etc are still vivid for me). I imagine the shift from the biannual release cycle to monthly releases would require a substantial culture shift for your team.

    The real problem would be changes to the actual .Net framework. Instead of saying to users, ‘you need the .Net framework v1.1 to run this program’, we’d have to start saying, ‘you need the .Net framework build x’ where x could be any of these incremental releases. Keeping the entire .Net framework install base up to date with monthly releases would be a real issue.

    I don’t envy the Java programmers who will soon be spending time explaining to their customers that they need to navigate Sun’s website to download the 1.5 JVM.

  3. Thank you, thank you, thank you… finally someone acknowledging the fact that Visual Studio releases are too much apart (regularly a few years).

    I would really appreciate it if MS would start releasing somewhat more aggresively. At least some features that are safe (in other words, do not introduce regressions) could be added this way. I won’t even mention bug fixes…

    Actually, since original Visual Studio.NET there’s been a button to check for service releases yet there hasn’t been a single one yet – what’s the deal with that? Isn’t service release exactly this kind of thing you are proposing here?

  4. David Levine says:

    I like the incremental release idea. It would be useful if it were combined with the ability to rollback an upgrade in case something broke that I couldn’t live without. The problem now is that if something doesn’t work right it can be a l-o-n-g time until a fix is released.

    Open source would be nice but it would take a lot of careful management to prevent it from mutating in harmful directions. I think that would have more benefit as a tutorial on how to write code then on how to get bug fixes done in a timely manner.

  5. Mike Schrag says:

    I think this is a really interesting issue for any software vendor, not just the VS team.

    I do some contract work for Rio, and they’ve allowed me to release updates to my part of the system independently from the core product. It’s been very interesting to see the comparisons/contrast between the core app, which releases updates on the 3-month scale, versus my part of the suite where I tend to respond to a bug report in the days-to-week timeframe.

    I think the biggest issue is testing. Obviously if you’re releasing builds much more frequently, you have two options 1) your test team has to be constantly testing since almost every build might become a production build or 2) put out intermediate releases with limited testing (perhaps less system testing and maybe the fixes have just been approved by unit testing) and make it obvious to your customers that if they want the fix, they must accept something along the lines of a beta build. Given this choice, I guarantee your customers will be OK using beta quality fixes if it means it solves a problem they’re been having. Especially knowing that if they find a bug with the patch, it will be a relatively short turnaround time to get a fix to THAT as well. Additionally, they certainly can’t be any WORSE off than the original production build, which they could always roll-back to if a patch is really wacky.

    The last part that is a concern from the development side, and that is branching and maintenance on the branch. Before, it’s likely that you would tag/branch VS 2005 and then just start working on VS 2006, which could be massive architectural overhauls. Now, though, you will be actively maintaining two branches, so you will need to more your fixes constantly between both. It could become pretty annoying. In my case, I release off the head of the tree all the time, so I’m only maintaining the trunk, but in your case, you won’t want to release intermediates that include new features of VS 2006/2007, so you won’t have that luxury.

  6. WPoust says:

    Updates/incremental releases every 6 months would work well. That should give enough testing time but also show customers that you are making progress.

    For our development group though, the reality is we do NOT update on every release. We have too much code and not enough time to spend a couple of weeks updating our developers and all the source code. Generally, we shoot for updating on every other release.

    It is also important that new releases work well with old releases. An example is VS2003 converting a solution so VS2002 can’t use it. That causes a lot of problems. We need experience with new software tools before some developers/managers give acceptance for everyone upgrading. By having backwards compatibility, the early adopters can start using the tools on our projects and speedup the acceptance phase.

  7. chrisbro says:

    The biggest problem with doing releases every few weeks is that it’s difficult to do anything more significant than small bug fixes or trivial features. Even a release every few months gives you about 8-10 weeks of code then a bunch of stabilization time. You never get to do refactoring or cleanup because you’re always in crunch mode to ship. There are no beta releases where you can decide something just didn’t work well and pull it.

    So while you get to spin on quick fixes, you don’t get to do the really exciting new stuff.

  8. Jeff Atwood says:

    "Now, I’ve talked in the past how C# differs from VB in that they have a full incremental compiler whereas we have a more demand driven model that analyzes and deciphers code as necessary to satisfy the current request being made. We also cache that information but aggressively invalidate that cache to insure that we’re correctly understanding the code that you’re changing. (I’d prefer to have the VB model, but this is how it’s currently done for VS2005 and we have to work within those constraints)"

    I’d like to find the person responsible for "Elvis" and "Mort" so I can kick them squarely in the nuts. Whoever you are, I totally blame you for this sad state of affairs.

    Incremental compilation and Edit-And-Continue are important for every developer, not just "marketing persona A".

    As for your other point: what the heck happened to service pack releases for VS.NET anyway? There were like six service packs for Visual Studio 6. I have yet to see ANY for VS.NET 2002 or 2003.

  9. Mark Levison says:

    Thank you, Thank you, Thank you – there are a lot of fixes (big and small), that I’m waiting for in the Visual Studio environment. Decuple framwork revisions from tools revisions and have releases evry 6-8 weeks.

    As for the concerns about frequent releases – look into any of the agile methodologies (XP, Crystal Clear, …), they encourage it. It certainly seems to work for Eclipse.

  10. Senkwe: I certainly hope we move to free releases for VS. 🙂

    Unfortunately I’m not in management. I could move to management… but then i would be able to do the dev work i love!

  11. Drazen: I agree. There are lots of little features I want to do that we just don’t have time to do now and i don’t want to have to wait years more until they’re released.

  12. ChrisBro: "The biggest problem with doing releases every few weeks is that it’s difficult to do anything more significant than small bug fixes or trivial features. Even a release every few months gives you about 8-10 weeks of code then a bunch of stabilization time. You never get to do refactoring or cleanup because you’re always in crunch mode to ship. There are no beta releases where you can decide something just didn’t work well and pull it.

    So while you get to spin on quick fixes, you don’t get to do the really exciting new stuff."

    I agree/disagree. I think you could do staggered released where you do work on some pretty big things (like generics), that require a lot of coordination to get right. but at the same time you could be producing your incremental work as well.

  13. Jeff: "I’d like to find the person responsible for "Elvis" and "Mort" so I can kick them squarely in the nuts. Whoever you are, I totally blame you for this sad state of affairs.

    Incremental compilation and Edit-And-Continue are important for every developer, not just "marketing persona A". "

    The personas were not what dictated what features the languages got. C# didn’t get incremental compilation and EnC because someone thought that those features were unimportant. C# didn’t get those features because it was felt (especially after communicating with the community) that refactorings were more important and that given our limited resources it was right choice.

    We dont’ sit here twiddlign our thumbs (well, sometimes when a long link is happening) during te product cycle. Every day is jam packed and we’re working darn hard 🙂

    FWIW, incremental compilation and EnC are not simple problems. If i could come in on a weekend and get it done i most certainly would. However, they’re features that would take a team a significant amount of time and which would add ton of risk.

    How do you implement incremental compile? Will it scaled to projects with 10s or 100s of MB of code? How do you test it? etc. etc.

    When C# was getting created it was decided for simplicity and speed to go with the current architecture. I believe that at that time it was a reasonable choice to make (even though i disagree with it), but we’re definitely feeling the effects now. And, as i said before, it’s very much something i want us to be working on.

    Just to make it clear: It’s not marketing 🙂

    We’re just not supermen, that’s all.

  14. Mark: "Thank you, Thank you, Thank you – there are a lot of fixes (big and small), that I’m waiting for in the Visual Studio environment. Decuple framwork revisions from tools revisions and have releases evry 6-8 weeks."

    I’d definitely like that. It would be nice to be able to ship not connected to the runtime. And it would be even nicer to ship tools that weren’t so tied to the IDE. not yet though. Sigh…

    "As for the concerns about frequent releases – look into any of the agile methodologies (XP, Crystal Clear, …), they encourage it. It certainly seems to work for Eclipse."

    Our team has been focussing a lot on agile and XP and we’d like to move completely over to it. Jay http://blogs.msdn.com/jaybaz_MS talks about it a fair bit and is pretty passionate about it.

  15. Vince P says:

    I’m still waiting for the bugs in VS2003 to be fixed

  16. Shital Shah says:

    I’d been similar situation with my PM an year back. Our company has a policy not to ship anything until not only every feature is incorporated but also a full QA cycle is run AND there are absolutely no bugs left to fix. If we haa scheduled last build on Friday and if QA comes on with one-in-a-million-year issue on Sunday, whole schedule will be posteponed. To make this even worse, after we have releasd the product to customers and then if you find bug, we weren’t allowed to fix it. In case if tremendous pressure builds up, they will ask us to branch out whole project (literally 30+ component sub-projects), make a change and run ENTIRE QA cycle all over again for weeks. The product wasn’t used for mdecal or life critical or aviation or financial banking stuff, but they did all of these because they wanted a "solid" bug-free product. This, unfortunetely, isn’t realistic. I’d been aggressively trying to push incremental deployment and automatic on-demand updates model, but they just didn’t seem to get over their "perfact product" dream.

    I absolutely belive in evolving product model. Many times what customer wants is fast turnout, something real to play with raher than waiting for years to see final product and then again another years to more features.

    Mathematically, its impossible to have solid product if the scale is large without puting tremendous time and effort – which in moern markets might just proove too late.

    Many MS products falls in to this category. A new version of WIndows or Office takes years and years to see the customer.

    I guess key is the number of intruptive and data-loss bugs. The intruptive bugs are those which will intrupt *usual* way of using the software and make cutomer to call support. If these two types of bug count is 0, product should get out of door with monthly or even weekly synking with new builds. Also these builds should be available on Internet rather then retail boxes to make sure people who are downloading it have ability to update in future. For retail boxes, I still believe it should be "solid" version of the product. However this is more applicable to non mission citical direct applications in domains like militery, airospace, banking, medical etc. All others can and should follow evolving builds model.

  17. Mishel says:

    Your site is realy very interesting. http://www.bignews.com