Howdy – I’m a software design engineer at Microsoft, in the Windows core technologies group. My work involves isolating applications, components, and the operating system from each other. I don’t have much original or interesting to say, but by gum – I’ll say it here. Like many of my generation, I grew up using computers. I won’t enumerate them all, but the list starts with an Atari 800 XL and hasn’ ended yet.

The unfortunate state of the world today is that once an installer exits (either for an operating system or a software package), the resulting state of the world is completely unknown and undefined. Sure, judicious use of regmon and filemon will let you guess at what those installers did, and maybe you’ll be able to uninstall. Compound this with the large set of installations required to get a running system, differing versions of shared components, and even more fun around points of extensibility, and you end up with a morass of crap – the only hope is to flatten and reinstall the system.

So what happens when we uninstall? Well, generally, the uninstaller has a list of objects that it installed, and some sort of reference tracking to know when stuff should go away. The installer decrements those references, and whacks the stuff that should be deleted. Too bad the installer for FoobleSoft’s Bongoblam III’s installer forgot to increment the reference on the shared component ponkfoo.dll – it hit zero, so the installer does away with it. Practical upshot? Next time you run Bongoblam III, it crashes during binding static imports. Well crap.

How do you help this? You could move to a single installer (MSI), so that only that installer knows how to do the right thing. You could move to a world where the system deeply understands the requirements between DLLs and EXEs (or images in general) and refuses to allow deletion of images if any references on them are alive. Both of thse, when done correctly, fix the problem of shared components accidentally disappearing out from under you. That’s a good step forward…

But, it’s not sufficient. That package you uninstalled – call it Barbaszot – and Bongoblam both wanted to be the .XYZ shell extension handler. The installer’s simple logic noticed that Barbaszot had written to HKCU\.xyz, so it deletes everything that was updated. Now when the shell wants to activate, it could find Bongoblam III, but it can’t! The .xyz handler metadata is gone, and the shell pops up a “what do you want to do with this file” dialog.

Smarter installers – like MSI – can help with this as well, by noticing the state of the world before values are written or updated in the registry. Maybe when you uninstall Barbaszot, it knows to just rewrite the old values, and suddenly Bongoblam III is now the handler again. That’s groovy. But wait – what if Bongoblam was installed after Barbaszot, but Barbaszot stole back the extension handler (this horrible practice started about five minutes after the first alternate handler for .whatever files was created…)? The installer in all its logic says “well there was no value before” and helpfully deletes the keys. Now Bongoblam III is out in the cold again.

What can we do about all this? My work involves doing completely self-described simple and nondestructive operations to a running system. In my world, when the shell was looking for an .xyz handler but after Barbaszot had been removed, it would look through a data-driven list of applications that had declared themselves to be .xyz handlers. Maybe it would have prompted the user, maybe it would have picked the last-used, maybe it would have picked the first random one. In any case, removal of Barbaszot would not have destroyed the list of possible choices

It would appear I’m rambling again. I’ll have more later – less ranting on how bad software is now, and more things I’ve discovered while working here.

Comments (6)

  1. Welcome aboard:) Good to see you here.

  2. Stephane Rodriguez says:

    I have a question for you. In many if not all .NET-related apps out there, deployment looks trivial. In fact, pretty much all .NET apps being demoed openly are simple demos or POCs, not complex apps that need to share the system folder or be integrated with a 3rd pary app, or behave nicely with other apps (an Excel add-in installed over existing Excel addd-ins).

    What do you think? Are the demos and POCs available out there really representative to real world apps out there, or is it just that there is no real strong complex .NET app available out there yet, let aside the CLR itself, but we know the versioning/sxs issues that it arises?

    Do you think that .NET provides a better mechanism for the static and dynamic diagnosis of component dependencies? If yes, why are the arguments? If no, isn’t it simpler to stick with existing WIN32/COM/MFC/ATL apps whose behavior is well alone (especially in VS6.0 compiled environments)?

  3. denny says:

    take a bit of longhorn and .net and then go into the "way back machine" to get the Amiga OS and use them with a dash of Linux to just build a new OS that works right.

    now getting a few billion / a billion users to switch thats the rub.

    the Amiga had the right idea on things like address 0004 and shared .Libraries in limited ram very well … I think due in part to relative branch opcodes that the 680xx procs used.

    .net is a great step to the right model.

    but we still have a ton of com and classic dlls under the hood to muck with.

    can we have a way to "Version" code so that we can *NOT* have oddles of local copies of a dll??

    hard disk space may be cheaper all the time but down that path you windup with having to copy and backup way to much stuff!

    I think the art and science of building components has some issues that need to be addressed….

    for example the classic:

    V1.0 has function Foo() that given an input of 1,2,3 returns 55

    V1.0.5 returns 66

    the contract is broken!

    so some apps need 1.0 and others need 1.0.5

    and the whold deal of to GAC or not to GAC

    I love to see a way… to make vendors want to and need to "certify" the .dll as backaward compatible …..

    if we could keep compatible new versions then we could GAC 99% of the code one time and in one place and keep the apps /bin filder slim and trim!

    but how could one create a test system for this?

    just some of my random ideas for you….

  4. I’ve been telling people for years that if we do our job right, the most important thing for our team to do eventually is tools for analyzing and testing for backwards compatibility.

    Really the problem is (a) figuring out what the actual contract is of the implementation since even the best engineers can’t specify the contract for nontrivial software to the "n"th degree and (b) understanding the contractual changes from version to version to see if they meet a reasonable bar for backwards compatibility.

    Even an obvious bug fix /is/ a contractual change; it’s a matter of economics of the costs around fixing it vs. not fixing it.

  5. Welcome aboard Jon, subscribed.

  6. Andrew Shuttlewood says:

    This sounds somewhat like the Debian package manager, (although I believe they removed this functionality).

    In the old days (about two years ago?), installing new Debian packages would ask you about mime types, and ask you where you wanted the package to sit – for example, if you installed a new image viewer, it would ask you the priority in the mime configuration. Once you picked a priority, then it’s fairly easy to handle removal of a package – you simply eliminate it’s entry, and other applications would then bump to top.

    Likewise, for common applications, Debian creates symlinks to /etc/alternatives for each package. A brief example

    /usr/bin/x-terminal-emulator is a link to /etc/alternatives/x-terminal-emulator.

    /etc/alternatives/x-terminal-emulator then links in my case to /usr/X11R6/bin/uxterm.

    If I removed that particular package, then another application that provided that alternative could be configured instead. I Believe this is done randomly, but I’d have to look at the source code to check.

    Of course, in the fairly hostile world of Windows, this is a lot harder – with applications abusing APIs to put themselves to the top whenever they can, so maybe asking the user is a good thing in this case.

    (Actually, having looked at it, the alternatives stuff does have a priority associated with it – when a package is uninstalled "update-alternatives" is called, which notices any removed packages and then changes the symlinks to them).

    (And the mime stuff is done using "mailcap.order", which has a manpage on my debian box to explain how to use it).

    Most of this is a lot more useful when you have control over the packaging environment and you can assume that installed packages are non-hostile, but the system does handle some of your scenarios anyway. It might give you some good ideas 🙂