Historically, Windows didn’t tend to provide functions for things you can already do yourself

Back in the old days, programmers were assumed to be smart and hardworking. Windows didn't provide functions for things that programs could already do on their own. Windows worried about providing functionality for thing that programs couldn't do. That was the traditional separation of responsibilities in operating systems of that era. If you wanted somebody to help you with stuff you could in principle do yourself, you could use a runtime library or a programming framework.

You know how to open files, read them, and write to them; therefore, you could write your own file copy function. You know how to walk a linked list; the operating system didn't provide a linked list management library. There are apparently some people who think that it's the job of an operating system to alleviate the need for implementing them yourself; actually that's the job of a programming framework or tools library. Windows doesn't come with a finite element analysis library either.

You can muse all you want about how things would have been better if Windows had had an installer library built-in from the start or even blame Windows for having been released without one, but then again, the core unix operating system doesn't have an application installer library either. The unix kernel has functions for manipulating the file system and requesting memory from the operating system. Standards for installing applications didn't arrive until decades later. And even though such standards exist today (as they do in Windows), there's no law of physics preventing a vendor from writing their own installation program that doesn't adhere to those standards and which can do anything they want to the system during install. After all, at the end of the day, installing an application's files is just calling creat and write with the right arguments.

Commenter Archangel remarks, "At least if the ACL route had been taken, the installers would have had to be fixed - and fixed they would have been, when the vendors realised they didn't run on XP."

These arguments remind me of the infamous "Step 3: Profit" business plan of the Underpants Gnomes.

  • Step 1: Require every Windows application to adhere to new rules or they won't run on the next version of Windows.
  • ...
  • Step 3: Windows is a successful operating system without applications which cause trouble when they break those rules.

It's that step 2 that's the killer. Because the unwritten step 2 is "All applications stop working until the vendors fix them."

Who's going to fix the the bill-printing system that a twelve-year-old kid wrote over a decade ago, but which you still use to run your business. (I'm not making this up.) What about that shareware program you downloaded three years ago? And it's not just software where the authors are no longer available. The authors may simply not have the resources to go back and update every single program that they released over the past twenty years. There are organizations with thousands of install scripts which are used to deploy their line-of-business applications. Even if they could fix ten scripts a day, it'd take them three years before they could even start thinking about upgrading to the next version of Windows. (And what about those 16-bit applications? Will they have to be rewritten as 32-bit applications? How long will that take? Is there even anybody still around who understands 16-bit Windows enough to be able to undertake the port?)

Comments (62)
  1. nathan_works says:

    Well, wasn’t Henry Ford known to search scrap yards to see what parts of his cars were still in good working order, and then replace the supplies of said parts with cheaper suppliers.. By not making things to last, he increased his profits..

    Not sure who’d profit here – MSFT or the ISV shops, but putting some kind of planned obsolescence is a strategy of so many industries already. (in addition to offering limited to no-warranty, making repairs as expensive as replacing via parts-pricing, etc)

  2. Mc says:

    We’re currently having fun trying to get a 8-bit DOS program (.com file) that drives a optical mark reader via a serial port (mission critical application of course) working on a new PC which has Windows 7 and no serial ports.   It’s currently live on a Windows 2000 PC which is getting a bit long in the tooth.   The application was written by an external company that no longer exists and there’s no source code.

    But we appreciate the fact that we were even able to get it working on W2000 8 years ago.

  3. Mc says:

    Sorry that .COM file must be actually 16-bit as it runs on a 80×86 processor.   But it’s OLD anyway.  :-)

  4. acq says:

    Windows didn’t have to have something to match installation of any application.

    However it was a serious design error allowing replacing DDLs belonging to the system “just so” and forcing every plain application programmer update the OS with plain file copies over the system files in order to make his application functional(!) The result is that today (but note I’m still on XP) there is some service which allows stupid applications to actually for a moment replace system dlls (or something like that) and then the service copies the original ones over that(!)

    Windows really had a lot of good design decisions — hey it worked much better on the weaker PCs compared with anything else! In the time of computers with 2 or 4 MB nothing could beat it. And of course there’s no time machine, but it doesn’t mean that other programmers shouldn’t learn on examples of bad designs. again not having an API call for updating system DLLs was one bad decision.

    [But if applications couldn’t replace system DLLs, how would an app redist a DLL upgrade that it requires? (Remember, this is pre-Internet.) Oh, and the recommended way to replace system DLLs was not to do a plain copy; it was to use VerInstallFile. But of course most people said “VerInstallFile is too hard; I’ll just do a plain copy.” -Raymond]
  5. Jim says:

    Aren’t we lucky that we have the Window domination for the PC. Imaging if we had a 10 or dozen of the operating system competing in the Market. That would be a chaotic world. So we can blame or praise or ask for help to a single source, and the single source will make a proper solution for majority of our problems.

  6. tobi says:

    @raymond in Thursday, January 21, 2010 9:12 AM by acq:

    You are right, they should have used verinstallfile IF they had had the same interests as microsoft. But they didn’t so they did what was best for them. And that is why microsoft has to force them to server microsofts needs by making it hard to copy the dll over.

  7. Barry Kelly says:

    Hey! I wrote a bill printing system for a company when I was 15…

  8. a random passerby says:

    @Jim: I politely disagree. I think that if Windows hadn’t emerged as the clear leader in PC markets then MacOS would be the one and only choice… outside of Linux, which would still be mired in documentation and config-file hell and would still never come into significant use among casual users.

    Without Windows or MacOS, I think Linux might have had a chance, and I don’t know what would have happened. That’d be a very different world. The potential for chaos is high, but the same can be said of letting any and all applications copy whatever DLLs they want into c:windowssystem32.

  9. Cooney says:

    Sure, windows didn’t ship with an updater. We saw how well that worked; meanwhile unix (the OS, not the kernel) generally does have a package management. The Linux variant generally uses rpms and those things seem to work a whole lot better than on windows.

    That said, this seems a bit incoherent – why are you bolting a gripe about installers onto a reasonable complaint about people’s expectations of the OS? It isn’t reasonable to allow apps to randomly update pieces of windows, and that’s a large part of why the rule for a very long time was to reinstall windows every year to 18 months.

    [Once again, people are focusing on the example instead of the principle. Pretend I never talked about installers, then. (Oh, unix didn’t have package management for the first few decades of its existence either.) -Raymond]
  10. dave says:

    You are right, they should have used

    verinstallfile IF they had had the same

    interests as microsoft. But they didn’t so they

    did what was best for them.

    The problem with that is, what was best for them was generally the worst possible thing for the customer.

    "You’ve got foo.dll version 42?  Too bad, we want foo.dll version 41, so we’re replacing it because our app needs to work".

    Preventing idiot application programmers from screwing up my PC is not "serving Microsoft’s needs", it’s serving my needs.

    (Though I find the SFP mechanism to be, uh, strange at best. I’d much prefer an approach where Windows determined the home address of the guy who wrote the installer, and then sent a hit squad there).

  11. PhilW says:

    The install problem is actually a bit harder than just converting scripts. If that install CD from 1997 doesn’t install on Windows 7, nobody is ever going to give you a replacement CD image that does install. So there are things like installer elevation and the program compatibility assistant to help get the actual bits onto the system. After that, there is at least a chance that the app will run, or if it doesn’t that the vendor might supply you a fix.

  12. Anonymous says:

    @random passerby

    You’re saying that if it weren’t for Microsoft and Apple, Linux would have been the only choice?  I don’t think so.  It’s naive to think that if MS-DOS had never existed, some other player we don’t think of today or maybe haven’t heard of wouldn’t have come along and ended up playing the same role MSFT did.  Linux’s introduction in the 90s is too late to have done this.


    You are confusing the modern history of Linux (with its RPMs and DEBs and whatnot), or even 1990s Unix (with the various incompatible package managers that the vendors wrote), with the older history of Unix.  Unix didn’t have a package manager for a long time.  What Raymond writes is accurate.  And if you don’t think Raymond has his Unix bona fides, grep the Linux kernel’s credits file for "raymondc".  :P

  13. Alexandre Grigoriev says:


    Why don’t you run that legacy program under a VM (VirtualPC or other)? You can use USB->COM adapter then.

  14. Alexandre Grigoriev says:

    One possible solution would be to provide virtualized environment compatible with older OSs for legacy software, and make it run completely isolated, except for some shared folders.

    Overall, the tendency should be toward more process/app isolation: installed DLLs not shared, child window handles process-local (if not specifically asked for, inter-process SendMessage blocked except for well known and specifically permitted by the target, etc. And kill DDE broadcasts.

  15. dasuxullebt says:

    Does Win64 support Win16-Applications still, anyway?

  16. Conor says:

    You cannot stop developers doing something other than "Best Practice" with their software. After all, that’s why MS is still in business (as opposed to Lotus/Broderbund/Visicorp/etc).

    You can however say "You cannot have the Designed for Windows XP/Vista/7 logo unless your application meets certain requirements".

    Like running LUA. Like not adding Everyone:Full Control to the Program Files directory (yes, I’ve seen that done). Like not polling the registry, something that gets up Mark Russinovich’s nose apparantly.

    MS has bent over backwards to meet demands that old software works (UAC?) and hasn’t got much praise for it.

  17. Gabe says:

    Nathan: you talk about Henry Ford’s visits to scrap yards as if they were a bad thing. As a car owner, there’s little value in, say, some axle outlasting the rest of my car. If Ford can make the axle using less steel, the car can not only be cheaper (saving me money) but also lighter (making the car accelerate faster, handle better, stop shorter, and use less fuel — saving me money for the lifetime of the car).

    Since this scheme saves me money and enables Ford to sell more cars (because they’re cheaper), why would any of us not want it?

  18. Teo says:

    Does Win64 support Win16-Applications still, anyway?

    It does not, because the CPU cannot run them. Rewriting the NTVDM as an CPU emulator is not exactly a simple project, and I do not believe that it benefits enough people to be implemented.

  19. asdf says:

    "Imaging if we had a 10 or dozen of the operating system competing in the Market. That would be a chaotic world."

    There are at least ten Linux distributions alone.

  20. Teo says:

    Please read "So your installation has to distinguish between 7.." as "So your imaginable self-made installation program has to distinguish…"

  21. mikeb says:

    > Does Win64 support Win16-Applications still, anyway? <<

    No – Microsft was finally able to decide that Win64 was far enough removed from Win16 and the Win16 support was unimportant enough today to decide it wasn’t worth supporting.

    There’s supposed to be some support to recognize certain installers (why do installers keep getting mentioned on this thread?) that use Win16 code as part of their platform detection, but thats a pretty particular edge case. As I understand it, this isn’t handled by actually running the Win16 code, but by giving the installer whatever result it might expect on a Win32 system.  Or something like that.

  22. Cooney says:

    [Once again, people are focusing on the example instead of the principle. Pretend I never talked about installers, then. (Oh, unix didn’t have package management for the first few decades of its existence either.) -Raymond]

    Installers are 2/3 of the post and a serious point of pain. Further up, you asked how an app could install shared components without permissions – the answer is that it probably shouldn’t be doing anything in a shared location, and this has nothing to do with the internet.

    If you hadn’t brought up installers, I would have nothing to gripe about.

  23. Duke of New York says:

    grrr why doesn’t Windows provide an API to do X in ten lines or less

    also why is it so friggin’ BLOATED >:-[

  24. WndSks says:

    The law of physics might not stop you, but the law of MS says, if you want the windows logo, your installer has to use MSI, no NSIS/Inno/etc for you!

  25. Teo says:

    @WndSks, that WAS true up to 7/2008 R2. Microsoft get their senses (Oh Gods, thank you, thank you), and the rules for certification now are much much simplified. And MSI is gone as a requirement :-D

    Raymond, did you miss the fact that MS actually put us into your 3-step plan? Because of the infamous PCA job, my perfectly working program, which adheres the XP / Vista logo requirements BREAKS on 7 (step 1). Until I add the totally magic GUID in the manifest, the program is broken (step 2). So, what again is your point? That MS tries to keep working programs that adhere to the contracts from breaking? It is not. It used to do it, up to 2000 pre-SP1 era.

    [Now I will have more ammunition when people advocate the three-step plan. “We tried it and Teo hates it.” More proof that no matter what you do, somebody will call you an idiot. -Raymond]
  26. Cooney says:

    Unix installs apps under /usr/local/appname, the apps themselves don’t install random system libs, multiple versions of a lib can exist at the same time, you as a user, can install an app under your home dir and not disturb the system.

    Also, I can overwrite a file that’s open and restart the process for very fast turnaround. This is useful in a number of places, although not specifically relevant to installs.

    [“Unix installs apps under /usr/local/appname”. “Unix” doesn’t install anything. Installers install stuff. It may be that by convention apps install under /usr/local/appname, but that’s just a convention. As with Windows, there’s no law of physics preventing an installer from violating the conventions. -Raymond]
  27. CN says:

    Now I feel bad for writing that bill-printing system when I was 12 (I am not making this up). The worst part was that I had to do it in Win16, which was already, well, not state of the art. But it had to run on an already ancient 386.

    It was pretty neat, though. I even got a 24x CD player and 16 MB worth of SIMMs.

  28. @Cooney

    There’s nothing preventing you from installing to /bin /usr/bin, etc. In fact, if you install a lot of stuff from source it will install there and bork your system by default…

    The main different is that Linux ./configure scripts will typically will accept a prefix flag, but that’s not really enforced by the system. That’s just how autotools does things.

  29. Dean Harding says:

    I think the main difference between Linux and Windows is that on Linux, installation is maintained by the distribution maintainers rather than the application developers.

    If Microsoft had a department where I could send my program to them and they’d send me back a fully-functioning .MSI (or added it to the "Windows package database" automatically) that adhered to all of the standards and then promised to maintain that .MSI through subsequent releases of Windows then all of these problems would be solved!

  30. hexatron says:

    Well, last week I told my neighbor who popped for a nice new Win7 64bit PC that his wife’s favorite Scrabble game would not run on it. Ever. At least I found him a $10 replacement (32bit version), but it was still not what he wanted to hear. Note: this user is a NORMAL PERSON with no interest in or knowledge of computer guts.

    There is probably significant demand for 16-on-64 emulation, and I expect it will come, though perhaps not terribly legally. After all we have had Z80 & etc emulation for years, all wrapped neatly together in MAME.

  31. hexatron says:

    Responding to the point of this post–I think most programmers are still quite hard-working, but the sheer number of programmers employed today means many many of them will be less than smart.

    A smart programmer can produce canned-type code, specialized for a particular application, pretty damn quick. Say, a tree optimized for quick insertion, or (as I actually did today) replace all substrings of the form ^n^ (n and ascii integer, with the UTF-8 superscript-n chars–took me about 15 minutes with 2 recompiles for errors).

    But an average programmer could spend at least a day producing a version of a linked list that compiles and often works correctly. He might leave that steaming turd in working code, where the stench might not be detected for years. I believe this is the base motivation for all the libraries of frequently-performed operations.

  32. MarcT says:

    "Is there even anybody still around who understands 16-bit Windows enough to be able to undertake the port?"

    If only such a person had a blog…

  33. Anonymous Coward says:

    ‘Historically, Windows didn’t tend to provide functions for things you can already do yourself’

    Well, X certainly took that to the extreme. Widgets? Draw them yourself. Nowadays on a modern GNU/Linux distribution you get KDE and most things look consistent, but every now and then you still bump into something that draws it’s own widgets and without exception does so badly.

    As for the installer issue. When 95 came the developers could have decided without causing much more trouble to software developers (who were having to switch anyway) that, no, you cannot install things yourself, you must let the OS do it. By the time 98 came it was too late of course, I realize that.

    But the fact remains that whether you let the application do it or the OS is a design decision that you cannot sweep under the carpet by saying ‘the OS doesn’t need to do it because you can do it yourself’ for you could have decided to not let the application do it or for example that almost every application needs to do it and there’s a benefit in a safe and consistent unified design.

    [If Windows 95 did what you suggested, then any app purchased prior to 1995 couldn’t be installed on a Windows 95 machine. I suspect customers wouldn’t like that. And Teo wouldn’t like it either – he hates MSI. -Raymond]
  34. Alexandre Grigoriev says:

    @Duke of New York,

    Non-bloated Windows is 10 MB or so. The rest ("BLOAT") is the part that allows you to do X in 10 lines of code or less.

  35. Cheong says:

    >Unix installs apps under /usr/local/appname, the apps themselves don’t install random system libs, multiple versions of a lib can exist at the same time, you as a user, can install an app under your home dir and not disturb the system.

    I think it’ll be better if there’s not “strong name” requirement to install files as Side-by-Side assembly. Perheps Windows should automatically rename the DLLs copied to system32 with companyname and fileversion (although not as accurate as strong name, DLL from the same company with same version number is most likely to be the same, and it seems like over 75% of DLL exist in the wild contains versioninfo) appended to the file name, and the software companies will less likely to “install” them in the broken way. For files without version information, Windows could have the option for me to deny the copying process (which could have the checkbox to use the same decision forever).

    >Also, I can overwrite a file that’s open and restart the process for very fast turnaround. This is useful in a number of places, although not specifically relevant to installs.

    I really like the idea of replacing the DLL files I need without restarts. It significantly reduce the downtime for the systems.

    [It reduces the downtime but makes writing the DLL much, much harder since it now has to interoperate with earlier versions of itself. -Raymond]
  36. Teo says:

    1. Windows does something wrong (lacks an installation/update framework), but unix does the same wrong thing too, therefore it is not wrong? Either I am missing something (English is not my strongest language) or your logic is flawed.

    2. “After all, at the end of the day, installing an application’s files is just calling creat and write with the right arguments.” You have installed manually an Win32 assembly in the assembly store, right? FYI, it is not even documented (“use Windows Installer” is what docs say). Or, perhaps, you have installed a driver recently? Hint: the syntax of the inf files is incompatible between 2000 and later OSes. Oh, wait. I am convinced that you had finished installing a WMI provider *just before* writing that sentence! Because, well, .mof files are incompatible between XP and later OSes. By the way, using the mof compiler at the end is just creat and write, but reaching that end might be a very long and perilous journey. And do not get me started with installing:

     COM servers

     NT4-style services

     WDM drivers

     non-WDM drivers

     .Net assemblies participating in COM interop

     SQL Server Express redist.

     Wait wait, have you tried installing a Microsoft Cluster resource? Please explain to me, how can I do it with just creat and write? Let’s this be your next code-based article, it will be enlightening, how the real men do it!

    After I sleep over, I probably will come with more examples. Morale: back in Windows 3.0 days installing could be as simple as copying some files and writing two ini files, but 20 years later things have changed. Windows has rich functionality but this costs in terms of complexity. Getting even a simple application installed correctly is increasingly difficult. Example? Imagine you have a game that consists of one exe and one .dat file with total size, 8 GB. Aside from the Game Explorer registration stuff (yeah, it is not exactly “just creat and write”), Windows 7 ARP has the bug that it accepts only sizes below 4 GB (or MS did a poor job of documenting it). So your installation has to distinguish between 7 and older versions of Windows.  And so on and so on. Installing consists of more that copying files (at least if you want to be able to deinstall cleanly).

    [By “installing an application’s files” I meant “placing an application’s files on the hard drive”. Sorry it wasn’t clear from context. -Raymond]
  37. Dean Harding says:

    @Teo: 99% of applications written for Windows 2000/NT (and earlier) will install and run unmodified on Windows 7. So yours is in that 1% bracket that don’t, you can’t extrapolate from one example and say that Microsoft has stopped caring about app compat.

    Also, I think a lot of people have missed the point with the Unix thing. In Unix, there’s *also* nothing stopping an installer from overwriting stuff in /sbin or /lib or whatever. So if MSI is the Windows equivalent of RPM/DEB/etc and WindowsSystem32 is the equivalent of /sbin, then what, exactly, does Unix do differently/better?

  38. Teo says:

    @Dean Harding, thanks for the heartening words, unfortunately for me, I have to rewrite, retest, redeploy *my* application, not the 99% of the apps written for 2000/nt. My customers prefer me to bring them new features. I prefer fixing my bugs. Instead, I am forced to work around MS stuff. My program might be in the 1% but it invalidates the whole second part of Raymond’s post – if we were out of business, who was going to fix this problem? Further more, my customers, being pure innocent souls, blame ME for changes that MS introduced. Cool, ain’t it?

  39. Worf says:

    @Mc: Either use a VM, Win7’s XP mode, or maybe even DOSbox will work… the latter for 64-bit OS.

    @hexatron – the problem is, AMD (who created the x64 mode) decided that no one runs 16-bit mode anymore and dropped 16 bit support while in 64-bit mode. The CPU can only do 32-bit and 64-bit programs when running a 64-bit OS. They decided those who really need it can run a 32-bit OS just fine. (x64 adds a pile of new stuff and is quite a departure from x86). Intel got to the x64 game late.

    Your solution then is to dual boot Win7-x64 and Win7-x86 – you probably can use the same key and everything. x86 for old stuff – works just fine, anf x64 for whatever needs 64-bit… and no, running a 32-bit VM probably won’t work.

  40. alex.r. says:

    While your post is insightful, and I clearly understand that decisions are not always purely technical and that they are rarely reversible.

    But the person asking for high-level services in the OS aren’t completely misguided, even if you consider how things were many decades ago. I think the line between what an OS should do and what a program should isn’t as easy to draw as you make it to be.

    It’s always possible to let developers do something on their own instead of providing functionality.

    You want to write that ‘file’ to that disk? I give you the disk manufacturer and its model and you can send the device some command by writing to this address. Now go, do it on your own.

    Sure, some application could corrupt the whole disk, and that would be bad… bad enough not to let people do it on their own — hence the need for an API provided by the OS.

    But the decision of what warrant this kind of separation is subjective and some OS do (did would be more accurate…) a lot more than others… even list management.  

    I’m a nitpicker I guess.

    [Since the file system is a shared resource, it lies more on the side of something an OS would get involved in. But a linked list is a private application data structure; there is no reason for an OS to get involved. And sometimes the OS involvement is only in the form of guidance (as installation has historically been – you can ignore the guidance and write your own custom setup program instead of using MSI, just ask Teo above). -Raymond]
  41. Pi says:

    It’s fascinating. I think that as long as this blog exists, Raymond repeatedly tried to explain the thoughts that go into decisions affected by backwards compatibility issues. It’s always the same:

    • An old version of Windows did something that wasn’t good or didn’t use a concept that wasn’t even known back then.
    • Enforcing the "correct" way of doing things now would brake old applications, causing grief to a group of people.

    • Not enforcing the "correct" way of doing things now allows for annoyances in new applications, causing grief to another group of people (although someone could belong to both)

    Microsoft will more often than not decide to go the way that doesn’s break backwards compatibility. Sometimes it will not.

    -> the comment section is filled with people that don’t understand the logic behind it and/or think that Microsoft is wrong beyond any shred of a doubt, simply because they belong to the group that is affected negatively by the decision Microsoft made in that instance. Hint: you are not the center of the world!

    I am.

  42. Duke of New York says:


  43. scorpion007 says:

    Commenter Pi captures my thoughts completely.

  44. Neil says:

    @Mc: often DOS apps only recognise COM ports that use the I/O ranges reserved for legacy COM1-4 i.e. 3F8, 2F8, 3E8 and 2E8, which unfortunately lie outside the range of ports allowed for PCI/USB virtual ports. (I can’t speak for VM serial port emulation.)

  45. MarkJ says:

    I really do think it’s great that Microsoft work so hard on backward compatibility for Windows programs.

    Off Topic: Can we have backward compatibility for VB6 source code please? Why is it so hard to migrate it to VB.Net? How come the Visual Studio VB6->VB.Net upgrade wizard is so weak that third parties can make a good living selling replacements for it? How about buying those companies and make the products free?

  46. Arno says:

    As hinted in other posts, it would help to let programs specify which OS version and thus set of rules they are programmed against. Then you’d never break existing binaries, but you’d force vendors who want to benefit from newer OS features to not only cherry-pick those features but fix up their programs so they are current with respect to other aspects of the new OS version (stricter enforcement of API rules, deprecated APIs, etc.). MS is partially going that route with app-specific compatibility code and DLL manifests, but it hasn’t been made a principle.

  47. Ben says:

    Not this crap again.

    You can have managed installation system, it doesn’t imply breaking old apps. You just need to have strong motivators in place for the apps to use the new system (i.e.: no UAC prompt if all there’s to be done is copying a few files to a new folder under program files and creating a shortcut; then make UAC more annoying and eventually put a warning that says “unless this application was from 2010 or before you really shouldn’t allow”).

    However the solution I prefer and the most pragmatic by far is decent “virtualization” (I mean filesystem/registry level like it’s already done to a certain extent). That way the only apps that can’t be isolated are the ones that break anyway (the ones that use drivers or worse).

    My gut feeling tells me Windows will go with the second route, but not after it has lost more marketshare (strong competition is the only way Microsoft ever tries to improve anything). I hope web stuff puts more pressure on it.

    [I guess I don’t understand what you mean by a “managed installation system”. To me, that means something like MSI or APT or RPM, where the app doesn’t actually install itself but rather describes how it wants to be installed and lets an installation engine do the work. How do you reconcile that with old apps who use a custom setup.exe program? If you just run the setup.exe program and let it do whatever it wants, then that’s not managed! -Raymond]
  48. PhilW says:

    To Cooney:

    "Installers are 2/3 of the post and a serious point of pain."

    And that’s because nobody (slight exaggeration) follows best practices for setups. It is in reality a specialisation, like writing drivers, designing databases, etc, not a twenty minute rush job to get the app out the door. It doesn’t help that many companies underpay setup developers and overestimate the ease of building a setup.

  49. clodney says:

    MC:  Out of curiosity, what is the old OMR device you have?  OMR machines tend to change very slowly, and the manufacturers love to pick up each other’s customers, so it is quite possible that there is replacement software/hardware for you.

  50. Mihai says:


    My two cents: I think the version in the name is nice, if it stops at level 2 or max 3, otherwise you end up with stuff like WinSxS, with tens of thousands of files, and the same file with tens of versions.

    Real example from my machine: 6.0.6000.16681, 6.0.6000.16717, 6.0.6000.16757, 6.0.6000.16764, 6.0.6000.16809, 6.0.6000.16830, 6.0.6000.16851, 6.0.6000.16890, 6.0.6000.20823, 6.0.6000.20879, 6.0.6000.20927, 6.0.6000.20937, 6.0.6000.20996, 6.0.6000.21023, 6.0.6000.21046, 6.0.6000.21089, 6.0.6001.18000, 8.0.6001.18702

    Ok, backward compatibility is hard, but allowing differentiation all the way to build number encourages really sloppy programming discipline.

  51. hexatron says:

    I popped back here just now, and will put in another 2×2¢ in:

    Worf says "Your solution then is to dual boot Win7-x64 and Win7-x86" (to run 16-bit programs)

    This was in reference to a neighbor, not me. He’s doing well to successfully use a single boot

    I wouldn’t dream of suggesting to him dual-boot or virtual PC or any other normal-person-untenable solution. It’s like the joke about the two old-west miners. One of them is bit on the privates by a rattlesnake. The other guy has a first-aid book. He looks up what to do for a rattlesnake bite. He tells his pal, "You’re gonna die."

    As for burgeoning winsxs: I looked at my work PC and was mildly aghast to discover several gigs of repetitive glarg there. But when I looked at real user PCs, winsxs was less then 100MB. And it works, it really works, and is painless if you don’t look under the hood. This is not true of the linux/unix .so approach. So it think it is a big win for users and for Microsoft.

  52. Teo says:

    Sorry Raymond, I used to hate MSI. About half an year ago I realised that it is exactly as much unable to solve my problems, as any other installation program I’ve tried, so now I am indifferent to it. But, what common has MSI with Windows 7 breaking a working application which strictly adheres to the rules of the previous version of Windows?

    I find even more problems with your post. If Windows is not supposed to come with programming frameworks, why does it comes with them?

    * MFC

    * .Net (Bonus point – it even includes COMPILERS to 3 languages – C#, VB.Net, JScript.Net)

    * Jet Red

    * Jet Blue

    * XML pull parser – xmlite.dll

    * XML DOM parser – msxml6.dll

    * XML SAX parser – msxml6.dll

    * Complete scripting environment with runtime and a VM – the activescript engine with jscript

    * SQL databases – two of them – Access in the form of MDAC and Windows Internal database, which is a repackaged SQL Server 2005 express

    * and so on and so forth.

    Everything of these can be solved outside Windows, some of them are solved better. Yet Windows includes them.

    If Windows isn’t supposed to implement linked lists, why did the NT team choose to implement them and export them for use of kernel-mode code? I believe (but not sure) that they did this back in the early 90’s for the first version of NT, so it’s hardly something new and flashy. Oh wait, while I was checking the MSDN for the single-list functions, I found this treasure: RtlLargeIntegerShiftLeft! Yes, Windows ships with a function that shifts left integers! Raymond, you really must accept your defeat :-D

    [Windows comes with that stuff but that stuff is not part of Windows. They are frameworks that Windows itself uses. The kernel folks exported those functions out of the goodness of their heart, much like the shell team exported GetEffectiveClientRect. Just because you get a gift once in a while doesn’t mean gift-giving has become a policy that you can count on. -Raymond]
  53. Leo Davidson says:


    "it would help to let programs specify which OS version and thus set of rules they are programmed against"

    That exists now as part of application manifests. There’s a general OS version and you can also turn on/off certain features/compatibility options individually (e.g. "high DPI" support) for your exe.

  54. Teo says:

    [Now I will have more ammunition when people advocate the three-step plan. "We tried it and Teo hates it." More proof that no matter what you do, somebody will call you an idiot. -Raymond]

    You are not idiots, that’s for sure. But Microsoft very often mis-communicate important changes, the "I am compatible with windows 7" manifest just being a random manifestation of it. When it is almost impossible to find this information on MSDN and I know it just because I am following almost two dozens of MS blogs, there is a problem. That problem is for me as a Windows programmer, because it prevents me from creating well-behaved programs, for MS who must support them, and for users who are infuriated. It is created by Microsoft, so MS is best-suited to fix it. Another good example of the same problem was when almost 6 months after the release of the Vista WDK, Google knew about KeExpandKernelStackAndCallout but MSDN was adamant that such things did not exist.

  55. Miles Archer says:

    "no matter what you do, someone will call you an idiot"

    I think I’m going to call this Raymond’s Law from now on.

  56. violet says:

    “Windows doesn’t come with a finite element analysis library either.”

    No, but it does come with DirectCompute, which would have been an unthinkably specialized niche API ten years ago.

    The broader point being that OS requirements, APIs, and integration points change. I mean, in principle, I can rewrite Explorer, right? And in the days of 3.1, that might even have been a reasonable idea (speaking of File Manager, of course)! But outside a very small niche, it’s a stupid idea now, because even though Explorer is Just Another Program ™, it is for all intents and purposes very much part of Windows. As the OS changes, the line between what programmers can and can’t reasonably do shifts.

    “Once again, people are focusing on the example instead of the principle. Pretend I never talked about installers, then. (Oh, unix didn’t have package management for the first few decades of its existence either.)”

    It didn’t. And that presumably sucked. Which is why the majority of Unixes do now.

    It’s not reasonable to ask why Windows didn’t *always* have a package manager. A better question might be: given the popularity of the iPhone’s App store, and it must be said, the various Linux packagers, why doesn’t it *now*? (There may well be a variety of good reasons. But it’s not a terrible question.)

    [MSI has been around for a long time now, but everybody seems to hate it. -Raymond]
  57. Gabe says:

    violet: DirectCompute is an OS-level API because it manages a shared resource (the GPU). It is needed to allow multiple applications to access the GPU for things other than graphics.

  58. The cat says:

    Can I refuse the gift of 300 Mb (approx)(.NetFW) bundled with the OS?

  59. violet says:

    "[MSI has been around for a long time now, but everybody seems to hate it. -Raymond]"

    Like I said, there might be good reasons. "The Windows software ecosystem just isn’t geared towards that," would be one. That said, MSIs aren’t really the same thing. The value of apt isn’t that I can double-click on a .deb file and have it automagically install. Nor is it that apt guarantees that random .deb packages will cleanly install and uninstall–it doesn’t. The value of apt is that I can browse a functionally complete listing of all software I might want to install on my system.

  60. Teo says:

    If the first paragraph was correct, please explain the existence of DeleteFile, which is a one-line function or CopyFile/CopyFileEx, which you claim should not exist? After all, DeleteFile is CloseFile(CreateFile(… FILE_FLAG_DELETE_ON_CLOSE)).

    While CopyFile is slightly longer, it’s still less than 5 pages of code, and that’s just because of the awful hellishly horrible Win32 API (compared to the undocumented Zw* one).

    Bonus question. If “Windows worried about providing functionality for thing that programs couldn’t do” is right, please explain why does Windows NT [3.1 – 6.0) lack API to enumerate named file streams which is required to implement CopyFile? Yes, BackupRead can be coerced to get the info, but it’s a misusing of a side effect, not a proper API. Same applies to APIs for extended file attributes.

    [I was personally surprised to find CopyFile in Win32. (And DeleteFile is probably there because it corresponded to the DOS “delete” function and people would be weirded out that there was no DeleteFile function.) -Raymond]
  61. Teo says:

    Ha ha, yeah nice one for DeleteFile. I believe I know the answer why CopyFile exists – it has to manage things like named streams, extended attributes, compressed and encrypted files. If it wants to be efficient, it has to account for the peculiarities of the source and target file systems, volume managers, the memory manager, the cache manager, and very often – the network redirector. And it had to work reliably and fast with unknown redirectors, like the ones coming with Hyper-V/VMWare/etc, the WebDAV one, and what not. Basically, what looks like a single while loop, suddenly explodes to unimaginable complexity.

  62. David Moisan says:

    DOSBOX works fine for games and is good enough that one company uses it to republish classic DOS games so they work on contemporary Windows, including my 64-bit system.

Comments are closed.