It takes only one program to foul an upgrade


"Worst software ever." That was Aaron Zupancic's cousin's reaction to the fact that Windows XP was incompatible with one program originally designed for Windows 98.

Then again, commenter Aargh! says "The bad code should be fixed, period. If it can't be fixed, it breaks, too bad." Perhaps Aargh! can send a message to Aaron's cousin saying, "Too bad." I'm sure that'll placate Aaron's cousin.

Comments (32)
  1. not a .net developer says:

    Neither of the links worked for me.

  2. Bap says:

    Post Aaron’s cousin’s e-mail address and I’m sure at least one of your fine readers will be happy to tell her "too bad".

    Whether it placates her or not is her problem to work out.  Sounds like she’s got issues anyway.

  3. Cooney says:

    AARGH! makes a good point in a sideways sort of way – the shocking thing about windows is not that it works poorly (sometimes/frequently), but that it works at all. It really is a feat that XP still runs software written against windows 3.1.

    That said, this is probably a case of poorly written software, relying on some version specific artifact, like writing to HKLM or something. Add a shim if you must, but don’t allow that in the name of compatibility.

  4. Oof says:

    Wow. That Aaron Zupancic critter has one of the most hilariously awful prose styles I’ve ever endured. I love the way subnormal types always go straight for the malapropisms when they want to "sound smart". I wonder if it’s even capable of understanding that words have meanings? Probably not.

  5. M$ says:

    Supporting old buggy apps is microsofts game. The backward compatibility would have been dropped a long time ago if it wasn’t lucrative.

  6. Dean Harding says:

    The backward compatibility would have been dropped a long time ago if it wasn’t lucrative.

    Duh!

    I don’t understand, are you saying it’s a bad thing that Microsoft makes money?

  7. Cooney says:

    I don’t understand, are you saying it’s a bad thing that Microsoft makes money?

    I actually do run into people who object to me making money – it’s as if me selling something to them for more than I paid for it is immoral.

  8. Mitch Tenderson says:

    Supporting old buggy apps is microsofts game. >The backward compatibility would have been >dropped a long time ago if it wasn’t >lucrative.

    Lucrative as in… the customers request it? Yeah, that sounds like a terrible way ot make software.

  9. Dan says:

    I major place some old 9x apps fail is when they try to access serial or parallel ports or write to the hard drive directly (ie raw data)… no-nos in NT.

    You’re really supposed to use drivers instead of the former two.  Programs that support legacy serial/parallel connections… and I mean more obscure hardware other than printers or scanners… typically use special drivers to gain low-level access to the port they need, to work around this.

    As for the last, I imagine it’s a security blockade against malware rendering your machine unbootable by messing with boot sectors, or wiping your hard drive by fiddling with the partition table.  I don’t believe there is a workaround for this.  I have a hex editor that can read raw data from drives on the disk and partiton levels, but the "write" command is always grayed out on XP… probably only enabled on 9x/ME, methinks, if I’m right about all this.

  10. Aaargh! says:

    The only reason this person is complaining is because people EXPECT things to continue working after an upgrade because this has always been the case.

    No one is complaining that their old VHS tapes won’t work in their DVD player because no one ever promised them it would.

    How much time does the average computer spend trying to determine if it should execute the normal code path or the backwards-compatible one ? When is MS going to stop providing backwards compatibility ?

    You could have done it the way Apple did with the ‘classic’ environment. Ensure backwards compatibility in a way that doesn’t compromise the integrity of the new OS and at the same time make it very unattractive to keep using the old software.

  11. Robert Moir says:

    Aaargh,

    The strengths and weaknesses of a virtualised or emulated environment have been discussed here before. It’s good. It’s very good indeed. But it isn’t a panacea.

  12. foxyshadis says:

    Mac Classic actually generates a whole lot more complaints than windows’ backcompat has, even with less users. It’s good, but I know several who downgraded back to OS 9 or never upgraded because they hated it so much.

  13. James Risto says:

    OldNewThing, you talk about compat a lot. Any insights on the
    future? Can you see a day where Windows ships with a virtual subsystem
    that runs old apps, perhaps freeing the real machine for new things?

    [1. Suggestions go into the Suggestion Box. 2.
    Please read the “Topics I am not inclined to cover” list before making
    a suggestion. -Raymond
    ]
  14. Aaargh raises an interesting question (indirectly).

    How expensive (in terms of CPU and memory) are these backwards-compatibility "hacks" that pervade Windows?

    If Microsoft produced a version of Windows that skipped all of the backwards compatibility stuff, how much faster would it be?  And how much smaller would the memory footprint be?

    Can some Microsoft employee answer this?  Or at least provide a ballpark estimate?

  15. Ben Ryves says:

    I find it absolutely remarkable that the Windows 2 control panel applet runs under XP SP2, and can even change system properties (window border sizes and so on). Windows 2’s Notepad also runs.

    The fact that XP’s DOS support is sufficient to run a handful of VESA games (Tomb Raider runs in high-resolution mode – with sound) is also staggering.

  16. JamesNT says:

    Ben,

    And most of that stuff runs on Windows Server 2003 and Small Business Server, also.

    I believe Raymond made a post about this…

    JamesNT

  17. James says:

    "Can you see a day where Windows ships with a virtual subsystem that runs old apps, perhaps freeing the real machine for new things?"

    I can. I can see that virtual machine being called NTVDM.EXE (the NT Virtual DOS Machine), and running a big chunk of Windows 3.1 (krnl386.exe) for compatibility with 16 bit apps, under the name WOW (Windows On Windows). That day started quite a few years ago, though.

  18. J says:

    "No one is complaining that their old VHS tapes won’t work in their DVD player because no one ever promised them it would."

    If your HD-DVD player didn’t play regular DVDs, that wouldn’t bother you at all as long as the manufacturer never promised you it would?

    (assuming you have a large DVD collection)

  19. Cooney says:

    If your HD-DVD player didn’t play regular DVDs, that wouldn’t bother you at all as long as the manufacturer never promised you it would?

    Who in their right mind would buy an HD DVD player that won’t play DVDS? Actually, why would you ever buy HDDVD? That thing is so hamstrung as to be completely useless and, as an added benefit, it can be declared incompatible with your new HDTV if the manufacturer feels like it.

  20. Dean Harding says:

    Who in their right mind would buy an HD DVD player that won’t play DVDS?

    I think that’s his point: you wouldn’t buy an HD-DVD player that doesn’t play DVDs, just like you wouldn’t buy Windows version N+1 if it didn’t run all of your programs. Doesn’t matter what the manufacturer promises, or does not. Microsoft WANT you to buy Windows N+1, so they make sure all of your old programs will work.

    Lets not get into the (many) vices of HD-DVD/BluRay, though… ;)

  21. Norman Diamond says:

    Tuesday, November 21, 2006 6:55 PM by Dean Harding

    you wouldn’t buy an HD-DVD player that

    doesn’t play DVDs

    In principle I agree.  (In practice it’s vacuous for me, because the region code garbage persuaded me not to buy DVD videos.)

    just like you wouldn’t buy Windows version

    N+1 if it didn’t run all of your programs

    Not really true.  I bought NT4 for one of my home machines knowing that it wouldn’t run a lot of Windows 95 and MS-DOS programs.  Later I bought Windows 2000 knowing that it still wouldn’t run some Windows 95/98 and MS-DOS programs.  I was astounded to read a previous reply here saying that XP SP2 will run Tomb Raider with sound, but I’m sure I have other programs that won’t run under XP.

  22. Chris says:

    Myron has a point. I’d add: what’s the cost in terms of features Windows doesn’t have/can’t have because compatibility gets in the way?

    I’m not advocating blowing off compatibility, but I am saying it is worthwhile pursuing a more balanced approach. There comes a time to drop support for ancient software where it is actively hindering newer features.

    For the record, the people who said they’d never upgrade to OS X are the people who hated Aqua or the new (still borked) Finder. The Classic environment worked and continues to work well, as long as you didn’t need most third party drivers (except USB drivers, which were/are actually supported).

  23. Anony Moose says:

    The cost?   In terms of market-share?  I’ld say absolutely nothing.   I suspect that Microsoft hasn’t lost one ten thousandth of a percent of their market share due to putting "compatability" ahead of "new features that break compatability for a single 15 year old DOS program".

  24. Dean Harding says:

    Anony Moose: But how do you know that? I’d say Apple’s vs. Microsoft’s policy towards backwards compatibility are a pretty good indication.

    Apple are pretty good TODAY in terms of backwards compatibly, but if I had wanted to upgrade to a Mac back in the day, I would have also had to replace all of my APPLE II software. When I upgraded from MS-DOS to Windows, it all still worked.

  25. FYI, I asked my question because I’m genuinely curious.  Do the
    backwards compatibility “hacks” of Windows reduce the system’s
    performance?  If so, by how much?

    Note that I’m NOT asking about the political and economic costs
    (market share, engineering man-hours, etc).  I’m asking about the
    performance and memory footprint costs.

    I’m just genuinely curious if we live with performance degradation
    because of backwards compatibility.  There must be at least SOME
    cost, at least in terms of memory footprint, but just how much?

    I think the only people who can really answer that question are ones
    that are familiar with the Windows codebase.  Those of us on the
    outside don’t really know the implementation details of these
    compatibility “hacks”.

    IF there is a significant drop in performance, or increase in memory
    footprint, then that lends some weight to the argument that backwards
    compatibility is “holding back” Windows.

    If Raymond (or some other MS employee familiar with the Windows
    source code) tells us that performance and memory footprint ARE NOT
    significantly impacted, then it takes the teeth out of the people who
    argue that backwards compatibility is holding Windows back.

    I also put this question into the Suggestion Box for Raymond.  I hope that he makes a blog posting about it some day.

    [Okay, posting the same question three
    times is excessive. I already wrote up an answer for you but now I’m
    tempted to delete it since you’re being so annoying about it. -Raymond
    ]
  26. JamesNT says:

    Mr. Chen,

    Although Myron A. Semack is being a noob, I would ask that you avoid the temptation of deleting your response to his question.  I too must confess a serious curiousity regarding the performance impact of the many back compat hacks in Windows.  

    Indeed, not only would such a discussion possibly remove the teeth from those who say back compat hurts Windows, but I feel it would also give many of us a much clearer picture of just how elegant Windows is.  

    Think about it, Windows does all the things it does with acceptable performance even with all those hacks.  

    Incredible.

    JamesNT

  27. Raymond,

    I wasn’t trying to annoy you.  I apologize if I was doing so.  It seemed that some of the other posters here were missing the point of my question, so I was trying to clarify myself and also explain the reasoning behind my question.

  28. Dean Harding says:

    I can’t see how returning 2 instead of 3* would come with any "serious" performance cost…

    * That being the compatibility "hack" required in the linked-to post…

  29. Igor says:

    "I can’t see how returning 2 instead of 3* would come with any "serious" performance cost…"

    Hmm, lets see:

    if (strcmp(appname, "buggycrap.exe" == 0 && (file_length == 64392) && file_crc32("buggycrap.exe") == 0xDEADBEEF) {

       return 2;

    }

    return 3;

    People are just being stupid. They want new faster hardware, new OS, new features and when they get all that they run some 20 years old crap on it.

    As an example just look at all those overclocking morons running SUPERPI as a CPU benchmark. For God sake it is 386 code. It even has waits for floating point instructions and it changes rounding modes so much (large penalty on modern CPUs) that just a simple patch to use FISTTP instead of FISTP can speed it up by whole 12%.

    I am really tempted to start distributing patched kernel versions which would just do NOP on CreateProcess("SUPERPI.EXE") or any similar legacy junk.

  30. Dean Harding says:

    Igor: Lets keep things in perspective here. The code would actually be more like:

    if (filename[0] == 0) { return 2; }

    And, more importantly, this is in code that opens files – an operation which is inherently quite expensive (requiring access to the file system), so even if you had 100 such tests, the difference is totally insignificant.

    To be honest, I haven’t heard of any compatibility fixes on this blog that WOULD significantly impact performance. The main downside to compatibility fixes is in complexity. They would make it harder for the Windows developers to get in there and fix actual bugs, because its hard to tell what’s a bug and what’s a compatibility fix. But I guess sufficient commenting could help to mitigate that.

  31. Igor says:

    Dean Harding said:

    “The code would actually be more like:

    if (filename[0] == 0) { return 2; }”

    Your code would always return 2 if filename is “”.

    That would do fine for those programs which do not check return code and just use it as a handle.

    But if you have a properly written program which prompts for a
    filename (for example BIOS flash utility) and user just hits
    <Enter> it will get 2 instead of rightfully expected 3 which is
    mentioned in the docs if I understand correctly.

    So you still have to test for misbehaving programs. That certainly wastes some cycles.

    [Any program that expects error 3 didn’t work on
    any version of MS-DOS since it always returned 2 in this case. I don’t
    know where you’re concluding that 3 is the return value required by the
    documentation. -Raymond
    ]

Comments are closed.