The real cost of compatibility is not in the hacks; the hacks are small potatoes


Commenter Myron A. Semack asks how much faster Windows would be if you took out the backward compatibility stuff. Myron is so anxious about this that he asked the question a second time. Asking a question twice typically counts as a reason not to answer it, but since I had already written up the answer, I figured I'd post it anyway. Oh great, and now he asked it a third time. Myron is so lucky I already wrote up the answer, because if I hadn't I would've just skipped the topic altogether. I don't respond well to nagging.

The answer is, "Not much, really."

Because the real cost of compatibility is not in the hacks. The hacks are small potatoes. Most hacks are just a few lines of code (sometimes as few as zero), so the impact on performance is fairly low. Consider a compatibility hack for programs that mess up IUnknown::QueryInterface:

...
ITargetInterface *pti = NULL;
HRESULT hr = pobj->QueryInterface(
                 IID_ITargetInterface, (void**)&pti);
if (SUCCEEDED(hr) && !pti) hr = E_FAIL;

The compatibility hack here was just two lines of code. One to set the pti variable to NULL and another to check for a common application error and work around it. The incremental cost of this is negligible.

Here's an example of a hack that takes zero lines of code:

HINSTANCE ShellExecute(...)
{
 ...
 return (HINSTANCE)42;
}

I count this as zero lines of code because the function has to return something. You may as well return a carefully-crafted value chosen for compatibility. The incremental cost of this is zero.

No, the real cost of compatibility is in the design.

If you're going to design a feature that enhances the window manager in some way, you have to think about how existing programs are going to react to your feature. These are programs that predate your feature and naturally know nothing about it. Does your feature alter the message order? Does it introduce a new point of re-entrancy? Does it cause a function to begin dispatching messages that previously did not? You may be forced to design your feature differently in order to accommodate these concerns. These issues aren't things you can "take out"; they are inherently part of the feature design.

Consider for example color NTSC. (Videophiles like to say that NTSC stands for "never twice the same color.")

The NTSC color model is backward compatible with the existing system for black-and-white television. How much cheaper would your color television be if you could take out the backward compatibility circuitry? That question misses the point. The backward compatibility is in the design of the NTSC color signal. It's not a circuit board (or, to be more historically accurate, a set of vacuum tubes) that you can pull out. You can't "take out" the compatibility stuff from your television set. The compatibility is fundamentally part of the way the NTSC color signal works.

Comments (42)
  1. Rosyna says:

    I wouldn’t consider good error checking to be backwards compatibility hacks.

    But, how different would Windows be if it didn’t have to take backwards compatibility for undocumented behaviour into consideration when improving features? Well, I guess it’d partially be Windows x64 which could have tossed most backwards compatibility hacks if the app was 64-bit since obviously those apps don’t need backwards compatibility with anything.

    [In other words, you want to make it harder for people to port their programs to 64-bit Windows. “I have this program that works just fine on 32-bit Windows, but it crashes randomly on 64-bit Windows. 64-bit Windows is so buggy.” -Raymond]
  2. CGomez says:

    So what is it exactly about Vista that is making life harder for people?

    Let’s ignore for the moment that people just might not get it.  You’ve said many times, when people have problems, they blame the OS.

    My basic theory is that since Vista pushes the "standard user" login, many applications that are still writing to Program Files and HKLM are broken, and were always broken (people just ran as admin) and it’s easy to just blame Vista.  Games are notorious for "requiring Admin privileges" to cover up for lazy developers.  You can even see this requirement on the box of some of the top-selling games of all time.

    Is this just that particular competitors made enough funny but inaccurate commercials?  After all, I made zero upgrades, haven’t seen a UAC prompt for ages, and still use all of my peripherals but one from a nasty vendor who has simply decided to drop support.

  3. Adrian says:

    Another cost of the backward compatibility hacks is that they hide bugs, making it much harder to find them.  Defensive code is great, but not in your debug builds.

    If my QueryInterface is messed up, it may be because of a deeper problem.  Fixing up the HRESULT may mask the problem, reducing the chance it’ll get noticed it or extending the time it would take to diagnose it.

    Are there debug versions of core system DLLs (kernel, user, gdi, etc.) that developers can use while debugging?  It would be cool if you could enable something assertion-like that would notify developers every time a compatibility hack saved their butts.

    [Debug versions of Windows have been included with the SDK since Windows 1.0. Just goes to show that even if you make it available, nobody will use it. And then they’ll complain that you didn’t make it available. -Raymond]
  4. Rosyna says:

    Raymond, no, it’s about enforcing the documentation. The app developer is going to have to recompile their application for x64 anywho, why not take the opportunity to make it bet? I mean, I thought the SDL features were also about making things more secure, which may mean old hacks didn’t work.

    And I’m not sure I understand, if you go x64 on windows, a lot of features are automatically enabled by default (especially security ones). They’re disabled by default (like DEP in IE7) on 32-bit due to backwards compatibility issues.

    [Engineering is all about trade-offs. Sometimes it tips one way, sometimes it tips the other way. -Raymond]
  5. Mike Dimmick says:

    @CGomez: If Windows Vista’s UAC is enabled, and an application is run which doesn’t have a requiredExecutionLevel in its manifest (or no manifest at all), file system and registry redirection kick in. These redirect writes to some per-machine areas of the file system (e.g. Program Files, Windows) and the registry (e.g. HKLMSoftware) to separate per-user stores. In Windows Explorer, you get a Show Compatibility Files button if there’s a corresponding redirect folder for the current folder.

    If UAC is disabled the ACLs are processed and an Access Denied error may be generated.

    If a UAC-compatible manifest is present which sets a requiredExecutionLevel (asInvoker, highestAvailable, requireAdministrator) then, if the user allows the program to run (if highestAvailable or requireAdministrator), the redirects do not occur – you are expected to have fixed your program. As such you’ll get Access Denied if you try to write to privileged locations (and you’ve not change the default ACLs).

    UAC isn’t actually that hard to understand for a developer.

  6. markus says:

    Don’t forget the cost to any newcomer when learning the API, since old API concepts, and calls/argument types/result types still linger around a long after after they are obsolete. To understand such an "organically grown" API, you effectively have to understand many of the preceding versions, even if they are totally irrelevant today.

  7. Mike Dimmick says:

    Oh, and the technical reason is that the colour part of the NTSC signal only carries so-called ‘chroma’ information. The black-and-white ‘luma’ signal is used as the actual value for one of the colours (green, IIRC) and the chrominance subcarrier (which is filtered out by a black-and-white TV’s tuning circuits) contains only the differences between the levels of the other colours and the luminance level. This has the useful additional effect of generally reducing the bandwidth required for the chroma signal.

    PAL works much the same except that the phase of the chroma signal is inverted for each line compared to the previous one (hence Phase Alternating Line) to reduce the effect of noise. The French SECAM system has a compatible mono signal but encodes the colour differently, so UK receivers, if able to pick up a signal, could traditionally only see a monochrome picture (nowadays European receivers are multistandard and can decode and display PAL-60 and SECAM on their analogue tuners).

    Similarly in FM stereo radio, the compatible mono signal is treated as Left + Right, while the ‘stereo’ subcarrier carries the differences between left and right (Left – Right). To get ‘left’ you sum the two signals (L+R + L-R = 2L), while to get ‘right’ you invert the stereo signal and sum (L+R + -(L-R) = L+R + -L + R = 2R).

  8. DEngh says:

    <OT>One would think that a person who "specializes in the development of low-level software for mission-cricial … systems" and a student of education could spell- and grammar-check their web presence before getting persnickety with others.  Sorry, this adds nothing but I couldn’t help myself.

  9. Gabe says:

    The kind of compatibility hacks that can be removed in Win64 include a few major categories:

    1. Things that were required for 16-bit; since 16-bit apps cannot run on a CPU in 64-bit mode, anything required strictly for 16-bit apps can be removed.

    2. Things that require code to be rewritten; for example anything that generates code will need a 64-bit code generator, so obviously disabling DEP on the generated code can be part of it, thus DEP can be enabled by default. There are many 32-bit programs that generate code (JITters) or have self-modifying code which would break if DEP were on by default.

    3. Things that require a recompile; for example it’s impossible to make a 32-bit driver work on Win64, so at the very least it will have to be recompiled. As long as it can be recompiled, it can be signed, so signed drivers can be made a requirement on Win64.

  10. Illuminator says:

    Once upon a time while debugging a nasty localization bug, my coworker and I had to look at the source for the Windows edit control.  I wear that emotional scar to this day.

  11. wndpteam says:

    In addition to the Debug Builds of various Windows binaries, there is a nice tool called the application verifier (http://msdn2.microsoft.com/en-us/library/Aa480483.aspx). Some of us (in networking at least) are thinking of adding more rules to that for pointing out to developers that they are using an API incorrectly, but taking advantage of the existing set of warnings it generates is a good thing.

     — Ari

  12. Wolf Logan says:

    Nitpicking the comments now…I’ve sunk to a new low…

    "The black-and-white ‘luma’ signal is used as the actual value for one of the colours (green, IIRC) and the chrominance subcarrier (which is filtered out by a black-and-white TV’s tuning circuits) contains only the differences between the levels of the other colours and the luminance level."

    Sorry, this isn’t correct (and assuming there’s anyone reading who really wanted to know the scoop on NTSC and its backwards compatability…)

    The original broadcast TV signal in the US was a simple DC voltage level, with some special timing characteristics. The level was used to drive the intensity of the (one) electron gun in early TVs, producing the fabled black-and-white TV signal.

    When color TV was introduced, the NTSC wanted to be able to transmit color TV signals that would still work on older B&W TVs. That meant that any changes to the signal format to include color information had to be designed such that an older TV would produce a reasonable B&W picture. Introducing a "breaking change" to the signal would have cut out about 95% of the TV market.

    They came up with a remarkably clever scheme. They divided the color signal into three parts: the "Luma" (luminosity, the brightness of the picture), the "Chroma Hue" (the particular color of the picture) and the "Chroma Saturation" (how intense the color was). The Luma signal was the same as the old broadcast signal, so older TVs would see this as their entire picture. The Chroma Hue was encoded as a high-frequency "rider" on the Luma, with its frequency fixed (at 3.58MHz) and its amplitude encoding the Chroma Saturation. The Chroma signal frequency was synchronized (in the days before PLLs and highly-accurate oscillators) to a "color burst" repeated regularly in the signal.

    I’m an old hand at NTSC signals, and I still think the engineering solution was stunningly elegant for its time. For more information (than you ever wanted), check Wikipedia [http://en.wikipedia.org/wiki/NTSC].

    And now there’s a "breaking change": the ATSC broadcast standard, which is a digital encoding, provides multiple aspect ratios, resolutions, and compression parameters…and is hardly used at all.

  13. CGomez says:

    @Mike Dimmick;

    I never thought UAC was difficult for a developer.  I’m just positing why there is all this negative press on Vista not being compatible with anything.  Every review or "reputable" magazine out there says "Make sure all your hardware and software works, and be prepared to buy new."

    This is said every time there is an OS release, but for some reason this time it seems to really be resonating.  Reasonable developers that I know are saying "Oh, Vista is crap… just a pretty interface over XP."

    Back to the point.  With everything Raymond espouses about MSFT taking great pains to ensure backward compatibility, why is there such a negative view of Vista?  I mean, assuming that many people at MSFT share the view that breaking changes just make users grumpy, there shouldn’t be many problems?

    And I shared my example to show just how few problems I’ve had.  One to be exact… vendor’s fault.

  14. James Risto says:

    Perhaps what people need to understand better is that a major Windows design point is backwards compatibility. Pretend that the MS developers are told that this is job #1. Easy to understand decisions now … and since performance is a close #2, it is great that it performs as well as it does, and I am glad that my kids crappy games still run. Saves me $$ … so WINDOWS SAVES ME MONEY.

  15. Rich Shealer says:

    The backwards compatibility entries remind me of DOS and 16-bit Windows applications that would work fine in Windows ’95, but would fail in NT and OS/2.

    The operating system was simply enforcing boundaries the application was crossing. But it the blame was squarely placed on the platform.

    I also remember the old Parity Check error screen, sort of a Blue Screen of Death in those good old DOS days. The memory system was preventing use with a known hardware problem that would eventually corrupt data. Many folks flipped the non-parity switch to "avoid" the issue. Yes it was rotten when you lost that spreadsheet, but it was worse when hard disk FAT became scrambled.

  16. Raymond,

    For what it’s worth, I honestly wasn’t trying to nag you.  I’m sorry if it came across that way.

    Post #1: I asked the question in a blog comment.  I had forgotten that your blog has a Suggestion Box.  I screwed up.

    Post #2: I said to myself, "Oops, I should have put the question in the Suggestion Box".  So, I re-posted it there.  I figured that if I didn’t post the question in the Suggestion Box, it wouldn’t get answered.  I was trying to fix my mistake.

    Post #3: I was actually responding to one of the other people who were posting comments.  I was trying to explain my question and the reasoning behind it.  Post #3 wasn’t directed at you.

    So, actually, I only asked you the question twice, not three times. :-)  And I only asked it twice because I made a mistake about where I posted it the first time.

  17. John Dempsey says:

    Coming at it from further out, NT was an upgrade, Win95 was a compatability hack. As was Win98, and Millineum. Not entirely true, but not entirely false, as fleets of 3.1 application took liberties with the underlying 16-bit USER.EXE architecture. Win95 programs were slowed by this, as lots of core calls were serialized. For clean-and-fast, run NT!

  18. Alan says:

    In my opinion backwards compatibility is a non-issue. Suppose an application is designed and tested for Windows 2000. When Windows XP or Vista came out, it should not be assumed to work for those platforms. Rather, appropriate porting and testing needs to be done first.

    When .Net 1.0 initially came out, it appeared that the versioning features would actually work like this. So if a n application targeting .Net version 1.0 was developed, it would execute under 1.0 even if 1.1, 2.0, 3.0 etc co-existed. This gave me so much hope that .Net could make a clean break from backward compatibility and actually fix the real problems instead of just supplementing them.

    However, in reality nothing has changed. People want their old code to work unported and untested on new platforms and new versions. So old broken code remains and new, fixed functions are added.

    As a developer, it’s a little depressing sometimes.

  19. J says:

    "As a developer, it’s a little depressing sometimes."

    As a developer, I love it.  I hate being called back to work on old projects to port them to new systems.  BORING.

  20. Craig Williams says:

    J,

    That is a rather irresponsible attitude. Even with assumed backward compatibility, you need to at least verify that you application still functions appropriately on a new version and platform. Depending on the results, some porting may be necessary.

  21. Andrew R says:

    I don’t think NTSC is a good exmaple of cheap compatibility; it wasn’t (and isn’t) cheap at all; the compatibility “circuit” is everything involved in YUV.

    The only reason the YUV colour-space exists at all is because original B&W is roughly what we now consider the “luminance value”.  Every CRT/LCD television can display only 3 colours; Red, Green and Blue.  YUV doesn’t map into that colour-space except by a massive loss of precision (RGB555 usually goes to 11.5 bits).  

    It requires special conversion circuits and sometimes multiple conversion passes.  There is utter madness going on inside MPEG-2/4 set-top-boxes because of this compatibility.

    BTW; PAL has the same problem.

    [I didn’t say it was an example of cheap compatibility. On the contrary, it’s an example of how compatibility is intertwined with design. Oh, right, here’s where I said it: “The backward compatibility is in the design of the NTSC color signal. It’s not a circuit board (or, to be more historically accurate, a set of vacuum tubes) that you can pull out. You can’t “take out” the compatibility stuff from your television set. The compatibility is fundamentally part of the way the NTSC color signal works.” -Raymond]
  22. Raptor says:

    Well Raymond you’ve pointed out that compatibility hacks don’t put such high penalty on performance since backwards compatibility costs are, and I quote “Real cost of compatibility is in the design”

    So logically the question should be rephrased as: how much “better designed” Windows would be if you took out the backward compatibility stuff?

    And the answer is?

    Raptor

    [I don’t know what designs were scrapped for compatibility reasons or how good they were. -Raymond]
  23. Dean Harding says:

    And the answer is?

    The answer is totally irrelvent because nobody would be using it. You can have the most elegant design in the world, but what’s the point if nobody wants to use it?

  24. Worf says:

    <BLOCKQUOTE>And now there’s a "breaking change": the ATSC broadcast standard, which is a digital encoding, provides multiple aspect ratios, resolutions, and compression parameters…and is hardly used at all.</blockquote>

    Maybe you meant not used as much, considering TVs that have ATSC decoders only tend to be a few years old, and stations broadcasting in ATSC really came online the past few years?

    Fact is, people have noticed that their cable and satellite TV of local channels looks worse than their ATSC counterparts. Cable and satellite compress the heck out of their HDTV signals, while ATSC tends to have few others sharing subchannnels, thus leaving plenty of bandwidth for the signal. One regular channel on cable can easily host 4-16 channels using digital cable – 64-128 channels for music. If your TV supports QAM, do a channel scan – even encrypted channels will cause it to lock, but it won’t decode.

    The recent resurgence in the humble UHF loop antenna is proof – this time, people who want the best picture and sound ditch their cable/satellite locals, and use the OTA versions. Which is why satellite receivers and the Series3 TiVo have not only their regular cable/satellite input, but also ATSC.

  25. Craig Williams says:

    Dean Harding,

    Please speak for yourself. I for one would use it and welcome it. I am tired of all of the “compatibility hacks”. As a developer I spend more time reading documentation for all of the technical limitations, workarounds, and “gotchas” to poorly designed functions than actually writing code. .Net 1.0 was a huge improvement. However, with future versions it’s just a repeat performance.

    Backward compatibility is not necessary and only gives developers a false sense of security. Assuming that a product developed for a specific platform and version will automatically work untested and unmodified on a future unknown platform and version is irresponsible and unrealistic. Even with the current backward compatibility, a certain level of testing is still required and some changes are occasionally needed.

    [There’s a great operating system for you; it’s called OS/2. Designed by the same people who designed Windows, but without the compatibility. The function names and parameters are more consistent, the interfaces more uniform, check it out. -Raymond]
  26. Dean Harding says:

    Craig: I’m sure TV manufacturers bemoan the fact that NTSC had to be backwards compatible with B&W, too…

  27. Ben Cooke says:

    I wonder if lessons learned with past APIs cause Microsoft platform developers to write their API functions a lot more defensively than those of the early days?

    I’ve been burned by similar things (on a much smaller scale!) myself, and I’ve learned that callers will take any liberty they can get away with, so it’s best to make them crash and burn rather than silently fixing their stuff. Obviously old functions can’t crash and burn like this, but new functions can.

    I wonder if the .NET Framework wasn’t intended in part to solve this problem. "Managed code" is by definition a lot more rigid and "un-abusable" than the native code functions of the Windows API.

  28. Jon says:

    Nobody correctly nitpicked the NTSC example:

    The color frame rate of NTSC is 29.97 fps (interlaced) due to a very complicated choice of color carrier frequencies in order to prevent distortion to black and white TVs (the luma carrier in general). It involved several years of work by the Rand corporation and has prime numbers and common denominators.

    The fact that the frame rate was dropped .03 fps meant the power supply could no longer be synchronized to AC power. This greatly complicates the high voltage power supply design of CRT televisions to this day.

    PAL is 25 FPS even, but due to the large number of countries and their compatibility issues there’s an amazing number of variants (PAL-A thru N + K’)

  29. John Topley says:

    "Backward compatibility is not necessary and only gives developers a false sense of security."

    Backward compatibility is necessary for Microsoft’s bottom line!

  30. Triangle says:

    Since the hacks are so small, why is the AppPatch directory on Windows XP almost 5 MB ?

    [Those are a different category – application-specific compatibility hacks. Most general-purpose hacks are small. -Raymond]
  31. mikey says:

    i think godwin’s law is going to need to be supplemented with ‘mikeys law of windows’ which says: ‘As an online discussion grows longer, the probability of a comparison involving UAC approaches one.’

  32. mikey says:

    @ craig. you are pretty cute and naive. it will be nice to see you on the other side of the ‘newbie to commercial compatibility’ phase.

    basically, you need to realise that ms is satisfying corporate need here. they don’t neccessarily write the programs they are using. but they want them on the new os anyway. if it doesn’t work, they won’t upgrade. and ms does not earn money. the simple answer? make them work. if it takes compat hacks, then so be it. at least the windows continues to be commercially viable, and while it’s viable we can add NEW things and GOOD things, while keeping the big paying customers on board.

    the blog would be a much quieter place if people understood this, and similar, concepts :)

    i, as a developer, am very happy and often impressed with ms’ compat implementations and stories.

  33. J says:

    "That is a rather irresponsible attitude. Even with assumed backward compatibility, you need to at least verify that you application still functions appropriately on a new version and platform."

    No I don’t.  That’s what we pay our testers for.  If no issues appear, then I don’t do anything.

  34. Craig Williams says:

    mikey,

    With over 30 years of experience in software development, I am hardly a "newbie" anymore.

    I fully realize why backward compatibility exists. However, developers constantly abuse it: assuming that their application targeting a specific platform and version will magically work on future versions without any testing (and needed porting). I have seen countless applications fail due to this.

    J,

    If you’re at a large enough company there will often be testers. However, at many smaller places, developers are the testers. And developers usually do not want to test (not talking about unit tests).

  35. Julian says:

    I agree with Alan and Craig.

    Each new release of the API and framework gets uglier, dirtier, and complicated. The broken functions and types are fairly fixed (backward compatibility) while new ones are made to replace them.

    I was much more productive with older versions of the Windows API than I am now. Perhaps the most productive I have ever been was with the initial version of the .Net Framework. However, since 1.1 and especially 2.0, it is taking more time to sift through the compatibility "trash".

    While I would like to always write perfect code, I do make mistakes. When this happens, I would hope that my application would fail so that I can fix it. However, all too often internal hacks hide the problem.

    That is not to say that I do not appreciate the newer functionality. However, I would like the trash that it replaces to be removed. It will not break my app because when I developed it it was for a specific version which will not change.

    The trash continues to grow, but no one is willing to take it out.

  36. Triangle says:

    One persons trash is another persons treasure.

    You can bet that there are multi-billion-dollar corporations with internal tools that relies heavily on that trash.

  37. Cheong says:

    [quote user="Craig Williams"]

    Backward compatibility is not necessary and only gives developers a false sense of security. Assuming that a product developed for a specific platform and version will automatically work untested and unmodified on a future unknown platform and version is irresponsible and unrealistic. Even with the current backward compatibility, a certain level of testing is still required and some changes are occasionally needed.

    [/quote]

    As a customer, I’ll be sad if the manufactorier no longer maintance the codes.

    Consider CompanyA’s multi-port modem boards. It includes drivers for Win2k but the driver software does not work on WinXP or above. The company later has brought by CompanyB and they refuse to provide free software updates for original CompanyA customers. Additional HKD$5000 must be paid for the new driver software bundle. That’s the exact reason why we still have one Win2k server not be upgraded to Win2003.

    It’s sad that both CompanyA and CompanyB are large and famous ones.

  38. Craig Williams says:

    Cheong,

    Sounds to me like the customer got what they paid for: drivers desired and tested for Windows 2000. Why would you expect it to magically work on another platform? Even _if_ it did work on Windows XP or 2003, that is only by chance and is not something that you can rely on without proper testing and verification. Same with other versions. There are a great many applications (especially games) designed for XP that will not run properly under Vista. Backward compatibility did not help as much as thought.

    [Thank you. Next time a magazine article says “Windows (k+1) sucks because of crappy hardware support”, please write them a letter explaining that they shouldn’t expect Windows (k+1) to support Windows (k) drivers in the first place. -Raymond]
  39. Neil says:

    [In other words, you want to make it harder for people to port their
    programs to 64-bit Windows. “I have this program that works just fine
    on 32-bit Windows, but it crashes randomly on 64-bit Windows. 64-bit
    Windows is so buggy.” -Raymond]

    So, all the 16-bit hacks were faithfully reproduced in 32-bit and now in 64-bit?

    [Not sure what you mean by a “16-bit hack”. 64-bit
    Windows doesn’t support 16-bit applications, so all of the “If this is
    a 16-bit program then do X differently” hacks, even if present, never
    fire. -Raymond
    ]
  40. Dean Harding says:

    So, all the 16-bit hacks were faithfully reproduced in 32-bit and now in 64-bit?

    Wow, you’re so smart for showing up Raymond like that! Please give your full name and address and I’ll send you a medal.

  41. Zagor says:

    So each time new Windows (k + n) version comes out we should buy all our peripherals over and over again?

  42. rolfhub says:

    So each time new Windows (k + n) version comes out we should buy all our peripherals over and over again?

    No, clearly not, but we should keep in mind that it isn’t only Microsofts responsibility to make sure that the hardware still works flawlessly, in a perfect world, the hardware companies would provide driver updates to every piece of hardware that is still in use by the customers. Same goes for software that relies on the APIs provided by Microsoft.

    Clearly, the real world isn’t perfect, so Microsoft adds enough hacks so that most of the old hard- and software works with newer OS versions, because otherwise, nobody (=not enough people to make it a success) would use the new OS.

    So basically, MS has not much of a choice, so it would be a bit unfair if we would blame developers like Raymond for slowing down the OS due to compability hacks, OK?

    Also, I think, the application and hardware developers also don’t always have much of a choice, because I wrote "… the hardware companies would provide driver updates to every piece of hardware that is still in use by the customers …" – and that can be a long time. Sometimes, hard- and software is still in use that hasn’t been sold any more more years. So it would be damn expensive to still provide updates.

    And the customers also don’t have that much of a choice, because buying a completely new system every few years can be just to expensive, so old hard- and software is sometimes used as long as possible.

    Seems as if nobody really has much of a choice …

Comments are closed.