Getting MS-DOS games to run on Windows 95: Too much memory!


Piggybacking on Roger Lipscombe's story of the MS-DOS extender that didn't work if you have 64MB of RAM:

There was a popular MS-DOS game from 1994 that didn't run in Windows 95. After some investigation, the conclusion was that the game didn't work if your computer had more than 16MB of memory (physical, if running under MS-DOS; virtual, if running under Windows). The 16MB limit comes into play because the game was written for the 80286 processor, and that processor supports a maximum of 16MB of RAM. I guess that when the game found more than 16MB of memory, it didn't know what to do with the extra memory; maybe it overflowed a buffer, or a calculation overflowed. Whatever. Doesn't matter.

We fixed the problem by creating a custom configuration for that game that said, "Never give this program more than 16MB of memory."

This case was interesting because the custom configuration means that the program runs better under Windows 95 than it does under raw MS-DOS: Under raw MS-DOS, it would have crashed!

Comments (32)
  1. Mc says:

    Not quite too much memory, but too much speed: I remember I had a MSDOS game which I had to stop playing because it ran too fast on my “modern” PC. Switching off the turbo button almost made it playable (went from 32Mhz to 8 Mhz I think) but it was still too fast to be fair. Can’t remember the name, but it was a Digdug clone in brown, green and red CGA mode. I tried some TSR programs that tried to slow the processor down in the background, but it was then too jerky to be playable.

    1. Mc says:

      Ah! It was called Digger by Windmill Software

    2. Rob Yull says:

      I remember trying to play the original Mechwarrior on a modern (and by modern I meant first gen Pentium) machine. It pretty much took all the processor power it could and ran as fast as it could. In combat, the enemy would run up destroy you and run away before you ever even had a chance to react.

    3. BZ says:

      I remember that “Nibbles”, a sample game that came with QBASIC, would attempt to compensate for speed differences by calculating how much time a certain number of for loop iterations would take… and this would fail with a division by zero error by the time Windows 95 was around because whatever precision time unit it was using wasn’t high enough to return a non-zero difference. Of course, since it came with source code, you could just increase the number of iterations so the check wouldn’t fail.

      1. Scarlet Manuka says:

        Turbo/Borland Pascal had a similar issue; any programs using the “CRT” unit would fail when run on a computer with a clock speed over 200MHz. The CRT unit had an initialisation step which ran a counter for 55ms and then divided by 55 to calibrate for a 1ms delay. If the CPU was too fast the result overflowed, though the error was somewhat confusingly reported as a divide by zero.

  2. Karellen says:

    My guess would be that it’s some kind of assertion/sanity check. If we’ve calculated that the computer has more than 16MB of memory, something is very wrong, because that “can’t happen”. We *must* have made a miscalculation, or something unexpectedly overflowed/underflowed/got corrupted. If that’s the case, we don’t know what data we can trust, so the safest course of action is to bail out.

    It always worries me when I see a “can’t happen” comment in an “else”/”default” clause… which then *does nothing* of significance. No logging, no bailing out, not even an early return with an error code. Your assumptions or invariants have just been violated, and you’re going to carry on regardless?? Ugh.

    1. IanBoyd says:

      Or possibly they were using the high 8-bits of the 32-bit pointer for some flags or extra storage.

      1. Karellen says:

        I guess it depends on what form of “not working” the game exhibited.

        I wonder if Raymond can remember if it crashed, hung, corrupted itself, or just unexpectedly exited.

    2. Kevin says:

      It depends.

      When you’re making a game, bailing out is almost always wrong. You are guaranteed to annoy the player that way. Glitches from false assumptions might not even be noticed, and if they are, the player probably won’t be as annoyed as they would by a crash (the only major exception being save file corruption – crashing may be preferable to writing bad data depending on how bad the data actually is).

      If you’re making a financial (or worse, medical) app, on the other hand, bailing out is obviously the Right Thing to do. But not everyone is in that boat, and I’m a little tired of developers assuming it.

  3. DWalker says:

    This reminds me of when the popular installer that worked like a shield would complain about “not enough disk space” if there was exactly or almost exactly 4GB, or something, of free space on the hard drive (back when hard drives were often 10 GB in size). There was a calculation error… which they fixed. They probably designed the software using 1 GB hard drives.

    I don’t know if it’s really hard to write software to anticipate everything growing in capacity (memory, disk sizes, etc.) but the errors seem awfully frequent.

    1. Darran Rowe says:

      A lot of these problems are down to people never thinking something can happen.
      Disk access was one of the first to hit this, we have seen some wonderful cases of software having issues, generally because hard disks have grown in size the fastest. There was even that driver problem with Windows XP where the drivers would end up wrapping around and start writing to the beginning of the disk when they got too large. (I think this was the issue with sector addressing.)
      Physical RAM has also had some really interesting issues. Like I think it was one of the Grand Theft Auto games that used a 32 bit integer to get how much RAM the system had, and then refused to run on systems when they didn’t have enough, like systems with 4GBs. I have seen some other games do similar things to, either refusing to run or locking the game at certain quality settings. The only way to get this to work was to not have one of these problematic values of RAM, at least Windows doesn’t require you to take it out these days, you can just use boot options to tell Windows to use less RAM.

      1. alegr1 says:

        The driver problem with LBA48… (remainder of message deleted due to disrespectful behavior, per ground rules).

        1. smf says:

          Adults make mistakes too.

          Windows 2000 requires a hot fix for LBA48 support, Windows XP seems to have shipped with it disabled but it can be enabled with a registry key, XP SP1 is enabled but you need a hotfix to stop hibernation and memory dumps causing corruption, LBA48 finally worked properly in XP SP2.

          Software is written to a budget and nobody wants to pay the price for doing it well, someone messed up. It happens. Adults know this and don’t get huffy.

      2. JanH says:

        Or a certain game dating from approximately 10 years ago, which would automatically set certain configuration settings based on your CPU speed. The trouble was that those CPU speeds were defined in terms of a Pentium IV, which needs relatively many GHz per unit of computing power. So if at that time you had a 2 GHz Athlon which was approximately as fast as a Pentium IV at 3.2 GHz, that game nevertheless locked you out of the highest configuration settings unless you manually edited the responsible configuration file.

        1. ender says:

          Game? About a month ago we had to quickly buy a Xeon E5-2620v3 to install some medical database software (which isn’t CPU-bound at all), because the installer refused to proceed on an E5-2603v3, saying it wasn’t at least 2GHz (that same software ran just fine on a Core2-based Xeon, which had slightly higher clock).

    2. cheong00 says:

      Adobe softwares before CS have this problem too (since they’ve published their KB on this and this happens long time ago, I guess there’s little problem naming it). I have to make multiple 100GB empty files just to make the available space on server goes below 1TB so the users can save their work on the file server.

    3. ender says:

      Several installers had problems with 4GB wraparound – I remember seeing the problem mentioned in release notes of a fairly popular open-source installer.

  4. skSdnW says:

    Are these compatibility entries stored under …\MS-DOS Emulation\AppCompat\ or in win.ini? The registry entries seem to key off the presence of additional program files which would make sense because these DOS apps don’t have embedded version information.

  5. Scott H. says:

    One that I remember (because I still play it a lot) is SimCity 2000 for DOS. It pops up a warning that your computer only has “-32767” kilobytes of RAM and it may not work correctly. Fortunately it lets you try anyway and it works fine.

  6. Yuhong Bao says:

    I think Windows 3.0 had a similar problem that led to a 16MB limit in himem.sys.

  7. Joshua says:

    I recall running this program on Win95 that demanded real mode and up and said as much when you ran it. It had an option to turn off the check, which didn’t help for the obvious reason of it still is going to demand real mode to actually work. (I’m guessing the idea of turning off the check was to allow it to function with EMM386.exe loaded.)

  8. henke37 says:

    “intentinally” isn’t a word, you might want to fix the summary.

  9. Erik F says:

    It’s not just software that messes up with unexpectedly high values: I remember having several old 8-bit ISA expansion cards that refused to work if the ISA bus speed exceeded around 8MHz because they were designed in the PC and XT era, and the bus ran at the same rate as the CPU. Obviously computers would never get any faster, right? Of course! Thus motherboards had to desync the bus from the CPU to maintain compatibility and make the ISA bus slower than molasses compared to the rest of the board; the settings in some BIOSes regarding VGA palette snooping and the 15-16MB memory gap are related to ISA as well.

    I’m glad that hardware makers are generally committed to supporting old hardware almost as much as PC OS developers are to software. Both have to work in conjunction to get systems working at all! I’m aware that there are many quirks in both areas, but for the most part I’ve had very few issues, or at least far fewer than I would expect given the complexity of the systems.

    1. Martin says:

      This might just be due a physical limitation of the technology used in the cards rather than a lack of forward thinking.

      1. Brian_EE says:

        Things like propagation delay, rise and fall time slew rates, setup and hold times. You can’t fault the card designers for using the standard TTL devices of the time. Does the boss care if you can run at some future unknown faster speed? No. Does he care that AHC-type TTL devices cost more? Yes.

    2. Falcon says:

      I remember reading about some hardware compatibility problems involving the I/O address space – there were some ISA cards that only decoded the bottom 10 bits of I/O addresses, so any addresses above 0x3FF would wrap around (the x86 I/O address space was, possibly still is, 16-bit).

      If you had one of these cards in the system along with a PCI card that used a higher I/O address range, they could conflict with each other. For instance, the address range 0xF7F0 – 0xF7FF would “overlap” with 0x3F8 – 0x3FF (1st serial port).

  10. Dmitry Onoshko says:

    I wonder if Raymond has some knowledge about NTVDM, especially its Sound Blaster emulation details. I wanted to write a small example for my students who currently learn x86 assembler, real mode, and it turned out that my Windows XP notebook has big trouble running most (but NOT ALL) of the MS-DOS programs that use means of sound output other than internal speaker. Anyway, there’re few of them which still work. Of course, I also tried to write my own program using all the docs available, but it doesn’t play sound in NTVDM as well. But it works in DOSBox.

    So, I guess, there’s some kind of a trick a program must do to make NTVDM recognize its intention.

    Or, maybe, someone could point me to some useful post or blog of someone from NTVDM’s team, or something?..

    1. Falcon says:

      There was a third party application which provided improved audio hardware emulation in NTVDM. (I won’t name it, to be on the safe side, but all the keywords required to find it using a search engine are contained in your post; I will leave it up to you to figure out which ones.)

    2. Azarien says:

      NTVDM’s Sound Blaster emulation is so poor and incompatible that it would be better had it not existed.

    3. Dmitry Onoshko says:

      Maybe just to bring some attention to the fact: my investigation shows that for some reason NTVDM delays MIDI message “execution” until the next MIDI message is received, i.e. the last one is always delayed. Messages using MIDI running status included. I can force MIDI message to be “used” by inserting a status byte (say, undefined/reserved one like 0xF4)… But that’s what the whole Raymond’s blog is about, right?.. that using undefined or reserved things is asking for compatibility trouble.

  11. Azarien says:

    I played one such game (maybe the same you talk about in this article), it was possible to trick it under plain DOS by eating memory with disk cache (SMARTDRV) or installing a RAM-disk and leaving only 16 MB of free memory.

  12. Ray Koopa says:

    If I recall correctly, I remember my SimCity 2000 DOS installer overflowing somewhere at 32MB RAM in an emulator, and displayed negative available memory.

Comments are closed.

Skip to main content