Why was the Windows 95 precursor project code-named Panther abandoned?


A long-ago discussion of the fates of the various cat-themed precursor projects that led to Windows 95 prompted Jack to ask whether the Panther project was abandoned because it was impossible or was it abandoned because doing it properly would take much more work than just writing a new kernel?

From an engineering standpoint, the two options are almost the same. Something is rarely "impossible" so much as "not feasible from a cost/benefit point of view." Even "make a change that breaks every program ever written" is strictly speaking not "impossible", but it is definitely comes at a very high cost.

I suspect that the cost/benefit calculations also included the discovery/realization that getting the Windows NT kernel to run in 4MB of memory¹ would require so many changes that it would no longer be what it wanted to be. You couldn't really call it a scaled-down Windows NT kernel any more.

But it was worth a try.

Bonus clarification: My colleague who provided the Windows 95 ship date predictor (and also missed out on his Nobel Prize in Project Management) corrected some of my hazy memories of the Panther project.

The Panther project was more than just running the Windows NT kernel in 4MB. It also ported the 32-bit USER and GDI from Windows NT, and the 16-bit versions of USER and GDI forwarded to their 32-bit counterparts.

When the Panther project was abandoned, the design flipped: Instead of having the 16-bit window manager and graphics engine forward to the 32-bit window manager and graphics engine, Windows 95 had the 32-bit window manager and graphics engine forward to the 16-bit window manager and graphics engine. Over time, the 16-bit window manager and graphics engine themselves did some growing up and became a hybrid 16/32 system.

The memory manager from Panther's version of KERNEL32 remained, but the rest of the kernel was rewritten, using Win32s as a reference in many places. Win32s didn't support multithreading, so that code needed to be written from scratch.

¹ Given that 4MB was the memory for the entire system, getting the Windows NT kernel to run in 4MB of memory meant in practice that it had to run in under 2MB of memory, in order to leave enough memory available to run, y'know, programs.

Comments (38)
  1. kantos says:

    I remember when I got a machine with 32MB of EDO ram around the 98 (not SE) era, we thought that was awesome; of course the same machine had 256MB within a year. Then we bought a AMD Athlon in the Windows ME era, that had a massive 1GB. It’s amazing just how for granted we take the absurd amounts of memory we have today.

    1. Steve says:

      Pah, youngster. I remember buying a new computer and paying about 100UKP extra to have 4Mb of RAM in it rather than the standard 2Mb …

      1. Scott H. says:

        Megabytes! I remember my dad taking our Atari 400 from 16K to 64K, and then later going from the 800XL to the Atari 130XE. 128 whole kilobytes!

        1. DWalker07 says:

          I remember my company paying about $50,000 for 256K of memory (on an IBM mainframe computer) sometime in the 1980s or 1990s.

          1. David says:

            I worked on a Univac 1108 in 1969. 131K words cost $823,500 according to this price list: http://www.fourmilab.ch/documents/univac/config1108.html

            Before that I was using a Honeywell 200 with 16K bytes. That had the 4K update on the standard 12K.

        2. thebecwar says:

          With apologies to Monty Python,
          “Aye … I like yer fancy ‘kilobytes of memory’. Is that what they’re using up in Yorkshire now?”

        3. CarlD says:

          Youngsters! My first “PC” had 2K of RAM – 2K! We were thrilled when we hacked together enough parts to increase the RAM to 16K. (Literally hacked together – chips hand-soldered on top of other chips, wires running around, good times).

          1. Paul Veitch says:

            Pah, my first computer had 3K RAM and an expansion pack to take it to 7K!

            Children these days!

          2. smf says:

            RAM? RAM? You don’t know you’re born. We used to store programs in mud.

          3. RobThree says:

            Back in my day we carved our own CPU’s out of wood!

          4. Stephen Hewitt says:

            The old piggyback with the chip select pins bent aside.

    2. Brian_EE says:

      Ha. I still design systems with only 16K of RAM. Of course, that’s primarily only used for heap and stack. The code executes from a whopping 64K of flash.

      1. smf says:

        I’ve been working with 16k ram for data and code recently. Back to the 80’s….

    3. Yuhong Bao says:

      It is unfortunate that DRAM prices stayed constant from 1993-1995, and that it took until 1996 for DRAM prices to fall.

    4. ender9 says:

      …but OEMs still tried to sell Vista machines with 512MB RAM…

  2. Yuhong Bao says:

    The fun thing is all this happened after OS/2 2.0 was abandoned obviously. Was it even considered before it was?

    1. smf says:

      OS/2 2.0 wasn’t abandoned, IBM developed and released it. OS/2 3.0 was abandoned, or at least it became Windows NT.

      1. Yuhong Bao says:

        I mean abandoned by MS, which even did unethical attacks later on.

        1. DWalker07 says:

          How could Microsoft “abandon” OS2 2.0? It didn’t belong to MS.

          What’s the purpose of the “unethical attacks” comment?

          1. Yuhong Bao says:

            Except it did. You don’t remember the MS OS/2 2.0 SDKs from 1990?

      2. Joe says:

        Well, yes and no, though Raymond could probably clarify this. The OS/2 3.0 that turned into Windows NT was not the same as the OS/2 3.0 that IBM released a year later. What I do remember about OS/2 was having to reinstall it every time it’s config file got corrupt and stuck in a loop. IIRC, it was 33 3.5 inch disks.

  3. Anon says:

    I always liked the way with VxDs you could have real mode, 16 bit protected mode and 32 bit protected mode code all in the same LE/LX file and it had the relocation entries to support them all. No other x86 executable file format could do that.

    And it had to be that way because you needed to have code running before and after the switch to protected mode.

  4. Darran Rowe says:

    Bah, running programs is for losers.

  5. Nico says:

    According to Wikipedia, the first release of Windows NT required 12 MB of memory. Cutting that by 70% is certainly ambitious!

  6. smf says:

    > also included the discovery/realization that getting the Windows NT kernel to run in 4MB of memory¹ would require so many changes that it would no longer be what it wanted to be

    Did anyone think it would be possible to optimise the code and data so that it would be functionally identical in much less memory?

    What changed when Windows CE was based on NT? As that can run in 1mb of ram.

    1. Yuhong Bao says:

      I think at the time the Win32 stuff was in CSRSS in NT. I wonder what was the plan for Panther.

    2. Anon says:

      Windows CE was a different kernel from NT. So at one point Microsoft had three complete Win32 implementations – Windows 9x, Windows NT and Windows CE, each with a different kernel.

      Windows CE was a lot smaller than NT and was hard realtime. Curiously it only supported the wide char versions of APIs, not the Ansi ones.

      1. ErikF says:

        It seems like the right move if you’re trying to remove complexity: By getting rid of the ANSI functions, you don’t have to include code page support (at least I can’t think of any reason for keeping that in!)

        1. SimonRev says:

          CE still needs code page support because it has to support things like MultiByteToWideChar type APIs (which you need since you may encounter things like some .txt files which are store as ASCII). You also need to convert between UTF-16 and UTF-8.

          What CE doesn’t try to do is auto convert for most APIs, so you don’t need A and W versions of every function that takes a string.

  7. Ivan K says:

    I wonder if Windows 95 would have been as successful with an NT core. My memory is hazy but I think it wasn’t until Windows 2000-ish that 3D games and such worked on NT with its HAL or whatever. But maybe this Panther thing was supposed to solve that at the same time as it trimmed memory.

    1. Antonio Rodríguez says:

      It has nothing to do with the HAL, but with support for the different DirectX versions. Windows 95 OSR2 and Windows NT 4, for example, both included DirectX 2 out of the box, so they could run the same games and multimedia applications. NT required more memory and sometimes had a lower framerate, but compatibility at that point was excellent, and it didn’t let crash games crash the OS as Windows 95 sometimes did. DirectX 3, which introduced Direct3D, was also available on both platforms.

      Your memories may be based on the fact that DirectX 5 and 6 were never released for NT 4 – only Windows 2000 (NT 5) would get DirectX 7 at launch, two years later, putting NT and 9x on par. Anyway, in that timeframe, there were a surprising amount of games that worked in Windows NT, maybe because its developers used NT in their machines. Better OpenGL support was also a big win: OpenGL, intended as an API for graphical workstations, was developed for NT and then a subset was ported to 9x, so OpenGL-based games (such as the Quake series) actually ran better on NT 4 than on Windows 95/98.

      But what happened to DirectX 4? Raymond has the answer: https://blogs.msdn.microsoft.com/oldnewthing/20040122-00/?p=40963

      1. Joe says:

        My memory is rusty, but wasn’t part of the problem that NT originally ran video in user-mode, which severely limited its performance.

        1. Joe says:

          Looked it up on Wikipedia. NT 3.x did run video in user-mode, but it was moved to the kernel in NT 4. I don’t recall running any games in NT 4, but did in Windows 2000 and, of course, XP later than year. (My kids had two games, though, that ran only in Windows 98, so I had a second system set up for that. I remember having to run chkdsk a lot on that system.)

      2. Ivan K says:

        Thanks Antonio.

      3. NT 4 didn’t support Hardware Accelerated Direct3D (only supported Accelerated OpenGL and vendor specific APIs). NT 5 on the other hand did support Hardware Accelerated Direct3D

  8. Gary Keramidas says:

    panther is still referenced today, as a folder in the windows.~bt\sources folder for windows 10 installs.

  9. Mark Grant says:

    Windows 95 had the 32-bit window manager and graphics engine forward to the 16-bit window manager and graphics engine.

    And then us third-party developers would go and stuff 32-bit code into our 16-bit video drivers to get decent performance on more modern CPUs, and kludge up the segments to make it work… If I remember correctly, it also required using an undocumented return instruction to convince the CPU to go back to where it came from without having to mangle the return address from 16bit to 32bit on every driver entry point (a big-enough cost to show up in lower benchmark scores).

    Then there was figuring out how to call from 32-bit assembler code in a ’16-bit’ driver into 16-bit C code. And stuffing read-only data into code segments because the 16-bit linker wouldn’t let us have more than 64k of initialized data.

    Programming was much hairier in those days.

Comments are closed.

Skip to main content