Why can you set each monitor to a different color depth?

Random832 seemed horrified by the fact that it is possible to run multiple monitors, with different color formats on each monitor. "Seriously, why does it let you do that?"

Well, of course you can do that. Why shouldn't it let you do that?

When multiple monitors were introduced to Windows, video hardware was nowhere near as advanced as it is today. One common discovery was that your computer, which came with a video card in one of the expansion slots, actually had a video chip baked into the motherboard, but which was disabled in the BIOS. In other words, your computer was actually multiple-monitor-capable; it's just that the capability was disabled.

Once you got it enabled, you would discover that the onboard video adapter was not as good as the one in the expansion slot. (Duh. If it were as good as the one in the expansion slot, then the manufacturer would have saved a lot of money and not bothered shipping a video card in the expansion slot!) Usually, the onboard video card didn't have a lot of video RAM. You still want to run it at 1024×768 (hey, that's high resolution for these days), but in order to do that, you need to reduce the color depth. On the other hand, the card in the expansion slot has a huge amount of video RAM (four megabytes!), so you take advantage of it by running at a higher color depth.

You're now getting the most out of your machine; each video card is cranked up as high as it can go. (The lame-o card doesn't go very high, of course.) What could possibly be wrong with that?

Bonus chatter: It so happened that some of these "secret video card" motherboards had a feature where they would disable the ROM BIOS on the on-board video card if they detected a plug-in video card. To get multi-monitor support on these recalcitrant machines, one of my colleagues wrote a tool that you used like this:

  • Turn off the computer and remove the plug-in video card.
  • Boot the computer with only the lame-o on-board video.
  • Run the tool, which captures the ROM BIOS.
  • Turn off the computer, put the plug-in video card back in, and boot the computer again.
  • Run the tool again and tell it "Hey, take that ROM BIOS you captured and map it into memory where a ROM BIOS would normally go, and then initialize the video card from it and add it as my second monitor."

It was a bit of a hassle, but it let you squeak crappy multi-monitor support out of these lame-o motherboards.

Comments (31)
  1. SimonRev says:

    I used to run Win98 with 5 video cards and as they varied from 1 to 4 MB of RAM, I was grateful that I could run each monitor at a different color depth.

    I also ran into the lame-o built in video card issue, but personally found it easier to go grab another video card from the dead PC heap than to try that BIOS hack.

    Actually the monitors were much harder to come by at the company I was working for back then.  I used to have to wait until someone left the company and then scavenge their monitor (we didn't have much of an IT department back then so no one really noticed much except that I kept getting more and more monitors).

    Back when I had 5 CRT monitors my coworkers would occasionally come into my office with whatever they were hardware working on (touch screens usually) and see if the EMI generated by them would interfere with their hardware.  We found several touch screen issues that way.

  2. One thing that I wish you could do is run each monitor at a different DPI settings. I have a laptop with a 13" 1920×1080 screen, which I like to run at 120 or 144 DPI. But I often connect an external monitor to that laptop (using that one as the primary and the laptop's panel as the secondary), and that monitor is 1920×1200, except it's 24", so I want to run that one at 96 DPI.

    So either I have ridiculously huge stuff on the external monitor, or I have to squint to be able to read anything on the laptop panel. There is no compromise that works well for both monitors.

    I understand the technical reasons for why this would be extremely difficult (if not impossible) to support. But it would still be nice to live in that fairy-tale land where this is possible. :)

  3. Antonio Rodríguez says:

    Once again, the title's question can be answered by another question: what would happen if you couldn't?

  4. henke37 says:

    Well, you could still crank up the card to the highest setting supported, it just wouldn't be the same as if it was alone.

  5. gfx says:

    The graphic hardware/drivers doesn't nescessarily support any common bitdepth.

  6. Tom says:

    Even today, most BIOSes will disable the on-board video card if they detect a PCI-Express card plugged in.  This is often done to conserve PCI-E lanes — in other words, to save fifty cents.

    Fortunately, triple-monitor cards are so cheap these days that it's less hassle to throw hardware at the problem than to work around firmware limitations like this.

  7. benjamin says:

    The whole presto-changeo BIOS trick to fool the motherboard sounds like a disaster just waiting to happen.

  8. 640k says:

    Raymond forgot to tell that the reason for this hassle is Sandy bridge's hardware "support" for protected video path, aka palladium/tcpa.

  9. Craig says:

    @640k – No, it's not. This is referring to kit from way back – not recent motherboards, though some still behave similarly but not for trusted path reasons.

  10. Torkell says:

    The other nifty thing that you can do is have different monitors set to different refresh rates (useful when using a mismatched pair of old CRTs), which amazingly doesn't seem to cause any problems.

  11. SimonRev says:

    Yeah, the different refresh rates thing was pretty important back with older CRTs.  Otherwise you would get a resonance pattern in the display, which I always assumed came because they refresh rates were slightly out of sync.  It also helped slightly to put some metal between the CRTs to block some of the EMI.

  12. Kemp says:

    @640k – Sometimes you make it too easy to tell you're just here to troll.

    On topic: I always found it annoying about the auto-disabling of the onboard graphics. There actually were a couple of times that having it available would have been really useful. On the other hand, these days it's easier to just pick up a second card in the rare case that your existing card doesn't already support multiple monitors. Just to cover all possibilities, I am about to build a machine using the built-in graphics capability of the Sandy Bridge line.

  13. Tim says:

    Sven, to save Raymond the effort, it's on the list. http://blogs.msdn.com/b/oldnewthing/archive/2010/05/31/10017567.aspx

  14. Gabe says:

    I'm guessing that it would actually have taken significant work to require that all monitors are set to the same color depth. Imagine what would happen if you have a system working with several monitors, then somebody plugs in one that specifies EDID information that is incompatible with the current resolution/color depth, causing the graphics card to change color depth. Then every other graphics card on the system would have to simultaneously change their color depth, possibly requiring them to change resolution too!

    There might not even be a common color depth supported by all monitor/graphics card/resolution combinations. Who could possibly want this situation?

  15. Joe Schmoe says:


    Can you please explain why Windows and most of it's applications do not fully support 10bit large gamut monitors which are in the market today when Windows has supported 32bit color depth in video settings for more than a decade?

    I know that the configured 32bit color depth is actually just 24bit color depth (8bits X 3colors) but why was it not possible to fully support 10bit (or more) per color if the hardware supports it in the Windows GDI and other applications programming interfaces for all this time?

    I asking this question because I've almost gone mad trying to find the right combination of Large gamut / high resolution monitor and image editing applications, video cards and connectors that will work. I can't believe that it is this complicated. I shudder how the average consumer is supposed to understand any of this when their $100 blueray player simply works with deep color via HMDI on their $500 HDTV but their $2,000 computer + $1,000 monitor + $300 video card doesn't.

    Was this another case of "nobody will ever need more than 512KB (8bits per color)"?


    [Going past 8bpp would break a lot of GDI functions. (How would you express 10-bit color formats in GetSysColor? RGBQUAD? COLORREF? GetPixel?) You are welcome to stop using GDI. DirectX, for example, supports many extended color formats. -Raymond]
  16. Joe Schmoe says:

    Raymond, I understand that some parts of the GDI cannot deal with >10bits per color. I was wondering if Microsoft had/now has a plan to fully supporting >8bit per color in normal applications and why deep color simply doesn't work correctly even with those rewritten DirectX applications unless you're using select video cards (NVideo Quadro for example) and display port cables (but not HDMI).

    [How would I know? I'm not a Microsoft spokesperson, and I certainly don't go to GDI meetings. You seem to have forgotten that I don't cover future plans on this Web site. -Raymond]
  17. Troll says:

    Too bad Windows 7 or Vista do not support horizontal and vertical desktop spanned modes for multiple monitors, that is, stretching the desktop resolution across multiple monitors. They only support dual view (extending the desktop to multiple monitors). Earlier versions of Windows did.

  18. Worf says:

    32 bit color is not just a neat speed hack for Windows – i.e., it's not just 8bpp x 3 channels, it's often 8bpp x 4 channels. The last one is alpha and controls transparency. Usually called RGBA mode. 10-bit color in 32-bits just puts us back in the nasty mask-and-shift pixel manipulations when we dealt with RGB565 or RGB555. Great on memory, at the price of speed.

  19. ender says:

    Heh, my current computer came with an on-board (on-CPU?) Intel HD graphics and GeForce 420 in PCIe slot. The on-board card has DVI and HDMI ports, both were plugged (with the DVI port having a a piece of plastic with "Do not remove" stamped on it screwed in). I could enable the on-board card in BIOS, but that resulted in Windows BSODing as soon as a monitor was connected to card (guess that explains "Do not remove") – then I found out that the BSOD goes away if I set the on-board card as primary.

    As for 10-bits per channel, I had a Matrox card long ago that supported that. Not sure how it worked, but the included viewer did seem to show a difference on the sample images – not that I cared.

    (and I still have to post every comment twice before it's actually accepted)

  20. Someone says:

    To enable more than 8 bit per color channel, it could be done like the introducing of Unicode: There would be RGBQUADH, COLORREFH etc with 16 bit per color channel. Every GDI structure und function depending on the current format gets a counterpart with a H prefix, and there should be a precompiler symbol to switch from the pixel color format "standard" to "high". The handling of Windows messages transporting color values also needs some attention:) Maybe its only possible on 64 bit systems.

    Just kidding.

  21. Neil says:

    @ender If you're quick enough it only takes one submit.

  22. Adam Rosenfield says:

    @Someone: Oh god, you almost had me there for a minute.  Having to #undef generic names such as GetObject is bad enough, but then having to deal with 4 potential variants of a function that happened to take both a string and a color?  Madness.

  23. Not Norman Diamond says:

    @Adam Rosenfield: In that case, it would make sense to only implement the xxxH structures on the W variant of the function – it seems that a bunch of APIs are now only being implemented in W variants anyways, so it makes sense that development of future functionality would ignore the seemingly deprecated A variants. In all likelihood, if you're going to want to implement high colours, you'd probably also be supporting Unicode, and if you aren't, you can always do the conversions where necessary.

    [Adding the new APIs is easy (if cumbersome). The hard part is making them interop with the old ones. If an old app calls GetObjectW(DIBSECTION) on a new bitmap, what should happen? -Raymond]
  24. Unsure says:

    "If an old app calls GetObjectW(DIBSECTION) on a new bitmap, what should happen?"

    If a window is shown on two monitors (half on monitor 1, half on monitor 2) and I retrieve it's bits via GDI functions: What happens if the monitors have different color depths? I'm not that familiar with GDI bitmap functions, but I thought that the color depth of the resulting bitmap can be dictated by the program calling the GDI function, via some parameter or header value in the bitmap structure.

    [DIB sections allow direct access to pixels in their original format. If you want a conversion, use GetDIBits. We are now way off topic and I think I'm going to delete all these off-topic comments in a week or so. -Raymond]
  25. 640k says:

    [That's a pretty awesome breaking change you're suggesting. "All existing applications will stop working until they recompile with the new header files. Even if you don't care about 10-bit color formats." -Raymond]

    A simple int can have changed from one day to another (16 -> 32 bit). Defined types are black boxes which you should not know any thing about how they are defined inside. You cannot just compile your code with new compiler/sdk and think it works. That's why you should have a dwSize member or similar check. And that's why you should use sizeof instead of using hard coded sizes of structs. This is (C) programming 101.

    I understand that most windows APIs are quick hacks which isn't forward compatible, a clean breaking change in those cases would be changing API semantics when upgrading architecture, you already had two opportunities, 16->32 bit and 32->64 bit x86. And you will again have this opportunity with ARM in next windows release.

    [Okay, now I'm convinced you're just a troll. -Raymond]
  26. Yeah says:

    "but then having to deal with 4 potential variants of a function that happened to take both a string and a color?  Madness."

    Don't forget 32 vs. 64 bit. All in all, there would be 8 variants that can be called wrong, especially via p/invoke :)

    But hopefully, there exist no function or structure or windows message containing strings *and* INT_PTR *and* color values.

  27. 640k says:

    @Raymond: How would you express 10-bit color formats in GetSysColor? RGBQUAD? COLORREF? GetPixel?

    The reason for using include files instead of hardcoded types are that defined types, like RGBQUAD and COLORREF, can be *redefined* when appropriate.

    [That's a pretty awesome breaking change you're suggesting. "All existing applications will stop working until they recompile with the new header files. Even if you don't care about 10-bit color formats." -Raymond]
  28. 640k says:

    @Raymond: DIB sections allow direct access to pixels in their original format.

    Are you saying that DIBs are device *dependant*? What a mess.

    [They are device-independent. You can create a 32bpp DIB even if your video card is only 4bpp. -Raymond]
  29. Gabe says:

    Raymond, I've discovered that apparently it is common that people who post on the Internet don't know that changing a header file means recompiling ALL code that includes that header. They don't seem to realize that recompiling only SOME of the code makes it binary incompatible with the code that wasn't recompiled.

  30. 640k says:

    @Gabe: Of course a 16 bit integer isn't binary compatible with a 8 bit integer. What did you expect?

  31. Gabe says:

    640k: Wow. I don't know where to begin telling you what's wrong with that.

Comments are closed.