# How do I get the color depth of the screen?

How do I get the color depth of the screen? This question already makes an assumption that isn't always true, but we'll answer the question first, then discuss why the answer is wrong.

If you have a device context for the screen, you can query the color depth with a simple arithmetic calculation:

```colorDepth = GetDeviceCaps(hdc, BITSPIXEL) *
GetDeviceCaps(hdc, PLANES);
```

Now that you have the answer, I'll explain why it's wrong, but you can probably guess the reason already.

Two words: Multiple monitors.

If you have multiple monitors connected to your system, each one can be running at a different color depth. For example, your primary monitor might be running at 32 bits per pixel, while the secondary is stuck at 16 bits per pixel. When there was only one monitor, there was such a thing as the color depth of the screen, but when there's more than one, you first have to answer the question, "Which screen?"

To get the color depth of each monitor, you can take your device context and ask the window manager to chop the device context into pieces, each corresponding to a different monitor.

```EnumDisplayMonitors(hdc, NULL, MonitorEnumProc, 0);

// this function is called once for each "piece"
BOOL CALLBACK MonitorEnumProc(HMONITOR hmon, HDC hdc,
LPRECT prc, LPARAM lParam)
{
// compute the color depth of monitor "hmon"
int colorDepth = GetDeviceCaps(hdc, BITSPIXEL) *
GetDeviceCaps(hdc, PLANES);
return TRUE;
}
```

If you decide to forego splitting the DC into pieces and just ask for "the" color depth, you'll get the color depth information for the primary monitor.

As a bonus (and possible optimization), there is a system metric `GetSystemMetrics(SM_SAMEDISPLAYFORMAT)` which has a nonzero value if all the monitors in the system have the same color format.

Tags

1. Farproc says:

Isn't it usually more appropriate to call hdc = GetDC(NULL); to get a "desktop" DC and use that? This topic is not well explained, but very few applications seem to use real device DCs anymore – typically an application doesnt bother to render to seperate DCs per enumerated device and just use the dc returned by BeginPaint – which would be the same kind of DC returned by GetDC(hwnd) or even GetDC(NULL). Window DCs seem to be some kind of virtual DC that represents some subset of the functionality of the various display devices that make up the desktop and it is the bit-depth of that device that most applications are interested in.

[I was assuming the question was being asked in the context of a WM_PAINT handler, in which case you should use the DC you were given. Even if being asked outside of a WM_PAINT handler, you should use GetDC(hwnd) to get a DC for your window. Screen DCs give GDI the heebie-jeebies because they are global and force certain optimizations to be disabled. -Raymond]
2. obelix says:

<quote>

If you decide to forego splitting the DC into pieces and just ask for "the" color depth, you'll get the color depth information for the primary monitor.

</quote>

What if there are three monitors connected and the window is across the two secondary displays? What would you get then?

["If you decide to forego splitting the DC into pieces and just ask for "the" color depth, you'll get the color depth information for the primary monitor." If you don't believe me, then try it and report your findings. -Raymond]
3. obelix says:

I realize the correctness of your method. I was just asking because I was curious.

I'm guessing the answer is that you could get the colour depth of either – there are no guarantees and it is subject to change :)

4. DWalker says:

You mentioned "multiple monitors", but my first thought was "no monitors".  How does that affect things?  Or, is there always at least one "monitor" even if the system is booted without a connected monitor?

5. Dan Bugglin says:

@obelix You might still get the primary monitor.  The API would have to return the same monitor every time it is called (otherwise it could confuse legacy apps that don't know about multiple monitors).  Even if the app is off the primary monitor, it could be moved onto it later, or visa versa, so I would think it would always return the primary monitor information.

Maybe it's more complicated than that though, I dunno.  I don't use those APIs.

6. Matthew W. Jackson says:

How does the new window compositing play into this? Even if the displays are running at different depths, wouldn't the Window Manager handle everything internally at the highest color depth and then covert them when rendering the final output? In that case, I think most programs would be most interested in the maximum color depth of the system.

7. hard says:

a dc can be rendered to several different hardwares with different color spaces.

8. Jack Mathews says:

Unless you're doing very high fidelity rendering (like perhaps photoshop or another image manipulation program), if you're asking this question, then you're probably worrying about something you don't need to anyway.  It's been years since we've had to worry about 256-color displays, which means that you're either running in 16-bit or 32-bit (likely 32-bit, with 16-bit for grandma's computer).  Either way, since they're not palettized, just do everything assuming 32-bit and let Windows do the work of bringing you down to 16-bit if needed.

9. Anonymous says:

@Jack Mathews

(likely 32-bit, with 16-bit for grandma's computer)

Off the top of my head: Remote Desktop, Windows Server compatible applications, embedded Windows systems, certain current sub-sub-notebooks.

To paraphrase the old quote, "Of course I don't look like I'm worried about compatibility issues popping up in the future – I did it right the first time."

10. Gabe says:

DWalker59: I haven't tried it, but I'm pretty sure it's easy to get an application to have no monitors — just start up your app and lock the workstation. Since the application's desktop is not displayed on any video hardware, it doesn't have a "monitor". Other similar situations are disconnected Remote Desktop sessions and services that don't have access to the console. I believe the UAC prompt could also create this situation.

Jack Mathews: You're right; if you aren't rendering images (like Photoshop or a web browser), you probably shouldn't care what the monitor's color depth is.

11. Farproc says:

You know, I really don't know how screen DCs, Window DCs and device DCs interplay any more, especially on Windows ver 6+ where there is a DWM. When I call BeginPaint in an application with a window that spans multiple monitors, what exactly do I get? Given the layers of indirection implicit in such a thing, how is a window DC different to the "desktop" DC? EnumDisplayMonitors implies some kind of relationship where window DCs can be made up of display DCs – Feng Yuan used to refer to 'mirror drivers' when answering this kind of question on the gdi newsgroups years ago, but theres no non-ddk documentation that describes what the hell those are and how they relate to calling CreateDC vs GetDC vs BeginPaint vs EnumDisplayMonitors.

12. Jack Mathews says:

Anonymous:

Remote Desktop – Fidelity not an issue as lots of stuff looks crappy.  Render in 32-bit, it will get sent across in 16-bit (or less).

Windows Server Compatible Applications – Same as RD – No need to be pixel perfect, 32-bit will scale down

Embedded Windows Systems – Specialized system, if you want to optimize for 16-bit so your ATM machine's graphics don't band, knock yourself out

Sub-sub notebooks – What are you going to do?  Make 16-bit graphics for horrible machines which will probably look the same as the 32-bit ones scaled down?

To paraphrase the old quote, "Of course I don't look like I'm worried about compatibility issues popping up in the future – I did it right the first time."

How about the fact that if you have less code paths, you're less likely to have compat issues and regressions?  Let's say you check for 16-bit, then you do it wrong, like, say, if multimon happens in the future.  You've now got a compatibility issue in the future.  Congrats.

13. Joe says:

I've given up caring. I write my graphics routines for 32-bit. If you run at a different color depth, that's your problem.

14. 640k says:

I write all my graphics routines for 64-bit to be future proof.

15. Boris says:

What about safe mode? Doesn't that run in 256 colors?

16. DWalker says:

Gabe:  So, I wonder … what's the color depth of a system with no monitor?  And what are the monitor dimensions?  Zeros all around?

17. Yuhong Bao says:

"What about safe mode? Doesn't that run in 256 colors?"

Until XP, when VESA VBE support was added to the VGA driver used in safe mode (at the same time they boosted the minimum resolution to 800×600), it used the standard 640x480x16 VGA resolution. Now of course it can ruse any video mode the VESA VBE BIOS supports.

18. Gabe says:

DWalker59: I haven't tried it, but I'd guess that you would end up with zero monitors, so you would be unable to ask its dimensions nor get a DC to ask what its color depth is.

19. Anonymous says:

@Jack Matthews

How about the fact that if you have less code paths, you're less likely to have compat issues and regressions?

Why, certainly.  If you omit the seat belts and airbags on an automobile, you are less likely to have failures as well.

Let's say you check for 16-bit, then you do it wrong, like, say, if multimon happens in the future.  You've now got a compatibility issue in the future.  Congrats.

I rather think there's a difference between "I didn't do it the right way even though I know better" and "Someone else changed something in the future I couldn't predict".  Instead of one source of compat issues and regressions, now you have two.

Congrats indeed.