Having More Colors Is A Good Thing, Isn't It?

When was the last time you saw a Windows Mobile device that couldn't do color?  (It's a trick question.  The last grayscale device predates the name change to "Windows Mobile.")  Subtle differences in screen quality aside, all WM devices have had the same ability to show colors--65536 of them.  Modern LCD screens, however, are capable of showing four times that many (262144).  So I've been asked, "Why doesn't WM allow OEMs to make use of these better screens?"

The short answer is: "We do."

The medium answer is: "But don't be so sure you want them to."

And the long answer is ... well the rest of this entry.

Kibbles and Bits
That medium answer is pretty shocking.  "How could I possibly not want them to have more colors?  Having more colors is a good thing, isn't it?"  To understand this answer, we're going to have to understand how computers show colors.  65536 isn't an arbitrary number.  It's the number of combinations possible in 16 bits of information (a "bit" is a number that can either be 1 or 0). 

Every dot on the screen (the technical term for "dot on the screen" is "pixel") is represented by 16 bits of information.  Those 16 bits give the color of that dot.  If you're really interested, it's broken up into 5 bits for the red intensity, 6 bits for the green, and 5 bits for the blue.  This format is known as "RGB 565." 

Modern mobile CPUs are extremely good at dealing with 32 bits of data at a time.  They're pretty good at dealing with 16 bits of data at a time, and they're reasonably good at dealing with 8 bits of data at a time.  They're pretty lousy at dealing with other amounts, though.

What's it take to do 262144 colors?  18 bits.  Since that’s not 8, 16, or 32, it's not a particularly CPU-friendly number.  For those keeping track of image formats, this would be 6 bits of red, green, and blue, or "RGB 666." 

I was Framed!
The next thing to understand is how the bits turn into colors on the screen.  Say you've got a typical PocketPC with a resolution of 240x320 and 65536 colors.  That means you've got 320 rows of 240 pixels (dots), each of which has 16 bits of data representing its color.  All of that information is stored in a chunk of memory known as the "Frame Buffer."  The LCD hardware takes whatever is in the Frame Buffer and converts it directly to what's on the screen.  Want to change what's on the screen?  Change what's in the Frame Buffer and the screen will update.

Okay, so we need 16 bits for every pixel, and we've got 240 times 320 dots.  16 bits is two bytes, so that's a total of 153600 bytes, or 150K of RAM used to hold what's on the screen. 

A Packing Problem
You need the CPU to write things into the Frame Buffer.  Since a pixel is 16 bits and the CPU really likes to deal with data 32 bits at a time you just write 2 pixels at once.  This makes everything happy.  But what would happen if you had 18 bits per pixel?  The first pixel would line up on a boundary that the CPU is happy with.  But the next one wouldn't.  Neither would the one after that.  Things wouldn't line up again until the 16th pixel.  So, 15 out of every 16 pixels are handled in a way that's really inefficient for the CPU.  18 bits just don't fit together well in a world of 32s. 

For this reason, the RGB 666 pixel format (18 bit) just doesn't exist.  It's not natively supported in any operating system I'm aware of.  Not CE, XP, Mac, Unix, Palm, or Symbian.

But wait.  There's a competitor's phone out right now that advertises having a 262144 color screen.  They must support 18 bit color, right?  Actually, no they don't.  They support 24 bit color, and then they throw away the data from 6 of the extra 8 bits. 

Have you ever tried to fit two suitcases into a trunk only to find that they're just a tiny bit too big to fit?  What did you do?  Most likely, you put a smaller suitcase in the extra space.  But what if you didn't have a smaller suitcase?  In that case, you probably put the one suitcase in the trunk and left a lot of wasted space around it.  That's what's happening with anyone who is supporting an 18 bit screen.  Because the 18 bits don't pack together well, they leave empty space around them.

Bitfield Wasteland
There are two ways to do 24 bit color.  In both cases, there are 8 bits for red, green, and blue (RGB 888).  In the first case, each pixel has 24 bits, and in the second it has 32 bits (the remaining 8 bits are either used for transparency or are unused). 

The 32 bit case is the best for the CPU.  It's one pixel for every 32 bits, so everything is guaranteed to line up the way the CPU likes it.  The 24 bit case isn't as good as 32, but it's way better than 18 bits.  At least everything in 24 bits lines up on an 8 bit boundary, and you get back to a full 32 bit boundary every 4 pixels.  But, remember, while the system thinks it's doing 24 bits of color information, the screen is only using 18 of those bits.  The rest are being thrown away.  So, depending on how you implement this, you're either wasting 6 bits of every 24, or you're wasting 14 bits of every 32. 

Enough mumbo jumbo. What does this really mean?
It comes down to this.  If you want to make use of those extra colors, your frame buffer needs to be either 50% or 100% bigger than it is now.  Filling that frame buffer will be correspondingly 50% to 100% slower.  Things that require the screen be updated frequently (watching movies, taking pictures, playing games) will go slower.  Games in particular, are hard hit.  Games based on the old Game API (GAPI) will need to do a slow conversion, or they won't work at all.  Newer DirectDraw games will do better, but will still be slower on a 24 bit system than on a 16 bit one. 

Windows CE supports both forms of 24 bit color (24 and 32 bits).  If an OEM really wants to "make use" of that 18 bit screen, they can.  The question you need to be asking yourselves, however, is whether or not it's worth the cost.  Is a marginally better image worth slower graphics? 

It is my belief that the tradeoff isn't worthwhile.  I see this as a marketing battle, not a technical one.  The competition wants to put the bigger number in their bullet points so they can be "better" even though, in my opinion at least, it makes them worse. 

Here's a quick and dirty desktop app written by one of our hardware guys.  It lets you load a 24 bit image and compare it to what it would look like in 18 and 16 bit.  Because it's so hard to see a difference, the app also shows a 12 bit image.  Nothing does 12 bit anymore, but you can look at the faults in that image to give you an idea where to look for differences in the 16 and 18 ones.  The App requires that you have .net 2.0 installed.  Sorry, it's what we had handy on short notice.  You can decide for yourself whether the differences in 18 and 16 bit are worth the corresponding ram and speed costs.

So, when will we see more colors?
I believe two things need to happen before this will be worthwhile. 

First, we need screens that use all 24 bits.  You pay the same speed penalty for 24 bit color as you do for 18.  But 18 is just a tiny bit better than 16.  Let's not take the hit until we get the full benefit out of it.

Second, we need devices to get fast enough that the speed hit isn't significant.  The desktop is there.  Mobile devices aren't yet.  Mobile devices are getting faster, but they're still RAM and CPU constrained. 

Alternately, a clever hardware company could make a mobile LCD controller that takes a 16 bit Frame Buffer and dithers up to 18 bits to match the screen.  That would get you some benefit out of the better screen with no hit in speed.  We would very much like to see this happen, but, to my knowledge, it hasn't yet.

Mike Calligaro