The quest for the three screens

Like most developers, I have overwhelming neurotic tendencies to incessantly become insanely focused on trivial issues. One particular thorn in my side is my failure to get a good triple-monitor system set up at work. (Three monitors is the only proper way to work. That simply goes without question. If you don't believe me, then this article probably isn't for you.)

Back in the days of XP, this wasn't actually a big problem. All of the decent AGP video cards supported two monitors, so I just had to dig up any old PCI video card to complete the trinity. The third monitor would be a bit laggy, but it wasn't a big problem.

Unfortunately, the Vista display driver model made this solution more difficult. Vista's display architecture took several steps forward and enabled a lot of new capabilities, but to make it possible, a few rarely-used capabilities of the older architecture were left behind. One of these capabilities was the ability for Windows to work with more than one display driver at once.

Being a creative genius, I was able to resolve this issue by getting two new nVidia graphics cards. One was AGP, the other was PCI, but otherwise they were identical (same chipset and GPU). This worked, but everything on the PCI display was slow. This was especially noticable when dragging windows from one screen to the other, and also when switching from one window station to another (as required for Ctrl+Alt+Del and UAC prompts).

I later upgraded to a PCI-express system. Unfortunately, it only had one x16 PCI Express slot, but I still managed to get the triple-monitor configuration working with two ATI video cards (one x16, the other x1). Again, the slower card was definitely noticable, but I couldn't figure out a good solution that didn't involve a new motherboard, so I just tried to live with it.

At one point, I got a video card with three outputs (VGA, DVI, and HDMI), only to discover that only two of the outputs could be active at once. Apparently the video card cabal has a rule: thou shalt not create a video card that can support more than two monitors.

 Recently, I found out that ATI had released a new line of video cards with "Eyefinity" technology. In theory, it supports three monitors, but only if one of them uses DisplayPort. None of my existing monitors supports DisplayPort, but minor technical issues like that have never stopped me before, so I immediately ordered a 5770 (I would have gotten a 5750, but my case didn't have room for a two slot card, and the only single-slot card in stock at NewEgg was a 5770) and a DisplayPort-to-DVI adapter.

To make a long story short, after some work with a pair of tin snips (the single slot card fits perfectly, but only for certain definitions of "perfectly") and a bigger power supply (they're not kidding when they say you need a 450 watt power supply - the card was not reliable with just 350 watts), I did get the card installed and running on my desktop. But it still only worked with two monitors at once. Any two, but only two.

Apparently, all DisplayPort adapters are created equal, but some are more equal than others. ATI isn't particularly forthcoming about this, but the "passive" DisplayPort adapters (read: anything less than $50) apparently don't count as a DisplayPort monitor. The video card only has enough circuitry to handle two VGA/DVI/HDMI ports at once, and a passive DisplayPort-to-DVI adapter counts as a DVI port.

So I've ordered a second DisplayPort adapter. This time, it's an "active" DisplayPort-to-VGA adapter (one of my monitors is VGA only, and I suspect that by the time it is replaced, the replacement will have a DisplayPort input), so hopefully it will work. If it doesn't, I'll probably rip my hair out. Tune in next week for the exciting conclusion!