When last we left my dead desktop computer, it had returned to the world of the living with the assistance of the onboard video adapter. The screen was fuzzy because I was running my LCD monitor through the analog VGA cable. Performing an auto-adjust helped a little but it was still blurry. Still, it was within the realm of acceptability for casual low-volume use.
Well, the computer once again died, and before it finally kicked the bucket, the onboard video card started pumping out corrupted pixels. My suspicion that my motherboard ate video cards was correct. In fact, its appetite for video cards was so voracious, it ate itself.
Okay, so here are the options I've been able to come up with so far.
- Feed the computer video cards.
- Replace just the motherboard.
- Replace the entire computer.
The first option is out because the rate of consumption appears to be one video card per month, which is a rather expensive diet.
The second option is a possibility, but the computer was purchased pre-assembled from a name-brand manufacturer, so the odds of finding a motherboard that exactly fits into the original case are pretty slim. I'll probably have to move everything to a new case.
The third option is the lazy way out, and is in fact the solution employed by most non-technical users.
For now I'm going to investigate option two. I'll have to take the computer apart to get at the motherboard anyway, and then I can investigate what type of replacement I need to get. (In terms of socketry and stuff.) Though who knows how long it will be before I actually get around to fixing the computer.
Meanwhile, my laptop which was manufactured back in 2000 continues to chug along happily.