Is IT software development BETTER than embedded software development?

We hear all the time, especially in IT, about the dismal failure rate for software development projects.  So may millions wasted on project X or project Y that was cancelled or rewritten within a year.

So I've been part of a group of frustrated 'change agents' who, for years, have struck out to find better ways to improve.  Better requirements, better estimation, better design.  More agile, more deliveries, more quality.  Tight feedback loops.  All that.  It works.

But then I get in my Toyota Prius and I can't figure out how to find the darned FM radio station that is tuned to my MP3 transmitter because it involves a complex interaction of pressing a panel button, followed by a button on the touch-screen, followed by a completely different panel button.

The labels on the buttons are meaningless.  The layout is seemingly random.  No IT software development process that I know of would get NEAR production with an interaction design like this, yet here it is, in one of the most technologically advanced cars in the world, from a world class innovative engineering company, after the model spent numerous years in consumer trials in Japan. 

That isn't the only example, of course.  Consumer electronics are full of bad interface designs.  I have a wall-mounted stereo that uses an LCD backlight in the default setting, except that the default setting is to show you the radio station, not the clock, and if you switch to the clock display, the back light goes out. 

How about the remote control that requires a complicated sequence of button presses to allow you to watch one channel while you record another (on your XP-Media Center, Tivo or VCR)?  Or the clock radio with a "Power" button on the face to turn the radio on, but reusing the Snooze button on top to turn it off, unless you happen to hit the 'sounds' button in the middle, which now requires you to hit the power button first, then followed by the snooze button to turn it off (I'm not kidding).

I have an MP3 player that doesn't let you move forward two songs quickly until it fully displays the title of the first song on the scrolling LCD display.  If it is not playing, and you press the play button for one second, it plays, but if you mistakenly hold the play button down for two seconds, it turns off.  Quick: Find song 30 and play it while driving... I dare you.

I use a scale that shows my weight and body fat and supposedly records previous settings, although I have yet to figure out, from looking at the six buttons (on a scale, no less), what combination of magic and voodoo is needed to actually get the previous weight to pop up.

How about the cell phone that makes me punch 6 buttons to add a new entry to the internal phone book, or the Microwave oven with 20 buttons labeled with things like Popcorn and soup, but which proves inscrutable if you just want to run it for 90 seconds on High?

All of these are software bugs or usability issues embedded in hardware devices.  Nearly all of these devices are inexpensive consumer electronics (except the car), and therefore the manufacturer was not particularly motivated to produce an excellent interface.

Yet, if a software application, like Word or MS Money, was to have some of these issues, that application would be BLASTED in the media and shunned by the public.  Software developers in hardware companies seem to get a pass from the criticism... that is until a hardware company comes along that does it VERY VERY WELL (example: Apple iPod), and puts the rest to shame.

I used to write software for embedded devices.  I understand the mindset.  Usability is not the first concern.  However, it shouldn't be the last either.

It think it is high time that we turn the same bright light of derision on hardware products with sucky usability and goofy embedded software, with the same gusto that we normally reserve for source code control tools. 

My expections of good design have been raised.  You see, I work in IT.