Why no Easter Eggs?


Yesterday’s post caused a bit of a furor in the comments thread.  A large number of people leaving comments (and others) didn’t understand why the OS division has a “no Easter Eggs” policy.

If you think about this, it’s not really that surprising.  One of the aspects of Trustworthy Computing is that you can trust what’s on your computer.  Part of that means that there’s absolutely NOTHING on your computer that isn’t planned.  If the manufacturer of the software that’s on every desktop in your company can’t stop their developers from sneaking undocumented features into the product (even features as relatively benign as an Easter Egg), how can you be sure that they’ve not snuck some other undocumented feature into the code.

Even mandating that you have access to the entire source code to the product doesn’t guarantee that – first off, nobody in their right mind would audit all 10 million+ lines of code in the product before deployment, and even if you DID have the source code, that doesn’t mean anything – Ken Thompson made that quite clear in his Turing Award lecture.  Once you’ve lost the trust of your customers, they’re gone – they’re going to find somewhere else to take their business.

And there are LOTS of businesses and governments that have the sources to Microsoft products.  Imagine how they’d react if (and when) they discovered the code?  Especially when they were told that it was a “Special Surprise” for our users.  Their only reaction would be to wonder what other “Special Surprises” were in the code.

It’s even more than that.  What happens when the Easter Egg has a security bug in it?  It’s not that unplausable – the NT 3.1 Easter Egg had a bug in it – the easter egg was designed to be triggered when someone typed in I LOVE NT, but apparently it could also be triggered by any anagram of “I LOVE NT” – as a result, “NOT EVIL” was also a trigger.

Going still further, Easter Eggs are percieved as a symptom of bloat, and lots of people get upset when they find them.  From Adequacy.org:

Now if you followed the link above and read the article you may be thinking to yourself…
  • Is this what MS developers do when they should be concentrating on security?
  • How often do they audit their code?
  • What’s to stop someone from inserting malicious code?
  • Is this why I pay so much for Windows and MS Office?
  • I know other non-MS software contains EEs but this is rediculous.
  • One more reason why peer review is better as EEs and malicious code can be removed quickly.
  • Is this why security patches takes so long to be released?
  • This is TrustWorthy Computing!?!
  • From technofile:

    Even more disturbing is the vision of Microsoft as the purveyor of foolishness. Already, the cloying “Easter eggs” that Microsoft hides away in its software — surprise messages, sounds or images that show off the skill of the programmers but have no usefulness otherwise — are forcing many users to question the seriousness of Microsoft’s management.
       A company whose engineers can spend dozens or even hundreds of hours placing nonsensical “Easter eggs” in various programs would seem to have no excuse for releasing Windows with any bugs at all. Microsoft’s priorities are upside down if “Easter egg” frills and other non-essential features are more important than getting the basic software to work right.

    From Agathering.net:

    “and some users might like to know exactly why the company allows such huge eggs to bloat already big applications even further”

    I’ve been involved in Easter Eggs in the past – the Exchange 5.0 POP3 and NNTP servers had easter eggs in them.  In our case, we actually followed the rules – we filed a bug in the database (“Exchange POP3 server doesn’t have an Easter Egg”), we had the PM write up a spec for it, the test lead developed test cases for it.  We even contacted the legal department to determine how we should reference the contingent staff that were included in the Easter Egg. 

    But it didn’t matter – we still shouldn’t have done it.  Why?  Because it was utterly irresponsible.  We didn’t tell the customers about it, and that was unforgivable, ESPECIALLY in a network server.  What would have happened if there had been a buffer overflow or other security bug in the Easter Egg code?  How could we POSSIBLY explain to our customers that the reason we allowed a worm to propagate on the internet was because of the vanity of our deveopers?  Why on EARTH would they trust us in the future? 

    Not to mention that we messed up.  Just like the NT 3.1 Easter Egg, we had a bug in our Easter Egg, and we would send the Easter Egg out in response to protocol elements other than the intended ones.  When I was discussing this topic with Raymond Chen, he pointed out that his real-world IMAP client hit this bug – and he was more than slightly upset at us for it.

     

    It’s about trust.  It’s about being professional.  Yeah, it’s cool seeing your name up there in lights.  It’s cool when developers get a chance to let loose and show their creativity.  But it’s not cool when doing it costs us the trust of our customers.

     

    Thanks to Raymond, KC and Betsy for their spirited email discussion that inspired this post, and especially to Raymond for the awesome links (and the dirt on my broken Easter Egg).

     

    Edit: Fixed some html wierdness.

    Edit2: s/anacronym/anagram/

    Comments (35)

    1. I think you meant anagram instead of acronym

    2. Anonymous says:

      Wow. So Exchange POP3 server actually had an egg where it would return a list of developers as an email message instead of a "normal" email?

      Yeah, I can see how that would be a problem. Especially if the key phrase is the one listed here:

      http://www.eeggs.com/items/19978.html

    3. SteveJS says:

      Jenson Harris’ has an amusing blog post about a bad encounter with a Microsoft Easter Egg when he was four years old:

      http://blogs.msdn.com/jensenh/archive/2005/10/20/483041.aspx

      I was really bummed about the No Easter Egg policy when it was announced. Now, the last thing I want to do is Security Review a single line of unnecessary code.

    4. Anonymous says:

      The Exchange 4.0 server had a 30MB Easter Egg… although at that size it was pressing the definition of "egg". What struck me was your comment:

      "But it didn’t matter – we still shouldn’t have done it. Why? Because it was utterly irresponsible."

      I guess we’ve all gotten a little older and wiser, eh Larry?

    5. Victoria French says:

      We still put Easter Eggs in our products, but they are not installed by default. They are an extra set of code that the user must opt to install. The Easter Egg component is also removable once they find it.

      People do like Easter Eggs and I disagree with most of the quotes presented. But giving the user a choice to install it allows for the customer to trust you.

    6. Anonymous says:

      The REAL question is without having the source to build the final executable with a compiler (which was built from source, ad infinitum..) can you be ever sure that they’ve not snuck some other undocumented feature into the code?

      People who have even a little common sense and are that suspicious will ask for source. They know that’s the only way to be sure.

      The explanation you gave is not even childish – some thing sillier than that. Asking for removing easter eggs and accepting that as a proof that they’ll never get malware embedded in the software for instance.

      Thing is some one at MS may still put Easter Eggs (and unlikely, Malware) which will never be discovered – like the WAV files built out by warez’ed tools that ship with Windows XP – who knows if they don’t cause a buffer overflow somewhere?

    7. Anonymous says:

      I understand the security issues w/ Easter eggs. But those concerns must be RELATIVELY small compared to actual security issues in Windows. I mean, it’s like comparing a tornado w/ wind speed of 200 mph and one with 202 mph.

      I must say, my XP SP 2 is pretty solid. But the UX of running a LUA account leaves much to be desired. I hope Vista does LUA well, and MS could have another hit on its hands. Good luck.

    8. Garry Trinder says:

      Wow.. So many reasons except most important one – There is no way to make money from Easter Eggs in software.

    9. The interesting thing about the 30MB easter egg was that it was never installed. It was just an AVI file stuck on the Exchange CD (renamed to be .MDB).

    10. Anonymous says:

      Adequacy.org is a well known troll site! YHBT, HAND.

    11. Anonymous says:

      A good example of how these things can get out of control, is the "NSA_Key" issue from some years ago. Presumeably someone chose that name as a joke, or without considering its potential interpretation. Whatever the rationale behind the name, it caused a firestorm of protest about "NSA backdoors" in microsoft products. Oops!

    12. Anonymous says:

      What constitutes an easter egg? Would the list of developers that used be in IE’s (4.x IIRC) about box be considered an easter egg? I think there’s a gray area in there, a list of team members is not exactly an easter egg, at least not in the way the IE team’s done it – it did not interfere with the normal (expected) operation of he software, t was a part of the about box. If Exchange returned an e-mail with its team member list in response to an e-mail with a trigger phrase then that might have crossed the line, as that would not be an expected behavior. But if it sent an e-mail in a response to an action in the about or acknowledgment dialog boxes that might be acceptable, even though still considered an easter egg.

    13. Anonymous says:

      The scary thing is that if one developer can slip an innocent Easter egg into an OS, another could also implement a back door.

    14. Anonymous says:

      Jerry wrote:

      > What constitutes an easter egg?

      Larry is saying (if I understand him) that the key thing is, is the behaviour documented? So, if you /documented/ that the keystroke sequence Alt, Ctrl, 1, *, 8 displayed a list of developers then sent an email to Father Xmas, then, that is not an easter egg!

    15. Anonymous says:

      So Microsoft responded to corporate and other institutions who care greatly about security and removed the so called easter eggs. It was a inexpensive thing to do, but did that really added much to the security of Windows?

      <p>

      I’m all for trustworthy computing. How about we have a defined and spec’d "credits" box that you can only get on if you have done something for the product that improved its security?

      <p>

      In the entertainment industry and esp. the movie industry, contracts spell out what people have to do to get on the credits and how long they run on screen etc. While software isn’t there, it might be another incentive for developers, testers, PMs, PSS people, Beta testers etc to do the right thing to get their name on the product. Just a simple acknowledgement could be enough for many people to want to contribute.

      <p>

      The original Mac had the development team’s names on the inside of the case. They took pride in their work and Mr. Jobs took pride in them.

      <p>

      Personal aside: I was the video tester for Windows NT when the screen saver with the developers names in it was shipped. The developer definitely went out of his way to obscure the code. (Strings.exe would not find them) The first I heard of it was from PSS people who wanted to be put on the list i.e. I was contacted to enter a bug. This was in the 1995 time frame so it predates any corporate dictums.

    16. Anonymous says:

      >> The explanation you gave is not even childish – some thing sillier than that. Asking for removing easter eggs and accepting that as a proof that they’ll never get malware embedded in the software for instance.

      Ummm, he never said that…

    17. barrkel says:

      > One of the aspects of Trustworthy Computing

      > is that you can trust what’s on your computer.

      Trustworthy computing is a DRM initiative. That sentence should read:

      + One of the aspects of Trustworthy Computing

      + is that Microsoft can trust what’s on your

      + computer.

    18. Anonymous says:

      "A good example of how these things can get out of control, is the "NSA_Key" issue from some years ago. Presumeably someone chose that name as a joke, or without considering its potential interpretation. Whatever the rationale behind the name, it caused a firestorm of protest about "NSA backdoors" in microsoft products. Oops! "

      It’s more than just that, Windows used to have entry points named:

      Death

      Resurrection

      PrestoChangoSelector

      TabTheTextOutForWimps

      WinOldAppHackOMatic

      UserSeeUserDo

      Bunny_351

      Brute

      FixUpBogusPublisherMetaFile

      I don’t think these names are innately bad _private_ names, but to have them exposed in export tables was pretty bad.

    19. Anonymous says:

      Of course, the easy way around all of the easter egg stuff is for MS to start putting credits in the About screen for each product. That way the devs get their names up in lights… and everyone’s happy.

      That doesn’t address the "cool factor" of writing a cool easter egg though, of course.

    20. Anonymous says:

      People will complain no matter what you do, it’s pointless to try to pander to everybody. The no easter eggs policy just reinforces in my mind the Microsoft is nothing more than faceless, humorless, Big Corporate, and therefore, not trustworthy.

      There’s nothing at all wrong with them as long as they aren’t "snuck in" and go through the same quality review that everything else does.

    21. Anonymous says:

      In fact, the EE is nothing more than just one feature. If we get any MS product, there are many features which are never used by a particular user. It may be because the user does not know about them, or if he does know, he can’t imagine how he can use them and what are they added for. In case of EE, the usage is quite obvious – a bit of fun. So EE looks even better comparing with other unused features. The same is true about security: unused features have more security risk, comparing with EE, because EE is quite simple (usually a bit of UI).

    22. Anonymous says:

      SPJ wrote:

      "People who have even a little common sense and are that suspicious will ask for source. They know that’s the only way to be sure."

      Read the link Larry gave to Ken Thompson’s lecture. Even if you compile from source, you *cannot* be sure what the program will do.

    23. Anonymous says:

      re adequacy.org

      Have you read that site before, or did you just google for contrary opinions. It sits well alongside the story on AMD: http://www.adequacy.org/public/stories/2002.1.28.153048.268.html

      Not everything is as it seems….

    24. me: Actually, Raymond gave me the links, and I used them. Afterwards, I learned that Adequacy was an intentional troll site, but it doesn’t really matter, even though it’s a troll post, the others aren’t.

    25. Anonymous says:

      Carlos –

      From the Ken Thompson article – "Figure 6 shows a simple modification to the compiler that will deliberately miscompile source whenever a particular pattern is matched. If this were not deliberate, it would be called a compiler "bug." Since it is deliberate, it should be called a "Trojan horse." "

      In my post, I said we need the compiler source and library source and the OS source etc. to be sure enough to trust a program. You _can_ trust source code, not infected binaries of compiler that produces untrustworthy code from trustworthy source. So if you have source for everything you can theorotically verify that it does what it is supposed to do and just that.

      Oh and yes, the CPU which executes the code needs to be trusted!

    26. Anonymous says:

      SPJ wrote:

      > Oh and yes, the CPU which executes the code needs to be trusted!

      Indeed, I gather that some of the CPU microcode update procedures, are now well-known publically. So I guess, with enough smarts, you /could/ trojan a modern CPU. For those not aware, google on "microcode update" (including the quotes) for lots of relevant hits.

    27. Anonymous says:

      A good story about "why no easter eggs" is the tale of the Adobe Photoshop easter egg that was snuck in by a developer and therefore wasn’t tested. Of course, it had a globalization bug which caused it to crash on systems that used double-byte characters, so there was much embarrassment when, one April 1st, the application became useless in various squiggly-text-using countries around the world. The embarrassment was so great in Japan that Adobe actually went to the trouble of printing stickers apologizing for the problem and slapped them on the boxes sold in Japan.

    28. Anonymous says:

      I applaud this policy as a security engineer.

    29. Anonymous says:

      Well, this year I didn’t miss the anniversary of my first blog post.

      I still can’t quite believe it’s…