Breaking into Apple

Larry’s in a curmudgeonly mood today, so…

Over the weekend, I noticed this post on Digg: “How I learned to break into Apple and Code for them without Permission“.

It’s an “interesting” story, and I have to say that I was aghast when I read it.  And my jaw dropped even further when I read the Digg comments about it (yeah, I know – self selecting trolls, just like on /., but…).

For some reason, the people reading the story actually thought it was COOL!

Now think about it.  These people were so dedicated to their product that they were willing to BREAK THE LAW to get it into Apple’s OS.

That’s cool (sort-of, if you’re willing to condone breaking the law).

But what does it say about Apple’s internal controls when the QA process on the final disks has things like:

Once again, my sanity was saved by the kindness of a stranger. At 2:00 one morning, a visitor appeared in my office: the engineer responsible for making the PowerPC system disk master. <snip> He told me that if I gave him our software the day before the production run began, it could appear on the Golden Master disk. Then, before anyone realized it was there, thirty thousand units with our software on the disks would be boxed in a warehouse. (In retrospect, he may have been joking. But we didn’t know that, so it allowed us to move forward with confidence.)

Wow.  So the contents of the software on the FINAL MASTERS for the operating system can change on the whim of a single release manager?  Doesn’t anyone ever check this stuff?

In addition, what was the state of Apple’s physical security?  Admittedly this was 1994, and things were somewhat looser, but still…  I’m sorry, but if you’ve got random ex-employees running around the halls spending HOURS with access to your CORPORATE NETWORK, what does that say about the level of security in your physical plant?  Apple got hugely lucky that these guys weren’t bent on corporate espionage.


Don’t get me wrong, I’m sure that Graphing Calculator is a SERIOUSLY cool app, and it’s clear that at some point there was a formal decision made to include it into the product, and it got the attention it deserved (as mentioned in the article, Apple eventually decided to include it in the product).

Think of what would have happened if Apple hadn’t: Once the decision was made to include it in the product, they got test resources, they got localization resources, they got usability testing, etc.  None of this would have happened if they had continued as a “skunkworks” project, and they’d have shipped a product that had serious flaws.


And this is a model we’re supposed to admire?


Comments (24)

  1. Chris says:

    Larry, come on, it’s not *that* clear cut. I used to work at Apple; I met Ron there and heard the story firsthand. I was also acquainted with Greg.

    0. These guys were known quantities to important individuals on the system software team at Apple. If they weren’t, they’d have been kicked out of the building.

    1. Controls of various sorts were much looser at Apple then. They’ve been tightened up considerably since.

    2. People in glass houses. Do you really want to encourage Slashdot trolls? Post about law breaking on an MS blog.

  2. John says:

    Somehow, the saying about stones and glass houses comes to mind.

    Perhaps Microsoft should get its own house in order before its employees whine about the procedures used in the past by Apple. Easter Eggs were rampant in older MS apps. And with the boatloads of patches coming on a monthly basis for "the world’s most secure operating system", it is clear that MS procedures need a lot of work. Underselling and overdelivering would be nice too.

  3. Chris, to me it is.  Apple CANCELED their project.  Apple made an executive decision to not produce the product.

    And the developers decided to override it by BREAKING INTO APPLE.

    There are a bazillion other ways they could have handled it: Building it on their own and trying to convince Apple to put it in the product, etc.  But no, they chose to STEAL from Apple.  And the policies of Apple were sufficiently lax that they let them get away with it.

    John, that’s apples and oranges.  There’s a HUGE difference between an easter egg inserted by employees and someone breaking into a building, and STEALING company resources for a skunkworks project.  And security holes have absolutely nothing to do with an almost total lack of control over process and facilities.

  4. David says:

    Process seems to be a hot topic, for better and for worse. For comparison, what would be the correct Microsoft way to include a simple but useful application like the XP PowerToy Calculator as an accessory installed by default in Vista? Apart from obvious stuff such as complete localization and fancy use of Glass in the graph window, how does one go from an idea and a set of source files to bits on the golden master? I’m assuming it is possible, but it involves "a few" steps and checks on the way.

    Is today’s process much different from what it was during the development of Windows 95 (since that seems to coincide witht the Apple event)?

  5. Lauren Smith says:

    Think about it though. These guys were so loyal (in their minds) to this company and its OS that they were willing to bypass security to make the product better.

    The question isn’t why Apple was so bad at keeping these type of people out. It should be why Microsoft can’t engender the same type of loyalty and fanaticism that Apple and the MacOS can.

    Just like the commercials with the two computer guys. The Mac guy is self-assured to the point of cockiness. He believes he is the best OS out there with everyone else following his lead. Mac fanatics believe this. They see the OS as something that they personally have a stake in.

    From a project management perspective, having rogue developers and secret deliveries is the last thing you want because it brings in too many unknown variables (QA being a huge issue, as you mentioned). But knowing that people are breaking down your doors to make your product better is a huge boost to the community surrounding that product. This is one major positive point to OSS. The community rallies around the project because in a real sense they own the project.

    Is it insane that this type of thing was allowed to occur? You bet. But it is also incredibly cool.

  6. Dave says:

    Larry, I see your point and I agree in a "they broke the rules and must be spanked" way. Yet, I have to admire their dedication. They were looking for every possible way to ship their bits. Big companies like Microsoft look for reasons to NOT ship bits. Mark Lucovsky’s article about shipping software comes to mind:

  7. Lauren, that’s a very valid point, and you ARE right, the passion shown by these guys is very impressive. I just seriously disagree with their methods, and the environment that allowed those methods to succeed.

    David, the answer depends on who’s making the cool toy.  In general, a team in Windows owns each of the features.  There are also different vehicles for releasing products.  You already mentioned the powertoy example.  Because powertoys aren’t supported, the quality bar is much lower for them.  Another example of a distribution mechanism is the windows ultimate extras stuff – that has a much higher quality bar.

    For getting a feature into the product (and continuing on your example, adding the PowerToy Calculator would be considered a feature), you need to have a series of specs (functional spec, design spec, test plan, threat model) for the app, you need to have dev and test signed up to work on it.  And you need to have approval from the set of people who are authorized to approve the feature.  There’s other stuff that has to happen as well, that’s well out of scope of a comment.

  8. Scott says:


    If you had known that the bug you fixed would have caused all the havoc during the infamous "text to speech" demo you recently blogged about, and had an opportunity, by breaking into  an office at MS, to place a fixed version on the demo machine before the demo happend would you have?

  9. denis bider says:

    Larry, I think you’re being a bit, and I’m just using this word because psychologist do, "anal". 🙂

    I think this is really a story about [1] how civilized people in general are driven by the desire to make a positive contribution, [2] that individuals, being flexible, will recognize a positive avenue  of development readily whereas an organization, being rigid, often will not, usually at the expense of the organization and the world at large.

    This story is an example of how lax policies are effective, not about what danger they cause. It shows how it’s the people and the mindset that matters; not the structure, and not the bureaucracy. It shows how the facilities manager who threw these people out at one point and the public relations person at Apple are rigid assholes who, with their anal reactions, probably didn’t contribute much good to the company or the society.

  10. Tim says:

    "We wanted to release a Windows version as part of Windows 98, but sadly, Microsoft has effective building security."

    I know this to be true, from personal experience. I was working on a saturday once, and wore an IBM T-Shirt from my last job. A security guard stopped me and asked to see my badge. 🙂

    Lauren: Microsoft does have that kind of fanatical loyalty to its products among some of the employees, but sometimes this gets dampered by bureaucracy.

  11. Barry Kelly says:

    I think you’re focusing on the negative in a hero story from olden days, when standards weren’t the same. It doesn’t smell nice – coming from a MS developer. In fact, it stinks.

  12. denis bider says:

    Here’s another way of looking at it.

    A security system succeeds if it lets in the good guys (people who contribute) and keeps out the bad guys (people who destroy). Apple’s bureaucracy, as described in the story, failed in this by keeping these guys out even though they had a positive contribution to make and were willing to do so at no one’s expense. Apple’s reality, being the people working there, corrected this system failure informally by ignoring the bureaucracy where it was harmful. The rules wouldn’t have been ignored had they been generally perceived as good.

    What Larry is proposing is that a system designed to be suboptimal should perform suboptimally, as designed. I propose that such anecdotes serve to teach us that systems should be designed around people, and not the other way around. If a company has people able to produce good work and willing to do this for free if the company’s financial reality isn’t good enough to payroll the work, then the company and everyone else is better off to the extend that it manages to capture this creativity, and the company and everyone else is worse off to the extent that it blocks it.

    Bureaucracies are usually created with the most common cases in mind, not the exceptions. To the extent that bureaucracies fail to gracefully handle exceptions, people that comprise those bureaucracies should compensate. After all, a foolish consistency is the hobgoblin of little minds.

  13. Alan De Smet says:

    You are being a bit curmudgeonly today.  Both American and hacker cultures make heroes of people who bend and break the rules.  If you break the rules and accomplish something generally perceived to be "good" you’re a hero.  You can even engage in outright crime and be admired for your daring if you’re bold enough (D. B. Cooper).  It Graphing Calculator story represents old history (thanks to Internet-time); people are admiring people who wrote Good Software and weren’t afraid to Break The Rules To Do It.  (Sounds like an ad for a movie.)  So, yeah, what they did was cool; just like D. B. Cooper.

    I don’t think people are quite as irrational as you might fear.  The author wisely didn’t post it for several years.  If it was unveiled that something like this happened last year I believe the response would be different.  While a subset might still admire the work, many more people would worry about the security of the operation system.

  14. Dale says:

    "Over the weekend, I noticed this post on Digg: "How I learned to break into Apple and Code for them without Permission"."

    Gawd, it makes you wonder how many other people were slipping into the place to write software.

  15. JS says:

    Why is anyone getting worked up over this? It happened 12 years ago. What’s the point, that 12 years ago the QA controls Apple had in place were bad? 12 years ago everything in software was bad, or perhaps I should say even worse than now.

  16. denis bider says:

    The argument that this somehow reflects bad QA controls at Apple (or any other company) is bogus. In the end, everything goes through the hands of people. Rules don’t decide anything. People do. People determine the rules, and sometimes people are smarter than the rules.

    The fact that something happened through an informal process rather than a formal one doesn’t mean that it was bad. In fact, the reverse is also true. Just because something happens through a formal process, it doesn’t mean it’s good.

    Suppose Microsoft’s bureaucracy was much more effective in those days, and informal decisionmaking like that did not occur. Did that prevent Larry from inadvertently – or advertently – introducing a buffer overrun into LAN Manager that could later be exploited as a remotely exploitable arbitrary code execution vulnerability? No, it did not. And the number of people who sign off on the master has nothing to do with it.

    Even if there are 100 sign off people, they aren’t going to go overnight through 50 million lines of the source code. Or else, there would be no bugs in Windows.

    Formal policies are OK as guidelines, but relying on policy to do the thinking instead of the individual just lulls the company into a false sense of security.

  17. In answer to the question you ended with, no, this is not a model we’re supposed to admire.  It’s not a model at all; it isn’t meant to describe the way Apple planned in the past, or plans now, to develop software.  We’re supposed to be amused by the pluck of the rogue programmers and the others who helped them, and, I think, to wonder a bit at how Apple managed to so bungle the dismissal of contractors that they managed to keep coming to work for months.

  18. Shog9 says:

    It’s the sort of story that seems terribly unlikely and slightly fool-hardy. Like stories of early aviators, or that guy who tested that heart catheter on himself. That’s what makes it interesting. Conversely, it’s also what makes most of these jobs so soul-sucking dull – nothing new, nothing unexpected, nothing tried nothing gained.

    Yeah, if i was the building manager, QA manager, product manager, blah blah blah… But i’m not. And so i *can* enjoy it.

  19. orcmid says:

    I figured this comment thread would be interesting.  There are two factors that come up for me.

    The first one, is developers placing their judgment ahead of the users and the responsible organization.  Whatever the sense of commitment, and sense of rightness, subverting the system removes all checks *and* is involving the users and customers without anything like their consent.  I first noticed this back when "tyranny of the technician" became part of the language in conjunction with more serious matters (like arms-control policies and building things like Star Wars).  There is considerable arrogance around such actions, it would seem, or in the case at hand, gross carelessness justified by the rightness of a course of action.

    My first thought was about a later experience.  For a time I was in an organization that built serious hardware prodects with embedded computer systems.  As a manufacturing company, there was a pretty serious product development process that worked from product conception to turnover to manufacturing.  There was also a larger customer delivery process.  I was impressed to learn of the second, but as a participant in the first, I quickly recognized that engineering was perturbing the second without any righteous feedback (let alone virtuous) from the customer-delivery side regarding maintainability, usability, etc.   The engineers were pretty much making it up.  And it was interesting to see some pretty major projects get cancelled late in the product development because of some serious operability failing.

    I think programmers often disdain the serious non-development considerations that enter into deploying and supporting a software product.  I know I did.  I still tend to approach software for its own sake.  It may be great for me and satisfying, but it is not the same as putting in the extra effort it takes to satisfy real customers and users with a reliable product.  

    It would be great to have a lighter-weight means for putting software into peoples hands where it is not worth it to either the developers or the users to carry the burden of productization and hardening, as well as the costs raised by commercial deployment and support.  It looks like Microsoft is slowly introducing such alternative pathways.  I don’t know what Apple is doing or what the Apple community is doing.  

    I confess I was amused when I first read of the incident some time ago.  Thanks, Larry, for pointing out that there are deeper lessons that we can bring into our present-day conduct.

    – Dennis

  20. Ron Avitzur says:

    What a fascinating discussion!

    I’ve given these issues a lot of thought at the time and over the years since. I still ponder the ethical ambiguity of the situation. It is only trespassing if we did not have permission. We were given permission later, retroactively making it legitimate. But there was no way we could know that at the time. I certainly told myself that we were "doing the right thing" and believed that. But at the same time, I was self-aware enough to realize that that was just frustration and bull-headedness and my ego talking. Even looking back it now, I’m still not sure if I pulled one over one them, or if I was horribly naive and taken advantage of, or perhaps both.

    It was certainly arrogant and egotistic of me at the beginning to think that I knew what was in Apple’s best interest better than the folks tasked with making project decisions. Had I been arrested I would have accepted the consequences knowing full well that I had earned them. Yet, when push came to shove and we came clean with what we had been doing, management confirmed my judgment and agreed that it was worthwhile and ought to ship.

    We could not have done any of this without the support of a lot of people who knew us, trusted us, and helped us at every step along the way. We were not strangers breaking in during the dark of night.

    And yes, to Alan: the events are 13 years old, I wrote the story 10 years ago, but waited to publish it until less than 2 years ago, now that Apple is no longer "beleaguered" and the story is just an amusing history. The Apple of 1993 was a place of chaos lacking in vision and leadership at the very top entirely unlike the Apple of today. Nothing like this would happen there today – nor would there be any need to operate that way.

    As to the question of Quality, we took that extremely seriously as professionals. So much so, that for nearly a decade after we finished, Apple used Graphing Calculator to test sick machines. Apple repair folks would run the Graphing Calculator in Demo mode overnight, and if it crashed, classified that as a hardware failure. I like to think of that as the theoretical limit of software stability.

    >And this is a model we’re supposed to…

    I was not trying to preach. If anything, the tale is a rorschach. Everyone reacts to it differently, reflecting their own experience. For me, I wrote the story as memoir. I like to reminisce.

    By the way, I’ve continued working on it ever since. There is a Windows release now. Send me an e-mail if you’d like to play with it, or if it’s not too late and you know Microsoft folks looking for extras to go in Vista. 😉

    Best regards,


  21. Doug Karr says:

    Imagine working on a painting for months only to not finish, or work on a car for years to only have it taken away, or work on a symphony and never be able to finish it, build a house and almost finish… I can say first hand that one of the worst things you can do to a programmer is to not let them finish their work.  

    I work for a company right now that is extremely successful, but we’re just pulling our developers out of a rut.  The issue was that some folks had them developing on a few projects in a row where the plug was pulled.  It’s had a terrible effect on their moral and pride in the products they are developing.

    A good developer is like a good craftsman.  The satisfaction of the job only comes once it is complete.  This isn’t about QA, stealing, breaking in… it’s about passion and vision.

    Dare I say that it’s also about ‘open source’ development and it’s success.

  22. says:

    Well its that time of year again, college.I will find out for definate if i am doing the course on thursday, and I hope I do, will be fun. Hardcore software development and project management.. like i dont have enough of that already, lol.Found this real

  23. Abiel says:

    I agree with those who think Larry’s spouting like an anal bureacrat. Given the slack, people will work around flaws in the rules-based system. Rules can never deal with all the exceptions and contingencies and people who write software for a living should know this. In fact the most carefully contrived formal systems are self-contradictory (Gödel, 1933 paper on formal systems).

    These guys did a good thing. People inside of Apple knew and it. Once revealed, they were handled as the positive exception they were. People who rigidly and thoughtlessly follow rules, never recognizing that the rules aren’t always "right", need to wake up. Just because it’s in the rules doesn’t mean it’s either good or right. Just because it’s a law doesn’t mean it’s just or moral. Just because it’s "provable in a court of law" doesn’t mean it’s true. Rules based systems are never without flaws. Most of them, because they are human contrivances, are terribly flawed. Luckily, many people aren’t rigid assholes and can work to correct them as the exceptions emerge. We need more "correctors" in the system, not more rigid assholes.

  24. beorg says:

    remember, law create criminals. by definition