Of Intel Macs and Red Herrings


Scott Byer at Adobe put up a very nice post about the switch to intel and some of the growing pains many of us are going through. While I’ve added my own comments to Scott’s piece, a number of my colleagues in Mac BU have asked me to weigh in on the subject here on my blog.

First off, it’s difficult for me to discuss specific facts. I’m under NDA with Apple that precludes me from discussing specific issues with the transition, particularly with issues related to the tools.

Nevertheless, there are some red herrings out there, and I’m going to try to dispel a few of them.

1) Steve said it was just a recompile! What gives?

I’ve discussed this one before, but it bears repeating. When Steve Jobs did his demo at the last WWDC, he was talking to an audience of developers. He knew, as did every other developer in that room, that getting the code to recompile was only the first step in a long process of testing and verification. No two compilers generate code in exactly the same way, and no software developer worth the name goes through a compiler switch without extensive testing.

A few reports who’ve never written a line of code in their lives and who didn’t actually take the time to go out and interview a variety of developers to get a different angle on the story propagated the notion that developers would be able to quickly move to Intel, but that’s not what Steve was saying. Steve was only saying that the tools wouldn’t be a major obstacle–or at least that Apple was prepared to get the tools to the point where they wouldn’t be an obstacle.

2) Apple’s been advising developers to move to XCode for years. All you had to do was follow Apple’s advice.

At that time of Steve’s speech at WWDC, most of the major applications weren’t built using XCode. That’s true for both Adobe and Microsoft. That fact alone complicates matters entirely. When this fact has been pointed out, this variation on the “Steve said it was just a recompile” meme has emerged. According to this meme, all we really needed to to was follow Apple’s advice, and, well, we’re the bad guys for not having done so.

While Apple did advise developers to move to XCode, Apple was rather tight-lipped as to the fundamental reason why. In the mean time, developers had to consider that advice along with the fact that Metrowerks’ toolset was both significantly faster than the XCode/GCC combination and generated better code than XCode/GCC. No sane developer would sacrifice both a significant level of productivity and the quality of their product merely because Apple said so.

But all of that’s very much beside the point. Whether we had gone through the pain of porting to XCode/GCC in some earlier release of our products, we’d have still had to go through this pain. The time spent doing this work then would have to have come from the features that we were, instead, adding to our programs. Arguing that we should have, somehow, absorbed this pain earlier really has little bearing on the nature and extent of the pain. We’d still have to do the work, and customers would have suffered as much then as they are now. The only difference is that, now, we have a tangibly legitimate reason for the suffering.

3) If you’d just ported from Carbon to Cocoa, your problems would have been solved.

I love this one. It so clearly demonstrates an astounding level of ignorance. I’ve already posted on the “Carbon vs Cocoa” issue, so I won’t really belabor the point here. I’m just amazed to see people arguing that porting not only from one compiler to another but from one application framework to another would, somehow, magically be less expensive than simply porting from one compiler to the other.

4) Some apps have done it, why can’t you?

In some ways, this is a legitimate criticism, but it glosses over significant differences between, say, PowerPoint or Photoshop and BBEdit. In this regard, there are two points worth considering.

The first is that the amount of work required to port a code base does not grow linearly in proportion to the size of the code base. There are a number of reasons for this, all of them related to complexity. For example, a more complex C++ program is far more likely to make use of certain language constructs (e.g. RTTI, templates and multiple inheritance) than would a less complex program. With more complex projects, the amount of work grows more on an exponential scale: i.e. twice the size of the code would require nearly ten times the amount of work to port.

The other is specifically related to GCC. GCC uses STABS to describe symbolic information so that the debugger can translate the code it’s observing into symbols so that the programmer doesn’t have to do this by hand. For those who care to take the time, RedHat has decent documentation on the STABS format.

The problem with STABS is that it’s very verbose compared to other formats (particularly the xSYM format that Codewarrior uses). It’s fairly easy to verify this. If you have XCode installed on your computer, create a simple “Hello World” Carbon project. Rename the main.c file to main.cpp (C++ generates mangled names due to overloading), and add a couple of simple classes to your app. Now, build both the debug configuration and the retail configuration. To be even safer, run strip on your retail build. Now, compare the difference in sizes between the two build.

I can’t say which application, but I have it on good authority that a modern C++ application has actually hit the virtual memory wall (i.e. the combined code + generated symbols resulted in executable code larger than would fit in virtual memory). And, no, it’s not an application that I’ve ever worked on or contributed to in any way.

 

Before I close this, I want to make two points very clear. First, I agree wholeheartedly with Scott’s primary thesis: I can’t imagine any developer who would prefer a long, drawn-out process of porting code from one build system to another to writing features that solve people’s problems. We are already doing as much as we can possibly do. Indeed, one of the reasons I haven’t posted very much as of late is because there really is a great deal of tedious work to do.

Secondly, I want to commend the tools group at Apple for the yeoman’s job they’ve been doing to help us make the transition. I’m reluctant to name names, but I honestly wish I could. I’ve met with these people, and they really are an outstanding crew. Thanks, everybody. You know who you are.

 

Rick

Currently playing in iTunes: Dixie Chicken by Little Feat

Comments (63)

  1. Dave Thorup says:

    Rick,

    After reading through Scott’s thread and realizing who you were I came over here to read this great article on your blog.  I too was frustrated in the noise of arm-chair-developers spewing all the common mischaracterizations of how easy the transition is and how _insert_name_here_ isn’t doing their job by being “slow” to update their applications.

    I too felt like writing my own article to address some of the myths about the transition, but your article hits the nail on the head regarding many of the points that I wanted to discuss.  I’ll probably still write my own article as there are a few issues that I still want to address, but I’ll be sure to make yours required reading when I do.  Now I’ve just got to get a blog up an running.

    I just hope you don’t get the same tired arguments that Scott has had to deal with.

    Dave

  2. Chip says:

    Well I don’t know what you know about computers but anyone who is listening to Little Feat is on top of things.

  3. Scott says:

    I clearly remember a slide which presented the level of complexity of an app and the proposed migration steps. Most of which started wth "Move to XCode". i don’t know why any tech reporters who viewed the slides would walk away with "Just recompile" as the only step unless they are just lazy reporters. Most of the successful migrations to universal binaries I’ve seen have been simple apps or the ones written by ex-NeXT devs. (aside from the Apple apps). Hopefully they are working to increase Rosetta performance every day, but I fear there’s only so much they can do in that area.

    "generated better code than XCode/GCC."

    By this, I assume you mean tighter assembly code. Are the differences between GCC and Metroworks compiler known to the GCC team. Where do you see the biggest differences?

  4. mark says:

    Thanks for your intelligent and no-finger-pointing post.

    Who cares whose at fault?  It’s always a combination of factors. All we can do is learn from it and use the info to try to avoid doing the same thing again.

    Thanks for working hard.  There are users out here eager to use your stuff to accomplish great things.

  5. Good job Rick. I think that if more of the major application devs from  Microsoft, Adobe, etc., were to make these kinds of posts, it could really help avoid "Adobe ported photoshop to Carbon in two weeks, what’s your problem" syndrome that we are seeing now and was really bad early on in Mac OS X.

  6. JD on EP says:

    Byer on MacTel, II: Photoshop engineer Scott Byer’s essay this week on porting to new hardware/OS hit a need and was heavily linked… this morning I see he has collected an incredible number of comments at the blog. I can see waves coming in from the

  7. Etienne Travailles says:

    Hey Rick:

    I wonder why you guys are porting Office and Word to the Intel/Mac platform anyway.  It runs perfectly well under Rosetta; I tried it out at the Apple store today on a MacBook Pro.  You aren’t going to make the MacBU any money doing the port; you are bound to reveal tons of obscure issues that are submerged in your decades-old codebase.  To use your lingo, the "opportunity cost" is just too high.

    A way better strategy, it seems to me, would be to get your OS guys to write a good compatibility layer for the Intel side of Mac OS X and just run the Windows Office natively, under Windows XP.  This has a lot of benefits for Microsoft; you’d be able to have everybody run the same code, so the cost of future development would be for one platform only.

  8. Preston says:

    Hell, I’m just extremely grateful Microsoft puts out a Mac version of Office at all.  The last thing I’m going to do is complain that they’re not ready with an x86 version the second Apple unloads those machines onto the world.  Office runs great under Rosetta.

  9. RM says:

    I wonder, Etienne, if you have used both Win Office and Mac Office.  I can’t imagine someone who has used both encouraging someone to drop production of the Mac version.  Rather, I would imagine everyone treating the very idea of it like the contents of pandora’s box.  It’s not Mac fanboyism, it’s just that MOffice is such a better app to use.  Or, if you don’t believe that, it’s at least a better match for the Mac users and also an independent source of innovation for a mostly stagnant area of software.  

  10. pfb says:

    Hey, your ports will still probably beat Vista to market 🙂

  11. <I>A way better strategy, it seems to me, would be to get your OS guys to write a good compatibility layer for the Intel side of Mac OS X and just run the Windows Office natively, under Windows XP.  This has a lot of benefits for Microsoft; you’d be able to have everybody run the same code, so the cost of future development would be for one platform only.</I>

    I take it you never used Word 6? That’s what happened the last time a "good compatibility layer" was used to run Windows code on a Mac. That’s in quotes because the reality for end users was it was pretty awful- but it’s like the dog who plays checkers who only beats you one out of every 10 games, it was amazing technical work to get the damn thing running.

    Mac users don’t want Windows ports that violate good Mac UI paradigms, don’t take advantage of any Mac OS functionality or features, and fell like shoddy Windows ports- and the fact is the MacBU is a rounding error in terms of Microsoft’s worldwide workforce, but the couple hundred million or so they bring in in annual revenue certainly is NOT.

    And the other thing is this is old hat for the Mac BU. In the last 10 years, they’ve had multiple OS revs to deal with (7, 8, 9, X), porting to a different microprocessor architecture (Motorola 680×0 to PowerPC, PowerPC to Intel), oh, and having to go to a different API model for a lot of their stuff (Mac Toolbox to Carbon). It’s not like "OMGWTFBBQ OH NOES THIS IS TEH FIRST TIME WE’VE EVAR DEALT WITH APPLE HANDING US A SURPRISE!!11111". Sheesh. Welcome to developing for Apple, home of "We know it will cause pain for you developers, but we have a company to run".

  12. John McEnerney says:

    The depressing side of this story, from the perspective of one of the original Metrowerks developers, is that Metrowerks possessed the technology necessary to make this transition painless for Microsoft and Adobe: an x86 compiler using the same C/C++ front-end that ran on the Macintosh and generated (decent, if not equal to gcc) x86 code. With a small amount of engineering work on the object file format and linker, and some assistance from Apple on debugging, they could have had an x86 version of Codewarrior available for the transition.

    Sadly, the relationship between Apple and Metrowerks was never a good one, and the revenue from Macintosh tools was never enough to be worthwhile to Freescale. Add in the fact that aiding the transition would just hasten the drop in Freescale’s PowerPC revenue from Apple, and it was a nonstarter.

    I always hate to see a good product die (THINK C, CodeWarrior) and it irks me no end that Apple was able to achieve its desired goal of replacing CodeWarrior without having to actually build a better product.

    Porting a large software product (or, in the case of Microsoft and Adobe, 5-10 large software products) from one toolchain to another is an unpleasant, time-consuming, and mostly unrewarding. I am waiting patiently for the Microsoft apps before switching to a MacBook Pro, and I am not blaming Microsoft for the delay…

  13. Andy Pastuszak says:

    Now how about some cleanup of your code when you port to Intel.  Can we PLEASE get rid of the Microsoft User Data folder from Documents and put it in Library where it belongs?

    I understand Office 2004 is a huge product, but I tend to feel developers like Adobe and Microsoft never plan to make their code more portable, so these kind of sudden changes from Apple are a tad less painful.

    Though I am not a developer, I do work in IT and have seen LOTS of project suffer from the ‘We don’t have time for that!" syndrome, where the bear minimum is done to get something done, knowing full well that there will be huge pain down the road when cleanup time comes.

  14. Rick Schaut has a great post on why moving to Intel is not just a simple recompile as some make it out…

  15. Kyle Wheeler says:

    I wonder… would it be cheaper for Microsoft or Adobe to buy CodeWarrior from FreeScale and make that compiler do what they need, rather than do all the work moving their software to Xcode?

  16. Pecos Bill says:

    Nope. CW sold their x86 complier to someone else before being bought (or so I read)

  17. David Perlman says:

    "…According to this meme, all we really needed to to was follow Apple’s advice, and, well, we’re the bad guys for not having done so.

    "While Apple did advise developers to move to XCode, Apple was rather tight-lipped as to the fundamental reason why. In the mean time, developers had to consider that advice along with the fact that Metrowerks’ toolset was both significantly faster than the XCode/GCC combination and generated better code than XCode/GCC. No sane developer would sacrifice both a significant level of productivity and the quality of their product merely because Apple said so."

    I don’t want to be accusatory, because I know Rick is in the Mac BU, and I have a friend there, and I know you guys are on my side.  But I have to wonder about the contrast between the emphasis on fast-code generation here, and my experience with Mac Office.

    I use Office X.  When I use one of the built-in invoice templates, it takes about 300,000,000 clock cycles for each character I type to show up on the screen.  In general, in all other contexts on my computer characters show up instantly as I type (including right now as I type this message.)  I have a hard time believing that the code in Microsoft Office X for Macintosh is really as optimized as, well, as almost any other program I’ve ever seen or used.  Also, I somehow suspect that Office on Windows is dramatically more efficient than the Mac version, and that no tears are shed in upper management that non-Windows users might have a somewhat sub-optimal experience…

    I realize this is quite tangential to the question of porting to x86, but I’m just curious if someone who knows what they’re talking about better than I do might have something illuminating to say about this.  For instance, I’d like to think things would be getting better as time goes on; but I tried Office 2004 and it was even slower and harder to use…

  18. Jason Umiker says:

    I switched about six months ago to a PowerBook in part because I preferred Mac Office to the Windows alternative. I can point to a few things like the Formatting pallette but, in general, it just felt easier to use and interact with. And, it is able to do that with relative feature parity of all of things I’d use, or want to use, in Office 2003.

    I have to admit that after watching this video (http://www.microsoft.com/office/preview/asx/OfficeUIIntro.asx) that I am really impressed with what Microsoft is doing with the interface of the Windows Office and yearn for some of what they are doing as far as the new interface and ready templates to create good looking documents and effects. It also struck me that their interface is very Mac-ish – it is the first MS interface on Windows that I think they can plunk down on a Mac machine and it wouldn’t look too out of place with very little change. The catagories on the Ribbon even looks quite a bit like the top of the Apple website.

    I know that the MacBU is going to have their hands full trying to get Office ported to the new XCode and Intel platforms and architectures, but to what extent and at what point can we expect some of that goodness? It would be the first thing in a long time that made me jealous of my Windows-using friends if their Office has a nicer interface and can easily make better looking documents than my Mac Office…

  19. slow_today says:

    Apple is on the working group which released DWARF-3 spec in Jan 06 – not sure why Apple is still after STABS when the FSF GCC/GDB moved away towards DWARF a while ago.

    I am surprised  though to hear of the virtual address space exhaustion due to debugging symbols! Either GCC is mind numbingly terrible when doing STABS or the project doesn’t have any concept of modularization at all. All bigger C++ projects (especially dealing with UI) already have concept of doing everything via ‘plugins’. Only load them when they are required and optionally dump them when not required. It’s hard to understand one would hit the 2Gb limit just with Debug information – something horribly wrong.

  20. Ralph says:

    It has all been well-known for a _very_ long time, also that Mach-O binaries are the native format for OS X, not CFM. If Kevin Browne was right, more than half of your codebase is shared with the Windows version anyway (I forget the source of that interview but it has been said in public).

    In other news, Final Cut Studio is a Universal Binary shipping _today_. Doesn’t seem like a small project to me. If XCode cannot be used for large project, how did Apple compile this beast?

    Even Quark has a beta out… I guess this is more a problem of justifying the cost of a full version upgrade. I understand that, the effort must be worth it, after all.

    But PLEASE be more honest to your users. It may not be simple to do the transition but even the Intel announcement is 9 months old, even older if you believed Scoble, your very own apologist, err, evangelist.

  21. Rick:

    About your comment that "GCC uses STABS to describe symbolic information".

    GCC 4.0.0 on Mac OS X 10.4.5 says it can generate debugging info in at least 7 different formats:

     -g                          Generate debug information in default format

     -gcoff                      Generate debug information in COFF format

     -gdwarf-2                   Generate debug information in DWARF v2 format

     -ggdb                       Generate debug information in default extended

     -gstabs                     Generate debug information in STABS format

     -gstabs+                    Generate debug information in extended STABS

     -gvms                       Generate debug information in VMS format

     -gxcoff                     Generate debug information in XCOFF format

     -gxcoff+                    Generate debug information in extended XCOFF

    I realize that VMS and COFF are other platform/dead formats, but just for grins, have you guys tried to use anything besides STABS?  "cc -gdwarf-2" works for me (and produces a "Hello, world!" binary 10% smaller than using "cc -gstabs" 🙂 )

  22. anonymous says:

    "All bigger C++ projects (especially dealing with UI) already have concept of doing everything via ‘plugins’."

    The application in question uses a lot of plugins. And that is a big part of why it hit the address space limit: duplicated symbols, lots of duplicated debugging info, address space fragmentation, etc.

    Plugins are not a panacea.

    DWARF: I think Apple’s GCC builds are a little behind the FSF (technically, they’re a branch with a lot of custom code added), and Apple would have to adapt all of their tools to handle DWARF before they can make that jump.

  23. anonymous developer says:

    Ralph:  CFM is just as native to OS X as Mach-0.  The OS can load and use both binary formats.  And both have the facility to contain binaries for multiple architectures, languages, etc.

    Avie, er, I mean Apple, may prefer Mach-0 — but there are a number of drawbacks to using Mach-0, and very few benefits.

    Don’t you suspect that Final Cut had a little advantage because they’re an Apple application and have been using XCode/GCC already for a few years?  And have you asked any of the Final Cut engineers about how difficult their transition really was?

    The fact is, Rick is being honest, and Scott is being honest.

    And both are holding back a bit.

  24. Alan Sky says:

    I agree fully that Metrowerks Codewarrior was a great product and Office a great and huge software to port.

    Apple is not behaving very profesionnally with all these changes in the past 3 years. They have a history of messing with their partners, now they have enough money to buy some and make a sub-micropoly.

    Microsoft BU on the other side should fix its Office 2004 and especially the VBA and memory management because Office 2004 is a buggy software even without the Intel story. Partly because OS X Server file sharing support is flawed and partly because Office looks like a monster to debug. The latest patch doesn’t fix anything in the VBA area.

    Fix Office, let the Rosetta run and take a year to port your code to Intel: who cares about being native as long as Office works?

  25. MalEbenSo says:

    I agree about the disaster that Word 6 was as far as the Mac experience is concerned. What a relief when Word 98 came out. And I am also glad to see that Microsoft ports to Intel CPUs at all.

    But I cannot see how MS Office is really optimized in any way for the Mac: The Microsoft User Data folder has been mentioned.

    Worse than that I find the fact that all Office apps take up CPU when they should sit idly in the background. Open an empty Word, Excel, PowerPoint document and Entourage. Then check in Activity monitor how they eat anywhere between 0.5 and 5% each doing nothing for the user. What a nuisance specifically for laptop users.

    Then there’s the font issue: Many of the great typographic features of OS X’s font system are not used in Word X (cannot say for Word 2004, although the non-idle issue has been confirmed for the current version). Just type "Zapfino" using the Zapfino font in TextEdit and Word and see the difference. And there is incredibly much more if you open the "typography" menu and play with the advanced options of OS X’s Text System.

    This is not to bash MS. Despite these deficiencies I still find it the best office package for the Mac available and I recommend it whenever people ask although the price beyond the student/teacher version is steep.

    I very much hope that MS uses the UB "opportunity" to bring out a great Mac Office that is indeed optimized and feels native not just in the GUI sense but also behind the scenes.

  26. slow_today says:

    anonymous –  Why would you need to load all plugins at one time? If you don’t dynamically load a plugin why will it consume address space, present duplicate symbols and duplicate debug information?

    I don’t think that Debug info hitting address space limits can become a practical problem easily.

    Riot Nrrrd™ – GCC 4.0.3 says it can and so does Apple’s documentation, but if you try to use -gdwarf it errors out saying unknown option to -g.

  27. slow_today says:

    Riot Nrrrd™ – Scratch that – I was having 3.3 in path – 4.0.1 works fine with -gdwarf-2.

  28. Rick Schaut says:

    Clarence!

    We sorely miss your talents.

    Yes, I did make eWeek.  What’s more, the reporter actually got it right!  I’m stunned.

  29. Alexey says:

    hmm, intel cpu not so bad…

  30. I’m a couple of weeks behind in my podcasts, which is why I just noticed that the folks over at Your…

  31. What do you do in the down time between feedings when you’re on parental leave pulling night duty with…