Some parts of an interface can change but others can't


When I wrote about asking the compiler to answer calling convention questions, some people were concerned about whether this was a reliable mechanism or whether it was relying on something that can change in the future.

This is a special case of the question, "What parts of an interface can change, and what can't?" And it all boils down to compile-time versus run-time.

Assuming you are interested in binary compatibility (as opposed to merely source compatibility), then a decision made at compile-time can never be changed because the decision is already hard-coded into the application. For example, if you have a function that takes a parameter that is an enumeration, say,

enum FOO_OPTIONS
{
    FOO_HOP = 0,
    FOO_SKIP = 1,
    FOO_JUMP = 2,
};

then the values of FOO_HOP, FOO_SKIP, and FOO_JUMP are hard-coded into any program that uses them. The compiler will generate code like this:

; foo(FOO_JUMP);

    push 2
    call foo

Suppose you later change the header file to

enum FOO_OPTIONS
{
    FOO_HOP = 2,
    FOO_SKIP = 3,
    FOO_JUMP = 4,
};

Making a change in the new version of a header file has no effect on any existing programs which were compiled with the old version of the header file. There is no way for the foo function to tell whether the 2 it received as a parameter is a FOO_JUMP from the old header file or a FOO_HOP from the new one.

Therefore, you cannot reuse values in any existing enumerations or #define's because the values are already compiled into existing programs. If you had given the value 2 different meanings in different versions of the header file, you would have in principle no way of knowing which header file the caller used. Of course, you can invent external cues to let you figure it out; for example, there may be a separate set_foo_version function that callers use in order to specify whether they are using the old or new header file. Of course, that also means that if there are multiple components that disagree on what version of foo they want, you have another problem.

Note that this is not the same as saying that the value of a symbol cannot change. We've seen this happen in the past with the PSH_WIZARD­97 flag, but these sorts of redirections are rare in practice.

Another thing that is hard-coded into an application is the calling convention. Once code is generated by the compiler to call a function, that's that. You can't change the calling convention without breaking existing code. That's why you can ask the compiler, "How would you call this function?" and trust the answer: If the compiler generates code to call the function using technique X (register set-up, what gets pushed on the stack first, etc.), then the function had better accept technique X in perpetuity. Of course, you need to be sure that what you observe is in fact all there is. There may be parts of the calling convention that are not obvious to you, such as the presence of a red zone or maintaining a particular stack alignment. Or it could be that the function is called only from within the module, and the compiler's whole-program optimization decided to use a custom nonstandard calling convention.

On the other hand, things determined at run-time can be changed, provided they are changed in a manner consistent with their original documentation. For example, the message numbers returned by Register­Window­Message can change because the documentation specifically requires you to call Register­Window­Message to obtain the message number for a particular named message.

If you want to know how to call a function, it's perfectly valid to ask the compiler, because at the end of the day, that's how the function gets called. It's all machine code. Whether that machine code was generated by a compiler or by an assembler is irrelevant.

Caveat: I'm ignoring whole-program optimization and link-time code generation, which allow the toolchain to rewrite calling conventions if all callers can be identified. We'll see more about this in a future article. The technique described in this article works best with exported/imported functions, because it is not possible to identify all callers of such functions, and the compiler is forced to use the official calling convention. (You can also use it when inspecting .COD files for functions defined in a separate translation unit, for the same reason. That's the technique I used in the linked article.)

Comments (25)
  1. Joshua says:

    Which is why you can export a variable from a DLL. The fixup is the same as for take address of function so the loader always handled it just fine.

    I keep on having trouble explaining to coworkers about binary comparability. Somehow "don't change enum values for enums written to the database" doesn't sink in. One guy thought he could get out of it by using implicit enums. Figures he tried to insert one in the middle.

  2. Mordachai says:

    Or, "If you think your compiler is going to generate working code _at all_, then you can trust it."

  3. Mordachai says:

    @Joshua Perhaps everyone should have to take assembler at some point.  If you don't grock what an enum really is, things like your coworker are going to repeat themselves ad naseum.  Other "magical" things would appear far less magical as well.

  4. Evan says:

    @Steve Wolf: "Perhaps everyone should have to take assembler at some point."

    Assuming you're talking about a CS degree, I'm under the impression that that's pretty much already true most places if not all. It was definitely part of the CS bachelor's requirement at both my undergrad and grad schools, and my undergrad in particular isn't exactly a stand-out university.

  5. Joshua says:

    Unfortunately Evan is right. They teach assembler at undergrad level but it's really easy to get a passing grade without understanding any of how assembly language corresponds to compiler output at all, so they still think its magical. Maybe if the compiler design class actually had to compile a primitive language all the way down to assembly they would get it but that's a grad-student level course.

    Maybe there's a better way to chase away magical thinking but I don't know of one.

  6. Crescens2k says:

    @Evan:

    These days they seem to like Java more than anything.

    While I was doing mine, there was mostly Delphi/C with a few other things mixed in. But it has been transitioning more and more away from the lower level machine.

  7. Crescens2k says:

    @Joshua:

    The compiler design code that I know don't actually output working code of any sort. They went to three address code at most.

  8. Azarien says:

    > Of course, you can invent external cues to let you figure it out

    And one of the worst and actually implemented way to do that was the invention of Common Controls manifest.

  9. Muzer_ says:

    We did a little ARM assembly at Southampton in first year. It was great. There was a lab where you had to write a load of ARM Thumb assembly that was loaded using a small C wrapper that they provided the compiler's assembly output of only. An extension task, which I performed, was to decompile the assembly output for the wrapper program back to C. That was fun.

  10. Evan says:

    @Muzer: "An extension task, which I performed, was to decompile the assembly output for the wrapper program back to C. That was fun."

    IIRC, our final ASM project in undergrad was to write a partial disassembler in assembly. (MIPS.) That was kind of a fun project, though I perhaps broke the spirit slightly by coding it first in a strange way in C and then "manually compiling" it to MIPS after I ran into a couple annoying bugs.

    One thing I was really proud of was that the program didn't even come close to following the usual calling conventions -- except for a couple functions, I was able to partition the registers between the different functions so that you could just call back and forth without doing any saving or restoring of registers, because you knew what all of the transitive callees would use and that they wouldn't clobber your data. I figured if you're going to program in ASM, might as well try to eke some extra performance out of it and do something the compiler would be very unlikely to do. Despite explicit instructions at the start of the course that we didn't _have_ to follow calling conventions, we got docked for that; one of the few times I've actually argued for points lost on a project.

  11. Evan says:

    "we got docked for that"

    I even submitted the text file that had the call graph of the programs and which registers were used where so that you could see at a glance the program was correct. :-)

  12. Matt says:

    Worth remembering that calling-conventions are, of course, architecture specific.

    So if your compiler says "Oh, the way you call functions in KERNEL32 is your push all the parameters on the stack and call it" that assumption holds for your current architecture - x86 - but not for other architectures like ARM or x64.

    Or if you say, "Oh, I know that parameter is eight bytes above the address of this local variable I just got via the & operator", that assumption is extremely fragile and liable to break on other compilers and architectures.

  13. Matt says:

    @Evan "I figured if you're going to program in ASM, might as well try to eke some extra performance out of it and do something the compiler would be very unlikely to do"

    Compilers routinely do that. It's called "Whole Program Optimization". You were just doing that by hand.

  14. Gabe says:

    My feeling is that if you can't grok that you can't change an enum that's already been saved in a database, taking an assembly class isn't going to help you.

    I once had an argument with somebody who thought it was possible that Microsoft might someday increase MAXIMUM_WAIT_OBJECTS to be more than 64. He didn't seem to understand that the number 64 is now hard-coded into countless different DLLs that aren't going to be recompiled.

  15. Evan says:

    @Crescens2k: "These days they seem to like Java more than anything."

    It's been a little while since undergrad (<10 yrs, but not a lot less), but not since I was in grad school. Sure, the curriculum still starts out in Java, and some higher-level classes use it. (I, um, may have taught a compiler class a couple times that targeted JVM bytecode. That was an undergrad class, BTW, which *also* seems to be available quite standardly -- though as an elective and not required.) But what I say still applies: I think a class that involves assembly (often combined with processor design and some VHDL work) is still _very_ typical.

    I think Joshua's diagnosis is probably closer to the mark: the connection between the C and asm isn't there. Although I'd argue it's also not really the place of University classes to hit something like the enum thing: that's a fairly C-specific artifact I'd say.

  16. Joshua says:

    @Gabe: Well it could be increased to (WAIT_ABANDONED_0 - WAIT_OBJECT_0). I'm too lazy to see if that really is 64 or not.

  17. Alex Cohn says:

    Gabe, the value may be increased probably. It cannot be changed to less than 64 without causing innumerable problems.

  18. Jan Ringoš says:

    @Joshua: It can't be increased because: WAIT_IO_COMPLETION - WAIT_ABANDONED_0 = 64. You would need a completely new API, which would be great by the way, but it would simultaneously needed to be seriously backported for it to be useful sooner than in 20 years ...and that rarely ever happens.

  19. Kevin says:

    @Evan: Were they docking you for violating standard calling conventions, or for not following *any* convention standard or otherwise?  Because the latter seems a little more defensible to me.

  20. Cesar says:

    @Jan Ringoš: no need to backport, just create a helper library which calls the new API using GetProcAddress, and if it's not found, emulates it. You lose some performance due to the impedance mismatch on older systems, and lose a very small amount of performance on new systems due to the indirect call, but gain in programming effort and cleaner code. After a few decades, the code can be changed to call the new API directly (removing the helper library) to force your users to upgrade.

  21. Marc K says:

    @Azarien: "...was the invention of Common Controls manifest."

    Indeed...

  22. Joshua says:

    @Cesar: Indeed a form of said emulation already exists in Cygwin, where they can wait for up to 1024 sockets.

  23. Myria says:

    The rumor has been that link-time code generation on the original Xbox was the biggest reason that the Xbox 360 had so little backward compatibility: with link-time code generation, it was impossible in general to do high-level emulation of the DirectX API.

  24. Joker_vD says:

    @Myria: Wait, but LTCG doesn't affect functions linked from DLLs? And if you link against static libraries built without /LTCG it too won't do dirty tricks, so... the rumor is the original XBox games were linked against static DirectX library binaries built with /LTCG flag on?

  25. Myria says:

    @Joker_VD: I know that they were.  The kernel had very little to do with the graphics system, and the original Xbox had no DLLs.  (Well, it had them in debug mode, but release titles weren't allowed to use them.  The retail kernel had the ability to load DLLs compiled out.)

Comments are closed.

Skip to main content