Alternatives to using the #error directive to check whether the compiler even sees you

In response to my description of how you can use the #error directive to check whether the compiler even sees you, some commenters proposed alternatives. I never claimed that my technique was the only one, just that it was another option available to you. Here are some other options.

scott suggested merely typing asdasdasd into the header file and seeing if you get an error. This usually works, but it can be problematic if the code does not already compile. And of course it doesn't compile, because the reason why you're doing this investigation in the first place is that you can't get your code to compile and you're trying to figure out why. Consequently, it's not always clear whether any particular error was due to your asdasdasd or due to the fact that, well, your code doesn't compile. For example after adding your asdadsads to line 41 of file problem.h, you get the error Error: Semicolon expected at line 412 of file unrelated.h. Was that caused by your asdasdad? Doesn't seem that way, but it actually was, because the preprocessed output looked like this:

int GlobalVariable;

After your asdasdasd, all that was generated were a bunch of #defines, #ifs, #endifs, and #includes. None of them generate output, so the compiler proper doesn't see anything; the preprocessor ate it all. Finally, at unrelated.h line 412, a header file finally tried to do something other than just define a macro, and it's only then that the error is detected.

But if you can pick the new error out of the error spew, then go for it. (There are also obscure cases where an extra asdasdasd doesn't introduce a new error.)

Since the string #error is shorter than asdasdasd, and it works in more places, I just go with #error.

Another suggestion came from Miguel Duarte who suggested generating the preprocessed file and studying it. That helps, but the preprocessor output tends to be huge, and, as I noted in the base article, #define directives don't show up, so it can be hard for you to find your place. I also noted in the base article that if you use Visual Studio's precompiled header files, the contents of the preprocessed output may not match what the compiler sees. In fact, that's the most common reason I've found for a line being ignored: You put the #include directive in a place that the preprocessor sees but which the compiler doesn't see because you violated one of the precompiled header consistency rules, usually the source file consistency rule.

Comments (21)
  1. Tom says:

    I suppose that, for very large projects at least, pre-compiled headers can be useful to reduce compile times.  Most of my projects are not that large, however; and I have usually found that using the pre-compiled headers sometimes leads to subtle compile or even link errors that I spend far too much time trying to track down.  Naturally, the solution is to "Rebuild All" whenever one of these problem arises, but that certainly seems to make using pre-compiled headers kind of pointless, doesn’t it?

    Thanks for the link to the pre-compiled header consistency rules.  I never knew about that stuff, and it goes some way to explaining how I introduced all those subtle compile and link errors into my own code. ;)

  2. Doug says:

    This reminds me of the time I joined a project with 500+ source files.  The OBJ directory had compile dates spread across 5 months.  And when I asked the "lead" if I could do a clean compile, he asked "why?".   The only good thing about that response was it allowed me to assign the correct assessment level of his skills….

    #error is usually the best way of detecting compile errors, for at least as many reasons as Raymond pointed out.  Probably the only reason not to is if you have a compiler that doesn’t support it…  Or does that date me too well.

  3. ChrisMcB says:

    @doug I’m curious what will happen if the compiler doesn’t support #error? Does it just ignore a preprocessor command it doesn’t understand?

    And what answer did you give your lead? And what was your assessment of his skills?

  4. ChrisMcB says:

    @doug I’m curious what will happen if the compiler doesn’t support #error? Does it just ignore a preprocessor command it doesn’t understand?

    And what answer did you give your lead? And what was your assessment of his skills?

  5. kip says:

    You could always add:


    then the error will clearly say something along the lines of "unrecognized symbol: asdasdasd"

  6. Worf says:

    I understand the reasoning behind precompiled headers, but it seems like the most frustrating problems I run into are precompiled headers.

    The easiest are ones to which the compiler barfs and identifies a precompiled header as the cause, but I’ve had obscure build errors that I fixed by disabling precompiled headers.

    Most annoying yet is when I get fresh source code, build it, and get a precompiled header error – it’s a frickin’ clean build and the precompiled header is already out of date?

  7. #error asdasdasd says:


    So, let’s say it takes the rebuild like 10 hours and the old obj files are perfectly ok.

    Why do a rebuild ? It’s not that older obj files rot or rust over time. Unless there is a compiler or build options change, there is no reason to rebuild all.

  8. Leo Davidson says:

    "the reason why you’re doing this investigation in the first place is that you can’t get your code to compile and you’re trying to figure out why"

    I’m not arguing against #error, but sometimes I go the "asdasdasd" route if the project is compiling but the changes I’m making don’t seem to have any effect and I want to prove that the file/part I’m editing is actually used.

    (There are a few ways to fall into that situation but a simple one is using Find In Files to find some code which turns out to be in an "old/unused code backup" file that’s no longer in the project, with the real code elsewhere.)

    #error definitely generates a clearer error message, and works in more situations, but it also takes slightly more thought*/effort to type than mashing the keyboard in frustration and hitting F7. :-D

    Sometimes I write a little message to the compiler, like "Hello!? Are you seeing this!??" which takes much longer to write, and produces very spurious error messages, but has a pleasant venting effect on the programmer. :)

    (*The main reason is habit, really. I didn’t learn about #error until after I was used to mashing, so I still instinctively mash.)

  9. Anonymous Coward says:

    I’d have to agree with Raymond that #error is generally the easiest way. However, I’ve met situations where #error didn’t seem to help and I worked my way around the generated source, which was tedious but, mostly thanks to the preprocessor inserting #line directives and such so you don’t lose your way, doable. In both cases it turned out the #error was reached both in the good version and in the bad version, but in a different way. But most of the time it works, and it’s certainly the easiest so that makes it the first thing I try in such situations.

  10. ATZ Man says:

    @error asdasdasd

    There could be latent binary incompatiblities between the old OBJs and the ones that have recompiled.

    A header:

    struct old_school {

     int field1;

     int field2_was_short_previously;

     int field3;


    void useful(struct old_school*);

    Source to five-month-old OBJ

    void i_never_change()


      struct old_school x;

      x.field3 = 1;



  11. Jonathan Wilson says:

    I know of one piece of software that does some hacks with #include

    What it does in one file called lets say abc.c:

    #ifndef x

    #define x

    #define abc(x) <blah>

    #include "abc.c"

    #undef abc

    #define abc(x) <blah 2>

    #include "abc.c"






    If there was ever a wierd hack, this is it…

  12. Jonathan Wilson says:

    I have experienced many times on some of my projects (all of which are stock Visual C++ 2008 projects with no special build stuff set up) when I have a bug showing up which disappears when I do a "rebuild all".

  13. Katie Lucas says:

    "when I asked the "lead" if I could do a clean compile, he asked "why?"… it allowed me to assign the correct assessment level of his skills…."

    You’d likely get me asking why as well; for two reasons; 1 – I believe my build system works properly. And hence 2 – If you think it doesn’t I’d like to know *why* you do, so I can find out if it’s a problem which needs fixing.

    Clean builds are a solution to a symptom, not the actual underlying problem.

    Please don’t turn "why" questions into that sort of nerd pissing contest that means problems don’t get solved.

  14. Mike Dimmick says:

    @Jonathan Wilson: then you have a regular, common-or-garden, dependency bug, which you should track down and fix.

    I will always do a rebuild all for an actual handover-to-someone-else release, but for development, an incremental build is fine.

    Virtually any Windows program (that uses windows.h) will benefit from precompiled headers.

  15. scott says:

    Hey, I remember that comment…Over a year later and I just found myself doing it again yesterday, so I guess I haven’t changed my ways much.

    I often do it to make sure that I have the correct define (#ifdef AMD64 was the one I was looking for yesterday) and that I’m compiling what I think I’m compiling. In the case where the code isn’t compiling at all though I’d agree that #error makes more sense, why confuse the mess even further.

  16. I always use

    #pragma message("SOME TEXT")

    which has the side effect of not breaking the build, and turns up for every file in which it gets included.

    One way I use it is if I have some code I’ve included temporarily — e.g., some OutputDebugString calls.  I add some #pragma message stuff in all-caps so that I can see in the build output to remember to remove it when I’m done testing.

  17. Christian Kaiser says:

    There’s a small but nice compiler switch "/showIncludes". Guess what it does.

    This is useful if you don’t know from WHERE an include file was included, for example.

  18. Dave Harris says:

    @Mike Dimmick: one cause of dependency problems is Resource.h, which in VC++ is excluded from dependency tracking by default. If you remove a #define from it, code that relies on the #define will no longer compile, and will probably crash when run, but if you don’t rebuild all manually you may not notice.

  19. peterchen says:

    My reasons for having a clean build:

    VC itself has some holes in the dependency check (e.g. #pragma comment lib, handling resource files).

    Also, a clean build will turn up some dependency / build order bugs that can go unnoticed for a long time.

    A clean build process is a de-facto documentation *how* to build your sources, in case that’s left is the source code repository.

    It also makes Joel happy.

    Still, incremental build should work in the general case – waiting for full rebuilds is no fun.

  20. GregM says:

    "one cause of dependency problems is Resource.h, which in VC++ is excluded from dependency tracking by default."

    For anyone who hasn’t run into this before, that is because of this line at the top:


    which you can add to your own files to get the same behavior.  There’s nothing magical about the name resource.h.  (See TechNote TN035.)

    I’ve done that for a similar header file full of function IDs that is automatically generated and then included everywhere.  The existing IDs are stable, but new ones can be added.

  21. В ответ на мое описание того, как вы можете использовать директиву #error , чтобы проверить, что компилятор

Comments are closed.

Skip to main content