The C runtime library cannot be mixed and matched


In 2011, a customer had an application written in C++ with Visual Studio 2003 that consumes a static library provided by a third party, let's call it contoso.lib. Now, contoso.lib is a static library compiled with Visual C++ 6. The customer is migrating from Visual Studio 2003 to Visual Studio 2008, but they are still using that old contoso.lib library that was compiled with Visual C++ 6. They were afraid that they would encounter some unresolved externals due to name mangling issues, but they were pleasantly surprised when there were no such issues.

Now the questions.

  1. Is it correct to link a VC6 static library into a VS2008 project?
  2. Even if the linking is successful, do you see any issues or disadvantages with this approach?

The customer liaison's opinion was "Due to missing security features like SAFESEH, GS, DYNAMICBASE, and NXCOMPAT, there may be a lot of drawbacks to using VC6 libraries in a VS2008 project. What do you think?"

It's nice that you're thinking about the security features added in recent versions of Visual Studio, using a generous definition of recent to mean less than nine years old. But the issue is more fundamental than just security. The issue is correctness.

You cannot mix libraries across compiler versions. That you're trying to mix libraries with compiler versions that are nineteen years apart in age is mind-boggling. The Win32 ABI does not extend into compiler-specific behavior, like its internal lookup tables for exception dispatching, private helper functions for RTTI, class member layout, the order of base classes, the implementations of STL classes, the layout of various internal structures, or what happens if you return FALSE from Dll­Main.

Name mangling will not catch any of these issues. If you modify a class, say by adding a new member variable or base class, the mangled name does not change, even though the new class is probably incompatible with the old one.

Assuming you manage to dodge all the link errors, what will happen is that these discrepancies will manifest themselves as random failures or memory corruption at run time.

You will have to go back to Contoso and ask them for a version of the library that is compatible with Visual Studio 2008.

(Then again, since this question was asked in 2011, they may want to go straight to Visual Studio 2010, which was the most recent version of Visual Studio available at the time.)

Bonus chatter: Another solution is to create a project in Visual Studio 2003 whose sole job is to wrap the static library in a DLL. The rest of your program can be developed in Visual Studio 2008, using the DLL interface to access the static library.

Comments (29)
  1. Joshua says:

    The bonus chatter is the right way to do it. Now if it had used only C functions (and you can check this) you might have a chance. The only ones likely to break are getc and putc, but you can check this.

  2. AC says:

    If, on the other hand, Contoso already supplies you a DLL which dynamically links to the old CRT, you're in a world of pain too. Loading several DLLs (from different 3rd party providers) which depend on different MSVCRT versions doesn't really work. Or it does, until you look at something funny.

  3. Chris Long says:

    I had these symptoms (random failures, stack corruption, etc) in a piece of code when it was loaded and executed as a shell extension, but not when the same code ran as a stand-alone EXE. Both the DLL and the EXE were statically linked to the runtime so I suspected some sort of incompatibility when the code was running in Explorer's process, but various things I tried at the time didn't seem to work. What is the correct way to compile/link a shell extension DLL to avoid issues?

  4. Anonymous Cowherd says:

    This bit Flexera (authors of license enforcement software FlexNet Publisher) hard with VS 2015.  We had a bit of schadenfreude where I work, though unfortunately it means we can't use VS 2015 in production until they get their act together.

    connect.microsoft.com/.../1144980

    [Note the bug submitter is FNP-ENG]

  5. joe says:

    Can anyone please tell

    1) Does VS 2003 with VC 6 compiler give the same problem (like using VS 2008 with VC 6 together)?

    2)How a Dll in VS 2003 solves the problem?

    Thanks

    joe

  6. Random User 36183755 says:

    Semi-rhetorical question:

    Would it be reasonable to be a bit worried about a product that, via its many DLLs of varying ages, eventually links to the msvcr/p DLLs for nearly every version of 6.0 through 12.0?

  7. Brian says:

    C 6.0 was released in 1989 (19 years before 2008).  Visual C++ 6.0 was released in 1998, which is only 10 years apart from VS2008.  Still a long way, but not 19 years long.

  8. dimkaz says:

    @joe

    1. VS2003 with VC 6 library is likely (unless for some reason MSFT explicitly worked hard on keeping them compat) to have the same issues as VS2008 & VC6.

    2. static lib is missing all the code from VC 6 (in this case) that it's linked with. Common but not limited to new/malloc/exceptions/static initialization/structures one used from c/c++ lib etc...

    By creating a dll in VC6 you are telling the linker to merge into that into this dll all those dependencies. (or link against older vc runtime dll)

    The project in VS2008 will have it's own copy of c runtime (or links to a different dll version of such)

    You can still run into trouble. Common issues: memory allocation/deallocation across boundary, C++ exceptions across boundary

  9. Darran Rowe says:

    @joe:

    It will always be a problem. Compiler features often need support from the CRT, for example, C++ exception handling uses a function in the MS CRT to handle them. This is one prime example because not only has the implementation changed, but the name has changed over the years. IIRC, this changed between VC6 and VC.net. What's more, VC.net was the first version that started work towards standard C++ behaviour.

    For how the DLL solves the problem, the DLL is a separate executable module to the EXE file. This means the .exe file can be linked to one version of the CRT and the .dll can be linked to another. This allows you to link all objects to the correct versions without mixing them, and if you have a C ABI for the DLL, then you can call functions in the DLL from the executable without issue.

  10. Myria says:

    Windows is somewhat unique in the modern native-code programming world in that the C runtime library is provided by the application, not the operating system.  In the UNIX world, there is a standard C library that comes with the operating system.  Changes are handled by newer compilers and headers linking to different internal symbol names.

    Then...there's C++ in the UNIX world.  That's pretty much the same as the Windows world.

  11. rob green says:

    I think the bigger problem is mixing the memory models between the components.  For example if the library function allocates memory it will come off of the vc6 heap, but if the contract is for the caller to free the buffer then it will free some random memory in the application heap.

  12. Darran Rowe says:

    @Myria:

    That is changing though, Microsoft introduced the UCRT with Windows 10 and VC2015. The plan for this is to have an OS supplied CRT.

  13. Cesar says:

    @Myria: "Then...there's C++ in the UNIX world.  That's pretty much the same as the Windows world."

    Not really nowadays, at least in the modern Linux world. Everything links to GCC's standard C++ library, even when you compile with another compiler (like LLVM's clang). It does the same kinds of things as the C library (different internal symbol names, etc), plus it can use namespaces. The name mangling and exception handling is a standard (Intel's IA-64 C++ ABI).

    As a recent example, the C++ standard changed to mandate that the linked list's size() be constant-time. So now GCC's standard C++ library has two linked list implementations, in separate namespaces.

  14. John Doe says:

    @Myria, @Cesar, last time I heard, people are launching stock virtual machines (say, in Azure or a book store's leftover CPU power) with a stock OS (say, Linus' pet project) and *statically-linked* compiled binaries living on a cloud storage (say, SkyDrive or 3 Ss).

    It's still common practice to do this.  The problems are just about the same as in Windows.

    As long as you don't have to actually share the CRT, you're fine.  Things to look out for: having one library fopen and its consumer fclose, same for malloc/realloc/free, etc.

    Here's another solution, if you have the source code: export proper functions that call the CRT from the originating library, e.g. foo* create_foo() and void release_foo(foo*) (error checking elided for simplicity).  Or have the caller allocate and free memory.  As for files, the same goes, either export your file handling functions or have the caller implement an interface (struct with function pointers).

    In fact, this is good advice for your own libraries, so all of you just go now and do this.  Fast.

    Boy, this reminds the very basics of in-process COM a lot!

  15. dmitry_vk says:

    >if the library function allocates memory it will come off of the vc6 heap, but if the contract is for the caller to free the buffer then it will free some random memory in the application heap.

    That's why libraries should export their own resource management functions (which would simply wrap malloc/free/etc but guarantee that their implementation comes from the same library) or be able to use user-supplied resource management functions (via function pointers; like a vtable).

    That's pretty much a standard practice for libraries.

  16. Ken Hagan says:

    @ AC & Random User 36183755:

    I've had very little trouble mixing DLLs that use VC2005, VC2008, VC2010, VC2012 & VC2013 all in the same address space. They all did resource management properly (ie: they provide functions to free whatever they allocate).

    The only gotcha I've seen is the handling of environment variables. At least for the older versions, the CRT maintains its own cache of the environment, so if you putenv() in one version and getenv() in another then the latter does not see the change you just made. Passing global context through environment variables is, thankfully, rare and so I only had to work around this once. (I used runtime dynamic linking to call the putenv() of both versions, and also SetEnvironmentVariable since I didn't want to be guilty of the same error.)

  17. Azarien says:

    And few days ago I have finally thrown out a static .lib that was compiled back in 1997. (so it must have been older than VC6, which shipped in 1998)

  18. DrPizza says:

    > Not really nowadays, at least in the modern Linux world. Everything links to GCC's standard C++ library, even when you compile with another compiler (like LLVM's clang).

    Unless you care about keeping pace with the latest standard, in which case you use LLVM's libc++ instead of gcc's libstdc++. And of course, in so doing you have incompatible binary representations of most objects.

  19. RS says:

    @Myria And that makes it very hard to compile software which has to run on older versions of Linux. Basically if your executable has to work on eg. Ubuntu 10.04, then your build machine has to run Ubuntu 10.04 or older. While for Windows you can use your Windows 10 build machine to build an executable which will still run on Windows XP (and probably older if you want).

  20. 640k says:

    @Ken: Don't forget to test all service packs also. There's no guarantees that the ABI wont change between service packs.

  21. cheong00 says:

    @640k: I thought that the compiler should be able to handle the binaries generated by one or two previous versions.

    Afterall, those component vendors seems have no problem in releasing library packages for one or two versions of Visual Studio.

    So I think even if the ABI can change across service packs, if you're using the latest SP level of VS to compile it, it should work fine.

  22. Yuhong Bao says:

    @Darran Rowe: I have an entire blog article on the history of the CRT. I hasn't updated it for final VC2015 yet though.

  23. Anon says:

    @RS

    Why are you building for Ubuntu 10.04? Ubuntu 10.04 has been unsupported for 4+ months.

  24. Zhila says:

    Maybe he is developing for Yahoo! Connected TV, which only supports development specifically on Ubuntu 10.04 (though they do provide a compressed VMWare image).

  25. @Anon says:

    @anon Ubuntu is one example, and half a year ago it was almost 5 years old and still supported. There are other systems with a much longer life span. CentOs 5 was released in 2007 and will still be supported for half a year.

    So what if you for some reason need to support CentOs 5? Then you'll only be able to use the compiler toolchain from around 2007. That's before even the first bit of C++11 was supported. If you're lucky you may be able to install a somewhat more recent version, but the tight coupling between the OS, the C library and the specific compiler version makes this a very tricky exercise.

    I would consider the system used on Windows a big advantage. Even if you still need to support Windows XP, you can use the most recent Visual Studio version on the most recent Windows system.

  26. Ben Voigt says:

    @Cesar: That was true six years ago.  About five years ago, C++0x made copy-on-write string illegal, and ever since, treating G++'s libstdc++ as if it were "The One True C++ Runtime (TM)" has been a losing proposition.

    Oh the irony, that the Linux crowd who'd been lecturing Windows developers that "C++ has an ABI, sharing standard library objects across modules is perfectly ok"... no longer did, while Windows continues to have a highly compatible ABI for a fairly large subset of C++ features (the ones used by COM).

  27. Cesar says:

    @@Anon: Can't you just compile a new gcc on CentOS 5? There's no tight coupling between the C compiler and the C library or the kernel; you can compile a newer C compiler using the older C library and it should just work (I've done it before more than once, though not on CentOS). The only difficulty you might face is that newer versions of gcc are written in C++, so you might need to compile an intermediate version of gcc if it depends on newer C++ features; other than that, there's no reason it shouldn't work.

    I agree that not being able to specify the minimum ABI on the command line like you can on MacOS (-mmacosx-version-min=10.X) is a bit of a pain, though nowadays with Docker and similar it's gotten much simpler.

    @Ben Voigt: Well, the subset of C++ features used by COM doesn't include std::string or std::list (the two main pain points). If you used the same subset, the ABI would also be highly compatible on Linux, and it has been unchanged since they moved to the IA-64 ABI (which dictates things like the layout of vtables and the name mangling) a long time ago.

  28. Zan Lynx' says:

    @Cesar: If you have to provide and support a shared object built for RHEL 5 and intended for use by developers compiling with the RHEL 5 toolchain, then you have to limit yourself to C++03 because you cannot require your customers to hack up their officially supported RHEL 5 installs. Plus code updates and bug fixes need to be drop-in replacements that don't require rebuilding the applications that use the library.

  29. Rich Skorski says:

    The DLL wouldn't protect you from changes in class member layout if a class or class pointer is passed to the DLL though, right? Or am I missing something?

    [Right. If that's a problem, you will have to make the DLL deal with it. Everything that is dependent on class layout or sizeof(class) needs to stay inside the DLL. The interface to the DLL needs to be compiler-independent, which probably means a flat C-like interface. -Raymond]

Comments are closed.

Skip to main content