DLinq Mapping: External Or Not (Attributes or XML file)

At PDC (and before that on ObjectSpaces project), this was a hot discussion topic. What approach is better: attributes or external source like an XML file. I would like to open up this topic beyond the 1:1 conversations at PDC to the blogosphere. I will start with a small backgrounder and you tell us which one makes more sense to you.

Advantage attributes

  1. Really simple for the basic cases. No need to create or consult (e.g. for debugging) another source file. Works with current tools that handle attributes.
  2. Easy access to mapping metadata without additional APIs.
  3. Requires recompilation for changes. No, this is not a mistake. Despite situations that make recompilation problematic, usually having a compiler check things is better than getting run-time exceptions

Advantage external mapping

  1. Same classes can be mapped to multiple data models using different mapping files (this includes different DB/table/column names, different server vendors etc.)
  2. Avoids clutter in source code, especially in case of complex mapping
  3. Hides database-specific information from users of object model
  4. Some schema changes can be handled without recompilation (IMO, the class of changes that can be handled is often exaggerated)

Of course, a natural question is why not support both? Well, as a user, which one should I choose? How do I decide? Do I need to learn both ways? As builders of the ORM component, do we split out resources on two things that achieve more or less the same result instead of focusing on one?

So what do you think. Do you have compelling scenarios that cannot be handled using one of the options?

Comments (27)
  1. Dinesh, I think you might remember who I am (I’m on the customer council, and was a little harsh when it came to ObjectSpaces).

    Personally, I really like using attributes. I agree that complex mappings might make for cleaner source code, but keeping the source code simple doesn’t mean that the complexity is erased.

    In that case, out of sight doesn’t equate to out of mind. In the cases of additions or deletions (of fields or relations) to your schema, you are going to have to change the source code anyways, since those fields will no longer be valid, and your assumptions will have changed.

    The only advantage from your list that I see as being valid is #1. However, I think that with the right design of the attributes, you can achieve this easily using an attribute-based approach.

  2. Douglas McClean says:

    I think mapping the same types to multiple data models is a compelling reason. I frequently use my existing mappers to map the same types to different schemata for server side and smart client stores. I may also wish to map the same types to both XML and relational DB schemata.

    I also think that isolating business objects from any knowledge of the database is a compelling reason in and of itself.

    Compile-time checking of external mapping information is still definitely possible, and in fact I would expect it from a tool as sophisticated as Visual Studio. Let us put the mapping information in a separate project, in some kind of mapping files, and you can compile that to ensure that it makes sense. You could even provide compile-time checking against the actual database metadata.

  3. I’m a fan of external. My own ORM tool (http://savannah.nongnu.org/projects/nrdo) uses external files for one very important reason: it actually *builds* the database using the information in those files, as well as creating relevant indexes, foreign keys, etc. At compile time the schema is checked and if any changes are needed (add/drop columns, indexes, tables, etc) the changes are made accordingly. Oh, and it also generates the C# source files.

    While it’s *possible* to put enough information into attributes to build the database from it, you’d be hard-pressed to do that and still make the files as easy to write as an external file can be.

    It goes the other way, too. You’d be hard-pressed to get enough information out of the database structure alone to be able to create classes that represent that structure in the most natural way – especially it’s not possible to tell what "get" methods to provide based on the database structure alone (although this particular limitation becomes less important with Linq because it allows the programmer to specify an arbitrary where clause, there are others).

    Using an external file and building both the database and code from that makes life very easy, in that you only have to define your table structure once, rather than building the database, then writing a class, then providing a means (attributes or mapping file) to connect the two together. It also lets the table structure itself be in source code control, and makes moving schema changes from dev to live very automatic.

    Attributes can’t do any of this 🙂

  4. Keith Farmer says:

    The only external mapping scenarios that I’ve ever seen as compelling were for database providers and database command wrapping when the database provider (and, therefore, SQL syntax) may change.

    So if the entity update command needs to vary, then some way needs to be provided for it. Perhaps [UpdateAttribute(providerName, command)], [UpdateAttribute(providerName, xmlFile, commandId)] would be suitable overloads?

  5. Paul Wilson says:

    I have quite a few people that use my WilsonORMapper for database independence because they create/sell products that must support multiple database vendors. At times this requires slightly different mappings for different databases, even though the code is the same in many of those cases. Probably even more significant though is the number of people that must support their database tables having a configurable prefix so that collisions with existing database tables do not occur. In this case the table names, and possibly even field names since customers can be peculiar and they are paying the bills, are determined by the end customer — simple to handle in an external mapping file. I also know of cases in highly regulated environments (medical/FDA) where a change in the source code that would require a recompile is simply not legally allowed, even though an external "configuration" is allowed — and the law doesn’t care whether or not you think that is just as risky or even more so. There are also cases where you make a reusable set of domain objects that you use in multiple projects with different tables — something which external mappings make easily reusable, but internal attributes that mark something external to the class make more difficult.

    So the only advantages I see to internal attributes is that some people (but not all) view them as simpler, that they reduce your need to have another file (which no one argues for in most other cases), and makes things easier for tools (which a tool issue, and isn’t really true anyhow). I’m more than willing to admit that some people prefer internal mappings however, but in that case you should also be more than aware that more people seem to prefer external mappings — so you either have to allow either, or require the one that solves more problems without creating more in the process. And while many people may never see the types of problems that others have, those of us that work in O/R Mapping are very much familiar with the very real-world problems that external mappings solve.

  6. Gabe says:

    I think the important scenario you are forgetting is when you don’t have a consistent object store. I might want to pull data out of an XML file on my PDA, a SQL Server DB on my server, and a WinFS store on my client. Who would want to have to ship a different set of DLLs for each one?

    While it is fine for my custom in-house app which will only ever run against SQL Server, anything that goes out to clients will need to be able to configure the data access layer at run-time.

  7. Soumitra Sengupta says:


    Aside from the advantage of loose coupling (think of a query over multiple db scenario), the biggest benefit of an external mapping file is less corruption of the language. As you may have noticed, in the Java world annotations are evolving into an expression language of its own and many people commented on how horrible it is and how it messes up readability of the programs.

    What I would like to see is limited custom attributes for mapping, external mapping files with deep integration with Intellisense for complex scenarios and a good way to deploy compiled mapping files along with the dll’s and exe’s.

    Does this make sense?


  8. Soumitra Sengupta says:


    Aside from the advantage of loose coupling (think of a query over multiple db scenario), the biggest benefit of an external mapping file is less corruption of the language. As you may have noticed, in the Java world annotations are evolving into an expression language of its own and many people commented on how horrible it is and how it messes up readability of the programs.

    What I would like to see is limited custom attributes for mapping, external mapping files with deep integration with Intellisense for complex scenarios and a good way to deploy compiled mapping files along with the dll’s and exe’s.

    Does this make sense?


  9. Eron Wright says:

    An obvious reason is that external files are "against the grain" for .NET development. It creates an impedance between files that collectively specify a class.

    Attributes are also strongly-typed. I realize that the XML designer in VS2005 is extensible and can be made aware of the mapping schema. Can VS2005 provide intellisense over the +values+ that would appear in the document? A typical value might be the name of a corresponding member in a class, or even a class name itself.

    Lastly, partial types is the typical answer for the desire to have external and/or generated files.

  10. Wolf Logan says:

    I don’t have a strong opinion one way or the other at the moment — I’m still getting used to the possibilities in the brave new world of LINQ. but I note that the distinction between attributes (internal) and mapping relations (external) boils down to two patterns: internal mappings are essentially driven by the database, while external mappings allow (to a certain extent) both the code and the database to vary independently.

    internal mappings make the most sense to me when the code is written as support for the database. usually this happens in legacy systems, where the database exists a priori and is maintained by a cadre of DBAs who have more important things to worry about than your little application. in this world, the database is forever, and your application is only there to support it. attributes are a perfect solution here, since the database is only going to change slowly, if at all, but your code may well have to do complex things to get its job done.

    when the database "grows up" alongside the code, though, there’s no clear driver. some days the code will change rapidly, and other days the database will change. occasionally, either before or after deployment, the database may be completely replaced. in this world, it’s important to decouple both sides, and an external mapping is a good choice.

    the complex scenario that comes to mind is the case where several pieces of code are brokering data among several databases at once, and each database (and perhaps each code assembly) is owned by a different entity (perhaps the databases are in different departments, perhaps they’re in different companies). in this world, external mappings are better, since they can be shared, but they seem brittle. it’s easy in this scenario for changes to get out of sync, even in a carefully-controlled development environment (which we don’t always have).

    this scenario starts to sound like the web services problem, with producers and consumers having to negotiate their schemata in order to communicate. I don’t have a grand solution, but I figured it was worth mentioning the scenario.

  11. Martin Nyborg says:

    Gentle.NET use attributes and is able to work with many different DB providers. Try it out. I have just used it for a commercial produkt and I was very pleased with it.

  12. Hi, I posted a fairly long comment yesterday and it hasn’t shown up. Any particular reason for that?

  13. I prefer attributes myself, and that’s what I built into my ORM. I like that it is close to the code and that you only have to write half of the mapping–with the external files, you have to specify both the class structure and the database structure. However, attributes allow your existing class declarations to serve as half of that.

    I also agree that it is rare to make changes to the database interface that don’t affect the code (and thus would require recompilation anyways). You can’t totally decouple the two, and I don’t see how having the external file loosens the coupling in a very meaningful way.

    If you write your ORM against the System.Data.Common namespace, you can plug-n-play different vendors to a great extent anyways, so if that’s really a concern for folks, there are ways to address it using the attribute approach.

    Regions in code can be used to hide length attribute declarations, if you really care. That’s hardly a good reason not to use attributes.

    I don’t think it makes sense to want to hide database high-level mapping information from the classes. Most applications that need persistence services will need to know they’re there and, at a high level, what is persisted (and, potentially, how). So I actually think having the mapping inline is a good thing on that count–there’s no hidden or unexpected persistence behavior.

    So as I see it, the attribute approach offers, more or less, everything the external file approach offers without the drawbacks of additional file management, extra mapping information declaration, extra debugging sources, etc.

    While I’m not opposed to having the XML mapping as an option (I actually was working toward that in my ORM), I really, really hope you at least support the attributed approach.

    Heck, you could even be able to specify an XPath value on an attribute to mapping information in a supporting XML file if you thought that particular mapping would be volatile or was particularly verbose/complex. An analogy would be how you can import XML comments from external files while still being able to keep other comments inline with the code.

    I think the main drawback to attribute approach is that a mapping designer would have to muck with the source because we can’t declare the target of an attribute in another partial class (because we’d have to redeclare the member). I actually see this as a problem with the attribute syntax in general and made a suggestion for it on the Feedback Center (http://lab.msdn.microsoft.com/productfeedback/viewfeedback.aspx?feedbackid=b8ca2beb-c1c2-4002-b7f7-d2677d42c0fe) that I hope you reconsider. You’re already adding new stuff to C# for LINQ anyways, so it’d be a good time to consider adding something like this to make it possible to define only attributes in a partial class (without redefining the member) and to be able to override them more easily.

    I think that custom attributes are the key to truly reusable code; we just need to make them a little easier to deal with. 🙂

  14. There is a third approach to mapping which is better then attributes or xml. We used it in our O/RM DataBlock (see http://www.voidsoft.ro). That’s mapping directly thru code (using class inheritance). The advantages are speed (its much faster because you don’t have the reflection overhead of attribute mapping) and also you don’t have to cache the mapping information.

    Also you mention "easy access to mapping metadata without additional APIs". That’s not true because with attributes you still have to write reflection code to get the mapping information.

  15. Dave Brann says:

    My team is working on an ORMapper similar to DLINQ today. We chose an external xml based metadata file because we must support customer driven customization (adding a column to a table). We generate strong type objects from the metadata file. These STOs wrap the ORMapper machinery. If a customer adds a db column, we regenerate the STOs with a new property that maps to the new db column. We could not do this if the metadata were built into the STO.



  16. omen says:

    I think the best solution would be having attributes that define the default mappings then have the option to override this thru some form.

    Northwind.UpdateMapping(MappingSource, mappingoptions…);

    with MappingSource implementing IMappingSource or something.

    this can be generalized to having a facility that takes a class and generates a MappingSource from it based on its attributes…so that the attributes defined in the DLinq class get applied on startup as default:

    self.UpdateMapping(new MappingSource(self));

    an IDE facility can also be provided that allows the specification of a file, existing class, or database source, or anything that implements IMappingSource then updates the attributes of the DLinq class automatically.

    I’m still getting my head wrapped around this but I envision in the future total integration with SQL Server. The client and the database will be seamless at design time and execution time, taking full advantage of seamless query optimization, shared type sytems, remoting and security. No more distinction between the middle tier and the database tier…the middle tier is just an execution context. Distributed systems nirvana.

  17. Ernst Naezer says:

    Hello Dinesh,

    I would very much appreciate an external-mapping file. It provides a natural border between the business logic and the database domain. Much in the same as the app.config file can be used to configure external connections for an application.

    The other reason to stick with XML files (or at least support them) is that they can be generated and parsed a lot easier than code.

    Why not support both of them? I was reading the introduction to the Windows Workflow Foundation (found at: http://www.msdn.microsoft.com/windowsvista/building/workflow/default.aspx?pull=/library/en-us/dnlong/html/WWFIntro.asp) and all tough they have a much more complicated scenario that includes more then just ‘simple’ attribute mapping and they share this view:

    "Why provide this markup-based option for defining workflows? One reason is that some developers prefer to work in this style, at least for certain kinds of applications. Another reason is that many tool builders find it easier to create tools that generate and parse XML rather than generating and parsing code. In fact, Windows Workflow Foundation’s Workflow Designer generates XML, so developers can always see the XML version of workflows created using this tool. A workflow can be built from any combination of Workflow Designer output, developer-written code, and XML, and however it’s created, it’s ultimately compiled into a standard .NET assembly."

    Keep up the good work!

    Kind regards,

    Ernst Naezer.

  18. In my book, advantage #3 for external mapping is the big one. Does anyone in their right mind think that hard coding physical table names in the assembly is a good thing? An external mapping that overrides the attributes might be a fair middle road but overall I believe external mappings are the correct choice. Another advantage is the ability to store query definitions in the mapping file. Intellisense for LINQ queries defined in the mapping file would be great!

  19. jemiller says:

    I think you should do what they did with EJB 3.0 Persistence and that is to support both. Personally, I would probably prefer to use attributes. However, I think point #1 under Advantage external mapping is important, so, I think XML mapping files should be supported as well. I think it should be left to the developer to choose which option they prefer.

  20. Just as a note. For the design-time scenario, DLINQ already has a mapping file. The SqlMetal tool that generates source also reads and writes an XML file with mapping information in it. It’s just not currently used at runtime. Design-time tools are expected to use this file.

  21. Bart Fibrich says:

    My vote is for Attributes for a number of reasons mostly discussed in the comments already. One benefit that I don’t think has been discussed is that using attributes allows for user extensibility. Rather than having a fixed implementation bound to xml, it you allowed these attributes to be inherited and extended then a heap of magic is possible.

    The most obvious is adding functionality for the attributes to get their mapping values from an XML file or any other data source for that matter. Which makes the XML or attributes discussion go away too.

  22. Hi,

    Nobody mentioned the fact that with attributes I can’t persist value or reference types over which I have no control (e.g. Point of which I’d like to store Point.X in a table column named PointX and Point.Y in a table column named PointY). Object members don’t always map to one column(e.g. DateTimeRange with a From and To property where I’d like to store the From and To properties in seperate columns) or have a data type mismatch (e.g. I’d like to store an XmlDocument as a byte[] in an Image or VarBinary column(not using SQL2K5 here)) or are stored inline (e.g. Customer.Address where Address is a class on its own, but in the table I’d like to store Street, StreetNumber, ZipCode, State, Country in the Customer table).

  23. Matthew Hobbs says:

    The principal of "Separation of Concerns" suggests the virtue of external mapping files over source-code attributes.  This and also an in-code API to configure mappings (as I think Marius is suggesting) allow for many interesting design approaches so supporting all these options would be a beautiful thing and should be easy enough to do.  (Attribute support is already there and won’t go away no doubt.)

  24. Jonathan Davlin says:

    I dont’t know how LINQ works under the hood, but why not use an XML file or config section that has the relevant information that merely Overrides or creates attributes, the handling of which is done via custom type descriptors.

    Maybe the runtime is just too slow to depend on reflection?  Who cares, use shadow copy to gen the "strong" attribute based model at runtime.

    It’s easy enough to create the build steps for the XML to accomodate both scenarios obviously, where you gen defaults at compile and allow only certain changes at runtime.

    This brings up a language request.  I want .h files in C#, sort of.  Just like we have partial classes, I want partial properties so that I can put all my attributes in one place.  If I can put all my declarations in one place that allows attributes, it also allows me to very easily generate a lot of code that adheres to templatable patterns without restricting myself to something like creating a DSM package.

  25. Kasper Oldby says:


    I have written a small project about LINQ and DLinq. This might an interesting read for some of you guys.

  26. vikas says:

    Application server that I work with interacts will different databases providing common set of services such querying, caching and user defined entity types. Essentially we have set common tables for per instance of our product and on top of this we provide ability to define custom entity types which are translated/mapped into tables. At present we use our own Object/Relational translation layer but we would like to use LINQ. Now having attribute based mapping will work for built-in types of our product but we will have to use external mapping for custom types and perhaps dynamically generate C# classes first and then update the XML map. Question: How changes to external map file are handled at runtime? i.e. once DataContext is initialized from the external map.

Comments are closed.

Skip to main content