Discussion about API changes necessary for POCO:

Evolving an API to support new requirements, like POCO, while maintaining backward compatibility is challenging.

The following design discussion from members of the Object Services team illustrates some of the issues and hard calls involved.

Have a read, and tell us what you think.

In particular are we missing something, or overstating the importance of something? Let us know...

Anyway over to Diego and Mirek...

POCO API Discussion: Snapshot State Management

What is in a snapshot?

In Entity Framework v1 there is a single way for the state manager to learn about changes in entity instances: the change tracking mechanism is set in such a way that the entity instances themselves notify the state manager of any property change.

This works seamlessly if you use default code-generated classes, and it is also part of the IPOCO contract for developers willing to create their own entity class hierarchies.

For version 2, we are currently working on a snapshot-based change tracking mechanism that removes the requirement from classes to send notifications, and enables us to provide full POCO support.

The basic idea with snapshots is that when you start tracking an entity, a copy of its scalar values is made so that at a later point – typically but not always at the moment of saving changes into the database - you can detect whether anything has changed and needs to be persisted.

The challenge

When we created the v1 API, we made a few assumptions that were totally safe with notification-based change tracking but don’t completely prevail in a snapshot change tracking world.

We now need to choose the right set of adjustments for the API for it to gracefully adapt to the new scenarios we want to support.

Mainline scenario: SaveChanges() Method

In notification-based change tracking, by the time SaveChanges is invoked, the required entity state information is ready to use.

With snapshot thought, a property by property comparison needs to be computed for each tracked entity just to know whether it is unchanged or modified.

Once the snapshot comparison has taken place, SaveChanges can proceed normally.

In fact, assuming the process is triggered implicitly on every call to SaveChanges, a typical unit of work in which the program queries and attaches entities, then modifies and adds new entities, and finally persists the changes to the database, works unmodified with POCO classes:

Category category = context.Categories.First();

category.Name = "some new name"; // modify existing entity

Category newCategory = new Category(); // create a new entity

newCategory.ID = 2;

newCategory.Name = "new category";

context.AddObject("Categories", newCategory); // add a new entity

context.SaveChanges(); // detects all changes before saving

Things get more complicated when a user wants to use lower level APIs that deal with state management.

State Manager Public API

ObjectStateManager and ObjectStateEntry classes comprise the APIs that you need to deal with if you want to either get input from, or customize the behavior of Entity Framework’s state management in your own code.

Typically, you use these APIs if you want to:

  • Query for objects that are already loaded into memory
  • Manipulate the state of tracked entities
  • Validate state transitions or data just before persisting to the database
  • Etc.

As the name implies, ObjectStateManager is Entity Framework’s state manager object, which maintains state and original values for each tracked entity and performs identity management.

ObjectStateEntries represent entities and relationships tracked in the state manager. ObjectStateEntry functions as a wrapper around each tracked entity or relationship and allows you to get its current state (Unchanged, Modified, Added, Deleted and Detached) as well as the current and original values of its properties.

Needless to say, the primary client for these APIs is the Entity Framework itself.

ObjectStateEntry.State Property

The fundamental issue with snapshot is exemplified by this property.

With notification-based change tracking, the value for the new state is computed on the fly on each notification, and saved to an internal field. Getting the state later only encompasses reading the state from the internal field.

With snapshot, the state manager no longer gets notifications, and so the actual state at any time depends on the state when the entity began being tracked, whatever state transitions happened, and the current contents of the object.

Example: Use ObjectStateEntry to check the current state of the object.

Category category = context.Categories.First();

category.Name = "some new name"; // modify existing entity

ObjectStateEntry entry = context.ObjectStateManager.GetObjectStateEntry(category);

Console.WriteLine("State of the object: " + entry.State);

Question #1: What are the interesting scenarios for using the state management API in POCO scenarios?

Proposed solutions

In above example there are two possible behaviors and it's not obvious for us which one is better:

Alternative 1: Public ObjectStateManager.DetectChanges() Method

In the first alternative, computation of the snapshot comparison for the whole ObjectStateManager is deferred until a new ObjectStateManager method called DetectChanges is invoked. DetectChanges would iterate through all entities tracked by a state manager and would detects changes in entity's scalar values, references and collections using a snapshot comparison in order to compute the actual state for each ObjectStateEntry.

In the example below, the first time ObjectStateEntry.State is accessed, it returns EntityState.Unchanged. In order to get a current state of the "category" entity, we would need to invoke DetectChanges first:

Category category = context.Categories.First();

category.Name = "some new name"; // modify existing entity

ObjectStateEntry entry = context.ObjectStateManager.GetObjectStateEntry(category);

Console.WriteLine("State of the object: " + entry.State); // Displays "Unchanged"


Console.WriteLine("State of the object: " + entry.State); // Displays "Modified"

DetectChanges would be implicitly invoked from within ObjectContext.SaveChanges.


· User knows when detection of changes is performed

· DetectChanges is a method that user would expect to throw exceptions if some constraint is violated

· This alternative requires minimal changes in Entity Framework current API implementation


· Since DetectChanges iterates through all the ObjectStateEntries, it is a potentially expensive method

· User has to remember to explicitly call DetectChanges() before using several methods/properties, otherwise, their result will be inaccurate:

o ObjectStateEntry.State

o ObjectStateEntry.GetModifiedProperties()

o ObjectStateManager.GetObjectStateEntries()

· This alternative implies adding a new method to ObjectStateManager

Alternative 2: Private ObjectStateEntry.DetectChanges() Method

In the second alternative, there is no public DetectChanges method. Instead, the computation of the current state of an individual entity or relationship is deferred until state of the entry is accessed.


· User doesn't have to remember explicitly calling DetectChanges to get other APIs to work correctly

· Existing API works as expected in positive cases regardless of notification-based or snapshot tracking

· No new public API is added


· The following methods now require additional processing to return accurate results in snapshot:

o ObjectStateEntry.State

o ObjectStateEntry.GetModifiedProperties()

o ObjectStateManager.GetObjectStateEntries()

· Existing API works differently in negative cases in notification-based and snapshot tracking:

o some of the existing methods would start throwing exceptions


o would require to introduce a new state of an entry – Invalid state (see below for details)

Question #2: What API pattern is better? Having an explicit method to compute the current state based on the snapshot comparisons or having the state to be computed automatically when accessing the state?

How this affects other APIs

While the State property exemplifies the issue, there are other APIs that would have different behaviors with the two proposed solutions.

ObjectStateEntry.GetModifiedProperties() Method

GetModifiedProperties returns the names of the properties that have been modified in an entity. Similar to the State property, with notification-based change tracking, an internal structure is modified on the fly on each notification. Producing the list later on only encompasses iterating through that structure.

With snapshot, the state manager no longer gets notifications, and at any given point in time, the actual list of modified properties really depends on a comparison between the original value and the current value of each property.

Therefore, for alternative #1, this APIs will potentially return wrong results unless it is invoke immediately after DetectChanges. For alternative #2, the behavior would be always correct.


This is the case in which an implementation detail that was a good idea for notification-based change tracking stops offering performance benefits in snapshot. Internally, ObjectStateManager stores ObjectStateEntires in separate dictionaries depending on their state. But in snapshot, any unchanged or modified entity can become deleted because of a referential integrity constraint, and any unmodified entity can become modified.

In alternative #1, DetectChanges would iterate thought the whole contents of the ObjectSateManager, and thus it would update the internal dictionaries. Once this it done, it becomes safe to do in-memory queries using GetObjectStateEntires the same way it is done today.

In alternative #2, GetObjectStateEntires would need to look in unchanged, modified and deleted storage when asked for deleted entries; also in unchanged and modified storage when asked for modified entities.

Querying with MergeOption.PreserveChanges

In Entity Framework, MergeOptions is a setting that changes the behavior of a query with regards to its effects on the state manager. Of all the possibilities, PreserveChanges requires an entity to be in the Unchanged state in order to overwrite it with values coming from the database. In order for PreserveChanges to work correctly, accurate information on the actual state of entities is needed.

Therefore, with alternative #1, querying with PreserveChanges will not have a correct behavior unless it is done immediately after invoking DetectChanges. For alternative #2, querying with PreserverChanges would behavior correctly at any time.

Referential Integrity constraints

When there is a referential integrity constraint defined in the model, a dependent entity may become deleted if the principal entity or the relationship with the principal entity becomes deleted.

For alternative #1, DetectChanges would also trigger Deletes to propagate downwards throw all referential integrity constraints.

For alternative #2, finding out the current state of an entity that is dependent in a RIC would requires traversing the graph upwards to find if either the principal entity or a relationship has been deleted.

Mixed Mode Change Tracking

At the same time we are working on snapshot change tracking, we are also working on another feature, denominated dynamic proxies. This feature consists of Entity Framework automatically creating derived classes of POCOs that override virtual properties. Overriding properties enables us to inject automatic lazy loading on property getters, and notifications for notification-based change tracking in property setters. This introduces a subtle scenario: when using proxies, it is possible that:

a. Not all properties in the class are declared virtual. The remaining properties need to still be processed using snapshot.

b. Sometimes, non-proxy instances of POCOs and proxy instances of the same type have to coexist in the same ObjectStateManager. For the former, it will have to use snapshot tracking. For thelatter, a combination of snapshot and notifications.

All in all, it becomes clear that the actual state of an entity does not entirely depend on the value of the internal state field, nor on the snapshot comparison, but on a combination of both.


As with mixed mode change tracking, SetModified() requires a combination of the internal state and the snapshot comparison to return valid results.

Handling of invalid states

When working in notification-based change tracking, Entity Framework throws exceptions as soon it learns that entity key values has been modified. With the default code-generated classes, the exception prevents the change for being accepted.

For alternative #1, DetectChanges can throw exceptions if some model constraint (i.e. key immutability) is violated. It is too late to prevent the value from changing.


Category category = context.Categories.First();

category.ID = 123;  // Modify a key property. This would throw if category wasn't a POCO class.

context.ObjectStateManager.DetectChanges(); // Throws because key property change was detected.

For alternative #2, reading the state of modified properties from an entity with modified keys could throw an exception:


Category category = context.Categories.First();

category.ID = 123;  // Modify a key property. This would throw if category wasn't a POCO class.

ObjectStateEntry entry = context.ObjectStateManager.GetObjectStateEntry(category);

Console.WriteLine("State of the object: " + entry.State);

// Throws because key property change was detected.

Getting an exception thrown here would be unexpected. But there is an alternative design that is to define a new EntityState that indicates that an entity is Invalid. This new state would account for the fact that POCO classes per se do not enforce immutable keys.

Since EntityState is a flag enum, Invalid could be potentially combined with other states.

SaveChanges would still need to throw an exception if any entity in the state manager is invalid.

It would be possible to query the state manager for entities in the Invalid state using GetObjectStateEntries method.

Question #3: Is it better to have an Invalid state for entries or should the state manager just throw exceptions immediately every time it finds a change on a key?

Our questions:

1. What are the interesting scenarios for using the state management API in POCO scenarios?

2. What API pattern is better? Having an explicit method to compute the current state based on the snapshot comparisons or having the state to be computed automatically when accessing the state?

3. Is it better to have an Invalid state for entries or should the state manager just throw exceptions immediately every time it finds a change on a key?


We really want to hear your thoughts on the above questions.

Alex James
Program Manager,
Entity Framework Team

This post is part of the transparent design exercise in the Entity Framework Team. To understand how it works and how your feedback will be used please look at this post.

Comments (20)
  1. Kosher says:

    I certainly hope this isn’t replacing ADO.NET.  I do appreciate the rewrite and there are some drawbacks to the existing ADO.NET paradigm but I think it has evolved quite nicely with the introduction of LINQ to DATASET.

    I hope you realize this is duplicating many of the ideas and features that already exist in DataSets.

  2. simmdan says:

    The Entity Framework will not replace ADO.NET–it builds upon ADO.NET, and we will support both.  There will continue to be reasons for using ADO.NET.  It might be that this is just the style of coding you prefer or it might be that you want to use the EF but occassionally need to go around the EF and talk more directly to the DB or whatever.

    The EF has its place and so does ADO.NET (including DataSets).

    – Danny

  3. jstangroome says:

    Regarding Q3, a big pain with DataSets is that invalid data cannot be entered. Sure, invalid data should not be able to be committed to the database but the in-memory object should be able to be in an inconsistent state until the app/user corrects it.

    Therefore EF POCOs should allow invalid state without throwing exceptions.



  4. Entity Classes & Architecture Patterns Part of the Entity Framework FAQ . 2. Architecture and Patterns

  5. Entity Classes & Architecture Patterns Part of the Entity Framework FAQ . 2. Architecture and Patterns

  6. Colin Jack says:

    Q1 – UOW and validation, whilst UOW is great it would be good if we could perform validation before you commit to ensure that totally invalid information is not persisted.  This should be done in a nice clean way and should allow us to do the validation any way we choose.

    Q2 – Make it implicit so alternative 2. People are used to the UOW pattern so I think this approach is fine.

    Q3 – Just throw the exception at the last minute, this isn’t a major issue so in my view so don’t worry about it. Reassigning the ID of a mapped class is a coding error and if we don’t catch it in tests an exception at run-time is totally fine. EntityState is not an attractive option at all.

  7. Jarod Ferguson says:

    I’ve had some issues with my comments not posting when logged in. Finally got a chance to type this again…



    Tracking state to POCO entities/graphs while detached from the context

    -Scenarios: WCF/Web Services, SilverLight, ASP.NET presentation tier (non EDS)



    Though DetectChanges is not ‘that’ intrusive (dev should be aware) it seems the default behavior should be ‘built-in’ as in alternative 2. I guess it would all come down to perf. Though a con of alternative 1 is potentially iterating through a large number of items in the state manager dictionaries, it seems as though the same issue might exist for alternative 2 ( as in saving changes). – Could it be possible to support both at some point by configuration?


    #3 –

    Invalid state seems as though it could have advantages being combined with other states. Again, it would be cool at some point to support both, defaulting to throwing exceptions by default with the ability to enable Invalid States.


    I think the best scenario is the Dynamic Proxy with notification based tracking. Would code enhancement (IL modification/injection) be an option vs. dynamic proxy to alleviate the need for virtual properties?

    If mixed mode was used I would prefer to call DetectChanges. It seems like it might be advantageous to have some properties state explicitly controlled, while others are tracking automatically with notification based. This would allow a dev to update parts of an entity. (Cant think of any good reason for this yet, but I’m sure I could find something)

  8. Colin Jack says:

    "I’ve had some issues with my comments not posting when logged in. Finally got a chance to type this again…"

    Yeah I’ve had this every time I try to post on the EF blog, tried 4 times on one post and nothing appeared. Very frustrating.

  9. macsgold says:

    Add the ability to require a POCO entity to be explicitly ‘registered’ for snapshot tracking (including its relationships?).  E.g. POCOs may not all come from the context anyway.  This obviates the performance problem of DetectChanges having to iterate through too much, because you only have to iterate through what is registered (only a subset of entities may need to be modified).

    Snapshot (or notification) state management must support true ‘savepoints’ to model nested units of work (e.g. popup cancelable modal edit dialogs in a UI, or any type of undo to a pre-specified savepoint).

    Following on from the above, state managements need to support TransactionScope – i.e. implicit savepoints and rollbacks to them on a TransactionScope rollback.  Ideally to be able to participate in distributed transations.

    State managements need to be extensible (interface based), with events to hook or methods to override, and also be fully replaceable.

    DetachChanges() should be public and not throw exceptions for validation changes (separate structure or method for querying that).  EDM constraints are only a subset of an overall validation model so don’t have exceptions dictate an implementation approach (i.e. the output of DetectChanges() is valuable with or without EDM validation errors). (Jason says this earlier.)

    Making and restoring (overwriting an entity) of snapshots of entities should be able to use a custom ‘serialization’ mechanism if desired, e.g. to support customization of the ‘restore’ process (talking about an undo or ‘rollback’ here) for any custom code that needs to be run (e.g. (re)setting any transient properties).

    POCO scenario:  ‘context-free’, save-pointable, TransactionScope-aware, efficient querying of what is modified.


    -Mat Hobbs

  10. Eugene Burd says:

    1.  I have had times where it was not enough to know that an object was changed, but I actually needed to know what changed.  The fact that the object changed, means that it needs to be persisted; however, if I have another system that only cares about certain changes, i wouldn’t want to notify it of these changes unless it was necessary.

    2.  Interesting dilemma.  From a pure academic point of view, I would say that since ObjectStateEntry.State is a property, best design practices would say it shouldn’t do any heavy lifting (should be deterministic).  However, in this case, I think the best decision is to break this rule.  I just think its dangerous to return wrong values to a developer who is ill aware of the new changes to the API.  This is almost worse than breaking the interface, it preserves it, but alters the functionality.  So alternative #2 seems better to me.  One potential way to speed things up is to take a checksum of the scalar fields of an entity and store it along with the state.  That way when subsequent calls are made to properties / methods that require the recomputation of state, it could be done faster, since if the checksum was the same, you would know that there were no changes from the previous computation.  Just an idea.  

    #3 – I don’t have strong feelings on this one either way you go.  Since its too late to correct anything, it seems like throwing an error would be a good thing.

  11. Alexnaldo Santos says:

    Q1, Q2 and Q3

    To manager business I need sinals to make decisions at any time, no matter scenarios.

    If I’m working with business objects (tables) and it have components (columns)

    I need to know the states of both at same time and together, no matter the Tier(WCF,ASP,WinForm, etc.).

    IMHO, IPOCO is the best option but without to implement many

    interfaces. I don’t like the idea that another object(ObjectStateManager)

    control the states of objects.No problem to "identity management" but wrong for "State management".

    If I’m bad I say to my doctor "I’m bad.."

    and not my doctor say to me "you are bad". The state of me and any part of my

    body can’t live out me. I think like it to object too.

    So, I would like to see in my objects :

    myObject.State   ( readonly, already exists )

    myObject.Undo()  ( undo the changes )

    myObject.Columns["MyColumn1"].State     ( readonly, the state on my column )

    myObject.Columns["MyColumn1"].OldValue  ( readonly, the old value of my column )

    myObject.Columns["MyColumn1"].Value     ( readwrite, the current value )

    myObject.Columns["MyColumn1"].Undo()    ( undo the changes )

    myObject.MyColumn1                      ( readwrite, the current value )

    I’m using this approach with "Linq to Sql" and work very well.

    Now, I would like to "migrate" my objects to EF.

    My table will be like it :



       public class MyIPocoClass : IEntity


           public MyIPocoClass()



           private EntityState _state;


           public EntityState State


               get { return _state; }



           private EntityColumns _columns;


           public EntityColumns Columns




                   if (_columns == null)


                       _columns = new EntityColumns();

                       _columns.PropertyChanging += new PropertyChangingEventHandler(OnPropertyChanging);

                       _columns.PropertyChanged += new PropertyChangedEventHandler(OnPropertyChanged);


                   return _columns;



           public event PropertyChangingEventHandler PropertyChanging;

           public event PropertyChangedEventHandler PropertyChanged;

           protected virtual void OnPropertyChanging(object sender, PropertyChangingEventArgs e)


               if (PropertyChanging != null)


                   PropertyChanging(sender, e);



           protected virtual void OnPropertyChanged(object sender, PropertyChangedEventArgs e)


               if (PropertyChanged != null)


                   PropertyChanged(sender, e);




           public string MyColumn1


               get { return Columns.GetValue("MyColumn1") as string; }

               set { Columns.SetValue("MyColumn1", value); }


  12. jrista says:

    Q1:  WCF & Disconnected Entities.

    In v1 of EF, its a bit difficult to effectively use entities on the client in an SOA environment. Part of this is due to how the available tools generate WCF proxies…either it generates all types for you, or you can share the exact same types in the exact same assembly as the service. The problem with EF entities is they come with an ugly parent class chain from the System.Data.Entity assembly, and all that stuff ends up cluttering your Presentation Layer. POCO can solve this problem, since they are just plain old classes, but you have the problem of serializing a modified POCO graph back from the client to the service and persisting those modifications to your data store.

    There are some options here, like keeping the ObjectContext around until the UOW is complete…but that means keeping it around accross multiple calls to the service, which means plumbing code. You could wrap up all object graphs in a utility class that tracks changes for you on the client, but then your binding both your service and your clients to a utility class that gets in the way of clean, simple use of your entities. Some kind of mechanism is needed to keep track of state accross object context instantiations, and it should be transparent so that POCO entities in an SOA environment can be used as simply as possible without requiring loads of plumbing and support code to keep track of changes and apply those changes to the ObjectStateManager.

    Q2: Neither is better, both are neccesary.

    Your developing a base platform upon which thousands and eventually millions of applications will be built on top of. In my experience over the last 6-8 months, the greatest problem in EF is its seemingly endless LIMITATIONS. In its current form, EF has such a nitch place because it limits and restricts to the nth degree. For EF to become truely viable as an O/R Mapping platform that can provide true value to the very wide variety of applications written on and for the windows platform, you need to eliminate limitations and restrictions on what is possible.

    So, in regards to question 2…don’t pick one…use both. Keep the framework as flexible as possible and support as many scenarios as possible. There is no one single way to write a program, and while there are few proper ways and many improper ways to implement a design, there are still a few proper ways. Support both state management methodologies, and you’ll keep more people happy and make EF viable for the broadest range of applications.

    Q3: Don’t Restrict and Limit.

    Just as in Q2, your building a base framework upon which applications will be built. Don’t impose MORE limitations on the consumers of your framework…eliminate them. There are going to be times when temporarily having Invalid State for a property is going to be neccesary to support what a developer needs. By immediately throwing an exception, you completely eliminate that option, and force a developer to invest more time finding a way around your imposed limitation.

    Allow invalid state…only throw an exception when its absolutely neccesary (i.e. persisting state to the data store.)

    I think EF has immense potential, but my overall experience, and particularly my experience since the release of v1, has been dismal. I have effectively given up on implementing EF in the near future, possibly for the forseeable future, because of the boundless limitations that I am continually running into. Most of the practical, proven, and effective methodologies for software development, such as DDD (Domain Driven Design), offer wonderful guidelines to developing stable, maintainable, and extensible software…yet EF does not support these common architectures and scenarios except in very limited capacity. (i.e. EF is still highly data-driven and limited in its O/R Mapping capabilities, and does not allow the DOMAIN to drive design decisions…it enforces schema-bound restrictions). That should change with EF v2.

  13. Shlomi says:

    Talking about snapshots – my wish for your next release is to see a rollback mechanism which supports TrasactionScope. if the scope fails – the context returns to what it was before the transaction.

  14. Gatecrasher says:

    I created, like many others, entity classes derived from the database tables/Sp’s.  I also implemented the "State" field (modified, deleted etc) and latched that to events used to implememt the databinding interfaces.

    Like someone else has said knowing it’s changed is sometimes not enough, you need to know what has changed.  You could argue that if that is the case use a DataSet which implements the "before/after" model. This of course has the drawback in that it doubles that amount of data being passed around.

    The other area that needs consideration is DataBinding at the client.  Various interfaces allow data binding to "undo" changes (start/endedit) and synchronise code/screen updates through their respective managers.  Databinding works very well with datasets but not so well with Linq to SQL (I understand) so maybe some lessons there.

    My problem was with databinding and serialization. To support Databinding I implemented the required interface and the notification events. Tied into those properties I modifed the "state" at the same time.

    Of course the properties that raised these events are called by the serializer.  So during a web service call "everything" becomes "modified".

    Being able to pass "work in progress" across serialization boundaries and have the state maintained is vital.

    Ideally the properties need to know when they are being called by the serializer and in .NET2 the XML serializer is not very helpful, there are no "before" and "after" events to latch onto.  The Binary serializer and WCF both implement these now I think.

    My "cludge" was to event a "first" and "last" property to set and clear a flag and fix the "field order"!! These properties would only ever get called when serializing (I hope) so could enable and disable the change events.  With WCF is can de-gludge my code I think.

    One thing I considered was using a Property/Hashtable type structure rather than the usual Property/Private field.  Optionally using 2 hashtables to store before and after.  The 2nd hashtable storing only fields that have changed and storing the original value.  Of course you need a serializable Dictionary<> object which I wrote too.

    I like the hastable approach, you only store twice at the field level when the field changes.  Any records in the 2nd table means "modified".  No changes means no XML when serialized unlike a DataSet which always stores everything twice so doubles the XML created.

    Where I work data load across the network is a "big thing", hence our interest in WCF and Binary serialization!!

    Breaking "hassle free" databinding is a reason we haven’t touched Linq for SQL.

    My story is simply pointing out that serialization, data load size, network performance and support for client Data Binding must all be factored into any decision.

  15. Gatecrasher says:

    I forgot to say that Dictionary<> objects do serialize the "key" which creates big XML too.  One reason I looked at it and then didn’t bother!

  16. Greg Young says:

    @Alexnaldo Santos:

    What you describe there seems to get rid of all the benefits of having typed objects in the first place (like type safety).

    Also why would your "poco" objects know how to undo themselves? This sounds like a responsibility that should be elsewhere. Generally if I am doing this type of action I explicitly model my state transitions as commands which allows this to happen easily.

    On the original topic, one of the most interesting bits of this for me would be in detaching an object from the current context and how that story would work.

    I think macsgold has a good concept with allowing explicit registration for tracking, but let’s stop calling these things entities and let’s allow the registration of an [Aggregate Root] for the tracking of changes. Obviously when viewed as entities with bi-directional relationships etc the work involved with traversal can be quite expensive. The [Aggregate Root] pattern and the preference of uni-directional relationships (object paradigm vs relational paradigm) makes the search space significantly smaller.

    A side note, dynamic proxies are scary for many users and one can quickly get to the "too much magic" point. In an ideal world the dynamic proxying would be transparent but it is unfortunately far from it at this point. Too many people expect (although they are writing terrible code) to be able to do things like if(obj.GetType() == typeof(Customer)) and the dynamic proxies will seriously confuse this type of developer.

    EF here is running into an issue many others have run into. A run-time weave is what would be wanted but the limitations and confusing use of dynamic proxies is a less than transparent solution. There have been many of us who have been asking for a way to run-time instrument IL (other than the profiler API :)) to be built into the CLR. This would allow the run-time weave behavior being sought while maintaining a much better level of transparency to the developer.

  17. Harvey says:

    In a DDD project, there will usually be s service object responsible for persist the domain object to the data store, such as UserSerivice.SaveUser(user) etc. So, why just let the consumer of the EF tell EF to persist the entity, no tracking, no comparison or notification. We just need something like _context.SaveObject(object), _context.DeleteObject(object) or _context.InsertObject(object). It is easy to implement and easy to use.

  18. Simon Segal says:

    I would really love to see the equivalent of "Fetching Strategies" that appear in NHibernate. In following a DDD approach, POCO along with Fetching Strategies would allow me to inject and specify what my loading intentions are, whether they be to eager fetch of lazy load. I could today build some kind of interpreter perhaps that allows my Repository to build a chain of .Includes for eager fetching but lazy loading is currently extremely difficult to accommodate in this respect. Essentially what I would love to see is the ability to create single fetching strategies that embody my loading intentions either eager or lazy. For example in LINQ To SQL I was able to achieve this with eager loading in this fashion:

    public interface IFetchingStrategy


       /// <summary>

       /// LINQ To SQL DataLoadOptions

       /// to use on the given DataContext.

       /// </summary>

       DataLoadOptions LoadOptions { get; }


    /// <summary>

    /// Marker Interface

    /// </summary>

    /// <typeparam name="TRole">

    /// Specification of a role for

    /// dyanmic runtime type lookup.

    /// </typeparam>

    public interface IFetchingStrategy<TRole> : IFetchingStrategy{ }


    public class CustomerLoyaltyDiscountFetchingStrategy : IFetchingStrategy<ICustomerLoyaltyDiscount>


       private readonly DataLoadOptions _loadOptions;

       #region IFetchingStrategy<Customer,ICustomerFaxChange> Members

       /// <summary>

       /// The DataLoadOptions used for

       /// eager loading.

       /// </summary>

       public DataLoadOptions LoadOptions


           get { return _loadOptions; }



       #region Constructor

       /// <summary>

       /// The constructor which sets the fetching strategies

       /// eager loading options.

       /// </summary>

       public CustomerLoyaltyDiscountFetchingStrategy()


           _loadOptions = new DataLoadOptions();

           _loadOptions.LoadWith<Customer>(c => c.Orders);

           _loadOptions.LoadWith<Order>(o => o.OrderLines);




  19. Tanveer Badar says:

    @Q1 and Q2: I think approach 1 of always giving correct results is the correct way to go forward with snapshots. It preserves existing code, provides correct results, seems like a win-win to me.

    @Q3: Throwing an exception is just limiting the user even more. There won’t be any choice except try{ } catch(blah b){ } around the whole thing.

    Another approach would be to fail gracefully, populate an Errors collection instead.

Comments are closed.

Skip to main content