Approach – how to sync custom entities and dataset


The Problem Statement:


Web Services A communicates with Web Service B which updates database. The Web Service B use Dataset to handle and update database and depends upon dataset to handle concurrency issues. Where as Web Service A takes dataset from Web Service B converts to Custom entities and pass to Web Application for modification. In reverese the Web Application sends the data as custom entities back to Web Service A which converts the custome data entity back to data set and sends it to Web Service B, but the problem here is that Web Service B handles concurrency issues on basis of datasets, but now as the data set got converted into entities the dataset that is created to send back to web service B only has the new values, so how to solve the concurrency issue.


The solution definitely involves either sending the dataset through out the layers to web application or somehow persists the old data in case the dataset cannot be passed through the layers as of business constraints


As the title suggests I would be discussing the later i.e. where dataset cannot be passed to web application but needs to be converted into entities:


The FIRST approach:



  • The Web Application makes a call to Web Service A
  • Web Service A calls the Web Service B to get the dataset
  • Web Service A transform the dataset to custom entity and send it back to Web Application, and discard the dataset
  • Web Application A the persists the entity using viewstate/session
  • Web Application make changes to the entity and now call Web Service A with (oldEntity, newEntity)
  • On Web Service A, create the dataset and set the old values from oldEntity, update the dataset with newValues from newEntity and call Web Service B with dataset having both old and new values.

The Second approach:



  • The Web Application makes a call to Web Service A
  • Web Service A calls the Web Service B to get the dataset
  • The dataset is persisted at the web Service A using the Cache distingushed on the basis of UserID and Requesting Page, i.e. use Cache as Session
  • Web Service A transform the dataset to custom entity and send it back to Web Application
  • Web Application make changes to the entity and now call Web Service A with new values
  • Web Service A retrives the dataset based upon the User and the requested page and updates the dataset with new values
  • the cache object is invalidated and Web Service B is called with dataset with old and new values.
  • Cache Expiry and Cache scavenging is also implemented to free up the cache

The comparison of 2 approaches:


From a service orientation perspective that the presentation should only consume and persist the data to the underlying store through the help of these business services and should not be cognizant of the underlying idiosyncrasies regarding the persistence mechanism. If using first approach where presentation tier being cognizant of underlying concurrency issues(by passing 2 entities, old and new). It should only be passing one and only one instance of the entity to the business service to persist the changes that were made in UI.


 


The approach (i.e. first approach)for cloning and sending old values and the altered values is the approach what  Dataset is uses internally. The Dataset implements IBindlingList interface and keeps track of the changes done like deletions, modification etc using the cloning before modifications so that it can maintain existing and the modified data. While analyzing the same approach for the problem I realized that we can tweak it little by passing the old and new entities as separate parameters as we didn’t required the lists of objects being returned at one go where as one entity needs to be modified in one call. Therefore presented the approach in the simple and tweaked format thus gaining on performance and making it simple to implement.  The  approach can be very well achieved without any tweaking by just passing a single entity to the UI layer which implements the IBindingList interface and follows the cloning approach internally, whereby encapsulating the cloned entity and returning the entity back to the Service Layer as single parameter.



If you are planning to cache all the data on the Service Layer, then the approach of caching the dataset might be a good approach, but if it not the case the you must try to analyze that is it worth putting the resources on Service Layer and implementing the cache for just preserving the existing values received from the web services.


 


Even though the first approach outlining has got merit but why would we pay performance penalty for instantiating few more objects? Not to mention, the cost of first constructing a dataset in the update method with first the old entity and then update it with the new entity – Double update for this dataset! Furthermore, from a maintainability standpoint, why not have a simpler model with less code rather than writing more code?



The debate is still open, please feel free to continue and provide your views.

Comments (0)

Skip to main content