Splitting Typed DataSets from TableAdapters – Sharing Validation Code across the tiers


You’re building a Smart Client application.  You want this Smart Client to be smart enough to provide validation feedback to the user as they enter information.  However, in order to make the application scale to thousands of users you can’t open connections directly to the database from the client because you’re server can’t handle that many connections.  You probably don’t want the data access information to be available on the client anyway so you turn to Web Services to help abstract the intimate knowledge about your database.  This way the client only has Web Service address and authentication information.  Since web services can be consumed by many different types of clients you don’t want to trust that all the validation rules have been enforced.  So, what do you do? 



  • You could put all the validation rules in the database, but that means your database has that much more processing to do, and your Smart Client won’t be so smart.  It will still have to make round trips to the database each time a user tabs off a control. 

  • You could use the XSD schema validation extensions, but they aren’t all that extensible and they don’t deal with dynamic values.

  • You could write the validation code twice.  Once for the server, then again for the client.  But that wouldn’t be all that maintainable. 

The likely solution is to simply write the validation logic and share it on the client and within the web service.  But wait, that’s type sharing… that’s bad…  Well, yes and no.  In the old days of COM/DCOM and even with Remoting you really were sharing the same actual object.  The client would instantiate a component and it would actually run on a remote server.  In addition to scalability issues, this also had a maintenance issue.  If you ever updated the component on the server, even if it didn’t involve changing the public interface, all the clients had to be updated as well.  To make this more complicated, since clients are running statefull operations on shared servers you usually had to completely shut down the entire system to make a simple update.  Web Services solves these problems as the object being used and what travels across the wire are really individual things.  The serialization format of a particular object has no real direct correlation to the actual object.  Yes, certain objects serialize certain types of formats, but that’s not a hard and fast rule.  I can talk in English, French, German, Hungarian, but I’d still be saying the same thing.  Likewise, 3 different people can all say the same thing in English, and it would have the same meaning.  Once you separate the wire protocol from the functionality you can change the internal implementation all you want, and as long as you don’t change the public interface and your clients can continue to consume the service. 


Now I know that many would argue that changing functionality within a service is breaking the contract.  And to some extent you’re correct.  If you’re exposing a public service, and don’t own all the clients then you do have an obligation to maintain consistency.  However, if you do own the clients and the services are only consumed as part of your application then this becomes a very flexible model. 


Take the following scenario:


You roll out your application over the weekend.  Thousands of users fire up the app first thing Monday morning.  They all get the updated version with ClickOnce deployment.  Life is good.  Later that day you notice that you forgot to put a validation rule in that all orders can’t be shipped for 7 days after the order date.  You quickly add the validation rule to enforce DueDate must be Today + 7 Days and you add a default value that DueDate is Today + 7 days, but how do you deploy this?  You could announce over the PA system that everyone must exit the sales system for 5 minutes which effectively shuts the company down and directly affects the profits of the day. That probably wouldn’t reflect well on your next review. 


Or, you could update the validation logic on the server without touching the client.  Since the public interface didn’t change, the clients continue to function.  Since ASP.net Web Services use something called Shadow Copied assemblies, you aren’t blocked from updating an assembly in the web server.  Once the file is finished being copied, the webserver detects that one of the assemblies that’s using has changed and reloads that assembly into memory. The next request will immediately get the new functionality.  At this point the client doesn’t have the validation logic, however, when the sales rep attempts to save an order the additional validation logic will kick in and send back the error to the client.  If the developer updated the client app with the new assembly then the clients will get updated the next time they restart their app. 


You could use the ClickOnce background APIs to constantly check for updates, and that’s a good thing but you don’t want to be checking or updates ever 10 seconds.  That sort of defeats the purpose here.  In the above example, you were able to immediately affect a change without interrupting any of the sales reps which means no downtime, and you’ve got a great review again.


 


So, how do you implement all this?  The basic concepts, including some of the proxy generation issues of Web Services equally apply to custom objects, but for this article I’ll focus on how to leverage Typed DataSets to enable validation on the client but enforce it on the server. 


 


One of the major features of Visual Studio 2005 is the new typed data access components called TableAdapters.  These can be used within your Data Access Layer to handle all the CRUD operations to the database.  TableAdapters are effectively strongly typed DataAdaters.  These bring parity with the Typed DataSet experience in Visual Studio 2002.  One of the VS 2005 enhancements of Typed DataSets is the use of partial classes.  In 2002/03 it was very difficult to add custom logic to the Typed DataSet.  Since Visual Studio uses something called Single File Generators to generate the Typed Datasets, any code the developer added to the Typed DataSet would be lost the next time the generator ran.  Developers would sometimes inherit from the Typed DataSet but that had lots of other complications as well.  It further complicated things as you now had two types; the one generated by Visual Studio and the one that had your validation code. 


Visual Studio 2005 Typed DataSets leverage a feature known as partial classes.  Using partial classes developers can now directly enhance the functionality of the generated Typed DataSet without the possibility of being overwritten by the Single File Generator.  This means we can now easily add our validation code directly to the Typed DataSet and enforce it on the server.  However, because Visual studio generates the Typed DataSet in the same file and project as the TableAdapters we don’t have an easy way to leverage the Typed DataSet on the client.  We could add a reference to the assembly that contains the Typed DataSet and TableAdapters on the client, but now the client has the intimate knowledge of the server that we’re trying to avoid? 


Great, so now what?  Well, it turns out it’s not all that difficult to separate these.  It’s not very discoverable, but it’s not that difficult.  What we’re going to do is separate the Typed DataSet from the DataLayer into its own DataEntities assembly.  We’ll create a Web Service to return the Typed DataSet and use the DataLayer assembly behind the web service.  The client application will share the reference to the DataEntities assembly on the client but it won’t have the DataLayer so we’ll keep the intimates of the database away from the client.  Since Web Services serializes DataSets as XML we can leverage the same type on both sides of the wire, but we’re not actually sharing the same instance so we get a lot of flexibility in how we update the application. 


 


The rest of this article focuses on a walkthrough for how to accomplish this.  I’ll highlight some of the best practices for this scenario to minimize complexities for working around the default behavior of the tool. 


To get things started we’ll create a solution with the projects to represent the logical and physical tiers.


Creating the solution


1.     Using Visual Studio 2005 create a new solution containing 4 projects.  It doesn’t matter whether you’re using VB or C#. 



















Project Name


Project Type & Description


Client


A Windows Forms project


DataEntities


A Class Library that will contain our Typed DataSet with validation logic


DataLayer


A Class Library that will load and save the Typed DataSet in the DataEntities dll


BizServices


A WebService that will simply wrap calls to the DataLayer


2.     Delete Class1 in the Class Libraries


3.     Delete the Service.asmx and Service.vb/cs file in the App_Code directory


4.     You should now have a solution that looks something like the following:


 


Adding a Typed DataSet/TableAdapters to the DataLayer


1.     With the DataLayer selectged, open the Data Sources Window. (You can do this using the Data Menu and select Show Data Sources). 


2.     Add a new Data Source and choose Database


3.     Add a connection to your database.  In this walkthrough we’ll use the Northwind database


4.     Using the treeview, choose your tables.   We’ll simply select the Orders table for this walkthrough.  You could use sprocs or views, but to keep this walkthrough focused, we’ll just select the orders table and press finish.


5.     You should now have a Typed DataSet and TableAdapter for the Northwind.Orders table


 


Moving the DataSet to the DataEntities project


We’ll now cut and paste the Typed DataSet definition from the DataLayer to the DataEntities project to isolate the intimate knowledge of the database from the data entities.



  1. Show the hidden files by pressing “Show All Files” button in the top of the solution explorer

  2. Open the NorthwindDataSet.Designer.vb/cs file
    In this file you’ll see two major sections.  The definition of the Typed DataSet and a namespace for all the TableAdapters associated with the Typed DataSet
    It’s easiest to see the source if you collapse the regions for the Typed DataSet and the TableAdapter namespace


  1. Copy the source code from the first line through the definition of the Typed DataSet

  2. In the DataEntities project add a new Class File and name it the same as your DataSet.  You’ll want to add the .Designer extension to maintain separation between the generated code and the validation code we’ll add later on.  In this case we’ll create NorthwindDataSet.Designer.vb

  3. In this file paste over the entire contents with the source you copied from the DataLayer.dll

  4. Back in the DataLayer project, delete the DataSet definition.  Be sure to leave the headers, Imports and if using VB, the Option Strict code

 


Fixing up the references


If you build the project you’ll now see a bunch of errors in the task list.  The TableAdapters reference a type that is no longer “visible”. 



  1. In the DataLayer project, add a reference to the DataEntities project.  This will solve the issue of finding the types. 

  2. You’ll now need to add a project level imports to resolve any namespace differences between the entities and DataLayer. 
    VB: Select the DataLayer project and double click the My Project node to open the Project Properties.  Select the References tab and using the Imported Namespaces check the DataEntities namespace.  The errors should now disappear from the task list.

C#: Open the NorthwindDataSet.Designer.cs file and add an additional Using statement for the DataEntities project:
using DataEntities;



  1. Once you build the solution you should no longer see any build errors.

 


We now have data access and data entities isolated and the have the proper references.  Notice that while the DataLayer does have a reference to the DataEntites the opposite is not true.  This means the DataEntiteis can be used without any database intimates.  The next problem is how to get the DataEntites from the DataLayer to the client.  This is where Web Services come in.  Using Web Services we can abstract the client form the server logic.  Note that this same process will work for the upcoming Windows Communication Framework in WinFX.


 


Exposing the DataEntities via Web Services


In order to expose the DataEntities via Web Service we’ll create some thin wrappers to delegate any calls to the DataLayer



  1. In the BizServices web service project select add new Web Service

  2. Name the Web Service Orders.  Visual Studio will create two files.  One for IIS to expose as an address, Orders.asmx.  And another for the code, Orders.vb/cs

  3. In the BizServices project add references to the DataEntities and the DataLayer projects.

  4. In the Orders.vb/cs file add the following code to expose the Load and Save methods:

    Private _ordersTableAdapter As DataLayer.NorthwindDataSetTableAdapters.OrdersTableAdapter


    Public Sub New()


        _ordersTableAdapter = New DataLayer.NorthwindDataSetTableAdapters.OrdersTableAdapter()


    End Sub


    <WebMethod()> _


    Public Function GetOrders() As DataEntities.NorthwindDataSet.OrdersDataTable


        Return _ordersTableAdapter.GetData()


    End Function


    <WebMethod()> _


    Public Function SaveOrders(ByVal orders As DataEntities.NorthwindDataSet.OrdersDataTable) As DataEntities.NorthwindDataSet.OrdersDataTable


        If orders.HasErrors Then


            Return orders


        Else


            _ordersTableAdapter.Update(orders)


            Return orders


        End If


    End Function



  1. We now have the ability to return and save Orders to the database.  The sample above does do some validation testing within the web service which you’d likely want to place in a separate assembly.  You could place it in the DataLayer as well  The point is you want to minimize the actual code in the Web Services project so you’ll have more flexibility for where to use this code.  For instance, when WCF ships you’ll likely want to leverage some of the new transports.  By keeping the code in the web service to an absolution minimum you’ll ease your migration process..  
    Another thing worth notice is while this sample focuses on a simple 1:1 mapping of the database to the returned DataSet, that isn’t the limitation of DataSets.  The TableAdapter wizards are optimized for relatively simple mappings, but using ADO.net commands, Stored Procedures and/or Views, you can do quite a bit of additional logic to differentiate the schemas of your DataSet from the underlying database.  Using the merge capabilities of DataSet, you can have a single DataSet or even a single DataTable represent data from two or more database systems.  But, that’s a discussion for another day.

 


Consuming the DataEntities in your WinForms client


We’ll now add the ability to load and save a form full of customers. In the client project we’ll add a Web reference to our BizServices to get the load and save functionality and we’ll add an additional reference to the DataEntities dll.to leverage common validation logic on the client and the server.  Note that we won’t have any reference to the DataLayer dll so the client will be clean of data access knowledge.



  1. Using Solution Explorer select Add Web Reference in the Client project.  You could use the Data Sources Window, but we’ll want to rename the service and this feature isn’t available in the Data Sources Window wizard.

  2. In the Web Reference wizard browse Web services in this solution.  You should now see your Orders service. 

  3. Since LocalHost is fairly meaningless, select the Orders Service and rename it OrdersService.  Press finish to continue

  4. If you notice the Data Sources Window you’ll see the dataset returned by the web service is now available.  What’s important to note is while this dataset is the same “shape” as the data set returned by the Web Service, it’s not the same Type as the DataSet in the DataEntities project.  The Web Reference Dialog created a proxy type for the dataset.  If you show the hidden files you can navigate through the OrdersService, Reference.map to find the refernce.vb file.  This contains the code necessary to call the OrdersService, but it also contains a type definition for the NorthwindDataSet.  While it is declared as Partial, we’re not going to leverage this feature. 

    Due to time constraints in the Visual Studio 2005 schedule we were unable to complete a feature we refer to as Proxy Type Sharing.  If you consume multiple web services from the same project that return the same type, Visual Studio will create multiple proxy types on the client.  This means you could actually wind up with several TypedDataSets on the client that all share the same shape.  However, even if we fixed Proxy Type Sharing, we still wouldn’t have exactly what we need in this scenario.  In this case, we already have the type on the client, so we don’t need a proxy.  In a few steps, we’ll see how we consolidate these.

 


Merging the proxy and original types


Although we’ll use the Web Service to get and save orders, we’re not going to use this TypedDataSet to represent our data within our client app.  Instead we’re going to use the DataEntities dataset



  1. With the Client project active, use the Data Sources Window to add an Object Data Source.  Add a reference to the DataEntities project and select the NorthwindDataSet as your Object DataSource. 
    Visual studio differentiates Object Data Sources from Database DataSources by providing a different design experience.  When the developer chooses Database datasources, Visual Studio creates Typed DataSets and TableAdapters.  When you drag & drop Database DataSources from the Data Sources Window Visual Studio will add additional load and save code to your form.  For Object Data Sources, Visual Studio doesn’t make any assumptions about how you load and save your objects.  Typed DataSets are simply very specialized data object containers that have all the necessary plumbing for serializing and databinding.

  2. With Form1 open, select the DataEntities.Northwind.Orders object in the Data Sources Window and drag it to the form.  Be sure you’re getting the right DataSet, it can be confusing as they’re both named similarly. 

    1. You can remove the Web Service DataSet proxy from the Data Sources Window with a little work around.  If you show hidden files in the Client project and navigate through OrdersService -> Reference.map you’ll see a NorthwindDataSet.datasource file.  This file is what Visual Studio uses to show object in the Data Sources Window.  If you delete this file the entry will be removed from the Data Sources Window but the Web Service will still exist. If you ever update the web reference the .datasource file will be recreated.

  3. Double click the form to get the form.load event and add the following code to merge the DataSet returned by the web service with the DataEntities DataSet.

    1. We could do another work around and change the code generated by the web service proxy generator, however this is much more complicated to maintain.  Since DataSets and DataTables support merge capabilities, we can avoid changing the generated code.  For plain old objects, (POOs) that don’t support merge capabilities, you’ll likely want to edit the proxy generated code.

    Private Sub Form1_Load(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles MyBase.Load


        Me.NorthwindDataSet.Orders.Merge(My.WebServices.Orders.GetOrders())


    End Sub


Notice that DataTables now support merging directly.  In 1.x, you had to merge the entire DataSet.  Also note that we were able to return an individual Typed DataTable rather then the entire DataSet.  This only works for Typed DataTables,  You can’t return a single UnTyped DataTable



  1. Activate the form designer again and enable the Save button that was created in the BindingNavigatgor

  2. Double click the Save button and add the following code to commit any changes made by the user and send the changes to the server.

    Private Sub OrdersBindingNavigatorSaveItem_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles OrdersBindingNavigatorSaveItem.Click


        Me.Validate()


        Me.OrdersBindingSource.EndEdit()


        Dim orderServiceOrderTable As New Client.OrdersService.NorthwindDataSet.OrdersDataTable()


        orderServiceOrderTable.Merge(Me.NorthwindDataSet.Orders.GetChanges())


        orderServiceOrderTable = My.WebServices.Orders.SaveOrders(orderServiceOrderTable)


        If orderServiceOrderTable.HasErrors Then


            MessageBox.Show(Me, “Errors on attempt to save”)


        End If


        Me.NorthwindDataSet.Orders.Merge(orderServiceOrderTable)


    End Sub



  1. Hit F5 to run you’re app

  2. If you look in the bin directory of the Client app you’ll notice we only have the DataEntities.dll and Client.exe files.  The DataLayer which contains all the intimate knowledge of the Databas are left out of the picture. 

  3. Using ClickOnce, publish the Client project and install it on your machine.  Be sure to use the default behavior so you get the Start menu item

Adding Validation Code


Now that we’ve got the client project setup and deployed, let’s add some validation logic.  In order to demonstrate the lack of tight coupling, we’ll add the validation logic to the DataEntities project, but we won’t actually deploy this to the client.  So, when testing this, be sure not to republish the client.



  1. In the DataEntities project add a new class file.  Name it NorthwindDataSet.vb/cs.  Note that we’re using the same standard as forms and the DataSet designer to isolate designer generated code from your code.

  2. In the file add the following code:

        Private Sub OrdersDataTable_ColumnChanged(ByVal sender As Object, ByVal e As System.Data.DataColumnChangeEventArgs) Handles Me.ColumnChanged


            ‘ We use the Changed event as we try to avoid deleting information a user enters


            ‘ Rather we show them what they typed, but provide information indicating the specific error


            ‘ This helps users understand if they mistyped, or the information is just not permitted


            ‘ Use strongly typed names of the columns to benefit from compile time verification


            ‘ If the user changed the OrderDate or ShippedDate columns, verify the dates


            If e.Column.ColumnName = Me.OrderDateColumn.ColumnName Or _


            e.Column.ColumnName = Me.ShippedDateColumn.ColumnName Then


                ValidateDates(e.Row)


            End If


        End Sub


        Private Sub ValidateDates(ByVal row As OrdersRow)


            ‘ Check for specific column errors and set or clear the errors


            ‘ Using this model, DataSets will surface the errors using the IDataError interface


            ‘ DataBinding in the DataGridView and ErrorProvider will pick up these errors


            ‘ and surface them to the user


            ‘ You may want to use strongly typed Resources for the error strings.


            If row.OrderDate > row.ShippedDate Then


                row.SetColumnError(Me.OrderDateColumn, “Can’t ship before it’s been ordered”)


                row.SetColumnError(Me.ShippedDateColumn, “Can’t ship before it’s been ordered”)


            Else


                row.SetColumnError(Me.OrderDateColumn, “”)


                row.SetColumnError(Me.ShippedDateColumn, “”)


            End If


        End Sub



  1. Rebuild the entire solution to make sure the updated DataEntites dll gets pushed up to the Web Service project.

  2. With the app still running, change the OrderDate to be after the ShippedDate.  Note that if you closed the app from above, simply start it again from the start menu so you get the version that doesn’t have the updated DataEnties on the client.  Notice that you don’t see any errors in the DataGridView.  

  3. Press the save button.  You may notice a slight delay as the Web Service re-computes the updated assembly, but when it returns you should now see ErrorProviders indicating that the ShipDate can’t be before the OrderDate.  

 


Updating the client


To update the client, simply republish the Client project using the ClickOnce deployment wizard and restart your client app.  


 


Wrapping Up


With the above scenarios you were able to make changes to the DataEntities without breaking the client applications.  You’re users maintained their productivity, yet you were able to quickly add new validation rules.  The clients could then be updated to complete the update, but you didn’t break you’re users workflow in the process.  With the basic tooling support in Visual Studio and these tricks you can achieve a truly factored ‘N scale application that leverages code where you want, scales to thousands of users, can be updated mid day without kicking your users off the system and delivers on the promise of Smart Clients for productive users.


Post Whidbey, we are working to enable this scenario without all the work arounds noted above, but for now…


 


Steve


 

Comments (17)

  1. SD says:

    Are you really recommending people to return typed datasets from webmethods?

  2. Steve Lasker says:

    Absolutely, returning Typed DataSets from WebServices or even Indigo is a very common and tooled, recommended and supported scenario.  While Typed DataSets may not be as friendly for generic services to the open public, they are perfect when you’re using Web Services as your means to scale out connections to your database for applications that live on both sides of the web services.  

    Is there a specific reason why you’re concerned?

    Steve

  3. Greg Donovan says:

    When can we expect a release that will us to configure code gen to separate the DataSets from the TableAdapters?

    greg

  4. Steve Lasker says:

    We’re working on our Orcas planning now and this is something we’re looking at closely.  Orcas will be a short product cycle for us, so we’re not sure if we can get it done, but we are doing some initial designs and costing to try and get it in the schedule.  Is this something that’s really important to you?

    Steve

  5. Greg Donovan says:

    It’s important in that we are an ISV (www.irmcorporation.com) and we want our solution to be database agnostic.  Some of our clients still run 98 which eliminates SQL Express.  And some of our clients have such large datsets that make SQL Express difficult because we have to use multiple databases to avoid the 4GB limit.  Manually separating the containers from the DAL isn’t too much trouble.  If I could have a wish granted – it would be for the 4GB limit to be raised to 10GB – or unlimited.

  6. Cleber says:

    Steve,
    I was like 1 week trying to do it!
    I have a question, so i want my client consumer app, just get the data from the WebService, because its in another VLan in my network and these Lans just communicate by http protocol…..
    So, will the client app work without communicate with the SQL server directly? (thats the point that i did not understand)

  7. Beth Massi says:

    This seems to be a really flexible design but does it change if you want to eventually move to a disconnected architecture? In other words, what would be the recommended approach if you wanted to add offline capabilities to this scenario by saving to SQL-Mobile first and synchronizing the data on a background thread? Would you leave the table adapters in the DataEntities assembly and use the web services instead to do the data sync? Or would you create a proxy that determined whether the system was offline or online and send data in realtime via the webservice when it could as described above and then switch to saving in the database cache when going offline? Hmmm…..

  8. Chris Sullivan says:

    Hello Steve,

    Firstly thank you for a great article. Right now I am part of an R and D team for my companies next generation product. So far we have crafted an architecture that supports a windows UI client that we will write. Long term goal is that the architecture will support a web client and various other clients in a SOA type environment.

    A sticking point for us has been whether to use domain objects or not to pass information through the system. Although domain objects have their place I feel that exposing data and function does not fit in a SOA world or indeed the web client, therefore want to keep the objects simple. Hence I have coined the phrase Just a Bunch Of Data (JBoD, yes I know it conflicts with some other acronym :)) to describe what the information objects should be. Ultimately we will receive data (of any kind) and reproduce that data in a known format for the business layer to validate and apply rules, the information the business layer receives is dumb. To make our UI client as usable as possible and give a great user experience we have had to duplicate some of the business layer functionality.

    The prototype that I created was classic 3 layer affair (4 actually but that’s another story). Instead of cutting and pasting a bunch of code as you have for the purpose of the prototype I created one huge strongly typed set for all of our tables in the DAL – because I couldn’t be bothered to write all the adapter code for the CRUD. Then for the client I used the dataset designer to manually create strongly typed sets to be consumed by the client. Using a modified version Glen Johnsons DAL (as described in chapter 6 of ADO.Net Applications book) I glued the client objects to the DAL objects on a like for like name basis i.e. people on the client maps to people on the DAL. The ADO.Net datatables ImportRow was used to copy the data across from one object to another. What was really cool was that we could change the structure of the client objects without any effect on the DAL objects and the mapping continued to work fine. This meant that we could use cut down versions of the objects on the client for search and read only display purposes. Typically an example being that is non editable data displayed in a grid.

    Unfortunately the flavour of the month for some of our developers seems to be ORM style objects. To be honest I do not find these as powerful as an ADO.Net dataset whether it is strongly typed or not. There is an ongoing debate regarding the use of these tools, although I do like Wilsons for the fact it’s meta data based.

    What would be nice for the future would be data adapter / table adapter to perform update across multiple tables (without having to handcode). Table adapter to be able to handle a subset of the schema. For example GetById() and GetBySurname() will both return the same number of columns, it would be nice to use the same table adapter and return a reduced number of columns.

    Would also be good if we had a strongly typed object designer instead of a dataset designer so that we can take those designed objects and place them into multiple datasets. Right now you have to manually go back to the data source for each object generated in the dataset designer. Obviously the developer would need to provide a wiring up mechanism so that the typed object maps back to a data source (or multiple data sources :)). Therefore you would only require one adapter (generated elsewhere) for multiple objects as it is the same data source.

    Chris.

  9. Jason says:

    We’re developing a smart client application that will leverage an Oracle database and will be using a Web Service to alleviate the need to have the Oracle client on every client PC.  I’ve used a similar approach to what you describe in this article in previous applications I’ve developed using v1.1 of the framework, except I didn’t use strongly typed datasets because of the single-file generator features you describe above and the absense of strongly-typed DataTables.  

    I’ve been playing with VS2005 a bit the last few weeks and the code generation tools seem to work quite well.  What I’m wondering, after reading your suggestions is this: If you manually split the strongly-typed DataSet/DataTable class from the TableAdapter classes, wouldn’t you be eliminating your ability to use the schema editor to make further changes to the DataSet/DataTable and the code generation features?  

    Thanks

  10. Robert Ensor says:

    "Handles Me.ColumnChanged" is fine in VB but theres no equivalent in C#! And we can’t bind the event in the datatable constructor because the parameterless constructor is already in the designer.cs file.

    So how do you do validation inside C# datasets 🙂

  11. Now that we’ve gone public with SQL Server Everywhere I was able to talk a little more openly about some…

  12. Darren says:

    “Handles Me.ColumnChanged” – not fine in VB.
    Can’t get the validation code to work – there’s no “Me.ColumnChanged” event to handle.
    There’s no “Me.OrderDateColumn” etc but there is a “Me.tableOrders.OrderDateColumn” etc
    How to wire this up?

  13. Steve says:

    I had to do this as well and the response from the dev’s on it upset me a tad.

    1. it seemed a bit short sighted not to make this a part of the adapter

    2. speaking of adapters – it makes me equally upset that the conn string can’t be set programmatically (ie. for encrypted strings, etc…)

    3. whoever came up with a ‘settings.setting’ file to store the conn string – WHY???? Just use the app.config – I don’t understand why they make these decisions – ie. we have used app.config , why this setting.setting file.

    I was a big fan of table adapters – but these design decisions have been major drawbacks to me – I’ve switched to using ORM tools instead.

  14. Here’s a list of new data tools features in Visual Studio. I will be discussing each one in more detail

  15. Here’s a list of new data tools features in Visual Studio. I will be discussing each one in more detail

  16. RSS It All says:

    Here&#39;s a list of new data tools features in Visual Studio. I will be discussing each one in more

  17. Here’s a list of new data tools features in Visual Studio. I will be discussing each one in more detail

Skip to main content