Validation Application Block: Revealed!


I’m sitting at home, still slowly digesting turkey, watching the snow fall outside the window, and trying to deal with the prospect of the imminent full work week. But while the 4 day weekend has definitely been a welcome break, the next week should actually be pretty interesting, with almost all of the Enterprise Library team in the same city for a change (alas, it’s Redmond 🙂 and the project starting to take shape.


Probably the most interesting new inclusion in the v3 release will be the Validation Application Block. We’re really looking forward to sharing some early drops with you soon, but as we’re not quite ready to do this, I wanted to briefly share a few of our plans and give you the chance to provide some feedback. Naturally, all the usual standard disclaimers apply (side effects may include headaches and nose bleeds etc), and there’s a good chance that the details will change before the release, especially if you ask us to change them :-).


Here are a few of the key scenarios we plan to support with the new block:


Defining Validation Rules


The Validation Application Block will include a comprehensive library of common validation rules that apply to primitive data types. For example, we’ll include rules like string length, numeric range, date range, regular expressions and so on. However your applications will typically deal with more complex objects such as Customers or Orders (yes, here at Microsoft we assume every application is based on Northwind ;-), so while the built-in Validators should be great building blocks, you’ll need to do some additional work to specify how these primitive rules apply to more complex objects. We plan on letting you do this in two primary ways: in configuration (which is ideal if you want the rules to be easily changed after deployment), or in code (which allows better encapsulation of rules and ensures the behavior won’t change unless the code does).


Defining Validation Rules using configuration


As with every other Enterprise Library block, we would expect most people to use the Enterprise Library Configuration console to define externalized validation rules. The following pseudo-configuration shows how you might be able to build validation rules using the tool:



  • Validation Application Block

    • System.String

      • Rules

        • EmailAddress

          • Validators

            • RegExValidator (Pattern:xxxxx)

        • ShortString

          • Validators

            • NotNullValidator
            • StringLengthValidator (Min=1, Max=5)

    • GlobalBank.Customer

      • Rules

        • ValidCustomer (Default=true)

          • Validators

            • PropertyValueValidator (Property: Name, Type: System.String)

              • Validators

                • NotNullValidator
                • StringLengthValidator (Min=1, Max=50)

            • PropertyValueValidator (Property: DateJoined, Type=System.DateTime)

              • Validators

                • RelativeDateValidator (Kind=Before, OffsetFromNow=0)

        • GoldCustomer

          • Validators

            • PropertyValueValidator (Property: DateJoined, Type=System.DateTime)

              • Validators

                • RelativeDateValidator (Kind=Before, OffsetFromNow=5, Units=Years)

Some interesting things to note here are that validation rules and individual validators are specified for a specific type – this should let us effectively filter the type browser dialog in the tool. As you’ll see later, we’ll also have a generic-based API to provide similar benefits at the code level. Also note how you can supply multiple validators for a single rule, either using simple Boolean AND logic, or by attaching validators to properties (or potentially fields or methods) of the target object type. Finally, note how it is possible to define multiple different rules for the same type, letting you do interesting things like differentiate between multiple types of “validity”.


Defining Validation Rules using attributes


Another interesting scenario for validation rules is to declare them within the objects that are being validated. This will be supported using attributes, as the following example shows: 

public class Customer
{
// Using fields instead of properties for brevity
    [NotNullValidator]


[StringLengthValidator(1, 50)]
public string Name;

[RelativeDateValidator(RelativeDateKind.Before, OffSetFromNow=0)]
public DateTime DateJoined;
}

In the previous example, the class has only a single, anonymous rule set defined. However as you saw from the configuration-based example, it may be interesting to have multiple validation rule sets for a single class. We’re planning on allowing this too, by letting you specify which rule set each validation attribute belongs to:

public class Customer
{
// Using fields instead of properties for brevity
    [NotNullValidator(“ValidCustomer”)]


[StringLengthValidator(“ValidCustomer”, 1, 50)]
public string Name;

[RelativeDateValidator(“ValidCustomer”, RelativeDateKind.Before, OffSetFromNow=0)]
[RelativeDateValidator(“GoldCustomer”, RelativeDateKind.Before, OffSetFromNow=5, Units=DateInterval.Year)]

public DateTime DateJoined;
}

Defining Validation Rules using code


Of course, the attribute-based approach is only viable if you own the code for the type being validated. In some cases this won’t be an option – maybe the class was written by another team, maybe you only have access to the binary assembly (such as for a .NET Framework class), or maybe the class was generated by a tool like wsdl.exe. In this case, you’ll be able to build up single or composite validators using code, for example:

IValidator<string> emailAddressValidator = new RegExValidator(“xxxx”);
IValidator<string> shortStringValidator = new AndCompositeValidator<string>(
new NotNullValidator<string>(), new StringLengthValidator(1, 5));

Validating objects


Regardless of which of the above approaches you use to specify your validation rules, they won’t be of a lot of use unless you actually plan on validating some objects. We also plan on providing a few ways of doing this:


Validating objects from custom code


The most flexible way of validating objects will by writing code directly against the block’s API. First, you’ll need to get a reference to the appropriate Validator object. You’ll already have this if you defined the validation rules in code, but if you used configuration or attributes you’ll need to use a factory:

IValidator<string> emailAddressValidator = ValidationFactory.CreateValidator<string>(“EmailAddressValidator”);
IValidator<Customer> customerValidator = ValidationFactory.CreateValidator<Customer>(); // assumes default rule
IValidator<Customer> goldCustomerValidator = ValidationFactory.CreateValidator<Customer>(“GoldCustomer”);

Once you have your validator, you’re ready to do some validation! We’re expecting this code to look something like this:

ValidationResults results = customerValidator.Validate(myCustomer);

…where ValidationResults is a collection of ValidationResult objects, each of which will report an individual violation, complete with references to the validator, the object and property causing the failure, and error messages.


Alternatively, if you aren’t doing anything wacky with polymorphism, we’d like to provide a façade which lets you create the validator and do the validation in one go:

ValidationResults results = Validation.Validate(myCustomer);

Integrating Validation into your application


While the approach above should work in almost any case, it isn’t optimized for use for any specific technologies or layers. To make it easier to integrate validation into your application, we’re looking to build some “adapters” that will plug the validation engine cleanly into different technologies. We haven’t decided exactly which technologies will be included, but we are looking at ASP.NET (server-side and AJAX), Windows Forms, Windows Presentation Foundation and Windows Communication Foundation. We’ll give more details on this as we work them out, but feel free to provide any suggestions or feedback in the meantime.


Creating your own Validators


While we have a pretty sizable list of primitive Validators we plan to include in the block, its still pretty likely that you will have some interesting requirements that can’t be met by stringing together our original validators. To get around this, you’ll want to create your own Validator classes. These could validate primitive objects in new and interesting ways, or you could build individual validators that can deal with more complex types (Customers and the like).


Finally, one additional scenario we are considering is letting you “in-line” the validation logic within the classes themselves:

public class TemperatureRange
{
    private int min; 
    private int max;


    [SelfValidation]   
    // if you want to mean this is only specific to a certain rulename you do [SelfValidation(“Gold”)] 
    public ValidationResult IsValid() 
    { 
         ValidationResult retval = (max < min) ? new ValidationResult.Failed(“Max has to be larger than min”) : null;

return retval;
    } 
}

 


I hope this gives you an idea on what we have planned. It’s not too late for us to tweak things, so if you have any important scenarios that you don’t think will be covered in anything I’ve described above, you know how to find me!

Comments (79)
  1. As mentioned before , the Patterns and Practices teams has already started working on Enterprise Library

  2. Luis says:

    Tom,

    Can this application block validate the parameters in a method using attributes, for example in a service, without defining a match with a field or a property?

    Thank you.

  3. kris kilton says:

    Is this designed to fit in or replace a rules engine?

  4. Luis – we haven’t built anything like this yet, but it could be an interesting scenario for the WCF adapter. Would you be able to provide some more details on the usage scenario for this? If it’s too big for a comment, feel free to mail me directly (tom.hollander at microsoft.com).

    Kris – No this isn’t designed to be a fully-fledged rules engine. For example, each validator will only return true or false – there’s no way for the validation engine to calculate values or choose from alternative branch paths. However it should be easy (and useful) to call the Validation Application Block from within a rules engine such as WF or BizTalk.

    Tom

  5. don mai says:

    Can this application block flag SQL/LDAP injection strings?

  6. CoqBlog says:

    Tom Hollander nous présente l’application block de validation qui fera partie de la version 3 de l’ EntLib

  7. Marty Bell says:

    Are you considering any kind of localisation support in defining validators.  This would be useful if the app was culture sensitive then the validators could be swapped on culture also?

  8. Scott Harwood says:

    How about a WinForms adapter including failure strings and making them available to the ErrorProvider when an object is data bound to controls?  Perhaps this would clog up what you are trying to accomplish.

  9. Sam Judson says:

    This looks really exciting! Just what we need 🙂

  10. Cool says:

    Finally I am going to get it

  11. Don: We haven’t planned anything a validator dedicated to script injection, since what constitutes valid or invalid input will be very dependent on the technology and usage scenario. However, we will have a RegEx validator which should should work well for checking script injection, provided you can come up with the appropriate regular expressions.

    Marty: Will the support for multiple rulesets on the same types (eg the ValidCustomer and GoldCustomer example in my post) be enough to support culture-specific validation? If not, what else would you want to see?

    Scott: Yes we are looking at doing something along these lines for both WinForms and ASP.NET.

  12. Tom Hollander, a Product Manager working for the Microsoft patterns &amp; practices group, discusses

  13. Marty Bell says:

    Possibly, I guess I was thinking that if we swapped the culture from say en-UK to en-IE the validator for Customer Name would be different (to account for the extra characters that could be used).  It would be nice to factory these in based on the current culture….

  14. Ronen says:

    Tom Hi,

    Can you elaborate on the exception handling for the different validation?

  15. Vikas Goyal says:

    Hi when prodcution ready v3 is expected to be out ?

  16. BjornS says:

    Hi Tom,

    I’ve heard that there are plans to include a Visual Studio Plug-In for the EntLib Config. tool. If this is the case, would it be possible to makeadd validation rules there like one does it in Workflow Foundation with Policy and Rule Sets?

  17. Kanz says:

    How would you recommend using the Valication Application Block (VAB) in an n-tier Web Application? Specifically at the UI layer, the Facade layer, and the Data Access layer.

  18. El Bruno says:

    Buenas, algo nos habia contado Tom Hollander en el TechEd, pero por fin … tenemos otro un AppBlock

  19. El Bruno says:

    Buenas, algo nos habia contado Tom Hollander en el TechEd, pero por fin … tenemos otro un AppBlock

  20. Jose says:

    There are lot of cases in which you create your business entities based on the database model, it would be great to add an easy way to build and "sync" the basic rules (not null, min and max values, length …) with the schema of the tables.

  21. Michel Grootjans says:

    How will the validations interact with visual validation widgets like the asp.net, winforms and third party validation controls?

  22. Blog-a-Styx says:

    Pour ceux qui ont déjà eu l’occasion de s’intéresser à l’excellent Enterprise Library , ils seront content

  23. odalet says:

    Reading this kind of constructs:

    "new AndCompositeValidator<string>(v1, v2)"

    lets me think:

    this sounds a little like functional (or even mock objects syntax), so shouldn’t this be "linqable"?

    You could keep C# 3 in mind  while building this block, and thus, when it becomes available, one could write some "linq" code of this sort:

    ValidationResults r = Validate myObject

    where v1 and v2 or not v3;

    Feasible? Planned?

  24. Simon says:

    Sounds really promising!!

    I also totally agree with Luis – being able to add annotations to cause method parameters to be validated would be great.

    In particular, this would allow us to get rid of code that checks arguments for null etc in the body of the method – although I’m sure much wider uses would become apparent too. Anything that keeps method content to pure business logic or flow is positive in my mind.

  25. Dan Blanchard says:

    Although the Validation Application Block’s rules are aimed at primitive data types, one frequently encounters validation scenarios involving two or more fields.  

    It would be nice if the VAB addressed such cross field validation scenarios. (Which is not to say that what you’ve described so far isn’t welcome:)

  26. Geert Klinckaert says:

    1. "in-lining" the validation logic within the classes themselves should definitely be included.

    2. Although not entirely ‘Validation’ related is to have the Validator provide some additional info like which properties are required.

    I already use this concept in my applications where I am able to do somthing like  validator.CalculateUICues().

    This returns a collection indicating which properties are required, readonly, invisible, …

    ‘Required’

    |__ ‘Name’, ‘Age’, …

    ‘ReadOnly’

    |__ ‘Title’, …

    ‘Invisible’

    |__ ‘Salary’, …

    This can then be used to have the UI react to this information and it is very usefull in situations where f.i. depending on the user input in for instance a drop-down the requiredness of the properties change or depending on the role of the user some fields become readonly or …

  27. Thomas Beck says:

    Tom – As always, thanks for involving the community on this. It looks like you have a laundry list of ideas to work through already. I’ll go ahead and add my $0.02 to the bottom of your list:

    • I like the attribute-based approach. As someone who is increasingly straddling .NET and Java these days, I would encourage you to look at the usage of annotations in Java5/EJB3 to see how they have addressed similar challenges, including validation.

    • I’m sure that you guys are already going in this direction, but I’d recommend making sure that the collection of ValidationResult objects that you return is bindable. Yeah, this displays a bit of affinity towards UI-oriented interfaces but it saves a lot of headaches in the long run.

    • Speaking from experience, externalization of rules into a cached XML file is something that turns up as a pretty frequent requirement in this area and isn’t too hard to implement. Business users love knowing that they can change these validations at a whim, even if the need crops up relatively infrequently in the real world.

    As a short aside and judging from some of the responses above, it would be really great if the P&P group could offer some guidance on the recommended application of different .NET-based mechanisms to enforce business logic. From attribute-based validations and compiled business logic to the use of products for rule-based decisions or full-out business processes,  I think the community would benefit tremendously from some formal guidance in these areas.

    Thomas Beck

    http://www.beckshome.com

  28. Sam Gentile says:

    Workflow/BPM/WCF/SOA David Chappell presents arguments both pro and con as to whether Microsoft qualifies

  29. paul de vries says:

    Are you also planning class validators attributes? With class validators I mean for example a validator for a customer class, with fields country and postal code, which can validate the combination.

  30. Kevin Idzi says:

    This is great to see how the patterns and practices are approaching business rules.  We’ve worked with making something similar at where I work, and tied it to work with Typed Datasets. We made the rulesets a collection of rules which only know the datasource table and the column name (as well any dependent columns).  

    Our rules can have group validators which have a series of validation rules for the criteria, and then a series of rules to actually validate against if there if the criteria is true.

    The way we set it up, our ruleset can be saved and cached globally.  Then a dataset can attach the ruleset and be validated as a whole (loop each table, grab all of the validation rules from the ruleset which are tied to the table, and execute).  

    It ties into our UI because we have a control manager which handles which controls are bound to which datafield in a datatable/dataset.  So when a user changes a field, we can query the ruleset for all rules which have a dependency on that column, then grab all of the controls from the controlmanager which have are bound to those fields, and revalidate all of the rules and update all of the visible controls which are affected.  This allows us to do real time validation of just what is needed. And then the whole thing can be validated against if needed, but in most cases we only want to validate what is visible on the screen (so the user can actually change it to resolve the problem).  

    I’d like to see more of the conditional grouping with the enterprise version, a dynamic error message which can return data from the ‘object’/row being validated in addition to a message, a way to get all of the rules associated with just a ‘table’ or a ‘column’ – and a way to store the dependencies for the rules.  Also making the ruleset cached globally is critical for large applications which do not want the continuous overhead of creating these rulesets each time they are needed.  

    The attribute technique is interesting, but I am not sure it fits for typed datasets. The API variant is interesting, but not storing the dependencies for a rule seems to be a mistake as far as tying it into a UI which only knows about the data.  The UI needs to be able to query the rules to revalidate for a given field.  And keep in mind, some rules might be dependent on multiple fields (and even calculated columns).  

    Just some thoughts I wanted to share, this is a good start though.

  31. NoMan says:

    So, like, when is the Configuration Tool going to do something useful, like, oh, I don’t know, actually saving config info when you hit save? Where is some documentation for troubleshooting this piece of crap?

  32. Eric M says:

    Anything that can let us specify validation rules once and then use these rules in both the UI (ASP.NET especially) and in the business objects would be great.

  33. Sam Judson says:

    Are what your talking about anything to do with this:

    http://www.codeplex.com/ValidationFramework

    ot is that something else? (looks very similar)

  34. Kevin Idzi says:

    Sam –

    That is an interesting site, thanks for the link.  I like the way it generates the asp.net client side validators as needed, that’s very clever. What I’m talking about is something we wrote internal, so it is not public.

    Having validations declaratively on the business object just seems a bit too strict for me.  I like someone’s post above about having the rules serialized to XML, that is something we have the option to do as well (or a database). It also allows you to have many more business rules because they can be cached, and the burden of the creation isn’t added for every instantiation of your object. But you need to disassiciate the rule from the target to do so.  You need to do a ‘lazy’ bind, or have the ability to do so since some may well prefer a static binding for simpler applications.  

    And I am not a personal fan of the IErrorProvider for these sorts of things, since there are multiple rules firing for a given data column, there might be multiple errors set at any one time, and changing one field might trigger others.  The code in the ValidationFramework is way too tedious for setting those rules.  I’d prefer an object where the ruleset is defined, and then let the ruleset manager know how to get the dependencies and keep track of the bindings for each rule and provide lookup services, and then have a control manager to handle the binding using a rule monitor to watch for changes and update the controls accordingly.  It might seem to add some extra layers, but the flexibility is amazing – and this would work just fine for backend validation with just the ruleset by itself (and the business object of course).  

  35. Ryan B says:

    Looks interesting. My two cents would be that I would think about separating the definition of the rules from the actual validation of the rules. It appears that the validators are actually both.

    The reason I say this is that I think it is going to be difficult to create validators that work in different layers of an application (UI,Domain,etc). While it might be possible for some applications, I think it would definitely be fragile.

    Instead why not have different types of Rule specifications that are simply data objects and store information about a rule, but do not actually evaluate the rule. You could provide  a library of validators that use these rule specifications to do validation. If the provided validator does not work for a particular situation (works for domain validation, but not UI validation) than a developer could create a new validator to work in that case. The key is that the Rule Specification does not change only the validator does.

    We are using this technique currently in our application and it works very well.  

  36. Enterprise Library his going to be extended with a Validation Application Block

  37. MSDN Austria says:

    Die ersten Details zum neuen Validation Application Block wurden kürzlich auf Tom Hollander’s Weblog

  38. Kevin I says:

    There is some work that might need to be rethought a bit as far as ASP.net client side goes, but currently we’re using the ruleset generically in the backend and in the UI.  All the validation does is check values in a datarow. And since we standardized on datasets, that makes it easier.  For domain, that is passed into the ruleset (if needed) and rules are dynamically created based on the domain values (CvvLength, etc). Those rulesets are created once, cached, and the datasets are used during the validation process.  

    We have discussed after getting the intial version of these out having these ‘rulesets’ be something that can load from a database, but that doesn’t always work very well.  

    For example, if you have a payment type domain that has different credit card types and information on each (cvvlength, pinlength, etc), making a ‘static’ rule definition would not work. Instead, the rule definition needs to be made with this lookup data available to loop through the domain objects and create the rules accordingly (in a general fashion for all payments — I’m not talking VISA do this, AMEX do that, the domain should contain the needed information for if a pin is required, or the regex for the creditcard string, etc).  

    We do have a library of validators, that is how it is designed currently. We have a base object the rules need, other things can be added as needed.

    Due to the way the rules are created, they are self-aware so that our ruleset knows all of the fields dependent on the rules within the set.  This is used by our UI Rule monitor which when a field changes, the rules are retrieved to then get the controls and display the messaging.  The backend just tells the ruleset to validate the dataset 🙂   So I don’t think it’s quite as fragile as you think.  

    For our asp.net integration, we would generate the client side information needed when the page is loaded, since we would know all of the rules needed for the given screen (due to the self-aware nature of the rules).  I wouldn’t want a rule which can dump out ASP.Net sitting in the backend, I’d rather that all be in the frontend. Keeps it more separated and decoupled.

  39. Thanks for all of the feedback, everyone! I’ll try to get some time to respond to some of individual questions later. For now, I’d like to drill into the excellent suggestion for using the block to validate parameters. We were imagining this would let you decorate method parameters as follows:

    public void DoStuff(

       [NotNullValidator]string x,

       [NumericRangeValidator(10,20) int y)

    {

    …}

    If we had a generic interception mechanism, it would be very cool to have the validation enforced automatically. But in the absence of such a beast, we would need to provide an API that walked the stack to find the parameters, extract the attributes and then perform the validation. The reflection APIs let you query the parameter’s attributes, but not the parameter’s runtime values, so unfortunately it would be necessary to pass the parameter values to the validation call from code – the API would use a parameter array and assume the parameters are passed in the same order as in the method signature.

    We were thinking of two methods: one which validated the parameters and throws an exception if anything is invalid, and another which just returns the ValidationResults, eg:

    Validation.CheckParameters(x, y);

    ValidationResults results = Validation.ValidateParameters(x, y);

    What do you think of this idea? Any suggestions to improve it?

    Tom

  40. SimonC says:

    Tom

    A question about validating parameters. I dont see how your method signature would work. Below is the closest I could come up with. Note the two options, one with black magic and one without.

    public class ParamValidationAttrbute : Attribute

     {

       public void Validate(object parameterValue)

       {

         //Call some validation based on Attribute

         //Possible throw an ArgumentException

         Debug.WriteLine(parameterValue);

       }

     }

     public class MyClass

     {

       public void Method([ParamValidationAttrbute] string hello)

       {

         ParameterValidator.Validate<MyClass>("Method", hello);

       }

       public void Method2([ParamValidationAttrbute] string hello)

       {

         ParameterValidator.Validate(hello);

       }

     }

     public static class ParameterValidator

     {

       public static void Validate(params object[] parameters)

       {

         StackFrame fr = new StackFrame(1, true);

         MethodBase method = fr.GetMethod();

         ValidateMethodInfo((MethodInfo)method, parameters);

       }

       public static void Validate<T>(string method, params object[] parameters)

       {

         MethodInfo methodInfo = typeof (T).GetMethod(method);

         ValidateMethodInfo(methodInfo, parameters);

       }

       private static void ValidateMethodInfo(MethodInfo methodInfo, params object[] parameters)

       {

         ParameterInfo[] parameterInfos = methodInfo.GetParameters();

         foreach (ParameterInfo parameterInfo in parameterInfos)

         {

           object paramValue = parameters[parameterInfo.Position];

           object[] attributes = parameterInfo.GetCustomAttributes(typeof(ParamValidationAttrbute), true);

           foreach (ParamValidationAttrbute attrbute in attributes)

           {

             attrbute.Validate(paramValue);

           }

         }

       }

     }

  41. Thanks for the code sample SimonC! This looks pretty close to what I was imagining. The only real difference I can see is in the return values/exceptions from the validation methods. Which part of my API sample were you having trouble with?

    Tom

  42. SimonC says:

    The code I was trying to work out was

       Validation.CheckParameters(x, y);

       ValidationResults results=Validation.ValidateParameters(x, y);

    I could not work out how you were going to extract the method info (and hence the validation attributes) without either passing through the type (i used a  generic to do this) and the method name or take the performance hit of walking the stack.

    I was actually hoping there was something I had missed as I would prefer not to force the consumer to pass through the method.

  43. SimonC – not that I know of. I was assuming a stack walk like in your "black magic" example. Obviously there will be performance implications to this approach; we would need to do some testing to determine how significant this is.

  44. Mr. Underhill says:

    Please consider the integration with the UI of this validations.  Think about this, once a rule is broken in a given object, you want to let the user know about it, PLEASE PLEASE do not keep the validation block only at the class level, it will be really beneficial to prescribe the integration to the UI.

  45. thardy says:

    As to integration with UI, I see two components being necessary:

    1. The user needs to be notified of the results – This is simply accomplished by having a common mechanism to allow your layers to place messages somewhere that they will be displayed to the user.  This is easily handled in your own application framework and probably is well beyond the responsibility of this validation framework.  Just do something like Ruby’s "Flash" mechanism.  Whatever framework you are using should allow this functionality to begin with.

    2. The declarative validation attributes should/could cause client-side validation to occur – This is the kicker.  You should be able to declare all validation requirements in one place, then have them executed both client-side and in the business methods (not code-behind).  That way any APIs you write against your business methods will include the validation.  

    The main requirement I see for the second is a mapping between controls and either entity properties or method parameters.  The mapping between entity properties and controls should be straightforward – something like the following:

    validation.AddValidation(“txtFirstName”, “FirstName”);

    formPerson.DataBound += delegate(object s,

    EventArgs args)

    { validation.GenerateValidators(formPerson,

    formPerson.DataItem); };

    (A code snippet straight from an article by Steve Michelotti for Visual Studio Magazine – https://www.ftponline.com/vsm/2006_06/magazine/features/smichelotti/)

    It would be great if the framework could refactor some of that out of site and automatically wire up the validator generation, but I don’t see the above being very problematic.  

    The real difficulty comes in mapping business method parameters to UI elements.  For starters, I get a dirty feeling exposing business method details in my pages.  We use a MVC2/MVP framework and controllers handle all the logic to call the business methods.  Perhaps we could do that mapping in our controllers and I’d feel better about it.  

    But even if I could get past that, we’d still need a mechanism to do it.  We’d need to map the class, method, and parameter to a UI element.  Perhaps the concept of a ParmObject would fill the gap.  Have our framework create a simple ParmObject for every method signature, and make these readily available to all layers that might use them.  Something like the following might result…

    Validation.MapValidator("txtFirstName", ParmObjects.MyClass.MyMethod.FirstName);

    Any thoughts?

  46. thardy says:

    I just realized that perhaps some of the comments on UI integration were referring to the actual message being displayed for a failed validation, not the mechanism to display the message.  I totally agree that the message to display for failure (or a resource key that will get the message) should be available for declaration within the validation attributes themselves.

    [RequiredValidator("SuperValue is required.")]

    or

    [RequiredValidator(ResourceKeys.SuperValueReq)]

    That’s definitely a must-have, especially for properties/parameters with multiple validators attached to them.

  47. SimonC says:

    Tom

    re performance of validating attributes, below seems to be the most efficient. Not the friendliest API but it avoids the use of the stack. Perhaps provide both the stackframe and getcurrentmethod options with some lengthy doco on the performance. Better performance can also be achieved through a static cache of parameterinfos

    public static class ParameterValidator

     {

       public static void Validate(MethodBase methodBase, params object[] parameters)

       {

         ParameterInfo[] parameterInfos = methodBase.GetParameters();

         ParamValidationAttrbute[] attributes;

         foreach (ParameterInfo parameterInfo in parameterInfos)

         {

           attributes =

             (ParamValidationAttrbute[]) parameterInfo.GetCustomAttributes(typeof (ParamValidationAttrbute), true);

           foreach (ParamValidationAttrbute attrbute in attributes)

           {

             object paramValue = parameters[0];

             attrbute.Validate(paramValue);

           }

         }

       }

     }

       public void Method([ParamValidationAttrbute] string hello)

       {

         ParameterValidator.Validate(MethodBase.GetCurrentMethod(), hello);

       }

  48. As a web developer, I would prefer an AJAX implementation.

    Much cleaner.  So I vote for AJAX!

    Thanks!

    Nathan

  49. Omari says:

    You should think about future and C# 3.0

  50. This is very interesting.  For comparison I’d like to make sure you’ve seen how <a href="http://www.lhotka.net/Article.aspx?id=12983bcf-4599-4a11-917c-72f3d473883e">CSLA handles rules using its RulesManager</a>.  Your approach so far seems to lack the concept of a rules manager to "catch" all the results.  The advantage of the rules manager is that it makes it easy to collect broken rules, and display them in another tier, or log them all.  Without it, rules are disconnected from the rest of the application.

    For those who talked about other rules engines like BizTalk and WF (Workflow Foundation), the CSLA author has <a href="http://www.lhotka.net/Article.aspx?id=245d288e-ed3c-481a-9148-b88883d735e4">this</a&gt; to say about the topic.  I can definitely see what he’s talking about, and that something like the Validation Application Block looks more straightforward to integrate than something like Workflow Foundation.  I’d be interested to hear confirmation from the authors of the Validation Application Block as to whether you agree about the applicability of different types of "rules engines"–it sounds as though you agree that there is a "right tool for the job" and that no one engine is appropriate for everything (even if you could stretch yourself and model validation as a Workflow Activity, why would you want to?).

    As an aside, when <a href="http://www.ideablade.com/">IdeaBlade</a&gt; created the FunHouse project as a test application for their DevForce project, the validation they implemented was using a modified version of CSLA’s rules.  It would be interesting to see a similar framework, only taken to the next level (as the use of attributes and XML serialization of rules can provide).

  51. Tatis says:

    El equipo de P&amp;P ya se encuentra trabajando en la siguiente versi&oacute;n de los Application Blocks

  52. TerryLee says:

    Tom Hollander在他的Blog中介绍了作为下一代企业库(开发代号Enterprise Library v3)中的新成员Validation Application Block,Tom Hollander在这里提供了一些实现验证的想法,对于Enterprise Library比较关注的朋友可以一睹为快!

  53. Enterprise Library 3.0 Dev CTP is available on Code Plex since yesterday 🙂 This CTP Highlights are

  54. Enterprise Library 3.0 Dev CTP is available on Code Plex since yesterday 🙂 This CTP Highlights are

  55. The first CTP release of the Enterprise Library, which is available for download from CodePlex , contains

  56. Kevin I says:

    The CLSA project makes a good attempt at business rules, but they are tied a bit too tightly with the business objects. The business rules are more useful when used in a context, not all or nothing.  And tying each object to a bunch of function pointers makes it a bit too tight, and creates extra weight when the rules either aren’t needed/used – as well as having them externally loaded.  There are some rules in which it is possible to share between a front/end and backend system (if no database interactivity is needed), and a good framework would be able to leverage both cases – as well as provide information on the actual error and provide at least a mechanism for the UI to be able to query out which fields are in error, and validate only the rules associated with a given field (under a given context).

  57. Après une interruption momentanée de ma participation à la bloggosphère, je profite d’une accalmie passagère

  58. I created a sample on my blog (http://www.delarou.net), how you can extend the BoundField of ASP.NET for validation through the VAB.

    Tom,

    Is it something similar that you will provide through the VAB? Or can you give an idea how it would look like?

    How can I map, for example a RegexValidator to a RegularExpressionValidator, given that the ‘pattern’ property is private? Or will that be changed in the future?

    thx

  59. manuelra says:

    Some thoughts…

    1. Multi Layer Architecture ==> Replicating Rules

    As per Michael Howard, data should be validated at chokepoints, not everywhere

    "all validations have a point in the design where the data is believed to be well-formed and safe because it has been checked. Once the data is inside that trusted boundary, there should be no reason to check it again for validity […] you should employ multiple layers of defense in case a layer is compromised (Writing Secure Code, 2nd Ed, p345)

    Thus Multi layer architecture ==> multiple points of validation ==> It SHOULD be straightforward to replicate these rules across layers

    And Multi Layer architecture does not ==> validations everywhere ==> Bindable (not intrinsic) rules

    (As far as I remember from Secrets & Lies) As per Bruce SChneider, security validation algorithms should preferably be public (e.g., security by obscurity is non desirable), and thus subject to wide scrutinity thus validation algorithms MAY be visible

    Possibly each layer will have its own technology nuances. A validation framework should preferably take into account how to replicate rules on the diverse technologies typically used (e.g., javascript, .NET Code, SQL CONSTRAINTS).

    2. Localization ==> (or not?) Propagating contexts & errors across layers

    Q: On a multi layer localizable architecture, Where do you localize error messages?

    A1: on the UI layer

    A2: on the layer that detects the error

    A3: either

    A1 ==>

    – The UI layer has to know all possible error messages

    – All information needed to build the error message should be propagated upwards on its native form (values, ranges, etc)

    A2 ==>

    – Localization context (language + regional settings) should be propagated downwards to enable proper construction of error messages

    – field / property / attribute names within error messages should be mapped at layer boundaries (e.g., MaxAge –> Maximum Age) OR

     error messages should include placeholders for those names OR

     field / property / attribute names are the same on all layers

    A3 = A1 + A2

     Notes:

     * Wouldn’t it be neat to have a WS-RegionalSettings standard that specified header formats for defining desirable error message languages? (if there is one I would like to know)

    3. What error information to replicate

    – fields failing validation

    – values

    – valid domains: should these replicate? sometimes not (e.g., checkdigits), sometimes yes (e.g. min age)

    – "base language" error messages?

    4. Back to bindable rules

    If bindable is desirable, then dynamic binding is heaven 🙂 ==> management & configuration and MAY imply caching, repository strategies, etc

    5. Form Factor

    Although I don’t know the language, this concept (on its non bindable form) soulds similar to the preconditions that I believe are available on Eifel. IMHO the parameter attribute based form factor does not provide a significant advantage over standard language constructs within the methods (unless we come up with some for to publish that information through some WS-Policy like mechanism 😉

    As for the object properties the attribute version may compromise the bindable approach.

  60. We’re making great progress with the Validation Application Block in Enterprise Library 3.0, and one

  61. Tatis says:

    Una de las cosas que m&aacute;s me ha gustado del nuevo Validation Application Block es la integraci&oacute;n

  62. There are many ways to implement Model-View-Presenter; Supervising Controller and Passive View are just

  63. Yes, it’s finally here. The patterns &amp; practices team is pleased to announce the official release

  64. neuhawk says:

    Enterprise Library 3.0 – April 2007 发布

  65. Bashmohandes says:

    Enterprise Library 3.0 Released

  66. The patterns &amp; practices team has announced the official release of Enterprise Library 3.0 – April

  67. Loser-X says:

    Most excellent. I have looked at Enterprise Library in the past, but never quite found a suitable project

  68. Overview The patterns &amp; practices Enterprise Library is a library of application blocks designed

  69. Despues de mucho esperaracion oficial del Enterprise Library 3.0 – April 2007 para.NET Framework 2.0 / 3.0. Punntos sobresalientes Si estuvieron tan atentos como yo los CTPs no habra muchas sorpres …

  70. EntLib 3.0 just dropped . Go get the bits here . For more details, see Tom Hollander’s most excellent

  71. Erics Blog says:

    While I was gone (three weeks in the US on vacation) pattern and practices have released Enterprise Library

  72. Když byla před měsícem zveřejněna Enterprise Library 3.0, stěží jsem tomu věnoval pozornost. Hmmm, co

  73. האם לא מצאתם את עצמיכם אי פעם יושבים המון זמן על פקדי וואלידציה שונים ומשונים לכמה טפסים מסכנים (ולעתים

  74. Sam Gentile says:

    Workflow/BPM/WCF/SOA David Chappell presents arguments both pro and con as to whether Microsoft qualifies

  75. האם לא מצאתם את עצמיכם אי פעם יושבים המון זמן על פקדי וואלידציה שונים ומשונים לכמה טפסים מסכנים (ולעתים

Comments are closed.