ASP.NET 2.0, NUnit, and TDD


I got my website set up last night and got a real ASP.NET page pulling data from a database and writing it out.


I did the bulk of my web programming when web-servers were still steam powered (does the name "EMWAC" mean anything to you?), so my skills are mostly Perl-based, so I'm in search of a few opinions...


A few questions:


1) Do you usually build the website locally (using whatever the new desktop-only server is called) and then only copy it over when you're done, or do you work on the live site? (I think I know the answer to this one already...)


2) What's the best approach for doing TDD on website-based programs? I took a quick look at NUnitAsp, but I think I'm more likely to write a set of classes that sit under a very thin asp.net presentation layer, and do unit testing on the classes instead. What has worked well for you?


3) Any issues with using NUnit and Whidbey beta 2? I could never get it to run with Beta 1?


4) For the database, should I run based on a local database, use the real database, or just talk to a mock database?


5) Anything other comments about tools/development approaches?


Comments (13)

  1. 1) I generally build locally and either copy the site over, or build an installer for it.

    2) I do a similar approach, write as thin of a web layer as possible. This becomes more difficult with complicated GUIs, I would really like to make sure that everything is correct, but haven’t seen something that really works for me yet.

    4) I have a SQL Server (under an MSDN license) that I use for testing purposes on a separate server. If I didn’t have that I would probably just use MSDE. Of course all the connection info is stored in a config file, so that deploying to production is pretty simple.

    5) I use a bunch of common tools like NDoc, CodeSmith, Reflector, FXCop, and a couple others written by myself.

  2. Brad Wilson says:

    1. Yes, build locally first.

    2. What you’re describing in the underlying philosophy behind "The Humble Dialog Box", except applied to the web. That’s a very good pattern.

    3. Getting NUnit to run on Whidbey is easy. You just need to modify the .config files for nunit-console and nunit-gui to allow the 2.0 runtime.

    Better yet, use TestDriven.net. Jamie has a version that fixes some Whidbey-related issues.

    http://www.mutantdesign.co.uk/downloads/TestDriven.NET-1.1.1048d.zip

    http://weblogs.asp.net/nunitaddin/archive/2005/04/30/405148.aspx

    4. A tough question. The answer I like is "get away from the database as soon as possible". Definitely no tests above the data layer should require any kind of database.

  3. Kevin Dente says:

    1) Locally. Just be sure to test under IIS periodically (or you can just develop under it always), as the development web server can behave a bit differently than a real IIS server. I’ve been bitten by that one.

    2) I’ve found Model/View/Presenter to be very effective. Using dynamic mocks for the View class works well, and is very productive.

    3) Nope – works great. Just be sure to run the latest (2.2.2) and update the supported runtime in the config file. I’ve found TestDriven.Net to be more finicky, however. It mostly works, but crashes the IDE periodically.

    4)I always prefer to mock or stub as much as possible, then separately unit test the db layer against a real database. Newkirk’s TDD in .NET has some good ideas around this.

  4. Haacked says:

    I agree with most what everyone has said already. A couple things I’d like to add.

    If I have custom controls or HTTP handlers that need unit testing, I can often apply this approach (Simulating Http Context For Unit Tests Without Using Cassinni nor IIS

    http://haacked.com/archive/2005/06/11/4617.aspx)

    For unit testing databases, stubs and mocks are fine to a point, but eventually you do want to make sure the database interaction is working. For this, I use a custom Rollback attribute within NUnit (http://haacked.com/archive/2005/06/10/4580.aspx)

    I usually test against a local database and script all changes. I keep the scripts in source control and notify the rest of the team when they need to apply it to their local builds.

    Phil

  5. nik says:

    1) locally

    2) you answered it urself

    4) local

  6. Erno says:

    1) Although it might be easy for personal homepages for Enterprise Apps you always build, test, create a setup and deploy…

    2) Agree!

    3) I had it running in beta 1 (I used a drop, not a real CTP…)

    4) Well that depends on what you want to test… for a new application the db might be empty so you need a mock to test it with fake data. (Never forget to test with an empty db also…) Always try to test the real thing unless it’s impossible or too expensive…

    Cheers

  7. 1) This is built on the build server and then run through the QA tests before being packaged for deployment. Deployment would be either a zip file or better still an installer (for rollbacks etc). This allows the package to be versioned so you can collate defect/new additions in your issue tracker to your release.

    2) NUnitASP is a great bit of kit and runs with NUnit and MbUnit, but as stated before TD.NET (which also runs both frameworks) is a great help with this.

    3) As much as I know NUnit runs fine on beta

    2, I can confirm that MbUnit runs on beta 2 but we have run into a MSBuild custom task issue (for the MbUnit MSBuild custom task) that was introduced between beta 1 and beta 2.

    4) I would suggest either using mocks or better still use a test database that is dedicated for testing (and reflects what you have from dev)

  8. Thinking more about the database testing, its worth noting that MbUnit comes prepacked with a database testing fixture and other fixture types that will aid unit testing your database, MbUnit also fully supports NMock if your mocking your database objects.

    http://www.mertner.com/confluence/display/MbUnit/RollbackAttribute

    http://www.mertner.com/confluence/display/MbUnit/FixtureDependencies

    Roy’s articles are well worth noting also

    http://weblogs.asp.net/rosherove/archive/2004/07/20/187863.aspx

    http://weblogs.asp.net/rosherove/articles/dbunittesting.aspx

  9. Mehran Nikoo says:

    I would use my local machine to develop the web site, especially in the case of VS.NET 2005 as it can use the local virtual server, which is much much faster than IIS (however if you need to use specific features of IIS then this is not an option).

    Even when you are using IIS you get more functionality (e.g. exception details) when running on the local machine.

    I would then rely on a build engine (like MSBuild, NAnt) or a setup package to deploy the site to test/production environments.

    For unit testing, as other guys mentioned if you are following the MVC pattern or have a UI process layer it should be pretty straightforward to test it.

    Database could be running on the local machine or another machine, in most cases it depends on the team size, frequencies of changes and also your unit testing policy.

  10. MbUnit bundles with tools to automatically populate a database (creating tables, constraints etc…) given a dataset. In the background, it generates the graph of tables/relation and uses it to figure out the ordering of table population. It also has a SqlAdministrator class that takes care of backup/restore/clean tables, etc…

    TestFu also contains a framework for automatically populating a database (given the dataset). Of course, this is rather undocumented….

  11. Nicole Calinoiu says:

    Since it sounds like you might be hosting the site on a shared server, you might want to account for restricted CAS permissions in your testing. Unfortunately, last time I checked, most of the available unit testing frameworks for .NET (including NUnit) were not able to identify tests in assemblies with restricted permission grants. The only one that I could find that didn’t have this limitation was csUnit (which also happened to be the only one in my evaluation pool that didn’t use attributes to identify tests).

    It’s been a few months since I last looked into this, and perhaps newer releases no longer have this problem. However, if you are interested in testing under restricted CAS permissions, it may still be factor in selecting your tool set.

  12. Chaz Haws says:

    I’ve had trouble using the debugger with NUnit. But running tests outside of the debugger has certainly worked just fine.

    Regarding TDD & database testing:

    If you’re testing a class, use a mock.

    If you’re testing an application, use a real database.

    TDD is often described solely in terms of Unit Testing, but you have to test the integration at some point.

    When testing against a real database, the only question worth asking is: What’s the difference between this server and the real one? Where it lives isn’t important. If it’s the same product, same version, same data, then you’ve got a really good test of everything (except hundreds of users hitting it at once). If it’s the same product, same structure but no data, you’ve got a good process test but maybe not as good performance test (many such times I’ve figured out afterwards that I’d missed a needed index). And so on.

    Automate the creation of the test server, even if it’s just the structure, from your live database.

    When you change any structure of the test database, script it and test the deployment of the change so you can roll it out the same way onto the live server.

    Can’t comment on anything else, I’m a Winforms guy. But I do a lot of database testing. I’ll eventually look into the database testing tools, but I haven’t used them yet.

  13. Red Forks says:

    I’m using sqlite as test database.

    Recreate and init the database before each test, and delete it after each test.

    Sqlite is very fast, no daemon required to run. Only you need is sqlite3.dll & Sqlite.Net.

    Well, I have a db transaparent libary, and o/r mapping libary.

    When db tests runs ok in sqlite, it should ok in sql server, oracle…

Skip to main content