XP2005

DSL + Agile workshop

I spent most of Monday running a DSL workshop at XP2005, with Steven Kelly of MetaCase. There were 22 participants including ourselves, and people said it was an enjoyable and useful day. Results posted at https://www.dsmforum.org/Events/ADDSL05/results.html

Main interesting points:

· Quite a lot of interest in using DSLs (both graphical and textual) to script and describe tests, especially in a model-based way – where the diagram is a state model, for example.

· How to develop a DSL: three inputs:

o Variability in the developed software

§ Which bits vary between delivered instances or across the life of the system? – These are the aspects that need to be at least separated out from the more constant stuff; and are good candidates for generating or generalizing and driving from a DSL.

o Ontology/vocabulary of the domain (as seen in user stories)

§ Make a model of the whole domain. Separate out the variable bits as derived from the software variability: this gives you the concepts you want to express in your language.

o Existing user notations

§ Look for drawings

· The issue of “do DSLs help scale up agile” was more or less accepted without question. Of course they do, provided you manage carefully the evolution of language and framework – don’t do too much up front, stick to agile principles.

Other stuff at the conference:

Jutta Eckstein, a well-respected process consultant, gave a talk on “Agile: Coming of Age”. Apparently the German armed forces (among others) now stipulate many principles and practices of agile methods for software contracts - in particular 2-way transparency and feedback; so it seems we’re getting past the early adopters stage.  

The essence of agile methods is aiming to meet the real needs of the customer by looking for early feedback, and planning for change. Also planning to tune your process -- Jutta said "If it doesn't include Retrospectives , it's not agile".

Acceptance Testing. Session on tools for managing acceptance tests, including FIT, which is about integrating requirements with acceptance testing. It’s a bit like if we associated a test script with every feature in our features DB, and then had the testing apparatus automatically record the results there. Part of the emphasis of FIT, and the other acceptance test approaches on show, was to make the tests and their results more accessible to the customer, by working from and to readable documents.