Stop work everyone, it’s time to check in…


I had to chuckle.  Steven Kelly over at MetaCase often gives us a hard time over the fact that DSL Tools is still at V1 whilst his company’s tool has been in the marketplace much longer.  However, he’s just written a white paper on versioning, from which I nabbed the following instructions for teams working on models:

When the team leader is happy with a set of changes, he can have the modelers log out, suspend the MetaEdit+ database server for that repository, zip it and check it in to version control.

Since a MetaEdit+ repository consists of many files, which must be treated together as one conceptual whole, the repository directories should not be kept under automatic version control at the file system level. Most version control systems only really understand a single file as a versionable unit, so the repository must be zipped into one file, and that file placed under version control.

Blimey, that doesn’t sound much like an agile process to me!

In our out-of-the box experience, we stick to a rigidly simple file-based system in DSL Tools, with the idea that you use regular SCC systems for day-day version control. We’re not very mature yet at links between those files, but it’s something we’re looking at closely right now.  We tend to think that a repository is something you use for publishing major version snapshots of models to as they become part of your company’s knowledge repository.

Given that in any major software project with multiple releases, you’ll have branched versions and hence usually a need to merge those branches later, dealing with models can be tricky.  I thought it was sometimes a bit rough in the DSL Tools’ world to merge XML files (albeit nice clean ones that match your domain model), but I don’t much fancy the idea of trying to merge a zipped snapshot of a whole repository.

I think it’s fair to say that this is an area where our corner of the industry has a lot of work still to do.

[edited as I forgot to put any hyperlinks or tags in]


Comments (5)

  1. Steven Kelly says:

    Heh, glad that it was good for a chuckle, Gareth! For people who like current version control practices, the important missing piece of context from Gareth’s selective quoting was that I recommend they use the single user version of MetaEdit+. With that, it’s just like normal version control with source code files: when you want to check in, the only developer who stops working is you.

    Only if you hit a situation where a single "module"

    would be too large for one developer, and couldn’t be split into sub-modules, would you move to multi-user editing of that one module. To draw the analogy with current source code version control, this equates to having multiple people simultaneously editing the same file, AND seeing each others’ changes. The locks on areas being edited are made at a fine level of granularity, preventing conflicting edits whilst allowing everybody to keep working. There’s no need for a separate merge step, since you are all working on the same data rather than separate copies.

    As the article makes clear, both single and multi-user versions of MetaEdit+ offer a lot more than current versioning of source files. When you want to refer to something elsewhere in your module, you just link directly to it. If you later want to change its name, you just make the change in one place: since all the referers are pointing directly to it the links remain valid. Compare that to the code world, where every reference is by typing the same sequence of characters in the reference as in the definition, and having to hand-edit, search/replace, or perform a refactoring to update every single reference if you want to change the name.

    As Gareth mentions, this ability to link to and reuse parts of models is something that is not yet ready in Microsoft’s DSL Tools. That may make current version control practices more applicable, since you still have the same string-based references as in source code files. On the other hand it’s a real shame, as direct linking and reuse is one of the major contributors to the increased productivity of DSM. It’s been proven over the years in MetaEdit+: all our customers use it, and once you’ve tasted it there’s no going back to search/replace.

  2. Sorry for posting it here, it may be a bit off-topic.

    But I was curious about the future of T4 templating engine. I am evaluating it as a means of our project local code-generation, but there is almost no documentation and 2-3 blog posts on it per year.

    Also I was interested in creating a language service (with embedded C#, like ASP.Net uses), but right now it is simpler to write parser from scratch than to reuse parts of T4 parser.

  3. Gareth Jones says:

    Hi Andrey,

    T4 is alive and well.  We don’t have many plans to change the core engine right now as most folks have found the language itself sufficient.  We are thinking about how we might make calling it more sophisticated as right now our default Visual Studio experience for it is somewhat basic.

    Interesting that you’d like the parser.  It’s actually just a rather complex RegEx – I’ll look into seeing if we can get it published so folks can have a consistent experience.

  4. Gareth Jones says:

    BTW, there are quite a lof of docs in the SDK, including hosting etc.

  5. With domain-specific languages applied to software engineering, versioning becomes a whole new challenge….