Why Are You NOT Automating Software?

  • What kind of automation?
  • What is built today?
  • Why does it not work?
  • What has been missing?
  • Why is that so beneficial?
  • What can do this today?
  • We can do this today!

Automated development & deployment tools? Everyone loves them - right? Certainly seasoned developers and infrastructure engineers love to use time-saving tools that help them develop or deploy software faster and more reliably in a consistent manner – as long as those tools promote a proven or best practice or some recommended guidelines that the engineers subscribe to in their solution.

But even in today’s SW industry, there does not seem to be much of an expectation from SW professionals that we should have the ‘capability/tools to build our own tools’ that do much of the project work for us. I am not talking about the simple automation, things like rename/refactor or autocorrect, or intellisense and all that, I am talking about the kind of automation and tooling that writes or deploys software for us and saves hours, days and weeks of our time from handcrafting software solutions from scratch again. Custom tooling that is precisely tailored by us, for us to produce or deploy software exactly the way we want it on every project. Not necessarily all the software, but certainly the parts that we have pre-determined could be repeated the way we have done it before.

It is not clear what the appetite for this kind of tooling is today, certainly it makes logical sense to have tooling like this, but there is still a large proportion of engineers (typically the unseasoned) who like doing everything painstaking and expensively manually just so that they can rest assured that they control everything themselves, and putting their personal mark on their work. But judging by the size of the communities who do have the appetite for more automation and reuse, it seems clear that the broader software development/deployment communities are still largely happily-numb about not needing to be able to do automation very often or well in their work.

What kind of automation?

So what kind of tooling are we talking about here? We are all familiar with most of the basic toolsets that are provided to us by tool vendors along with the latest technologies. For example, you name it, the latest ASP.NET MVC tooling in Visual Studio that came with the release of the ASP.NET MVC 3 framework. Or the PowerShell console/shell for windows that came with the PowerShell engine. You name your favorite framework and toolset. Technology vendors spend heaps of time and money on providing these basic tools to be relevant to the mass audience, so that they can promote the use of their technology most broadly – that’s what brings in the bacon after all. Most IT experts (lets just call them experts for brevity here) who develop and deliver solutions built on these general technologies with these general tools get to know these tools back to front, and learn how to use them for their limited designed purpose somewhere in the building of real world solutions. Funnily enough, some experts are not even aware of a distinction between the technology and the tools that aid in programming it (e.g the .NET Framework versus Visual Studio, or PowerShell script from the shell). But in most software organizations those experts often come to a point in the mastery of those tools and technologies where they have created similar enough solutions for so long now, that they see overarching patterns emerge from their solutions, common themes, common strategies, common practices. And now they now desire to apply what they have learned previously to next projects using automation to get more consistent and supportable results – that ultimately save time and money in implementation and maintenance. And what tools do they have at their disposal that let them create this kind of automation to share with others? If you ask a developer to quickly create some automation, they will probably write a dedicated desktop/console application, or create a macro in their favorite IDE, or at best create a rudimentary plug-in. An infrastructure architect will almost certainly always create a *.ps script.

What is built today?

Inevitably, engineers start with what they know, and invariably go off in a dark corner and start cobbling together their automation with various bespoke automation frameworks, tools and technologies they are already familiar with – its creative. Its also super hard to do. They build tools that they think others would tolerate using, and could possibly use (with some degree of understanding) to apply what the expert has learned to be proven from their experiences on real projects. Tools that will often be parameterized to deal with variance in requirements of the solution. And these tools are accompanied by detailed documents explaining how to use them, and what they do. These kinds of tools we can generally refer to as ‘Domain Specific Tooling’ (DST) . They are specific to only solving problems in a particular solution domain, for a specific audience, promoting/enforcing/implementing specific practices/principles/patterns/implementations given a specific set of requirements and technology assumptions, environments and constraints etc. The kind of tooling an expert takes on the next project to repeat what they learned on the last project, to save them doing it all from scratch again – reliably.

Why does it not work?

But in practice, these DST tools are often cobbled together across a jumble of platforms with technologies that don’t/won’t integrate well together. “no worries! , just follow the steps in the documentation!” brilliant! Usability is an afterthought, because the expert building the automation is often too consumed with the details and complexities of getting the automation to just work for them. To make up for that deficiency, the resultant tooling is often delivered with detailed step-by-step manuals to teach how to install and use them properly (without which, only the expert can use these tools). Those manuals typically aren’t very scalable across unskilled or uninitiated audiences, (and that’s even if you can find them – they are usually stored in a different location than the tool).

And what if you want to adapt the tool for the next project that has slightly different requirements? Often, these kinds of tools are so hard to learn, parameterize or adapt that they are simply discarded, and their value and expertise is lost entirely. Not really the intent of the expert who invests in creating them.

In larger, better funded organizations, these kinds of DST tools may ultimately result from policy pushed down to engineers for implementing on their projects to gain some semblance of consistency for supportability across the organization where support and maintenance has been recognized as a cost burden. This is usually the case when the folks in the ivory tower (who were perhaps once proficient in the field) get an opportunity to be creative and demonstrate how much they should know (or once knew), and what they know or think everyone needs to use now. They get license to create and manage long running programs that eventually deliver out of date tools and badly integrated automation that govern or enforce uncompromising guidelines that must be followed on all projects – no exceptions! But sadly, with this kind of governance, these kinds of ‘tools’ and guidelines are either designed to be too general to realize any productivity benefit, or too specialized or rigid to realize any value for the specific needs of the current project underway. They are designed by the people most furthest away from the customers problems, and built by people far away from the continually evolving solutions. They are often paid lip-service and simply ignored, or unjustified, moved out of scope, or worked around.

In practice, just due to just how difficult it is to build: good, relevant, up to date, usable, customizable DST tooling, and who should be doing it, the kinds of automation and tools which are being worked on become very unsupportable all too quick, and their value is lost to future generations of projects. Mission failed.

What has been missing?

There are some very specific and critical requirements that have been missed in many of these attempts at supporting experts in building custom guidance and automation tools and frameworks in the past. Mainly due to the fact that those building the tools consider themselves the target users of their own tools, and as such they assume that if they find them easy to understand and use, so will all other users. Nothing could be farther from the truth of course. There are some key success criteria that are missing in almost all of these approaches, some poor assumptions, and some very specialized tooling missing that is needed to help dramatically increase the success of building maintainable and evolving DST’s. Let’s look at some of them here:

We recognize that the primary activities of the ‘expert’ when building a DST *should* be limited to:

  • Identifying the patterns of a existing solution implementations that could to be parameterized, and repeated or automated.
  • Identifying and naming a simplified description (a.k.a domain model) of the part of the solution that needs programming/configuring by an end-user. i.e. name, scope, language, vocabulary, relationships, cardinality, etc.
  • Identifying the values and types of parameters of elements in the domain model (i.e. data types, selections, ranges, bounds of values.
  • Defining rules and formulas that can be evaluated against instances of the domain model, to verify configuration of the solution.
  • Designing the user’s experience in terms of how the domain model is represented, understood and manipulated, and what guidance and cues are required for self-guidance  and self-discovery of the solution.
  • Capturing and templatizing the specialized assets that have been acquired or developed through previous experiences from existing solutions.
  • Mapping these templated assets to elements of the domain model, and defining the automation that generates, populates and modifies the templated assets.
  • Defining the structure of the solution implementation (i.e files and folders) and/or integration (i.e. interaction with services) from the current state of the domain model configuration.
  • Creating instructive guidance that directs a user to how to create and configure instances of the solution with the assets and automation.

Note, ideally, none of these kinds of activities should require any training in actual tool building or automation. But, in practice, beyond simple kinds of automation (i.e. generating of files and folders from templates, managing solution structure, executing scripts and that kind of thing) the expert may need to learn details of the automation framework to a level of detail that provides them the degree of automation and integration they may desire. But that level of understanding should not be necessitated by the experience of performing all other activities. It should be staged.

The primary requirements of any automation platform that yields, and executes DST’s, which would take the inputs from the above activities, would include:

  • Providing simple-to-use graphical/textual designers to create a simplified domain model of the experts’ solution. i.e. provide a minimal vocabulary and meta-language where elements, relationships, cardinalities, attributes, automation, templates, assets, guidance for the solution can be quickly specified.
  • Provide a rich and extensible automation framework API that can be used to respond to and navigate the domain model, read metadata about the domain, as well as the configured parameters of model elements when a solution is created or updated in it.
  • Provide an extensible integration and abstraction layer to other tools, frameworks and services of the IDE for rich automation and integration with them.
  • Provide an extensible library of automation classes that perform common automation tasks. for most domains e.g. generate text files, prompt users for input, run batch commands, etc.
  • Generate a platform specific set of tools (a toolkit) that packages up and automates the domain model, assets, automation and guidance into a single versionable, installable and executable toolset.
  • Provide easy means to customize (and control customization of) a installed toolkit.
  • Provide simple means to compose and extend installed toolkits.

These generated toolkits will need to support a core set of basic and standardized user interface and lifecycle features, so that the experts building them can focus on the solution domain, rather than how the automation is presented and managed, such as:

  • Publishing, browsing, and installation of the toolkits, and activation/deactivation of installed toolkits.
  • Browsing, creating, modifying, deleting instances of a solution from any toolkit at any time, integrated into an existing solution.
  • Displaying, navigating, inspecting elements of each instance of a solution from a toolkit in a familiar and integrated user interface in the IDE.
  • Navigating seamlessly between instances of a solution to other tools and services in the IDE.
  • Automatic validation and error reporting of incorrectly configured instances of a solution.
  • Persistence of instances of solutions in human readable (and source-controllable) files.
  • Integration of guidance, so that a user can easily navigate to the guidance specific to the current solution instance.
  • Support for a properties inspector and wizard framework for capturing the configuration of properties of elements in the instances of solutions.
  • Support managed version updates of solutions, when new toolkit versions are released.

Why is that so beneficial?

In this way, this class of DST toolkits can be easily generated from simplified descriptions, rule sets and assets provided by the experts, because the management, lifecycle, operation and integration of the toolkits has been standardized for consistency and usability. And so, because the experts can focus intensely on their solution domains and harvesting their assets; rich, highly-usable and relevant tooling can be generated (and updated) far more economically and reliably than using existing bespoke automation technologies.

By making the building and customization of DST tool building so economic, and giving the experts on the front line the capability to create and manage them themselves, the DST tooling that results is always open, always relevant, always up to date, and highly applicable for each project they work on. In this way, more experts start to trust and adopt the tooling provided by other experts, since they can provide their own input to it and continue optimizing it, slowly standardizing on agreed proven practices. As the tooling is so easy to customize and extend, it continues to evolve as the technology and practices do, freeing up the experts to focus on other areas of the solution development/deployment for automation. As a result consistency, quality, maintainability and supportability all increase dramatically. If organizations want to prescribe policies of solution development/deployment, they can invest in that also by building their DST tooling, and then the experts on the front line can take that tooling and specialize/generalize it just as easily for their projects, feedback their expertise in the tools themselves, realizing the combined value of the organization and individuals in their marketplace.

What can do this today?

This was precisely the realization of the VSPAT toolset from when it was incepted back in late 2009. It was recognized then that just providing a powerful and flexible tooling framework was not enough to address the problem that software professionals faced with bespoke automation. Which was the incumbent thinking inside the Visual Studio team at the time. “Just give them powerful frameworks and they can use them! ” was a position of grand arrogance, and presumed that every software professional (development or deployment) was an automation and tooling expert [in Visual Studio automation]. When the keys were in fact in bringing the automation experience to well within the grasp of all software professionals (both development and deployment). So that any tooling produced was so easy and economical to produce (and to customize or extend) that the initial (and ongoing) investments required to create and maintain great tooling were not prohibitively high, and very quickly achievable.

As well as that, making the toolkits themselves an open and extensible system, opened up possibilities for customization and tailoring previously not tackled in past generations of automation frameworks. If you didn’t like the toolkit, then change it yourself – no source required. A toolkit could just as easy be created from scratch, as modified and customized or extended. If you found a toolkit that was 80% of what you needed, you could open in the designer and customize the 20% to suit your needs just as quick. If you didn’t like the code that was generated by a toolkit, you could open it the designer and modify the code template to suit your needs. No longer did you need to invest in esoteric file formats or tinkering with aging automation frameworks, everything was graphically designed, related together in a simple view of your domain, and all programming extensibility points typed to your domain. You could even build toolkits iteratively and incrementally, and present them to users to try out every cycle without having to commit to all the automation, so that you could explore and experiment in their look and feel and function first.

If you wanted to build a toolkit, and have others implement some aspect of it, or perhaps enable someone to plug something into it (i.e. a technology variant, or architectural extension), then you shipped the base toolkit with a defined ‘extension point’ in it, and others in the organization/community shipped extension toolkits that plug into the extension point to allow the two toolkits to interact and collaborate. With this kind of customizability and extensibility, using the same experience as creating new toolkits, the traditional economic barriers to creating and maintaining such automation started toppling.

We can do this today!

With the advances made in the VSPAT technology and the proven approach it has demonstrated over again, it now makes sense for all software organizations to make small investments in tooling as the software project progresses, as a parallel activity. This works especially well in agile projects, where the tooling and the automated software are changing rapidly together.

As an open source project, the VSPAT project is not finished by any means, there is more to learn and improve in the real world customization and versioning scenarios that communities may need. Now that this project is free of its corporate bonds, its can flourish in the wider development community, we are looking forward to many organizations realizing the potential of this technology and the project moving forward.

Please check out the VSPAT project site, and lets get the automation discussion and communities going strong again, and realize higher levels of consistency, predictability and supportability in bespoke software solutions.