Øredev 2013

In this post I summarize my highlights from visiting the Øredev 2013 conference. First of all, I really enjoyed the broad scope of the conference! In my opinion, it is always healthy to think outside of your technology/approach box to avoid a certain routine-blindness.

Here is my at-a-glance-list of hot topics from Øredev 2013:

  • JavaScript
  • Functional Programming
  • Domain-Driven Design
  • Micro Services
  • Continuous Delivery


JavaScript is everywhere - with NodeJS even on the server side. Some people already talk about JavaScript as the VM or assembly language of the web. I do not want to recite all of these right statements. Instead I would like to share the impressions about JavaScipt I got from the sessions at Øredev. A lot of speakers had something to say about JavaScript. Unsurprisingly all of them agreed that JavaScript is a flawed language - and I concur. The most interesting session on JavaScript was held by Douglas Crockford. He predicted that we might be stuck another 40(!) years with JavaScript and its shortcomings. Looking back in our industry's history of dealing with sub-optimal programming concepts like the the Go To statement, a number like 40 years might be an optimistic guess. It took us around 30 years from Edsger Dijkstra famous letter "Go To Statement Considered Harmful" to the actual demise of the Go To statement in a mainstream programming language (Java, 1995). And this was a rather small issue to fix.

What can you do today if you are not happy with JavaScript but need to work with it? Tools like JSLint or JSHint can help you in identifying the obvious bad parts of JavaScript in your code base (look WAT if you don't know what I mean). In addition, you might benefit from TypeScript or CoffeeScript and their approach of compiling into JavaScript. They are bringing things like classical inheritance to the table. Typescript even introduces type-safety to the language.

As a side note: Douglas Crockford was not really enthusiastic about bringing the classical inheritance concept to JavaScript (nor about the existing prototypical inheritance concept). He argues that statefulness and conditional logic in OO-languages are also flawed and tend to over-complicate stuff. In his regard, the future lies more in functional programming. No surprise that functional programming was also one at the hot topics at the conference.

Functional Programming

Functional programming was present with a lot of sessions. Why is functional programming getting so much traction over the last years?

One of the biggest challenges in nowadays' systems is dealing with their high complexity. Complexity arises from mutating state, piling up conditional logic and producing vast amounts of code. Functional programming is no silver bullet but offers concepts to deal with these challenges. Immutable types, pattern matching and a succinct but readable notation help developers to write better code. There was quite a controversial talk from Bodil Stokke titled "Programming, Only Better". Unfortunately it does no appear on the Øredev video listing. It analyzed the state of our current mainstream programming concepts and demonstrated how functional languages may address some of the issues.

If you are programming on the .NET platform, you are in the lucky spot to have already a lot of functional goodness at your fingertips. In 2007 we introduced C# 3 and added features like lambdas or LINQ to the language. And you can even go further into functional programming on the .NET platform with F#. The great thing is that it is absolutely easy to combine C# and F# together in your code base. They both compile to same CLR bytecode. You should watch Phil Trelford's talk to get an in depth impression. From his talk I took away the idea to model my domain classes in F# from now on. I really loved the terseness (hope to blog about it soon).

Domain-Driven Design

Domain-driven design (DDD) cannot be overlooked these days as a conceptual approach to develop and architect complex software. If you have not heard of it you might want to read a quick summary. Understanding the idea of a bounded context will be one of the "aha" moments in your SW development career (at least it was for me). Although not a silver bullet, it seems to be the best approach we have these days to tackle complexity at an architectural level. A couple of sessions addressed DDD. One of them was Julie Lerman's talk on applying DDD concepts with Entity Framework as the persistence infrastructure. Another one was Tom Scott's talk on CQRS/EventSourcing and DDD.

Micro Services

If you drive the DDD idea of bounded contexts to an absolute extreme you end up with something called Micro Services. Think of masses of tiny REST/web services that are hooked together via a high throughput bus. Mix that with a few radical ideas such as a wild mix of technologies, allowance of copy & paste programming and a deliberate throw-away mentality. Sounds a bit crazy? Yes, that was my reaction, too. Fred George's talk was the most unsettling one at Øredev. Unfortunately, the session recording is not available (yet). As an alternative you can watch the recording from a ruby conference in 2012. Side note: I read about a similar concept on Ralf Westphal's blog. Unfortunately, his article is only available in German.

Continuous Delivery

How often do you deploy your software? Once every two months? Every week? There are people out there deploying their Software more than 100 times - EACH DAY. Yes, deployments to the live system - not only to the alpha or staging area. Sounds crazy? Not when you actually think through it. What is the biggest issue in SW development these days? Yes, you read that already in another paragraph - it is complexity. How many changes are typically rolled out when you deploy to production once every two months? 10 features, a couple of bug fixes summing up to maybe 3000 lines of code that were touched? Is your system always stable after deployment? Is it likely that - although you applied best practices as automatic testing - some issues will still pop up? How would you monitor your production system to detect issues that might have been raised from a deployment?

Wouldn't it be easier if you could deploy each tiny feature/fix immediately and monitor their effectiveness in a very advanced fashion? Just touching a couple of lines of code would also reduce the time you need to resolve potential issues. And while you resolve the issue, the tiny increment is reverted. And that's the main idea of continuous delivery (CD) - a high speed feedback loop that allows you to deliver new value to your software as often as you need it.

But who actually needs several updates a day? Companies want to be more nimble than their competitors when it comes to explore new possibilities for products and offerings! They want to experiment with stuff quickly and easily. There are statistics that only a third of all features that are incorporated into a SW are used to the extend that was originally thought off. Yes, you read right. Two thirds are usually waste that most people don't need nor use. In lean development you want to detect and eliminate waste quickly. With CD you have the tool, to put some initial effort into a new idea and drive some statistics how users respond to it. If they like it, continue the development. If not shut it down quickly. The turn-around times allow you to experiment.

As you can imagine the shift to CD is not something you just do over night. It requires a very sophisticated environment that allows operations and development to work seamlessly together - also known under the headline DevOps. Jez Humble's talk gives a great introduction into adopting continuous delivery. There is also a book by Jez for anyone that wants to deep dive into the topic.

When putting the idea of Micro Services into context with CD it becomes clear that both approaches fit together nicely. The Micro Services architecture is all about quick integration and easy discontinuation of services. Exactly the thing that you want to achieve in CD.

My Takeaway

The conference confirmed my feeling that our industry is evolving at different speed levels. A majority that does not evolve at all or very slow and a minority that picks up change and (r)evoluational ideas pretty quickly. The latter are present at conferences, develop cool new products and drive the community. The others that are the gross of our industry cling longer to old things and do not embrace change progressively. Scott Hanselman recently called them dark matter programmers. They make up the majority of our industry but you do not know that they exist. They program procedural are typically a bit waterfall'y and are not eager to try new things. Their systems get bigger and bigger day by day and they drown in complexity. And instead of investing into things like functional programming, Continuous Delivery, DDD or Micro Services they continue to do the same stuff as 10-40 years ago. Lost opportunities?

Comments (1)
  1. Kjeld says:

    Sounds like a cool conference you attended there. Crockford's prediction is pretty bold but probably right. 🙂

Comments are closed.

Skip to main content