Fragility of Markets and Architecture Failure

As the NASDAQ exchange came to a grinding halt this past week, I, as well as many of you perhaps thought about system failure and what was the series of events and initial conditions that preceded the shutdown.

Did a human did something strange that set a series of events in motion that jeopardized the integrity of the market? Or what it was a technological problem that created the condition that was termed “information asymmetry” in which traders did not have equal access to information, or the information was delayed, or the information quality was not consistent for all parties. I will be reading a lot more on this notion as there have been studies on systems dynamics on principal-agent problems in the market place.   

There are numerous questions that require some answers to understand what occurred. One aspect I feel that has not been acknowledged in the press was although there was failure, the shutdown and startup was very orderly based on how the market behaved before and after. From an outsider point of view, it appears that a potential catastrophe was avoided because someone interceded before things reached a chaotic state.

Robert Greifeld, CEO of the Nasdaq OMX group, said on Bloomberg TV:

“We have to make sure no matter what happens the system stays up and the system has a resiliency and a robustness that we did not exhibit yesterday.”

I can only ascertain that Greifeld is strictly referring to the “technological system” but as a whole, I have to say the entire system that included the people, process, and procedures did function well. Now that does not mean the failure was acceptable, as I am sure there will some financial and legal repercussions that may not be completely known yet. But again, the markets appeared to have shrugged it off after the event.

Now what about the technological system? First and foremost, the environment in which they operate in (the marketplace) has a very large number of actors, variables, conditions, and components. The whole notion of the market is an incredibly complex phenomenon. Even so-called experts are befuddled that try to predict the market. The most successful traders are the ones that accept that markets as chaotic and dynamic and are smart enough take advantage of entropic nature of the market. They operate in the world of “deterministic chaos” in which there is a degree of entropy.   

The technology elements that support the market exchanges have an equally large number of incredibly diverse elements and highly interconnected. Over the years the technologies that support markets and exchanges have become very complicated, that has grown by bolting on new functionality that requires integration with older platforms. Put a complex system with an overly complicated system, well you can guess what may occur. With that acknowledged, the technological system must not be fragile, it must be antifragile.

In pervious blogs, I mentioned there are four systems that comprise an enterprise suprasystem: the sociosystem, the techosystem, the biosystem, the econosystem.    

The “actors” in sociosystem acted orderly which fortunately did not affect the overall econosystem or “marketplace”. The biosystem (how and when reaction of the event) actually worked well from the sociosystem side, but not some much from the technosystem side. Therefore the technology system failed and was certainly fragile. 

The technosystem within these markets provide one OUTPUT that is valuable ….. that is INFORMATION.

Fair and equal access to information is the lubricant to a healthy market. One has to wonder if the CTO or Enterprise Architects within these exchanges have the ability to even re-architect something that is has become so complicated and has grown organically where it now beyond compression. When something is beyond understanding, then the technology drifts into the world of overcomplicated. Which is some cases will be fine for a while, as long as you can predict the outcomes most of the time and handle events gracefully when you there is a result that falls outside of the prediction range. (Which will eventually happen.) Over time these systems will start to erode and decay, they will be more complicated and then move to the complex domain, when the outcomes become erratic and impossible to predict. You have moved from deterministic chaos to chaos that is not probabilistic.

I believe I can assert that building a complicated technology environment to address a market that is complex by its very nature will lead to more events in the future, perhaps with more grave results than what we have witnessed this past week. This event should be a wakeup call for all in the enterprise and IT architecture professions to strongly reconsider how we design systems in this new age.

We cannot just say, well I followed this framework and my architecture is perfect. So my architecture is fine. We cannot point at the implementation and say it was the implementers fault. We architects must have skin in the game. Our challenge is that architecture of soft systems are not as predictable as we would like, especially for those of us who come from the world of engineering where prediction is vital. We need to architect and design systems that operate in a band of tolerated outcomes, as the structures within the system can govern, provide, or guide the right behaviors.    

An architect fails in their job when they design and implement systems that are fragile. I think an architect gets an average score if the system is robust or resilient. An excellent architecture is antifragile, that promote right behaviors by smart governance an elegant design.    

As always I am interested in your thoughts…..