Why the Zachman Framework is not an ontology

John Zachman has been making an interesting claim in the last few years… he is claiming that the Zachman Framework for Enterprise Architecture, a creation of his that, through many revisions across the years, has formed a cornerstone of Enterprise Architectural practice in many parts of the world, is an ontology.

It is not.

Let’s be clear, first off, about the definition of an ontology.  There are two commonly used definitions.  In philosophy, an ontology deals with the nature of being.  An ontology in philosophy is a statement of “what is real”,  a conceptual understanding of something that helps to answer key questions, like “what things exist,” and “how does a thing relate to other things.”  Philosophical ontology attempts to answer questions like “does truth exist?” or “does energy exist?” 

Not long ago, the term ‘ontology’ took on a new meaning in the context of computer science.  In the late 1980s and early 90s, Artificial intelligence was attempting to create large maps of human understanding, and the term “ontology” started to creep into general use in AI circles. 

Tom Gruber is widely credited with bringing the term into focus in computer science.  In 1993, he wrote a paper titled "Toward Principles for the Design of Ontologies Used for Knowledge Sharing," later published in the International Journal of Human-Computer Studies.  The paper can be found here.  A more general discussion can be found here.

Gruber’s use of Ontology is fairly clear.  In the discussion cited above, he stated:

“In the context of knowledge sharing, I use the term ontology to mean a specification of a conceptualization. That is, an ontology is a description (like a formal specification of a program) of the concepts and relationships that can exist for an agent or a community of agents. This definition is consistent with the usage of ontology as set-of-concept-definitions, but more general. And it is certainly a different sense of the word than its use in philosophy.” (emphasis added).

The goal of ontologies in Gruber’s research was focused on Artificial Intelligence.  His challenge was complex: How could he get intelligent agents to converse with one another in a consistent fashion?  Ontologies provided that ability, far better than simple definitions would.  A concept in an ontology is defined rigorously, and the relationships between terms are specific and declared.

It is important to note that there are no implicit relationships between terms in an ontology.  Relationships, where they exist, are explicitly declared, and in many cases, constrained.  Relationships are not implied.  There is no notion that two terms may be related to one another, or not, as needed.

And here is where we come back around to Zachman.  While the Zachman Framework (ZF) defines, in a 6x6 grid. a self-contained set of terms, the ZF does not declare the existence of any relationships between them.  In fact, he implies that anything can be related to anything, without constraint.  While practitioners can reasonably discuss the composition of primitives into composite objects, the Zachman framework does not describe any composite objects at all. 

In defining a “type system” for “things related to enterprise organizations,” the Zachman framework provides a simple set of data types, and little else.  It is useful, but insufficient. 

Using ZF, I can look at a thing and figure out “where in my taxonomy does this thing belong?”  But the type system is too broad to imply many constraints.  The Zachman framework is full of generalities.  Specific types are not described.  This means you can understand a “thing” but not really talk about it at a conceptual level because it is too vaguely defined.  Specificity and clarity are required to draw conclusions.  That requires a great many more objects, and specific, constrained, declared relationships between them.

Without concrete objects and specified relationships, ZF cannot solve many EA-specific problems.  You cannot, using Zachman alone, have two people draw the same conclusions from the data that you collect, because your data is unstructured.  Conclusions come from structure and relationships, not definitional types.   You need a model that includes relationships to infer a conclusion drawn from the data.  EA needs ontologies (metamodels).

An ontology allows inference.  The Zachman Framework does not.

The Zachman framework is not an ontology.