A Blog about Business, Enterprise Architecture and Software Design by the Founders of ZenModeler

Simple and Easy Software Design – QCon London 2012

The talks I remember the most on QCon‘s second day are those from Rich Hickey. I was lucky enough to attend both the keynote in the morning (slides available) and the one about the modeling process (slides also available at the GotoCon website) in the evening at the London Java Community Group.

I was pretty amazed by the clarity of Rich’s thoughts in the Clojure documentation but I found a real visionary with the datomic service disclosure a few days ago. He’s a sort of philosopher, who tries to go very deep at first to grasp the essence of concepts like state, identity or time and then applies his thoughts to technology and I’m fond of this way of thinking and doing.

I already wrote about state, identity and behavior a few weeks ago. Here is a more general way to write about simplicity and is also the subject of Rich’s keynote at QCon : « Simple made easy ». Simplicity is a subject that fascinates me, as I see it as one of the ultimate goal in my design activities. In the following sections I will humbly mix my thoughts on that subject (sentences starting with « I ») with the ones from Rich (sentences starting with « Rich »).

Simple and Easy

Rich started by talking about the core and boundaries of these two words:

  • Simple : one fold, which is not composed
    • From the latin simplicem, "without a fold"
    • Is the opposite of complex
    • It is an objective concept that is independant of the human observe
  • Easy : lie near

Complication or complexity?

As a first step to grasp the concept of « simplicity » I started to look at what the opposite, complexity, is. So I separate at first complication and complexity:

  • Complication:
    • Deterministic, has a lots of details and connections, but with linear « cause and effects »
  • Complexity:

Finally, even though it’s interesting from an intellectual point of view to separate complication and complexity I think we can just use the following definition:

Complexity is a measurement of how difficult it is for someone to understand how a system will behave or to predict the consequences of changing it Therefore the latter definition is relative to the subject trying to understand a system, so I think that what is pertinent to analyze here is the subject’s perception of complexity.

What makes a system difficult to understand?

High level of variety in its components

Low variety in the system vs High variety in the system's constituents

Lack of order

Same relations between the elements but some organization (partitioning) added on the left

Same relationship between the elements but some organization (partitioning) was added to the left Same relations between the elements but organization (grouping) added

High degree of connectivity

Linux apache system calls for serving an image

Windows IIS system calls for serving an image

Which one do you prefer to work with? :-) But is simplicity only a lack of complication and complexity? It’s a good start (high order, low connectivity and low variety) but I think there is something more to it: *simplicity is also defined by what you add with clarity, purpose, and intentionality *(look at the DDD intention-revealing interface pattern). I think the most difficult part of simplicity lies in the latter.

Complexity and Simplicity toolkit

Rich’s morning talk was a praise to Clojure and a line of arguments about why Clojure is better on simplicity and easiness. Rich reborns an old english verb « complect » : « To join by weaving », and compare complexity and simplicity constructs to what they complect. Actually, it was a confrontation of the OOP/imperative way of thinking versus the functional one with the exposure of the different concepts on each side :

Complexity tool kit

Construct complects Elements
State complects everything that touch it
Objects complects State, Identity, Value, Operations
Methods complects Function and state, namespace
Syntax complects Meaning (derives meaning), order
Inheritance complects Types
Switch/matching complects Multiple who/what pairs
Variables complects Value and Time
Imperative loops, fold complects What and How
Actors complects What and Who
ORM complects OMG
Conditionals complects Why and the rest of the program

Simplicity toolkit

Construct Get it via
Values final, persistent collections
Functions a.k.a stateless methods
Namespaces language support
Data Maps, arrays, sets, XML, JSON etc.
Polymorphism a la carte Protocols, type classes
Managed refs Clojure/Haskell refs
Set functions Libraries
Queues Libraries
Declarative data manipulation SQL/LINQ/Datalog
Rules Libraries, Prolog
Consistency Transactions, values

It appears that the Clojure concepts have all the simplicity labels :-). Nevertheless it’s a good kick in the ass of the OOP approach.

What brings simplicity in our programs?

  • Composition of simple components together
    • This is really the fundamental way to have a simple design in a direct application of the Single Responsibility Principle.
    • In tune with the Unix philosophy that composes small programs (cat, grep, sed, awk) and link them together through text interface (the pipe). Actually, I think each point of the Unix philosophy helps bringing simplicity in every design.
  • Modularity by partitioning or grouping
    • But be careful, grouping and partitioning are enabled by simplicity, not by the opposite.
  • State always bring complexity

Example of incidental complexity with the « order » concept

Then Rich talks about the « order » concept, which gives a concrete example of one that brings incidental complexity and infects a lots of our usual programming constructs without any usefulness. Why « order » brings incidental complexity? because modifications are inhibited: a change in the order impacts the usage others have on the collections.Hence we find the « order » concept in the following constructs on the left while the construct on the right side fulfill the same needs but without that over-added « order » concept.

Order in the Wild

Complex Simple
Positional arguments Named arguments or map
Syntax Data
Product types Associative records
Imperative programs Declarative programs
Prolog Datalog
Call chains Queues
XML JSON, Clojure literals
... ...

Conclusion

Why take all this times to write about simplicity in what can be seen as very philosophical? because simplicity is a key concept and a key skill to design our softwares, not only in their technological aspects but very firstly in their domains. Yet the most significant complexity of many applications is not technical. It is in the domain itself, the activity or business of the user. When this domain complexity is not dealt with in the design, it won’t matter that the infrastructural technology is well-conceived. A successful design must systematically deal with this central aspect of the software Eric Evans, author of Domain Driven Design. « Simplicity » and « Common Sense over Common practice » was also the corrolary of the « test considered harmful » and « ORM OMG ! » that were higlighted several times during the conference. What I like the most in QCon is the « think differently » that I could feel in some talks given by opinionated people. And after that exposure I started looking at things and thinking differently, so thank you Rich for having given me that! Other write-up about that second day at QCon :

Know The Trade-offs Of Your Design Decisions – QCon London 2012

Trade-off is the keyword I remember the most of that first QCon London 2012 day. It was a key point of Dan North’s presentation yesterday: each decision imply a trade-off, wether you know it or not.

QCon Logo

Trade-off means that there is no black and white choice, you always have a trade-off, the point is to make an informed decision not one influenced by emotion or fashion.

Dan talks about 4 topics and the usual decisions that goes along:

  • Team composition: co-located or distributed ? feature or layer team ? experienced or inexperienced ? small or big ?
  • Development style: automated or manual build ? test-first ? TDD ?
  • Architecture: monolith or components ? resources or messages ? synchronous or asynchronous ? single event loop or multiple threads ?
  • Deployment: automated or manual deployment ? vertical or horizontal scaling ? hosted or in-house ? bespoke or commodity ?

Common Sense is better than Common Practice

Dan North.

This is a quote when he talks about times when tests are the worst thing to do given the trade-offs: longer time-to-market, disposable application, low return on investiment, lower agility for change, etc.

Design is all about the decisions you made

All your architecture is about the informed decision you made. Your decision can have a system wide impact in the quality attributes of the system or a localized one, that define respectively macro (cross-system) and micro (system- internal) architecture as Stephan Tilkov stated it in his presentation. Design is all about the macro and micro level decisions you made.

Stephan also talks about the objectives and rules that goes with each architecture. Here are some topics that can have rules associated with, each one categorized as macro or micro:

Cross-System System-Internal
Responsibilities Programming languages
UI Integration Development tools
Communication protocols Frameworks
Data formats Process / Workflow controls
Redundant data Persistence
BI Interface Design Patterns
... ...

To take account of time, each set of rules is versionned. With the assumption that two systems in the same Rules version can interoperate seamlessly.

In the beginning of a project the objectives are more the following: ease of development, homogeneity, simplicity, etc. and as time goes by (and as the artefacts increase) the preponderant objectives becomes: modularity, decoupling and autonomy. Nice point : Stephan talks about the domain architecture, that is often the most stable one.

Finally, Stephan talks about integration: especially UI and data. I left apart the UI one to focus on the data one with that pieces of wisdom shared by Stephan, Greg Young and Dan:

Each time you mutualize something between two components (code, data) you introduce a new coupling

 

The inverse of « DRY » (Don’t Repeat Yourself) is « Decoupled » !

That’s particularly true for data replication, often considered bad but replication permits more autonomy and avoid synchronous calls that directly impact availability and reliability. The solution is asynchronous: use events for notifying interested components of a change.

The second keyword I notice several times is « entanglement » (actually there are several keywords but I grouped them: cyclic dependencies, braid, intricacies). Reducing dependencies is a point that I promote for a long time. Dependency is often added as a micro level decision but every time one is added it augments the overall system complexity. I often use Structure 101 to manage the dependencies in the Java system I work with and I’m very satisfied with it.

To conclude that first day report I quote Rich Hickey, Clojure creator:

programmers know the benefits of everything but the trade-offs of nothing