Sunday 19 October 2014

Denying State

Recently I attended a great talk about Lamba Architecture by Carlos. I particularly liked how information is underlying a tight control of accuracy, a concept that systems like HyperLogLogF1 or Storm with Algebird have since some time. In my JAX 2012 Keynote I stated that Eventual Consistency will move into Hyper-Consistency that focuses more on perception than technical isolation. I could very well imagine Lamba Architecture being combined with CQRS, where some parts of the UI are "noisier"(a bit like VoIP scatter, coming from a queues and workers implementation) than others (maybe implemented using a functional paradigm). In functional programming as well as in event-driven systems, state is mainly neglected. But sometimes state is necessary, because it occurs in the real world, and addressing it in an immutable manner, just storing all information, can become costly, slow and inaccurate and insecure. By encapsulating functional logic and immutability where it makes sense, pairing it with a more state-driven, fuzzy, approach an architecture can scale more heterogeneously. The functional parts could be RESTful, and the fuzzy ones streams.

But the more complex these ideas become, the more they seem to be used as an excuse for random technology choices - especially in multi-paradigm solutions, e.g. ones based on JavaScript. They are getting used by people I refer to as Hipster Coders. Although the term hipster has become non-descriptive and contradictory, there is a core to it: You can call it quality or design but in the end it's simple: Not caring. Caring is the opposite of cool, it's boring. Hipster coders don't actually care about appropriate technology, the business context or the user. They care about their personal fun. The hipsters were just the last step in a long history of defaming care. They took away the rebellion. From Hip Hop, Rock, last but not least the Geek and Nerd world - hollowing out every possible idea until it could be mass-customized into an ironic pop-culture t-shirt. Hipsters celebrate obsolescence. Their hedonic need to be "on the cutting edge" (Boym) rather than beyond the edge is meagrely covering the capitalistic rationale behind it: The perpetual, infinite,  reformulation of techno-scientific merchandise (Lyotard). Hipster coders use technology like fashion uses materials, as a status symbol with an arbitrary defined lifespan.

A good example is Microservices, an architecture pattern that gained traction as fast as a cat meme. A mix between SOA, DevOps, REST and Cell Architecture, Microservices build mainly on the availability of automated virtualization and out-of-the-box distributed infrastructure. They are a worthwhile answer to heavy infrastructure blocks that come with corporate processes designed to kill any form of agile solutioning. But they come with the cost of "Integration at every level" - a technical debt in the long term a Hipster Coder could not care less about. As mobile and social require distributed architectures, this debt is easily accepted in favour of time to market. But once the systems stabilise, operations cost will rise. I agree, versioned services running in their own container make a lot of sense. As does decoupling code from nonsense anachronistic operations onboarding processes. For some components, though, a shared infrastructure just makes total sense. Forcing everyone into a directed messaging model will eventually lead to an uncontrollable network of dependencies. Once the append-only databases start stalling debugging will be hard. Microservices should be used to get rid of noise in the development cycle, not so isolate your fancy pants from the context of the world.

Dear Hipsters, maybe you would be better off with a bit of Normcore.

Sunday 3 August 2014

Fog Computing

Recently Cisco is promoting a new term, Fog Computing. It's the application of cloud (IaaS) terms "to the edge of the network", incorporating ideas such as CDN, SDN, IoT and BaaS.

In building architecture, this convergence between what they call public and private space is much discussed and a main feature of contemporary buildings, for instance the Blur Building which has fog instead of a boundary, or edge for the sake of the argument:
Diller & Scofidio Blur Building
(c) Diller & Scofidio at Designboom
What began as the discussion about buildings that have a flexible interior but a very clear, black box, exterior in Learning from Las Vegas, has moved towards a view of information flows alongside people flows in cities (see Trüby, Lynch or the Quantum City). The bottom line is: From the smallest spatial piece of information, where we are located, how we experience space, to the urban landscape, evolving over time, the same patterns emerge. There is no clear distinction between inside and outside, static and dynamic, it's a Matryoshka that unfolds in different speeds.

Building up on the currently ongoing Microservice discussion this means Microservices, or Fog Computing, won't solve the complexity. They are just in the same state as OOP, CASE Tools or Dynamic Languages were in the beginning - before they hit the Norris wall. Our systems will grow in complexity, and while it's great not to have to worry about heterogeneous platforms, we need to understand them nevertheless. Scalability concepts still need to be understood from the bottom up, in combination with the real world problems they are trying to solve. They can be masked away by high-speed data synchronization concepts between devices and algorithms, but eventually they will pop up as technical debt, when change happens. It's the story of the magic Excel Sheet - which turns into Software and then back. Just like a Matryoshka.

The Blur building was not really comfortable, it's pretty wet. Many edgy contemporary buildings with "Open Space" concepts are also not comfortable. They seem to give maximum flexibility in a simple concept but actually are just a mask around technical debt. Which is fine as long as it's clear to the user. It won't make IT easier or more commoditized, in fact it promotes the opposite: Having to pay for highly specialized experts when you can't go back and are stuck with a tool that once seemed easy enough. In the past this situation occurred as well, but it was covered with new needs for client-server systems, GUI's, the Internet, Mobile and Omnichannel. If there is no new paradigm around the corner many users will sooner or later realize the technical debt they finally have to solve.

Let's see which technologies will be chosen then, when they finally need to be harmonized with real change - when the users realize they work for the computers. I hope it's simple components, that need a fair bit of understanding - and not oversimplified magic bullets. The more important change in IT should be a culture that embraces problem solving, diversity and organization rather than fancy facades.

The Next Challenge of the Web is Us
Christian Heilmann 

Sunday 11 May 2014

How doctors and pilots verify

The current discussion around Test-Driven-Development (see here, here and the Hangout) seems a bit far-fetched, as I have rarely seen such dogmatic applications, but some aspects of it are indeed interesting.

Mocking clearly has its limitations (whoever used Spring or O/R mappers knows the uncanny feeling of mocking tons of Spring internals). What I find fascinating more interesting, though, is the cultural discussion of pre-planned versus experimental approaches. In the Hangout, Kent Beck makes a point about how he defines the problem and incrementally finds a solution whereas Hansson does not even define a problem but rather tests something directly with any "owner", prototyping. From their example it looks like Beck usually solves algorithmic problems whereas Hansson seems to face interface and integration issues. Fowler seems to see both advantages, and tries to reconcile without a strong opinion.

The more interesting underlying question is, where verification is required and how it is achieved. This reminded me of a chat I recently had with a friend, a doctor, and another friend, a pilot. The pilot explained, how nowadays conflict resolution between the various (social and technological) subsystems of a plane, also between the various autopilot systems, is a major part of the training. The human mind becomes the reason between algorithms (an argument Lehrer already wrote about, also see Flash Boys). Understanding feedback cycles and how systems interact is of course an important tool to solve conflicts. The doctor added that, interestingly, this systemic understanding becomes more and more relevant in her work. She went on explaining that, just like in software engineering, medicine has a discussion about whether it's more of an art or more of a science. In her opinion, which I liked very much, medicine is an art based on science*.

Software Engineering could be the same art based on science. Hypothesis (i.e. scientific)-driven Validation in the form of TDD would therefore be required for the scientific part, the algorithms and processes. Empiric (i.e. psychological, social)-driven Validation would be required the arts part, the composition, experience and elegance. The latter would be hard to test, it would probably need to be monitored and replayed in simulations or regression-tests, at least documented, in order to be manifested for future generations of programmers. Clearly defining which part of a software system belongs into which domain might end many of the discussions, and bring back reason.

*) Apparently, this quotes originates from William Osler

Sunday 20 April 2014

Simplicity Coach

Recently I had a chat with colleagues from a very complex global omnichannel project, rebuilding a huge solution from scratch. The end-user was only supposed to realize performance improvements and nicer user experience, no service interruption. The colleagues involved a service design firm in the process. Although the project was complex, it was so from a technical point of view - business processes would only be changed afterwards. Hence I asked the guys: Why involve a service design firm?

Their answer: We needed a Simplicity Coach.

You will find this term in many ways around the web, in fact there has been a true simplicity hype in recent year  (e.g. Maeda, Küstenmacher). What struck me, though, was how they used it informally as a standard role. The service design team was part of the architecture team, in fact it was their daily business to challenge each others ideas. The architects from a quality, scalability and maintainability perspective. The design team from a simplicity perspective. The architects asked: "Can it be done safer?" whereas the service designers asked: "Can it be done easier?". Together they achieved true resilience in building an architecture based on small, versatile services, yet with a strong vision, common understanding and buy-in from everyone in the project.

Tuesday 18 March 2014

Permanent Crisis

Maybe it's just my selective perception, but it seems in the last weeks there has been a burst in interesting IT industry topics which did not only cover technology.
  1. Uncle Bobs move towards more governance and how meriotocracy might not be the right solution if myths and stereotypes persist
  2. The question whether estimation is evil, if we should stop using the word Agile and why software development still sucks
  3. Fowlers Micro-services and the "history repeating" argument about SOA
Despite the obvious lack of reason in most of the discussions*, for me they also have something else in common: They're all about trends a new generation of coders have embraced particularly dogmatic in the last 5-10 years which seems to reveal they are no silver bullets in fact. It's like all the hype cycles ended up in the Trough of Disillusionment at the same time.

Are we still in software crisis?

At Baruco 2012, Paolo Parrotta diagnosed wittily: Software Crisis? You cannot be in crisis for 20 years! 

It's been 54 years.

Maybe software development just does not get really better? Maybe we are moving just with the real world, like everyone else, not being "special" or "visionary" by definition but actually just normal people with (or without) a degree in computer science, influenced by the same socioeconomic value shifts and biases? Maybe, as Software Is Eating The World, software just becomes as chaotic as the world itself?

Yes, the field improves but it's a little signal in a lot of noise. I am currently writing a book on building architecture vs. software architecture, which is why I haven't been blogging recently. It's not that I want the ideas to sell, no, a book does not bring you much money. But I see that I have to think ideas through, reading stacks of books, old notes dating back 5 years, re-reading articles and posts. And I see some improvement while doing this. But I also see many themes repeat over and over again. Sadly, instead of agreeing on a common sense, views seem to become more dogmatic and further apart. It's always the same pattern (The Excel Pattern):

Someone has a simple idea. Followers who believe in control systems and analytic engineering turn it into a methodology. Someone else sees the methodology and has a simpler idea. Then, both the old idea and methodology are declared evil.

As long as we are not working on the core biases and expectations towards software engineering and architecture this vicious circle will never stop and work towards real improvement, in technology as well as in team building, fairness and business value. Maybe this will turn IT into an art thereby. That's the risk we should be willing to take.

My current book stack - wish me luck:



*) But slowly returning, the original posts were mostly reasonable though