CraftConf: "The rules of the game have changed"

This week I've been to the two days long Craft Conference in Budapest, and while, as the tagline "software craftsmanship matters" also suggests, the main theme of the conference was craft culture and such, I discovered another common theme, which was more about how the context of software engineering has changed in the last 10-15 years, and how engineering practices and paradigms try and should catch up.

But before I get to that, I must note that it feels super reassuring when I go to a conference on writing, building and operating software and software organizations, and I learn almost nothing new. It may sound strange, but reality check is almost as useful as learning something. I sat through a lot of really good presentations of very smart people, and I was just nodding my head that yeah, that's exactly how we did it too, check, check, check. And I mean talks from people like Jeff Hodges (Twitter) on distributed systems, or Bruce Eckel on what makes a good development process, or basically all the talks on continuous delivery and automation. It's not always obvious, but we are definitely on the right path.

So back to my point, which is basically what Jonas Bonér had on his slide on that first image above: the rules of the game have changed. Here's how:

CraftConf: "The rules of the game have changed"

(image by @keyvanakbary)

Those in the left column were the challenges engineers were facing for many decades, and their practices and paradigms were centered around these challenges. CPU time, memory and disk were scarce, while concurrency was not an issue. A good example is what Douglas Crockford brought up in his talk later that day, that there are 7 different number types in Java (byte, short, char, int, long, double, float), so that we can store everything using the least possible bits. Who needs that anymore?

CraftConf: "The rules of the game have changed"

By the way I must say here that Douglas Crockford's talk was without doubt the most entertaining one at Craft, and his style and personality is so captivating that you just want to hug him and tell him thank you for being.

Let me put one of his slides here, which is kind of related to what I'm writing about:

CraftConf: "The rules of the game have changed"

We don't optimize for memory or CPU, we optimize for response time, because CPU, memory (and bandwith, and disk etc.) are not scarce anymore but abundant, while in the meantime, our systems have hundreds or thousands of interactions a second with millions of users.

Also CPU performance stopped being a function of how much transistors we shove onto the silicon, but it is a function of the number of cores, which process in parallel. Concurrency is not specific to systems interacting with extreme number of users (like web apps), but is everywhere. There's a quad-core processor in Bonér's phone, and as he said, the Von Neumann architecture assuming a single computational context working with mutable state is not valid anymore, and we need a fundamentally new model, if we want to tackle the challenges of concurrency, like complexity.

This is a slide from Bodil Stokke's keynote speech, which was another brilliant presentation, during which I just felt that I somehow want to work with this person, because she's so awesome. First she went through topics related to the main theme of the conference, so she mentioned agile and TDD etc., but the talk quickly turned into making the case for functional programming, when she got to the point of complexity being the reason for bad code. There was nothing new here of course (except for my little ponies - that was new at a software engineering conference), as we have been listening to this argument for many years now, that in the age of concurrency the functional paradigm is the natural choice, as it almost fully eliminates mutable state, which is the ultimate source of complexity (by making us deal with the order in which things happen, and that infinitely complicates control flow). That encapsulation of state, while useful in many ways, doesn't help with the real problems, and referential transparency is the way to attack the main issue.

CraftConf: "The rules of the game have changed"

CraftConf: "The rules of the game have changed"

I would actually go further and say that this is not only about deporting mutable state from our application to a transactional database. Storing normalized data and mutating it is a practice of the past, when we optimized for disk. It doesn't matter if it is in a database instead of local memory, it is still shared mutable state fenced around with locks. In my opinion storing immutable data is key.

By the way to me it was surprising to see how many times declarative programming came up during these two days. Stokke brought it up too, and in the context of all the above, it completely makes sense:

CraftConf: "The rules of the game have changed"

Tim Felgentreff had an interesting presentation on a language called Babelsberg, which is like a declarative extension to Ruby and other object-oriented languages. Oh, and he played the very same video from 1962 of a program called Sketchpad, that was also featured in Bret Victor's legendary "The Future of Programming" presentation at the DBX conference (the best conference talk I've ever seen).

So turning back to the original point, and continuing Bonér's thoughts that our systems are inherently distributed, and we have to start to deal with this, and think in terms of distributed computing, let me highlight two more talks at Craft. One of these was by a colleague of Bonér, Endre Varga, who is a member of the Akka team at Typesafe, and who gave an enthralling presentation, titled "How to Think Distributed", of how abstract epistemological questions like "does the present exist", "does the past exist", "is there an objective reality" etc. become practical matters and everyday reality for someone dealing with distributed systems.

CraftConf: "The rules of the game have changed"

(Yeah, the venue was spectacular.)

A friend of mine, who's not an engineer, told me when he saw this picture, that although he doesn't doubt for a second that all this makes sense, nevertheless this slide would perfectly hold water at some esoteric pseudoscientific session too. Certainly this talk was deep, and examined some of the core questions of distributed systems, and was also somewhat academic, but Varga also gave the audience a few practical advice at the end, that were among the most important conclusions for me at Craft.

Somewhere on the other end of this theoretical-practical spectrum was Jeff Hodges' talk. He's been an engineer for Twitter for more than four years, and everything he knows about distributed systems, he has learnt it in the trenches, so as he said, we should listen to him. His talk was dense with practical advice, but interestingly he made almost the same points as Varga in his theoretical explorations.

CraftConf: "The rules of the game have changed"

Hodges talked about failure as one of the key characteristics of distributed systems, and how metrics are essential to deal with failure. Varga used the term "failure model", that we should have in order to have systematic knowledge of what is expected to fail and how in our system, and also how we should never assume reliable communication. The central idea of Varga's presentation was how components of a distributed system have no objective knowledge of each other's state, and Hodges also mentioned several times how consensus is not trivial in such a system, and how we should avoid coordination as much as possible.

So what are the problems and challenges that keep software engineers awake every night and make scratch their head? It's not how to fit a program into 32K anymore, or how to efficiently use CPU cycles, or how to represent data, so that it takes up the least possible space. It's how to build a network of interoperating and intercommunicating entities that compute in parallel. What are the characteristics of such networks, how to exploit these. The impact of all the change in the last one or two decades is that it completely shifted the focus of engineers to a new set of problems.

It's a funny thing that actually all the tools and practices needed for dealing with this new set of problems have been out there since the sixties, we just need to continue to re-discover them.