Navigating the IT complexity spiral

From computers to cars to Kalashnikovs – what we can learn about 'over-engineering'.
20 January 2020 | 16 Shares

Technology used to be a lot simpler. Source: Shutterstock

Everything is getting more complicated, just everything. There’s an increasing amount of complexity being built into every single product and service we touch. The intention is to evolve products and make them better, but is it always good for the end user?

Most of us wouldn’t draw a direct parallel between computers, cars and Kalashnikov rifles, but let’s look at why they share a common development trait… and then consider what that means for the complexity conundrum.

Mikhail Kalashnikov designed what became the world’s most widely pervasive automatic rifle based around a concept of design simplicity. Although horrifically deadly, the AK-47’s ‘beauty’ (for want of a better word) stems from its low cost of production and its simplicity to manufacture and maintain.

If an AK-47 could be stripped down and reassembled in under a minute (and you can go and look for the YouTube videos to your heart’s content), then the core design basics of the product must have been fairly logically and sensibly engineered in the first place.

But even automatic rifles have been subject to technology-based disruptive change. Newer models seek to give us a lot more bang (for want of a better word) for our engineering buck if we do decide we need to purchase a firearm.

From Kalashnikovs to cars

As anyone who remembers the 1960s (or is still familiar with the cars of the era) knows, automobiles used to be built with that same kind of basic engineering approach. 

Although the internal combustion itself may still have been out of the reach of most home mechanic’s scope, many people would change their own oil, replace their brake discs or be able to at least perform some internal tinkering under the hood to make sure their motor kept on running.

That kind of hands-on tuning in cars is all but gone. According to Visual Capitalist, “With the advent of sophisticated, cloud-connected infotainment systems, the car software in a modern vehicle apparently uses 100 million lines of code.”

For many people this is a wonderful thing, it brings intelligent safety controls to us while we are driving, GPS satellite navigation systems to our dashboards to ensure we never get lost, and this type of technology also allows us to know which wheel needs more air pressure at any given moment.

For others, this is over-engineering and its complexity means that humans are kept out of the engineering loop… so when a problem occurs, it’s harder to fix.

From cars to computers

The parallel throughout here is what has happened to computers, which actually started off in a similarly straightforward base of design.

In computing, the core design arrangement is called as the Von Neumann architecture. In this design specification the memory and processor are separate, meaning that the act of computation requires data to be moved back and forth. 

Other elements of the Von Neumann computing machine are similarly quite clearly demarcated, there’s data storage, input/output and memory. In other words, you know where to put the oil in and change the tires.

But, as in Kalashnikovs and as in cars, things have changed.

The rapid development of data-driven AI algorithms means that in some cases, the hardware of the computing device itself has become the bottleneck for the execution of advanced algorithms. The next advancement then is in-memory computing.

Widely believed to the be the gateway to the next generation of complex Artificial Intelligence (AI), in-memory computing works in contrast to the Von Neumann architecture. Instead, memory and processor are fused together and computations are performed where data is stored with minimal data movement. 

This means that computation parallelism (the ability for a machine to do two or more things concurrently) and power efficiency can be significantly improved. 

To autonomy… and beyond

So computers are being reinvented with a new degree of hands-off complexity that human beings will be further separated from. This should be good news, but only if those systems stay up and running when we need them. With the prospect of even more complex quantum computing on the almost-visible horizon, things are about to get even more complex.

The technology industry (perhaps somewhat predictably) thinks it has the answer– and it comes in the shape of autonomous controls, which themselves stem from AI intelligence in the first place.

Autonomous computing means we can predict when databases need patching, when mobile devices need updating and when system maintenance is going to be required and we can do all of it without human input.

In theory at least, the future is still bright. Well, until someone needs to go in and manually change the oil on an autonomous cloud computing AI engine, right? 

Don’t worry it’ll never happen… probably.