gtucker.io
Beyond Open Source
Beyond Open Source

The time has come once again for me to make a professional move. After seven years at Collabora, I’ll be taking a break in January 2024. Life is full of new beginnings and this is one of them.

As far as KernelCI is concerned, news will be shared later on as things fall into place on the project’s dedicated blog. Stay tuned!

What I’ve learnt

One of the largest open-source projects out there is the Linux kernel. It was named after Linus Torvalds who started it, and Linux rhymes with the class of operating systems called Unix. As he himself once put it, describing the whole institution around public contributions and code reviews:

It’s a social project. It’s about technology and the technology is what makes people able to agree on issues, because […] there’s usually a fairly clear right and wrong.

Now, anyone who has been exposed to the open-source world will know that there are basically two sides: the community, public, “upstream” side and the corporate, private, “downstream” side. They both interact continuously, but there is clearly a line to be drawn there. The downstream goals are well understood, they’re very similar to any product industry using proprietary software. It’s basically about making profits and turning the cogs of consumerism, while occasionally making the world a better place as a side-effect.

The upstream goals aren’t that obvious. This is where I feel I’ve learnt something critical in understanding how this all fits together. The usual narrative is that contributing upstream avoids duplicated efforts downstream, so everyone benefits from it. All the product manufacturers may be privately impacted by one bug, or one missing feature, and then one single set of changes sent upstream can solve the issue once and for all. However, nobody can ship a product with a plain upstream Linux kernel and this also applies to nearly all open-source projects. Why is this? Here’s my humble take on it.

Referring to the earlier quote from Torvalds again, the upstream side doesn’t actually care about downstream. The community doesn’t see the industry as its top priority. It’s common to hear developers say that if some code is not upstream then it doesn’t exist. What matters, is that people get to work together on solving issues, because this is what humans are really good at: surviving as a community in the wild. It was carved into our instincts over the millennia by the chisel of natural selection. Add to this the ability to believe in stories or abstract ideas and you can have an ever-growing group of individuals who haven’t even met each other physically but all work towards a common ideal.

The industry is just providing a particular type of fire to keep the open-source Olympic torch alight. If the upstream were supported by say, some kind of universal income and if it was decoupled from how profitable the industry was, it would most likely still carry on even without any sustained commercial success. People would still want to keep developing the code, fulfil their roles as maintainers and solve interesting problems together - for as long as the flame of the story held. This is how the Linux kernel started, and it’s probably also how it will end.

Humans and Machines

In the beginning, programming was about setting machines up to perform specific tasks. Like a drum machine: you directly flick a bunch of switches in order to play particular sounds on each beat of a sequence in a loop. Then software appeared, somehow abstracted away from the machine but still broadly aiming at reaching the same goals. So far so good.

And then, relatively recently, things started to change. After all the fiction, and amidst all the on-going buzz, artificial intelligence finally started to get real. It can already write code, soon it will be designing new hardware and who knows how things might look like in a decade from now. What we’re witnessing today are just snowflakes landing on the tip of the iceberg. One could easily argue that the boundaries between software and hardware will get blurred, in fact it’s already the case with neural networks. Is it software, data, or can it be seen as continuously reprogrammable hardware similarly to biological neurons? It’s a spectrum.

If software becomes a thing of the past, and conventional programming languages get superseded by other non-human forms of technology, then ageing developers will be turning into actual hobbyists akin to today’s steam engine enthusiasts. Legal frameworks will also need a full rewrite, or what’s copyleft got to do with it? Can the spirit of Open Source and Free Software evolve fast enough to meet these new challenges? Will society be able to stick to a coherent set of ethics once new forms of intelligence far exceeding human capabilities are omnipresent?

We’re now setting things up, not for the machines to perform arbitrary and predetermined tasks any more, but for them to evolve in one way or another. A bit like ballistics I suppose: once it’s outside our reach, once the stone has left the arm of the catapult or the rocket has left its launchpad, we can only hope it did so at the right time and with the right acceleration and inclination. If open source can be of any help at all here, it would probably be to put the design documents under public scrutiny before the final count down.

À bientôt

I once missed a boat even though it was delayed by over an hour, I was a little bit less late than the boat but still too late to board. It finally departed, well after I’d arrived at the terminal. So after buying an even later ticket for a night crossing, there I was on the Portsmouth seafront smelling the smoke of that freaking ferry heading off to naughty Normandy (see actual picture). Leaving that analogy aside, I’m now learning to swim fast enough so I might catch up with some of the things that haven’t quite taken off yet but somehow make me feel like I’m also little bit late with - starting with myself. The gate is still open.

No time was ever wasted, and it’s never too late to be alive.


Last modified on 2024-01-29