Interview On August 25, 1991, Linus Torvalds, then a student at the University of Helsinki in Finland, sent a message to the comp.os.minix newsgroup soliciting feature suggestions for a free Unix-like operating system he was developing as a hobby.
Thirty years later, that software, now known as Linux, is everywhere.
It dominates the supercomputer world, with 100 per cent market share. According to Google, the Linux kernel is at the heart of more than three billion active devices running Android, the most-used operating system in the world.
Linux also powers the vast majority of web-facing servers Netcraft surveyed. It is even used more than Microsoft Windows on Microsoft’s own Azure cloud. And then there are the embedded electronics and Internet-of-Things spaces, and other areas.
Linux has failed to gain traction among mainstream desktop users, where it has a market share of about 2.38 per cent, or 3.59 per cent if you include ChromeOS, compared to Windows (73.04 per cent) and macOS (15.43 per cent).
But the importance of Linux has more to do with the triumph of an idea: of free, open-source software.
“It cannot be overstated how critical Linux is to today’s internet ecosystem,” Kees Cook, security and Linux kernel engineer at Google, told The Register via email. “Linux currently runs on everything from the smartphone we rely on everyday to the International Space Station. To rely on the internet is to rely on Linux.”
The next 30 years of Linux, Cook contends, will require the tech industry to work together on security and to provide more resources for maintenance and testing.
Its first 30 years in hindsight are more plain to see, but the past still requires some interpretation. So The Register asked Greg Kroah-Hartman, the Linux Foundation fellow who oversees stable Linux kernel releases, to explain what just happened, and where it’s all going.
The Register: Linux is now about 30 years old and, as I understand it, you’ve been working on the kernel for more than 20 years. Looking at the initial ambitions for the project and at current goals, what has changed the most and why?
GK-H: The original “goal” of “provide a working kernel for everyone to use that works on all hardware” seems to be still our current goal, not much has changed there 🙂
Also, there was our stretch-goal of “world domination,” and that actually happened, so our job is to keep that up 🙂
Seriously, we all just wanted to make a kernel that would work for us, and for others, to get their real tasks done. That is still what we do today, and will be doing tomorrow. It’s not magic, or special, but it is what we like doing.
The Register: What lessons might the Linux project offer to other open source projects?
GK-H: Focus on the technology and providing a solution that is useful for people. But this isn’t a big secret, everyone knows how we do our work as it’s all in the open 🙂
The Register: What’s been the greatest challenge for the project and how have the kernel maintainers dealt with it?
GK-H: We’ve had loads of challenges over the years, mostly all having to deal with how our development model needed to change with the increase in developers and users. We constantly discuss how our current process is going and what can be done to make it better, making lots of small changes over the years that at the time do not seem like a lot, but over the long-run have helped out immensely.
Things like going to a time-based release model (every 2 1/2 months), and always keeping each release stable, and having stable kernels that only contain fixes that are in the current tree are ways we changed to make both the developers’ lives easier, as well as users of the kernel and companies that want to use Linux in their devices.
The Register: How is the exploration of Rust in the kernel going, given its integration so far? Is its adoption within the kernel moving faster or slower than anticipated, and what needs to happen next for more of the kernel to be rewritten in Rust? Is that even the goal, or is it for totally new code to be in Rust?
GK-H: Take a look at [this summary of the latest kernel developer thread about Rust], and the original email thread that is linked to at that page for more details on this topic than you could ever imagine. Also see the “here’s an existing driver in C converted into Rust as an example of how this might all work”, which is a great proof-of-concept for some of the issues involved here.
As you can see, the idea of bringing in a new language with different lifetime rules, into the kernel which already has a set of existing lifetime rules, is going to be complex to say the least.
That being said, the developers working on this are making great progress, but in the end, we will only know if this works if they get their code merged. It’s still a long way off, but they are getting there.
As for “slower or faster” than anticipated, no one “anticipated” anything here. Kernel development does not have deadlines, we merge code “when it is ready,” and sometimes that can take a while to get there based on all of the issues involved. For something like this, it is a complex thing and of course it will take a while. No idea on when that will be, there’s too many unknowns to ever be able to guess.
And as for the goal of rewriting things, no, right now the goal is to just write new code in Rust if at all possible. Replacing existing code is not on the roadmap of these developers, as their announcements say. But that’s not to say that once support would be merged, that replacing existing C code would not be possible, only that it is not the original goal here at all.
The Register: Are there any significant upcoming challenges that you expect will need to be addressed? (In terms of evolving chip architecture considerations, emerging use cases, organizational shifts, or whatever.)
GK-H: We don’t really plan for the future, we only react to what is given to us with changes in code or hardware architectures, so we will handle those changes just like we have in the past.
The Register: Why did Linux succeed?
GK-H: People have written theses about this topic. I think I’m a bit too close to answer it in an unbiased way. There are a few things that I think might be the reason, but others might disagree, so I’ll let everyone else argue about it while I get back to work reviewing code that people submit to us.
The Register: Will there be another Linux-style kernel, a large, vastly deployed open-source kernel produced as a collaborative effort? Writing a fully fledged web browser from scratch, for example, seems unwise and impractical, so does the same apply to Linux? And how does that affect the development direction of the kernel?
GK-H: No one would have thought that a second open source browser could have been done, and yet it was. Who knows if the same thing could happen for an operating system kernel as well?
The Register: Do the Linux maintainers see any other technology projects in a competitive light? (e.g., Google’s Fuchsia effort) If not, why not? And if so, how does that inform decisions?
I would love some real competition in operating system kernels. We lost some good feedback loops when we would work with the BSD kernel developers in the past, as most of them went to work for Apple and disappeared. Some of the ideas in Fuchsia look interesting, and I’ve talked with the developers there about some of them.
I would love some real competition in operating system kernels
The nice thing is, if any good ideas do come out of other projects, and they make sense for Linux as well, we can add them to Linux fitting them into our project as needed. It’s not like we all work in a vacuum of ideas, most kernel developers have loads of things they want to change and add and improve if only we can carve off the time to do it.
The Register: Is Linux development more affected by geopolitics/nationalism than it used to be? If so, how?
GK-H: What is the timeframe you…