'It's hard to find maintainers': Linus Torvalds ponders Linux's future
theregister.comFor me, this is one of the most interesting aspects to the development of Linux, and one I've been studying for years at various levels.
Even though I don't understand a lick of most of the code, watching the lkml and the process itself is very quite interesting, however, there are a few ways to try and put the kernel in a better position in the future.
Each major release can be considered a "story", for example, the journey of WireGuard from inception to submitting it to the kernel, that's an entire process, involving many, many steps and employs an unlimited number of tools; meeting in person, email, VoIP, Slack and anything you can think of.
Conversations that turn into code are a vital part of kernel development, but I feel is less well known (after all, we just see code shuffling about git repos).
Each bit of code is possibly hours of work, represents many conversations, and yet, git cannot (and does not), preserve the history of how code reaches Linus' tree.
So, how can we improve this situation?
Document, document, document!!
I'd love to see someone like Greg Kroah-Hartman, David Miller, Stephen Rothwell et al document, extensively, how they do their work.
I'm talking about high resolution; screen captures, text, images, audio and anything else that we can preserve going forward, perhaps all neatly tucked away in a git repo and backed up many, many times.
Seriously, we're at risking of losing a core vital understanding of how people do their work, especially those who are core to the Kernel. Of course, people may develop new methods, but I feel the kernel is quite mature and the processes in theory are quite stable too (such as how Greg actually releases a stable kernel once a week).
Of course, not everything can be documented, older software gets older, and someone's favorite email client may not be transferable, but the general process can be.
So yes, this is something to think about and prepare for, otherwise it may hit the kernel quite hard.
I always wonder why most projects don't have an onboarding process where you get in touch with the maintainers and discuss the roadmap and how to help even if you personally don't have a specific issue you want to work on. For me I submit pull requests to open source projects when I personally need them and then leave once the work is complete. That's basically the opposite of what it takes to retain maintainers. Maintainers usually write code for the sake of others and not for themselves and they stay for much longer periods of time.
These are some very good suggestions. I think they should take note on how the Kubernetes project operates, with the various interest groups, Slack channels, meeting notes + their recordings on Youtube. I think those channels catch a large part of the history of the development. I'm not sure if Linux uses anything beyond mailing lists.....
I think the heart of what you're getting at lays in the world of ethnographies. Perhaps a cultural anthropologist would take a hand at documenting the process of developing for the Linux kernel, its rich history, and all the minutiae that go into it.
Read up on the field on Science and Technology Studies (STS). STS researchers do this kind of work all the time, since the 1970s.
More specifically, check out the work of Matt Ratto, who defended his 2003 dissertation on this exact topic: https://bit.ly/2YQL1yf
A few points irk me on "the beginning of the end" conjecture in modern technology, rather than focusing on "as one door closes, another one opens" philosophies that keep it all going.
If I were to do coffee with Linus, I would tell him to fork the kernel and cleancode a kernel for the future (and take the reins), while letting the complexities of the current kernel continue to flourish in its present form (possibly seen as letting the rope go in the middle of a heated tug-of-war, which needs to happen on a public stage more often).
The "in between the lines story" on Linux over the years looks like [they] have been thrust into a role of placating the miserable, instead of writing brilliant code (its happening outside of tech too).
As far as switching gears to salaried maintainers, OSS should start an ISP (core function) similar to AOL (maybe aspergers online) and be a HUB for accessing the fruits of their labor. It would be like bringing the earthlink/mindspring 110% support model back to life (a reputation for being stewards of all open tech, in addition to top notch support).
Just having an ad-free network as an option would be worth the price of admission.
Linux is successful because of: "WE DO NOT BREAK USERSPACE". It is Linus policy and work with community that made it most popular OS.
I believe the way to "next generation" while keeping the Linux userspace compatibility promise is basically gVisor running on a non-Linux kernel.
If it runs your container just as well, well, it's "Linux" as far as your app is concerned.
Case in point: 1% of Google Cloud Run could just as well run on Fuchsia today and you just wouldn't know. All you see is the inside of the gVisor sandbox -- yet at the same time it'll run pretty much any http-serving docker image.
(Of course, in the real world Linux rules because of its huge collection of drivers, filesystems, etc.)
Don't quote me on this, but I hear:
creeps like other users data
I think the next billion uses (not users) of the linux kernel will be on bare metal (or bare SOCs, wtcmb), so storage yes, but no networking or cloud required for optimal operations.
Well, with all of the swinging C.o.C. puns available, I'll tread safely and just point out L.T. never seemed to be about compromising quality over making people happy. Solid state does not have squeaky wheels.... Somethings amiss here....
I think Linus is at fault here. Having a PR process that requires sending a patch over email, email threads done with super old mailing list software, etc instead of the more modern workflow of using forks, PRs in a web UI, etc is tedious and annoying. Making kernel development and maintaining more accesible to younger people would increase the pool of potential maintainers.
Keep in mind that linux is a 25+ year old project with established practices which might well outlast fads of web-driven development which might change every few years. Yes, Github/etc provide a friendlier interface for newbies (and might well be worth considering seriously) but note that there is almost no open source project that is on the same scale of collaboration complexity as the linux kernel, and it is far from obvious that they would be well served by popular platforms. Not to mention the whole Bitkeeper SNAFU that came out of becoming dependent on a platform they did not control; they would be very reluctant to put themselves in such a vulnerable position again. Github can decide on a whim to de-platform certain projects/customers/users (and has demonstrated a willingness to do so).
There's no need to rely on a third party provider given that it's possible to host say, Gitlab yourself. For a project of the size of the linux kernel that's more than a worthwhile investment.
In this context it’s wise to move conservatively. Gitlab has “come of age” only in the last few years. We see that KDE is now moving. If that works well, I wouldn’t be surprised if other OSS projects slowly start migrating.
Remember that there is heavy organizational inertia and culture on the LKML. Moving to Gitlab is practically akin to gutting that and beginning a fresh project with forked code — the kind of cultural makeover that most companies don’t survive, and won’t do unless they’re forced to. And maintaining starting culture is even more crucial for volunteer-driven projects than for corporate projects with very different incentives.
I really don't see a reason to not use both. Even right now, a maintainer of some subsystem could conceivably create a repo on Gitlab* and accept PRs there, which they could then (rebase-?)merge in and sync with git.kernel.org with Linus never having to know about it.
The problem is cultural, as well as organizational. Firstly, I doubt any of the current maintainers would be willing to do that, especially without Linus's blessing and secondly, without a fully managed Gitlab* instance that takes care of sync and whatever else automatically (provided by the Foundation), this would be an unreasonable burden on the maintainers.
> I really don't see a reason to not use both. Even right now, a maintainer of some subsystem could conceivably create a repo on Gitlab* and accept PRs there, which they could then (rebase-?)merge in and sync with git.kernel.org with Linus never having to know about it.
FWIW that's already happening with the BPF subsystem https://github.com/libbpf/libbpf
Or Gitea!
I'd put LLVM/Clang, Chrome, and AOSP into those categories. They all have more modern development tools than e-mailing patch-sets around.
I'm often confused as to how many people really have a potential future where they become motivated long term participants, but a few bumpy roads at the onset turn them off. And I don't just mean in kernel development, learning the guitar isn't fun either.
Who gets into software as a profession, and then lets a bad UI turn them away from a project? When so much of the valuable skillset in programming is understanding and navigating a possibility space where the UI isn't built yet? (And I'm sure some kernel contributors would argue it's more _unusual_ than actively _worse_).
I know this paints me as old and out of touch. I know this. But I think it still applies. When I look at my interns, the good developers tend not to be the ones tripped up by onboarding to systems and workflows that aren't exactly like the trendy development tools or what they saw in school.
I don't think this is an entirely fair comparison. I thinks it's closer to the Post Office being worried about finding enough new drivers for their mail routes, but refusing to upgrade any of their delivery trucks from a manual transmission to an automatic one.
Sure, you can argue that learning to drive a standard isn't that hard, and if you want to do a valuable public service like delivering mail, then learning to drive one is a small price to pay. But it's a barrier to entry for most young drivers because automatic transmissions are far, far more common. Besides, not only is the constant starting/stopping like a mail truck does usually hardest part of driving a manual, but isn't even fundamentally coupled to the act of delivering mail.
There's plenty of competent drivers who might be willing to deliver mail using an automatic (or developers to maintain the kernel on a platform with a decent UX), and that kind of process overhead isn't something that should just be waved away.
Edit: Changed accidentally repeated phrasing in middle paragraph
> automatic transmissions are far, far more common
The Post example doesn't work in Europe or Africa post.
If there were a better way to learn guitar, everyone would jump on it.
Software doesn't have to be so difficult and tedious to use, as has been demonstrated by a decade of u/x in the UI space. Nowadays, nobody would ever consider gordian knot for their video editing, and I don't blame them.
This idea that one must suffer to learn is as outmoded as alchemy. You don't need to take the good with the bad; learn from the bad and make more of it good. I lived though the old times, and they sucked!
Nowadays, if the developer doesn't give any thought to u/x, people won't give any thought to his creation, and rightly so. The days of cryptic incantations and tedious rituals are largely over.
> If there were a better way to learn guitar, everyone would jump on it.
This type of assertion is made in a lot of contexts, but I don't believe it holds true. I believe that there are a subset of people who are willing to learn a new skill. Of those people, there are very few who are willing to learn a new skill but won't because of some perceived barrier to learning it. The vast majority of that subset will learn the new skill despite the perceived barrier.
Of the people who aren't in the set of people who are willing to learn a new skill, there will be very few people who will be movitated to learn the new skill because the perceived barrier to learning it has changed in some way.
I agree. Bad UI is not a barrier. After spending 3 weeks with FreeCAD I got used to all the weirdness. The hardest problem was finding good teaching resources and actually going over them. All you really need is a clear path for people who want to become kernel maintainers. If you rely on random luck it's obviously going to be difficult and nowadays there are more closed platforms than ever so less people even get to throw their dice.
> Making kernel development and maintaining more accesible to younger people would increase the pool of potential maintainers.
Is there an established open source project out there that originally used a mailing list or some other review software like gerrit, reviewboard, or phabricator, that transitioned to Github or Gitlab and actually increased the number of contributors and maintainers?
I could be wrong but ReactOS?
I suspect he also deserves some of the “blame” but maybe not for the reasons you list. He has not exactly made the Linux kernel community a warm and welcoming place historically. Many people can probably think of more fun things to do than get publicly brow beaten by Linus
Surely you jest. Compared to digging around in the kernel and device drivers, the current PR process is probably the most user-friendly and simple-to-use thing that kernel developers deal with in their work.
Of course Linus wrote git, so it's not like he's against modern workflow. There are kernel forks of course, I think that not using web front end is partly due to reluctance to become dependent on an commercial service that Linux people don't control, and partly due to the fact that they like to keep their hands in the mud.
Flip side is there is something to be said for the Kernel not jumping on whatever platform is hot this month.
Plus I suspect the crowd capable of contributing meaningfully is quite comfortable with low level old school tech.
Modern does not mean better. Email-based collaboration is most welcoming to new people and good enough for the regular participants. I for one do not want to create an account and learn how some slow web application works for each project when I can do with sending an email.
>> Email-based collaboration is most welcoming to new people
What? Aren't new collaborators way more likely to be younger and acquainted with Web-based tools like GitHub/GitLab?
New wannabe collaborators maybe. But does the project want to attract only young people who don't care about corporate control creep over free software? Or does it want to attract all able young people including those who understand the benefits of the email based collaboration, which is being the only convenient, decentralized, non-corporate-controlled and the most equitable way for strangers to communicate over the Internet?
Interesting point of view,
> Is C, the language the kernel is for the most part written in, being displaced by the likes of Go and Rust, such that there is "a risk that we're becoming the COBOL programmers of the 2030s?" Hohndel asked. "C is still one of the top 10 languages," answered Torvalds. However, he said that for things "not very central to the kernel itself", like drivers, the kernel team is looking at "having interfaces to do those, for example, in Rust... I'm convinced it's going to happen. It might not be Rust. But it is going to happen that we will have different models for writing these kinds of things, and C won't be the only one."
For people asking why the mailing list, among other things, GKH did an ama session[0] a couple of weeks ago, worth reading before jumping to conclusions too quickly.
0: https://www.reddit.com/r/linux/comments/fx5e4v/im_greg_kroah...
If Linus wants a maintainer he can just sign kernel maintainership duties over to the entity that will end up with them anyway: Red Hat.
Why would Red Hat want to maintain all of kernel subsystems? They care about some of them, not all.
This kind of reminds me of a similar article I saw, saying that Ruby needed maintainers/contributors.
I contacted the Ruby development team, and was told: "find something to improve, maybe do an optimization" with no further guidance. So, I moved on to other things.
'It's hard to find maintainers' (that will work for free)
> Apple is now likely to deliver the kind of Arm-based machine Torvalds has been waiting for. ®
But the question is, will it run Linux® ?
(Serious reply to joking question...)
No, the bootloader is locked.
Microsoft will come to the rescue
Ahaha. Let me corrected that.
Microsoft has arrived and is rescuing linux
"rescuing"
Like the US “rescued” Iraq.
I thought that Google did it first.
Speaking of! Any Android devs interested in helping maintain/build out planned features for:
GrapheneOS (most secure AOSP variant) Vanadium browser Auditor (attestation.app)
Get in touch by commenting here, or GrapheneOS.org