Open Letter to Debian and Ubuntu Developers
linas.org> I mean, I have a really rather high IQ (just look at the web page below), and I have patience that is perhaps unmatched.
I'm guessing you didn't write this, but don't write this.
One, your IQ is irrelevant, it's an outdated way to measure anything practical. Two, patience is relative (and vague). Three, it makes you sound like a conceited jerk.
> We need to figure out what is going wrong, not just at the technical level, but at the social and political level
This is not a constructive criticism.
All in all this post is one of many examples of non-constructive criticism that plagues the open-source world. We know it's supposed to just work. We know it doesn't. Rants don't add value.
By the way:
> Its supposed to be a holdiay weekend. I'm not being paid to run these servers.
Then don't. What are you even doing? You're either working for free or expecting others to, both of which I believe to be a detriment to the free software cause.
>your IQ is irrelevant, it's an outdated way to measure anything practical
This is a common misconception. IQ predicts a lot about a person's life outcomes; see https://en.wikipedia.org/wiki/Intelligence_quotient#Social_c... for a basic overview.
It is usually a faux pas to bring up your IQ online but that is not because IQ lacks predictive power.
Any of those studies address the predictive power of IQ with regard to online discussions of the state of Linux?
Yes, the result known as the positive manifold says that high performers in one area will be high performers in other areas.
Why is IQ irrelevant? It certainly helps when trying to understand complex systems. And the gentleman's accomplishments are consistent with high IQ so at least to me it does not sound like hollow bragging. More like making it plain that stupidity is not the cause of his troubles.
It's a counter-signalling thing. People who do very clever things and brag about their IQ are generally less intelligent than people who do very clever things and don't brag. If you have to brag, then you're worried about getting confused with people who aren't as clever.
IQ is a copout as whole. There is nothing good that comes from trying to measure something as highly subjective as intelligence.
Great you're smart on paper, now what? Maybe you get easily frustrated, maybe you aren't very dedicated, maybe you don't like things that aren't inherently interesting, etc. etc.
Use something objectively measurable. Be scientific. Don't be a copout.
The curse of Free Software. You are root. You can do anything on your own machine. This means that you have the freedom to break it.
The problem with traditional distributions today is that the packaging gives you something, but you are expected (particularly on a server) to take that packaging and then modify your system configuration to suit.
Unfortunately, it's then impossible for the packaging to cover every case on upgrade, since packaging can't possibly know what you did. Suddenly upgrade path code has to magically cover every conceivable use case and more. This route is doomed to fail.
Distributions are working on this problem, but it requires a huge paradigm shift that will leave behind traditionalists kicking and screaming. Take Ubuntu Snappy, for example, with its read-only filesystem and image-based updates. But this sort of thing is the only sensible way forward if you want upgrades to work without failure.
Alternatively, as a user you can take a different course. Make your deployment "immutable", and manage any necessary state independently. In other words, define your deployment as a delta to be applied over the distribution default. If things go wrong, don't try to recover; instead blow it away and redeploy. Manage your delta in version control, and make it easy to test and deploy. This is what the configuration management crowd are doing, as well as the Docker crowd, and just about everyone else.
Of course, this is a little crazy on a desktop system, which is why Ubuntu is doing Snappy.
Or, stick to doing things the traditional way, but don't expect your experience to change.
I mean, most people don't have that many problems with Ubuntu and Debian boxes, right? My first thought is, what is the author doing that most people aren't? Maybe stop doing that?
I don't think the author is alone in having problems with Ubuntu and Debian. There are numerous forums, mailing lists, bug trackers, and so forth which are full of reports of people encountering problems like the author describes.
Many of those users, especially if they're new to Linux, probably just end up abandoning Linux instead of talking about their problems in more detail like the author did.
Even experienced Linux users are moving on. I used Ubuntu and Debian, among other Linux distros, daily for many years. But I too started noticing quality problems. So I've moved to OS X. It isn't perfect, but it generally gives a much better experience than I was getting with Linux, while still letting me use much of the software I was accustomed to using.
The Intel video driver likes to freeze occasionally on Debian 8 when it was never an issue on 7. Can't exactly stop "doing" X.
That's worthy of a bug report - anything in Xorg.log when the system comes back up?
I can only speak from experience but the initramfs system seems to be a lot more "fragile" (?) recently? I've had upgrades result in being dropped to the "emergency" shell and not be able to mount the rootfs, like what the author experienced.
I may end up going back to running without an initramfs, at least for VMs that are going to last for more than a day.
I run a bunch of debian and ubuntu boxes.
The kinds of problems he's reporting are more consistent with hardware failure that exhibits only after a reboot, or with misconfiguration of his software, than defects in the OS boot mechanism.
I have a lot of respect for Linas (I used his Extrusion code to make the glextrusion xscreensaver) but I think in this case he's doing things to his machines that cause these problems, and it's not usually debian or ubuntu's fault.
Ad hominem attacks on this page notwithstanding, he's not wrong.
The reliability and simplicity of 'getting to a shell prompt' out-of-the box for ubuntu also seems to me to be on the decline over the last ten years.
And going much farther back, I would say that inscrutability of boot problems might be at its all time worst.
I have ubuntu systems that will hang for 60 seconds on network failures, or sometimes just refuse to boot. It is very, very frustrating.
And, I'm not a linux newbie -- the first linux kernel I installed was 0.99pl14 on a floppy(!) based slackware system, and I spent years overseeing slackware, then redhat, then ubuntu systems.
I don't think it's a big surprise that CoreOS is so appealing; there's just an awful lot of magic and surprise baked into your standard ubuntu install right now.
I think a lot of it boils down to the Linux ecosystem worship of C99.
Almost everything underneath your GUI (and often times, it even is your GUI, cough Gnome...) is written in C.
C is good for micro-systems where you do not want to implement all the abstractions higher level languages require. It is good for implementing low level functionality in other languages (like Python) or for writing a first-try compiler in on a new platform because of its simplicity.
It is not appropriate for an entire OS stack including tens of millions of LOCs across thousands of projects and a hundred thousand developers. At least in the modern age when we have everything from OCaml (1996) to Rust (2015) showing how to do bare metal safe and fast. Even C++ is moving towards a safe yet fast subset of the language where you should never use new / delete anymore.
Going forward, probably the most important revelation the free software world is going to need to go through is that for your own personal projects, being a whiz C expert that can hyper-optimize pointer math is great. But as soon as you start accepting merge requests, or even worse start delegating maintenance of your codebase across multiple people, C is going to cripple you.
Like I said, it is not C's fault. It is the dogma of Unix that the community holds sacrament - you write it in C, you use pipes, raw IO buffers, and to question that holistic view is to be opposed to everything about it, even if all you take issue with is the complications imposed by using C everywhere (OpenSSL, Systemd, the kernel, udev, NetworkManager, and in personal projects I have had to contend with deep buffer overflows / pointer misalignment / offset miscalculation in things like pulseaudio, SDL, Mesa, Wine, etc).
Maybe when the systemd developers pick their next slice of userspace to bring into the collective, they might possibly consider using a higher level language to implement it in. Not because they are bad developers, but because the code they write is not just about them. Think of how many headaches new work could avoid using something safer like D, or Rust, or even a restricted subset of C++.
> I think a lot of it boils down to the Linux ecosystem worship of C99.
Hardly. If anything it stems from a dual attempt at Linux user space devs to turn Linux into a merger of OSX and Solaris, while applying copious amounts of NIH-ism and second system thinking.
They are far too willing to throw away whole generations of software and concepts over some esoteric corner case or other, and keep chasing platonic ideals that will never stand up to an encounter with actual usage.
And likely when Torvalds steps down, and thus no longer hold the kernel devs to the "do not break user space" mantra, we will see the kernel fall to the same mentality quite quickly.
its CADT cubed.
1) I'm not sure I get what your theory about C is. I could get it if his machine was segfaulting, but how does C cause these problems? 2) C trivially can't explain any decline that's happened, because old Linux was also written in C.
1. The complexity breakages caused by software interactions over the years only gets worse with increased complexity. This is where the overarching theme in the piece of "Linux is getting worse" comes from. More moving parts, more interactions between parts, and more errant behavior and millions of LOCs getting chugged through a CPU give you a larger surface area for problems.
2. This is just a reiteration of the main point, but old Linux was... a PITA to work with. Xorg configurations, manually activating peripherals, no common bus interface like dbus, etc.
The problem is, again, when your foundation is built on C and you used it throughout the entire construction, you get leaks and problems come through the cracks especially at the joints and especially at points of interaction (especially between multiple versions / upgrades, which is a major point of the OP). Linux has gotten bigger to do more and be more, but as long as we keep piling on unsafe, unabstracted code as the fundamental our technical debt becomes so incredibly huge that you get to the point where you pull a Debian Stable rather than an Arch because upgrades are so terrifying.
Most of the delays, problems and incompatibilities do not happen in C code.
It happens because nobody stopped to think about an interface, or because a configuration file or shell script does not handle an error correctly, or because, since nobody wanted to create a sane error handling system, an script has no option but just keep trying and trying.
I do agree that most of the software on one's computer shouldn't be written in C, but those are not the problems this migration would solve.
> The reliability and simplicity of 'getting to a shell prompt' out-of-the box for ubuntu also seems to me to be on the decline over the last ten years.
It has literally been the same for ten years. Ctrl-Alt-T or ctrl-alt-fkeys.
> I don't think it's a big surprise that CoreOS is so appealing
Hilarious, given that his complaints can be blamed squarely on the core (heh) part of CoreOS, systemd.
Blah blah blah, things changed, blah blah.
Ubuntu is a better experience than ever. Open source web browsers are better than ever. Maybe this guy thinks he's more clever than he is, and fucks around with too many things. Maybe he hates change. But, trying to be as objective as possible, this post comes across as whining for the sake of whining.
PS. I haven't seen such an ugly website since the 90's...
"We are all Microsoft Windows, now." Oh, how that hurts. And he's not wrong. I've been putting off buying a new machine because for the first time since school I don't know what OS to put on it. It's not going to be Linux. :( Sad, sad panda.
tl;dr A rant from a guy who manages servers like it was 1995 complaining how he can't reboot and how things were much better back then.
I reboot, upgrade and recreate machines on a daily basis. The Linux Servers and Desktops work pretty well these days, despite Systemd, Gnome 3 or Unity.
If you need days to boot a server, it might not be the server's fault... just saying...
Was that page deliberately designed to be unreadable?
I am honestly trying not to troll. Tried to read and had to give up.
I copy and pasted the text to a text editor.
This struck me as well, and yet this very article complains about lack of good "design and usability" : )
There is a lot going on under the hood in a Desktop Environment to make everything work as you would expect from MS Windows or OSX. Gnome has gotten a lot better in the past 3-4 years. When gnome 3 came out along the same time as Unity, both were a terrible mess. Gnome has come a long way since then. That being said, I dont use gnome. Yes, if you don't understand dbus or systemd or udev or any of the other Linux conventions for that matter, it's going to be frustrating if you upgrade and your system suddenly won't boot. Who uses busybox anymore anyways? I'm not sure how this man's rant got upvoted to the FP of HN. It sure seems like a lot of ignorance and complaining on his part. Systemd has made "managing" or "administering" Linux systems much easier for me and I could in no way see how going back to init scripts could be better or make your system more reliable to boot? An analogy that comes to mind is automotives. Engines and transmissions have come a long ways in the past 20 years. Drive by wire, variable valve timing, direct injection, dual-clutch automatic.. These new technologies come with a learning curve clearly but have definite advantages. I'm not sure why this guy is complaining, if he understands Linux as he claims, he shouldn't be having these trivial problems or much less spending time ranting on the internet. I'm not sure if anyone will take him seriously.
> There's also the nuttiness known as gnome-shell and unity.
I wish more people knew about Xfce (Xubuntu). It offers a better Gnome 2 -like experience than Gnome 2 ever did.
Although, the ouf-of-the-box theme, and look and feel, of Xfce/Xubuntu looks very dated. Changing the theme, desktop background picture, adding some transparency, is easy for a geek, and gives you a nice modern-looking desktop. But the default theme with gray and blue colors gives a somewhat Windows XP -like feeling.
I have become a big fan of XUbuntu as well. When Ubuntu phased out Gnome 2 I looked for alternatives and I found XUbuntu. Like you said, it's a better Gnome 2 than Gnome 2 ever was.
This is what my XUbuntu desktop looked like a few years ago: http://imgur.com/BK2leWF
I don't seem to have any of these problems ... on desktops or on servers. I guess I don't understand what's so hard. (And I haven't lost the ability to boot since the advent of ReiserFS and Ext4.)
I will say it is awfully annoying to have the Ubuntu boot process freeze because some USB external hard drive didn't initialize it time. Just skip it for god sakes, don't make me have to find a keyboard plug it into the machine just to press S to skip mounting and boot.
This seems to be an odd systemd problem. I'm not sure what makes it decide that this USB drive / SD card / SMB share is so utterly critical that it's worth hanging the boot for 3+ minutes.
A brand new Debian install on an SSD with UEFI can get me to the desktop in about 4 seconds. Please keep it that way.
this predates systemd on ubuntu- ubuntu has done this for some time. I don't know why. To fix this, you would have to configure this with nobootwait.
I feel the same way every time I have to work with systemd.
I think I would have the same hatred of someone forced me to work on a Windows 8 machine or Microsoft server. You seem to have some buggy boxes why not throw your servers on linode.com and take a snapshot of everything when it's "working" unleSs your line of work requires some physical access constantly but I can't imagine something I can't stream to a server from some backend connection.
>And its not just the low-level stuff, either. There's also the nuttiness known as gnome-shell and unity. Which crash or hang or draw garbage on your screen. And when they do work, they're unusable, from the day-to-day usability perspective. This wasn't a problem with gnome2. Gnome2 rocked. It was excellent. Why did you take something that worked really really well, and replace it with a borken, unusable mess? What happened, Gnome and UI developers? What were you thinking? In the grips of what madness? In what design universe is it OK to list 100 apps, whose names I don't recognize, in alphabetical order? Whoever your design and usability hero is, I am pretty sure they would not approve of this.
Just use Slackware. Or FreeSlack if you are a GNU zealot like me.
>Its spreading, too. Like cancer. Before 2013, web browswers worked flawlessly. Now, both mozilla firefox and google chrome are almost unusable. Why, oh why, can't I watch youtube videos on firefox? Why does Chrome have to crash whenever I visit adware-infested websites? What's wrong with the concept of a web browser that doesn't crash? Why does googling my error messages bring up web forums with six thousand posts of people saying "me too, I have this same problem?" When you have umpteen tens of thousands of users with the exact same symptoms, why do you continue to blame the user?
uBlock.
>I can understand temporary insanity and mass hysteria. It usually passes. I can wait a year or two or three. Or maybe four. Or more. But a trifecta of the Linux boot, the Linux dekstop, and the Linux web-browser? What software crisis do we live in, that so many things can be going so badly, so consistently, for so long? Its one thing to blame Lennart Poettering for creating buggy, mal-designed, untested software. But why are the Gnome developers creating unusable user interfaces at the same time? And what does any of this have to do with the web browser?
Gnome works really well on sysadmins workstations. No more cluttered taskbars any more.
I would switch to a BSD if I had his troubles.
I did (for a NAS), and the learning curve (after 10+ years of Debian) is steep. I'm hoping one day I can justify the ~40 hours I've put into wrangling FreeNAS, and then FreeBSD 10 (needed the newer igb driver).
> I'm hoping one day I can justify the ~40 hours I've put into wrangling FreeNAS, and then FreeBSD 10 (needed the newer igb driver).
We'll get the answer in few years :)
None of these issues affect my Gentoo box that was first built in 2003. Been upgrading since. Avoided PulseAudio, systemd, use Xfce, stable and boots the same every time. Hardest part was quad monitor setup (Nvidia) - took almost 4h (once a 3yr period)
I want to have sympathy, but I don't. Get a UPS. And after seeing the website I have to believe the other problems are as much a symptom of the man/setup as anything.
The page background makes it difficult to even read the article.
That's the first thing I fixed to read it.. Inspect element.. remove background.
(2015)
linas != linus?