Settings

Theme

The Need for Stable Foundations in Software Development

pointersgonewild.com

70 points by kennethfriedman 5 years ago · 62 comments

Reader

tuatoru 5 years ago

> In software and hardware design, the things that we should be the most wary about breaking, are the interfaces that people rely on.

And yet the big boys just can't help themselves. I've been doing a bit of work with Google Sheets the last few months, and even in that time the user interface has changed a few times. Only in small ways, will be the offered excuse.

People are not just playing with this software, they're trying to use it. Usually as a minor component of some workflow where other things occupy working memory and attention.

dgellow 5 years ago

> There is probably some truth to the idea that many newcomers will go for the latest features. However, on the flip side, I’ve been wondering if there can’t be a competitive edge in designing a language that is minimalistic, stable and slow to evolve by design.

That's exactly how I see Go, and why I personally use it, even though it has clear limitations and can be frustrating at time. And the language get quite a lot of hate because of these choices and their consequences. Minimalistic often means that you have to implement your own solution for some common problems, and/or write a good amount of boilerplate. Stable means that you don't change things that much over time, even if trends are shifting and new patterns/paradigms are developed, so you're potentially losing people and/or powerful technics (you also avoid the ones that haven't been proven yet and will potentially be seen as harmful in the future).

On the other hand you have the real benefits of having something small that you can easily keep in your mind, a tool that won't impact your maintenance budget, you can learn it once and then can be good for years without feeling the need to catch up all the time with newest changes and ideas. That's quite relaxing IMHO.

  • tuatoru 5 years ago

    > > a language that is minimalistic, stable and slow to evolve by design.

    The upper case languages. C, AWK, SQL, COBOL, FORTRAN (yes yes I know the latter two like to be written in mixed case these days).

    Also Ada ... hmm, perhaps not so minimalistic.

    And Scheme.

  • christophilus 5 years ago

    Agreed. Also, Clojure. It’s one of the most stable languages I know of that is actively developed. It’s refreshing to install a 5 year old library and have it just work. In many cases such libraries aren’t abandoned; they’re just done. That’s a rarity in other languages I use.

    • mark_l_watson 5 years ago

      Right on, and this also applies to Common Lisp. I often use very old CL libraries, not modified in a long time because they are “done.”

  • Jeff_Brown 5 years ago

    > something small that you can easily keep in your mind

    The tradeoff is that a small language can lead your program to be big, hard to read, hard to keep in your head. Abstractions like polymorphism, monads, recursion schemes or lenses add (what initially seems like a lot, but later feels like) a little to the language, in return for (if used appropriately) immensely cutting down on the amount of code you end up writing.

  • pwdisswordfish0 5 years ago

    Go isn't as rosy as is often claimed, unfortunately. I have a lot of familiarity with the Wirth school that shaped it, and the design and conservative philosophy of the language really appeal to me. But of course what it's really about is the economy (ecosystem), silly.

    Take my recent experience with Gogs and Gitea as an example—both not just incidentally coming out of the Go world, but also consciously branding themselves as "a painless self-hosted Git service". The latter abandons this and the gestalt of Go right out of the gate. Builds are not in any sense fast, it depends heavily on the NPM ecosystem, and there are other issues. The former is the original and supposed to eschew with all the "move fast and break things" attitude of the latter, right? Try building and it fails. At the time 1.14 was the latest version of Go. The fix was upgrading to 1.14, which had only been released a few months earlier. So then you build this binary on your local machine and then move it to the the VPS and find out that you can't set it up because sqlite support hasn't been built in because that requires building with cgo, which proves to be a real pain in the neck trying to build on a different system, so you're better off doing that on the system it's going to run on. So much for Go's thoughtful compiler architecture. So you re-clone the repo, only this time on the remote system. And of course that won't build, because (remember?) it needs the latest version of go. And now you're running as non-root user on a system with a system-level go binary, so that's going to involve twiddling the symlinks and/or your path so `go` invokes the right version.

    I say all this with an awareness of the tremendous unrest against the trajectory of Go's "simplicity" from within its own fanbase, and the Go team's recent capitulations in the last year or so. As the blog post lays out, this is not unique; it's the way mainstream software development goes in general. The reason is what I said before: it's the "economy"—the interactions of a bunch of stakeholders.

    What seems to be the problem with every system, no matter what tech stack they're built on or the principles they set out to embody at the beginning, is that it tends towards consultant-driven, devops-scale complexity. Intentional or not, there's an invisible force that moves things in the direction of maximizing "payoff", where that most often involves literal money changing into hands of the person putting up with all this stuff. Pretty much everyone in the workforce benefits when their line of work accumulates more esoterica, thus proving the value of consultants and others who derive their paycheck from being wizards taming incomprehensible systems. Even well-meaning, non-nefarious actors are susceptible to it. What also happens is that most developers get so far up their own ass with respect to their chosen tech stack that they're not able to see the problems their "progress" is causing, and they've got a whole community usually egging them on. This isn't too far off from what we've seen with the Twitter-driven polarization in politics, as everyone is able to find a set of likeminded people telling them that they're doing and thinking the right things.

    • dgellow 5 years ago

      I think that you have a good point with the example of gitea, the language and other tools can be as minimalistic and clean as you can wish for, and you can have the best intent to have a stable ecosystem, you are still likely to have the same kind of issues.

fergie 5 years ago

"you can charge [a MacBook Pro] using any of the 4 USB-C ports, but you really should only ever charge it from the right side."

...wait what?

tannhaeuser 5 years ago

I don't think the rant about Ubuntu Unity is fair. If anything, Ubuntu 18.04/20.04 LTS releases based on Gnome shell are serious regressions in so many ways (FF not opening new windows on top, std menu missing wtf, etc etc) that I'm considering something KDE-based next.

  • franciscop 5 years ago

    Strongly agreed; Ubuntu 14 and 16 were amazing to run, and 18+ are a huge mess with full menus compressed into burgers icons.

    Connecting to WIFI takes me 8 clicks! As someone who regularly goes to cafes, this is very frustrating:

    1. Generic top-right menu

    2. Open wifi submenu.

    3. Turn On

    modal closes automatically!

    4. Generic top-right menu

    5. Wifi submenu

    6. Select Network

    popup opens

    7. Click the desired network

    8. Connect

  • pjmlp 5 years ago

    It made me move into XFCE, GNOME focus in GJS and extensions everywhere is just too much.

    KDE would have been the alternative.

    I really don't get the hate against Unity.

  • mrkeen 5 years ago

    Well Unity sucked and Gnome 3 also kinda sucked. These are relative terms - I only compare them to what existed at the time.

    Right now I'm using Ubuntu Mate 20.04 since Ubuntu 20.04 won't run on my machine. After a fresh install it just boots up into a black screen, and I can't ctrl-alt-f2 my way into a terminal, etc.

    I've always found KDE pretty decent.

  • beagle3 5 years ago

    Unity is still an apt-get install away. I’m using it with 20.04 and haven’t missed a thing, even though it’s not getting any new features.

  • fiddlerwoaroof 5 years ago

    Yeah, I think Unity happened because of Gnome 3? Gnome 2 EOLed and Canonical didn’t like Gnome 3.

    • bregma 5 years ago

      There are a number of reasons why Unity happened, all of them good. Difficulties getting Gnome 2 changes upstream, differences in vision between the Ubuntu desktop needs and the Gnome developers, and delivery date issues with Gnome 3 were among some of the reasons, yes.

      There was a very vocal minority of shouties in various forums that spewed their venemous hate at Unity but by and large most people who tried it really liked it and I still get very positive feedback from non-technical users even today when they find out I was heavily involved in that project.

      The criticism levied in the feature article is the same tired old one that boiled down to "I didn't like it because it wasn't the Microsoft Windows I used when I was first learning." There is always a certain merit to the "all change is bad" argument, but since it's entirely based on visceral reaction and not technical merit or rational discourse, it can be difficult to use to convince others without appearing petulant.

      • fiddlerwoaroof 5 years ago

        I'm mostly sympathetic to the decision to leave Gnome: I myself abandoned KDE after the 3 -> 4 transition and then Gnome after the 2 -> 3 transition. I just never found the Unity desktop pleasant to use, so I ended up running tiling window managers for a while.

    • beagle3 5 years ago

      Unity happened mostly because Canonical was betting on convergence - which neither Gnome2 nor Gnome3 were practical for.

      Supposedly, it’s also super nice on mobile and tablet, I didn’t get a chance to try.

      If you want gnome 2, Mint is still maintaining it as MATE and it’s in the Ubuntu repositories.

zokier 5 years ago

Does anyone have experience using one of the more formal method oriented languages (e.g. Idris, Agda, F* etc) as their "daily driver" language, i.e. using them for general purpose programming, for any extended period?

I'm dreaming that by focusing on correctness one could reduce the maintenance churn that then can lead to various other spurious changes. But I don't know if any of those languages are really suitable to "real world" use, nor if they really provide such dramatic reduction of bugs that one would hope for.

  • exdsq 5 years ago

    My colleagues use Agda but only for things like protocols or core features, Haskell is better for general purpose code

  • nanagojo 5 years ago

    Idris is actually a general purpose language unlike the rest you listed.

    • zokier 5 years ago

      Agda might be bit borderline, but any particular reason why you consider F* less general purpose than Idris? It's own intro says

      > F* (pronounced F star) is a general-purpose functional programming language with effects aimed at program verification

      I haven't used either, so that's why I'm asking. Certainly going by their marketing F* seems very practical oriented, fusing common sensibilities from F# with formal methods

baylessj 5 years ago

A minor nit, the author references MX Linux as being the most popular Linux distribution currently when that is not truly the case. Yes, MX Linux receives the most page views on DistroWatch [1], but that does not directly correlate to actual users or downloads. I'm still looking for a clear set of data showing the market share of various Linux distributions, but everything I have found so far points to Ubuntu still being the most popular distro. MX Linux is growing in popularity, sure, but it's certainly not a runaway winner in the market.

1 - https://distrowatch.com/table.php?distribution=mx

megameter 5 years ago

Here are the three principles I currently all believe mesh together in discussing stable foundations.

1. Sustainability

2. Chaos

3. Reorganization

Programming in the large has a natural ecosystems quality to it. It resists standardization and falls into patchworks easily. So I have come around to the idea that one should embrace the change and discover points of stability by accident. The sustainable part happens when the system survives chaos and is sufficiently flexible to be reorganized - i.e. there is a benchmark for passing the test.

Long story short, it doesn't come easily by design. Designing small and designing to retarget your output are good ideas, because that reduces two forms of change cost. But we trip over the problem of having immediate solutions at the cost of complexity and single-use implementation. Designing for extension turns your system into a platform, which gives it a definite kind of lifespan and ability to address new problems.

I worked with Go for a while and gradually got fed up with the accumulation of little issues. I have come to the conclusion that Haxe - and most transpiled languages - do the job of sustainability better than Wirthian languages, actually, because being retargetable allows your code some insulation from platform changes. The intent is preserved, and the bulk of breaking changes occur outside the compiler tooling. A cost is paid in having to debug multiple times and often at a distance from the original source, and in having imperfect access to platform features, but this is a much smaller thing than having a codebase with hidden dependencies, which is a thing that constantly sneaks into native code systems, and a thing that makes VM language runtimes grow over time.

lmm 5 years ago

Software development just isn't mature enough yet. A bug-free, stable version of an IDE from 10 years ago would be much less pleasant to work with than the flaky version with all the features that have been invented in the intervening years.

  • beagle3 5 years ago

    Visual c++ 6 circa 1998 was the pinnacle of responsive IDEs, and was more responsive in 2000 on 2000 hardware than the the visual studio 2013 o last used in 2015 - are new versions any better, or so you still wait a couple of seconds when pressing “run” just for the IDE to figure out that no compiling needs to be done?

    • mrkeen 5 years ago

      Yes, so much this.

      My thoughts on Eclipse the entire time I used it was "why can't this be as fast and reliable as VB6?"

      • beagle3 5 years ago

        It is ridiculous.

        In 2015, when I had to port something to Windows, I was so frustrated by the IDE speed that I was doing “make ; make test” outside instead - only using the ide when I needed the debugger. (I had makefiles from Linux, I would probably have been to lazy to create them if I didn’t)

      • fomine3 5 years ago

        I refuse to use the VB6 IDE that truncates characters from variable viewer.

  • pjmlp 5 years ago

    Delphi, C++ Builder vs Visual C++.

    Visual C++ 6.0 vs Visual.NET

    Visual Studio 2010 vs Visual Studio 2015

  • AnIdiotOnTheNet 5 years ago

    I know of several programmers still running Visual Studio 6 because of how much more pleasant it is to work with than any modern VS version due to its responsiveness. That's without it even being bug free.

mark_l_watson 5 years ago

In 2005 I wrote a little rant about ancient software [1]. The idea is that far in the future there would be very stable software that had not seen changes in centuries.

[1] https://markwatson.com/blog/2005/08/04/ancient-software.html

  • machello13 5 years ago

    A Deepness in the Sky by Vernor Vinge also mentioned this — it's a scifi book that takes place in space thousands of years from now, but the foundations of the computer systems are many of the same libraries in use today.

pjmlp 5 years ago

Programming languages are software products like everything else in the industry, they either stagnate or evolve.

  • HPsquared 5 years ago

    As in biological evolution, the parts with the most "breaking changes" tend to remain quite stable.

  • sagichmal 5 years ago

    At a far far slower pace, maybe.

    • pjmlp 5 years ago

      Depends on the language, ISO/ECMA ones take their time, around three years, others every year/six months.

  • acp2020 5 years ago

    Then what happened to C?

    • WJW 5 years ago

      C has been evolving, there are C89, C90, C99, C11 and C17 standards and a new one is in the make.

      • Chris_Newton 5 years ago

        But the last version with major changes was C11, nearly a decade ago. Even that was almost entirely backward-compatible, other than a few very specific cases like removing the inherently dangerous `gets` function from the standard library. And even that removal followed formal deprecation several years earlier, and widespread advice not to use it at all for many years before that.

        Put another way, while the C language has evolved, you can still compile C code written several decades ago using modern tool chains with an excellent chance of having it work immediately or at worst needing some very minor changes.

        It’s depressing that some of the comments in today’s discussion are praising Clojure because code often still works after 5 years. Obviously that is itself a good thing, but is only 5 years now a remarkably long time for code to still work?

        • pjmlp 5 years ago

          Actually no, because C11 has removed features from C89 and C99, while C17 has done some UB "improvements".

          • Chris_Newton 5 years ago

            There is very little that has changed that isn’t backward compatible, though, and what there is has mostly been in areas that no sensible programmer should ever have relied on in more than toy programs anyway, like inherently unsafe functions or undefined behaviour. You probably have to go back to C99 prohibiting some things that used to be implicit and therefore to code written over two decades ago to find “real” exceptions to that. Compared to many “modern” languages, I’d say that’s still an excellent track record for longevity and compatibility.

            • pjmlp 5 years ago

              Toy programs like Linux kernel, once an heavy user of VLAs, which only GCC and clang ever implementated, while everyone else just moved from C89 to C11 without ever caring about them.

              • Chris_Newton 5 years ago

                Sorry, I’m not sure I’m understanding your point here. Are you claiming that variable length arrays have been removed from C in recent standards after they were added in C99? Or is your objection that some compilers are very slow to support language features even after they have been standardised? Or something else?

                • pjmlp 5 years ago

                  I am not claiming, it is a fact.

                  It was dropped from C11, moved into optional annexes and all commercial compilers that were yet to be fully C99 compliant just moved to C11 instead, without bothering with optional annexes.

                  The world of C compilers isn't only GCC and clang.

                  • Chris_Newton 5 years ago

                    Something being dropped is not the same as making something optional. In this case, the latter action was an explicit recognition of the reality that many compilers never fully implemented the previous standard anyway.

                    The point here was about backward compatibility. What standard C99 code is not also standard C11 code? What used to compile a few years ago but won’t compile today? We all understand that there are some minor exceptions, but I think it’s fair to say the answer to both question is not much.

                    • pjmlp 5 years ago

                      VLAs, can't make it any clearer than that.

                      Any use of Annex K functions, gets(), any code that is considered semantic invalid under C11 memory model, or gets wiped away with the new UB cases from C17.

    • psychoslave 5 years ago

      Maybe an unique historical opportunity for a tool "good enough" to do the job, seizing the widespread adoption possibility, and letting no room in this ecological niche.

      That is, yes, there is room for stable languages, but it’s already fully occupied. So only languages targeting market with a mindset open to "better moving continuously anywhere than risking sclerosis" can flourish.

      Now, that might be a bit caricatural of course.

    • pjmlp 5 years ago

      C got C89, C90, C99, C11, C17, and is on its way to get C2X.

orestis 5 years ago

Obligatory Clojure plug :)

Most, if not all, code from 5 years ago runs with no changes whatsoever.

The stewards of Clojure heavily advocate and practice “stable” development where backwards compatibility is a requirement. Combined with the famous JVM compatibility means that the whole enterprise is a very stable foundation for building stuff.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection