Settings

Theme

Ease of maintenance is a feature

ronakgothi.com

82 points by ronakjain90 a year ago · 38 comments

Reader

cglan a year ago

Super true. One of the best tests of this is setting up a new laptop. Some of the best experiences are when you get a new laptop, and just clone the codebase and everything works as it did before, no special magic. Golang with vendored dependencies seems to be wonderful for this but I've had relatively decent experiences with newer java projects.

My worst experiences universally have always been python projects. I don't think I've had a single time where I cloned a python project and had it just work.

Beyond just the code, I've had lots of mixed experiences with CI/CD being smooth. I unfortunately don't think I've been in a single shop where deployments or ci have been a good experience. They often feel very fragile and undocumented and hard for newcomers.

  • sethammons a year ago

    I have a couple decades of experience and have ridden a small start up through public and I have worked intimately with 6 companies. I know about taking a product and ultra scaling it in both technical and organizational scale.

    I will never recommend Python outside of a small team. It is organizational molasses. My current company has multiple teams striving to keep our Python tech stack serving our growing technical and organizational scale.

    I have fixed this in two companies in no small part with migrating to Go. I am on my third.

    • aaronblohowiak a year ago

      The hoops that people go through to solve this sometimes creates something even more complex and not great, like forcing all development into a docker container..

      Ever try conda though? I’ve had moderate success with pipenv, but tbh I don’t love it as it hides too many things when installing a package fails.

      • bunderbunder a year ago

        Doing a Python development environment inside of Docker can get particularly obnoxious in the long run because there are approximately a zillion ways that a base image upgrade can break things by changing something about the system Python packages.

        (And by the time it happens you might have a real mess on your hands because by that point the dev container's dockerfile has quite possibly grown into an undocumented spaghetti tangle of band-aids as a result of every dev on the team tweaking things in whatever way seemed to make the most sense at the time without a whole lot of regard for the end-to-end cohesiveness of the situation.)

        The standard advice in the Python community is "never trust the system Python", but tools like pyenv that we have for protecting ourselves from the operating system aren't always straightforward to get working sensibly inside of a container. It seems like it should be easy, but I've seen people get it wrong far more often than I've seen them get it right.

        A big part of the problem is that the Python community has developed an extremely severe case of TMTOWTDI when it comes to dependency management, packaging and deployment. It's led to a situation where, if you're just googling around for problem solutions in an ad-hoc manner, you're likely to end up with a horrible chimera of different philosophies of how to do Python devops, and they won't necessarily mesh well together.

        • aaronblohowiak a year ago

          TMTOWTDI is antithetical to python as a language it’s so odd to me that its packaging and env management is the opposite.

          Do you have a suggested solution? Im solo dev for now but will be adding more folks in foreseeable future. Stuck with python for some things (notebooks, model development.)

          • bunderbunder a year ago

            Honestly I don't think there's any quick answer, and providing specific advice on Hacker News would be playing into exactly the problem I was pointing out in the last paragraph of my previous post.

            My usual advice is to just bite the bullet and invest the time it takes to understand how Python package management and resolution really works under the hood, and how all the various devops approaches that are built on top of it work with it, so that you can make informed decisions and truly own your own stack.

      • imp0cat a year ago

        I quite like docker for local development.

        Docker and docker compose do make it incredibly easy to start everything that's required for local development and testing. Your service A needs B and C? Grab those images of B and C and run them all on your machine. The only limitation is the amount of RAM you have available locally.

        And if you think a bit about your Dockerfiles (ie. have the layers set-up to take advantage of caching, have icecc+ccache mounts for c++ projects to distribute compilation and cache results, have mounts for apt or other package manager cache downloaded packages that you use) the local image rebuilds can be quite fast. Those are the little tricks to make your life with docker less miserable.

        • aaronblohowiak a year ago

          Yea all of that is stuff I do not want to bother with. At all. “It’s nice if you take on maintenance burden of a bunch of additional moving parts” is the opposite of what I want. If you have to support a diverse set of languages / runtimes / environments and you deploy using containers, maybe it makes sense, but that seems like a use of the complexity budget I’d rather spend on… something else

          • adamc a year ago

            In big organizations, it solves a lot of problems. Docker isn't perfect (hence interest and growth of Nix), but in day-to-day use it's fairly replicable.

  • senko a year ago

    > My worst experiences universally have always been python projects. I don't think I've had a single time where I cloned a python project and had it just work.

    I'm curios if you can spot a pattern in the platform (win/osx/linux), type of project, or is it all over the place?

    My own experience with Python boils down to creating a virtualenv, installing the deps, setting up configuration (or just copying it from somewhere) and creating a database, and I'm off to the races. The only exception in recent memory was when a project had two dozen microservices, half of the codebase was on private package repository, and we used Poetry. The combo required somewhat more involved setup. That said, IIRC all the projects had fully pinned package versions (package==x.y.z).

    In contrast, every time I touch something in JS land I get the same experience you described for Python. On one project we literally copied node_modules across machines (including servers), because it was unbounded amount of time trying to do a full reinstall. Anecdotally, amount of churn in JS is much higher, and the maintenance load increases proportionally.

    Usually it's something like:

    - have a project in JS with some dependency X that's no longer on the bleeding edge, but works nice

    - want to depend on a new package Y for some new feature

    - the new package Y depends on a library Z that's higher than what the other dependency (X) can work with

    - try to update the original dependency (X)

    - wailing, gnashing of teeth, and considering the switch to agriculture instead

    In my experience, if you're not closely tracking the bleeding edge, upgrading packages and updating your code accordingly, your JS developer experience will be abysmal.

    Agree on the CD part, especially the fragility and more manual work than if the deploy is some manually driven (semi-)automated process.

    • imp0cat a year ago

      You can get the same JS/node_modules experience with Python, just use pdm. ;)

  • AlienRobot a year ago

    I love Python but it always amazes me how hard it is for it to just... work.

    So there is virtualenv, built in, but... if there is a venv directory, Python doesn't just use it.

    Like you have app.py, and you python app.py, that doesn't run it with the venv python. This leads to all sorts of problems with scripts that assume they're running under venv. Which means you probably want to write a script that sources venv just so you don't forget, but if you place it in the same directory you may forget you need to call the script, so you probably want to add an extra directory to hide all the python code so you only see the shell script that you need to run to properly setup the environment to run the python code. Or just use an IDE.

    Just "pip install." But pip isn't installed and ensure pip doesn't work? What do I even do then?

    I recall downloading a project that required a library that wasn't available for the newest version of python, so when you tried to install the requirements pip wouldn't find it. I discovered this, naturally, because I updated my operating system so the python version changed which means the project that used to work stopped working! What is the solution for installing multiple python versions side by side? Hint: it's not an official project by the Python organization but something you can find on github.

    • adammarples a year ago

      My recent workflow is to use a great program called mise. You have a config file in your directory and hey presto, python venvs work, they install themselves if they don't already exist, and it will install the exact version of python you specify in your config. On top of that is will set environment variables for you and unload them when you change directory. If you combine this with uv (just tell mise you want uv installed in the config) you can run uv pip sync and instantly reflect any changes in your requirements file directly into your venv very quickly.

    • tmnvix a year ago

      For the past 4-5 years this is what has worked exceptionally well for me:

      - pyenv for installing multiple versions of python on my machine

      - direnv for managing environments (env variables, python version, and virtual environment)

      - pip for installing dependencies (pinning versions and only referencing primary packages in requirements.txt - none of their dependencies)

      This makes everything extremely easy to work with. When I cd into a project directory direnv loads everything necessary for that environment.

      Each project directory has a .env and a .envrc file. The .envrc looks something like this:

          layout python ~/.pyenv/versions/3.11.0/bin/python3
          dotenv .env
      
      Absolutely no headaches working on dozens of local python projects.
      • KronisLV a year ago

        > Absolutely no headaches working on dozens of local python projects.

        The other day, I moved over to a new container base image that's supposed to run Ansible inside of it. Almost immediately, when trying to manage a RHEL8 compatible host, I got this error: https://github.com/ansible/ansible/issues/82068

        I've had issues not only with Python projects that I write, but also with software that's relying on it. Then again, while there are both problems and ways around those, my experience has been similar with pretty much every tech stack out there: from Java apps that refuse to run on anything newer than JDK 8 (good luck updating dozens of Spring dependencies across approx. half a million lines of code in a codebase that's like a decade old), to hopelessly outdated PHP versions or software that works on ancient Yarn versions but not on newer ones and doesn't even build correctly when you move over to Node with npm or software that's stuck on old Gulp versions. Same for Ruby and Rails versions that will run, or .NET and ASP.NET codebases, where the framework code ends up being tightly coupled to the business logic, don't even get me started on front ends that rely on Angular (or AngularJS), Vue (2 to 3 migrations) or React. I've had Debian updates break GRUB, AMD video drivers for the iGPU on the 200GE preventing it from booting, differences between Oracle JDK and OpenJDK having a 10x impact on performance, Nextcloud updates corrupting the install, the same happening with GitLab installs, just a day ago I had a PostgreSQL instance refuse to start up with: PANIC: could not locate a valid checkpoint record. PostgreSQL, of all software.

        Sometimes churn feels unavoidable if you don't want code to rot and basically everything is brittle. All software kind of sucks, sometimes certain qualities just suck a bit more than the average. Containers and various version management tools make it suck a bit less, though!

    • fragmede a year ago

      yeah I wrote python-wool and set it as my local alias for python so it does just look for a venv in the called program's path, and use that.

      https://github.com/fragmede/python-wool

  • nzach a year ago

    > My worst experiences universally have always been python projects.

    Do you mind sharing why do you think this happens ? Although I never worked professionally with python, this sentiment matches with my experiences as a user. So I don't have a lot of context why this is the case.

    Some siblings in this thread provided some explanations that mostly boils down to 'bad tooling' in one form or another. But this doesn't feel right.

    In my opinion if it was just bad tooling this problem would be solved by now.

  • ronakjain90OP a year ago

    Every time I setup a JS project which is older than a few years, it's

    1. Extremely difficult to setup the code base, because of dependency spaghetti 2. Lot of breaking changes across different libraries, making maintenance not so easy.

    Easiest projects to maintain were written on Go, Java, Ruby,

  • humanfromearth9 a year ago

    You may want to consider using Nix, with nix flakes.

    • bryanlarsen a year ago

      How much of that just hides complexity? I remember back in the day hiding a large amount of complexity behind vagrant.

      A new dev could get up and running quickly with "install vagrant; vagrant up", but that was hiding a lot of complexity behind a very leaky abstraction.

  • pphysch a year ago

    > My worst experiences universally have always been python projects. I don't think I've had a single time where I cloned a python project and had it just work.

    I got a new Chromebook from work, and had VSCode+Docker running an existing Postgres+Django+etc dev environment in literally 15 minutes. I was shocked. Devcontainers are magic, and poor Python DX is a skill issue.

    • porridgeraisin a year ago

      > Poor Python DX is a skill issue

      Oh yes, the language whose ecosystem only hears about backwards compatibility in their own death marches? Not their problem. It's the developers, it's _their_ problem.

      Not the standard library which _removes_ packages, breaking code which I recently cloned. See "imp".

      And not the next python version, which throws a syntax error on bare excepts, breaking old code for absolutely zero benefit beyond pretending to be a linter.

  • sameoldtune a year ago
uludag a year ago

I was reflecting on this article, thinking about what software tools and languages I use that reflect this property, and a weird realization came to mind: Emacs lisp is by far the best language I use in this regard. I literally copy-pasted 20+ year code, eval it, and every time it just works. Then if I want to debug it: C-u C-M-x, and I'm instantly stepping through the code.

Something this old shouldn't have this property. Nothing "modern" even comes close. Look at the top languages, Python, JavaScript, and Java, and you don't even have to consider too much how abysmal these languages are in this regards.

  • karthink a year ago

    > Something this old shouldn't have this property.

    It's not an accident -- reading through the emacs-devel mailing list, it's easy to see how much effort the maintainers pour into backward compatibility. It's one of Emacs' unspoken guiding principles[1].

    At the same time, it's not that surprising either. Emacs does not have other objectives that more modern languages/ecosystems do: no revenue or growth targets, corporations or VCs breathing down its neck, or a mandate to be "modern". Its most vocal and experienced users, who are also its volunteer maintainers, decide what its priorities should be. Since they've been using it for decades, backward compatibility is high on the list.

    [1]: It's "spoken" guiding principles being to further the goals of the GNU project.

sethammons a year ago

As a principal software engineer, my life is moving larger orgs closer to this model. I have lived it when it works. It is critical to so many things. I feel like Plato's Cave when I couple this stuff with structured logs, metrics, dashboards, and alerts. So many shops don't understand that this stuff gives you wings.

aaronblohowiak a year ago

I like this. Couple things to add:

Fast setup and revision are important but incomplete list of maintenance tasks; are metrics/logs predictably named and accessed? Can you perform manual experimentation without hard-to-configure client (ir: hit the server with a browser or run a cli)?

Also, "cycle time" or "revision time" are soo important, but I havent found a good way to do that with AI model development :( any tips here?

anonyfox a year ago

Nowadays i come to my conclusion that "ease of maintenance" is the most important feature to have in a project. More critical is only that the project in itself is valuable enough, so many engineers optimize things that shouldn't exist in the first place.

Easy to maintain is not only about keeping something alive with minimal effort over longer periods of time. It also plays a pivotal role for scalability in any direction. Adding more engineers/teams, adding more unforseeable features, iterating quickly in general, surviving more traffic/load, removing technical bottlenecks, ... everything is so much easier when the project is easy to work with and maintainable.

chanux a year ago

Cannot be overstated! I've spent countless hours trying to understand systems built by others (dozens of others of various skill levels) to try and bring the code to a more maintainable posture. Sometimes it feels like a thankless job but it's a rather selfish endeavor because first and foremost, I want to save my future self from suffering.

stevepike a year ago

I think the kind of application here matters a lot, specifically whether you're trying to make a change to a web app or if you're hacking on library code.

In ruby, for example, I can pretty trivially clone any open source gem and run the specs in < 5 minutes. Patching something and opening a PR in under an hour is definitely standard.

On the other hand, getting a development environment running for a company's proprietary web app is often a hassle. Mostly though this isn't because of the language or dependencies, it's because of:

  - Getting all the dependent services up and running (postgres version X, redis Y, whatever else) with appropriate seed data. 
  - Getting access to development secrets
My company (infield.ai) upgrades legacy apps, so we deal with setting up a lot of these. We run them in individual siloed remote developer environments using devcontainers. It works OK once we've configured the service containers.
afiodorov a year ago

More software is written than kept. It's harder to write useful software than to configure CI/CD. The latter is a problem that has been solved before, whilst chances of any software codebase being useful enough that it is even worth maintaining are very low.

godshatter a year ago

Based on the subject I thought this would go in a different direction: document well, take the simple approach where possible instead of the most clever one, modularize well, etc.

  • bobertlo a year ago

    I think the author did a good job of challenging my assumptions going in at least, which is nice.

    My initial reaction was that it was a list of fairly complex things, but they are not necessarily complex to implement, even if people commonly over-complicate a couple of those things or make them a pain for other developers to setup, which seems to be part of the point.

megamix a year ago

This is so important. It's the most practical and future-proof way forward.

braza a year ago

For the folks that wants a structured reference around maintenance, this book [1] it's one of the best of the topic.

[1] - The Innovation Delusion: How Our Obsession with the New Has Disrupted the Work That Matters Most - https://a.co/d/eInjwZD

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection