Settings

Theme

Polylith

polylith.gitbook.io

104 points by 0x63_Problems a year ago · 18 comments

Reader

scubbo a year ago

> Creating a library also harms the development experience, because changes to the code no longer take effect immediately, as we need to build a library every time we make a change.[0]

I see this criticism of libraries from monolith fans a lot, and it confuses me. That would be true anyway - you cannot use code until you've built it (unless it's in a language which is interpreted/unbuilt/uncompiled etc.). This would be true if the code lived in a separate repo or in the same monorepo.

I suspect what they really mean is "you can't consume a new version of a library until that new version has been built _and pushed to a package registry_ from which it can then be referenced", which is just false in any language I have any familiarity with - Yarn Workspaces[1] allows the "live" use of Typescript libraries in a local development environment, Python has plenty of solutions[2], even the minimalist GoLang has Workspaces[3]. What am I missing? Is there some language which lacks this Workspaces support for "depending on local versions of dependencies" that is driving all this monolith-fever?

[0] https://polylith.gitbook.io/polylith/introduction/sharing-co... [1] https://yarnpkg.com/features/workspaces [2] https://stackoverflow.com/questions/1112618/import-python-pa... [3] https://go.dev/doc/tutorial/workspaces

  • c0balt a year ago

    Ime the support for workspaces/monorepos is often mixed. It takes more time to setup correctly, in comparison to having multiple repositories, and can sometime slead to weird errors.

    In the NPM/Yarn world at least this was something we encountered recently. Python has matured here but can also sometimes have weird edge cases.

    • n00bskoolbus a year ago

      Not sure that multiple repos is a good comparison for local development at least. If you go the polyrepo route with NPM/Yarn (for example) you'll probably still be using linking or a filepath in the package.json dependencies. I don't think monorepo vs polyrepo eliminates the problems that come with this like making sure changes from a library are hot reloaded into another which is something that still seems to require lots of configuration.

    • scubbo a year ago

      > Ime the support for workspaces/monorepos is often mixed. It takes more time to setup correctly, in comparison to having multiple repositories

      Ah - this highlights that I have misunderstood something, then - because, to my mind, a workspace _is_ multiple repositories, which are siblings within a directory. Something like:

      .

      └── Root of the workspace/

          ├── repository-1/
      
          │   ├── src
      
          │   ├── tst
      
          │   ├── Dockerfile
      
          │   └── ...
      
          └── repository-2/
      
              ├── src
      
              ├── tst
      
              └── ...
      
      
      Configured such that, if the code in repository-1 depended on the library built from repository-2, repository-1 would "see" the live-updated version immediately.

      But it sounds like you're saying that that's not the case, and that your mental model of a workspace _is_ just a single repo? Am I understanding that right?

  • d0mine a year ago

    Manually hacking `sys.path` somewhere in the guts of your project is a path to madness.

    During development, `pip install -e .` can be used to get installation from the local directory (changes are available immediately).

stavros a year ago

We decided to move from (micro)services to a Django monolith, so we consolidated our services there. We added pre-commit hooks so that nobody could call the internals of each app, only a strongly-typed interface that each app exposes (in a file called api.py).

This way, if any one service changes its API, the developer can just see what breaks and change all dependents. API changes are atomic, so we don't need any API migrations or versioning, and it's much, much easier to document API functions that are strongly typed with the appropriate types than HTTP calls.

The tooling is also great, to deploy a new application you just run the Django `startapp` command and that's it. You get authentication, background processes, events, linters, etc for free. It's very convenient, you don't even need to talk to infra.

So far so good.

dang a year ago

Related:

Understanding Polylith through the lens of Hexagonal architecture - https://news.ycombinator.com/item?id=38109928 - Nov 2023 (2 comments)

Polylith is a functional software architecture at the system scale - https://news.ycombinator.com/item?id=30697724 - March 2022 (82 comments)

Show HN: Polylith – the last architecture you will ever need? - https://news.ycombinator.com/item?id=25253731 - Nov 2020 (10 comments)

Show HN: Polylith – A software architecture based on Lego-like blocks - https://news.ycombinator.com/item?id=18123996 - Oct 2018 (5 comments)

crabmusket a year ago

Fun opening a project page to immediately step on a LEGO brick analogy! This is a pet peeve of mine to the degree I had to write about it a little while ago: https://listed.to/@crabmusket/30495/lego-blocks-are-never-a-...

As for Polylith itself, it seems to have some fairly sensible non-controversial ideas like building atop a foundation of pure functions, building up shared libraries, separating code by component instead of layer, etc.

barryhennessy a year ago

I've seen polylith over the years and it's always piqued my interest.

I'm curious as to what has been built (by yourselves or others) in the 4 (?) years since its release. Have the experiences held up to the initial goals and designs?

skybrian a year ago

I found it difficult to understand what it actually is by looking around the docs and blog posts. The high-level software philosophy is only giving me a vague idea by making analogies to other things like Lego bricks. There’s a lot of vocabulary but the definitions are vague.

It seems to be a way to arrange your source code in a monorepo? Is there a core API? Are there servers?

languagehacker a year ago

Something about the tone of this book gives me the "time cube" vibe.

Nobody "just" refactors their monolithic code into a new architecture.

Any sufficiently advanced organization with a microservices architecture has a platform team helping to standardize applications and build out shared tooling that's already being used across multiple services.

It seems like there's plenty of room for an architecture like this, but that maybe it's closer to what already exists in practice at the places where it will work, and doesn't exist at organizations where it won't work for good reason.

webel0 a year ago

I've been mostly pleased with our use of python-polylith [1] with poetry in a production application. We output a webapp, python sdk, and CLI as separate "projects."

It doesn't _really_ solve python dependency/import issues. Instead, it helps to you keep your project in discrete chunks and well-organized. It also makes it easy to package up the separate projects as artifacts.

I've run into some issues with versioning separate projects but I suspect that is a matter of bandwidth rather than an actual, insoluble issue.

I'd use it again at a startup or on project where you need to ship a bunch of artifacts but don't have a lot of bandwidth.

[1] https://github.com/DavidVujic/python-polylith

bsima a year ago

This just seems like a monorepo with extra steps

  • pensatoio a year ago

    It is.

    A monorepo with strong boundaries between components is ideal, as opposed to a monorepo-monolith, where everything depends on everything. I think it'd be far easier to explain and justify this polylith pattern if you approached it from that angle - rather than pitching it as some unique new thing.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection