To Microservices and Back Again – Why Segment Went Back to a Monolith
infoq.com> A period of hypergrowth at Segment, around 2016-2017, added over 50 new destinations, about three per month. Having a code repository for each service was manageable for a handful of destination workers, but became a problem as the scale increased.
I’m not familiar with what Segment does, but this sounds like just a bad design, not anything to do with Microservices vs Monolith. Why does each “destination” need its own repo? Surely they could come up with an abstraction for a “destination” and create a solution that works for all destinations, with the differences between them captured in the configuration of the service.
Is there a particular reason why this discussion is always a matter of all or nothing? And companies bounce back & forth between them unhappy with both?
Is there not a viable paradigm in the middle somewhere
The article is too general to get an idea of the exact challenges Segment faced when moving to micro services, but there are some things that stand out to me.
> Shared libraries were created to provide behavior that was similar for all workers. However, this created a new bottleneck, where changes to the shared code could require a week of developer effort, mostly due to testing constraints.
This I don't understand. First of all, creating libraries for shared code makes perfect sense, of course, but that should never compromise on developer performance, which makes me think that they had bad testing routines.
For the last company I worked for, I was part of a team rewriting a huge monolith to 50+ micro services, and it was a joy to behold. Whenever one library changes, all of the micro services are rebuilt and tested (automatically, of course). However, only the micro service that needed the library change is deployed.
We did this in addition to library versioning, where we felt versioning was necessary, mostly for paranoia's sake. :) Segment also tried this, it seems;
> Creating versions of the shared libraries made code changes quicker to implement, but reversed the benefit the shared code was intended to provide.
From what I can understand from the article, this means that they put functionality in the libraries which shouldn't be there, because it was clearly too specific to be shared among multiple different services.
Personally, I love working with micro services. I feel they make almost everything easier, even though they require more time to set up and get running, especially in terms of proper testing, failover etc. They also need a lot of thinking through, because most (an assumption I have) developers are (still?) used to thinking "monolith" style, me included. Whenever I prototype something, I start out with one application, and then start thinking about how to separate the different parts of it into micro services that can be reused, or otherwise benefits from running as one.
There is its a combination where most are in the monolith and certain features requires their own services why everything needs its own microservice i dont get.