Versioned Go Commands
research.swtch.comTOC for all articles in the series: https://research.swtch.com/vgo.
Links to previous discussions about articles in the series:
Part 1: https://news.ycombinator.com/item?id=16421966
Part 3: https://news.ycombinator.com/item?id=16431299
I'm really excited about all this! No more GOPATH, no need for vendor directories; I think this is a positive step forward for Go, and package management in general. I'm also interested in seeing if there's any wider adoption of these ideas in the larger package management ecosphere.
One thing I've been mulling is whether this setup makes more sense in compiled languages than it would with dynamic languages. In the latter, your code base is your executable, so there may not be much difference between this and having a traditional vendor directory.
Of course, using semantic import versioning is something that can stand alone.
GOPATH is one weird thing that I have never been able to justify to myself when trying out Go. I keep all my code under ~/co, I clone others' repos under ~/co/External, many times I clone sth. to /tmp first, and move to ~/co/External if I see fit. Then this language comes around and says "well, that's not how you're supposed to do it, or you'll have to set this variable every time". Just good riddance.
> I'm also interested in seeing if there's any wider adoption of these ideas in the larger package management ecosphere.
If I'm not mistaken, isn't most of this how things work already? The minimal versioning algorithm may be novel, but things like locking dependency versions, no weird preconceptions on where you put code, easily caching dependencies etc. are nothing new in the greater programming community. But whatever, I'm happy that Go's taking this path.
Vendor directories are the simplest way to include the deps with your project. Removing that ability would complicate many existing projects. Now instead of git, we will need an additional tool to properly vendor our deps .
I am afraid I don't exactly agree. If you are publishing a library, for one, it's best not to include any vendored dependencies to begin with. But, ideally, no one should include vendored dependencies, whether a library or executable; those are things that can be fetched at build time. ("But, what if the dependencies disappear?" you may ask—this would be handled through the caching proxy in Russ' proposal.)
The additional tool you would need to build your software would be vgo.
For smaller teams/apps, it's super-handy to be able to just include dependencies in the repo. I'm guess this is one of the capabilities vgo will gain in response to community feedback.
(The Go folks have a bit of a blind spot for environments that don't have heavy duty infra. Recall how their initial solution to a montonic clock was "just arrange for your ntp server to not do that...")
> those are things that can be fetched at build time
At least for Linux distributions, the package builders are normally not allowed any network access; this was an issue for packages using Rust before cargo-vendor and the introduction of a flag to cargo to never try to update the lockfile.
Not allowing network access for build hosts is probably not limited to Linux distributions, so this use case will have to be addressed sooner or later.
It's a simple system. If the modules are already present in your local cache it will just use them directly. So not having network access for build time is trivial to work around: just put the packages you depend on in your local cache and you're done. Note that with this system the selected versions are deterministic, so if the modules are present there's no reason to access the network anyways.
I have been out of the go space for a while, and recently took up a small project that uses it.
I was glad to see that 3rd party tooling (VSCode extension for example) had improved a lot. And after the initial hump, using the dep[0] tool was a breath of fresh air compared to the cludgy alternatives back in 1.4/1.5.
As of a month or two ago dep was slated to become the "go dep ..." command. I'm really curious (and not in a demanding way.. but legitimately curious) why:
1) it's taken them this long to stabilize dependency management, in what seems like an otherwise polished, well supported ecosystem?
2) they could abandon a well thought out, reliable community supported tool that was endorsed by golang itself?
After a lot of trying I'm finally warming to the language but the community and ecosystem still leave a lot to be desired. I'll be interested to see how vgo plays out. Third time lucky I guess (get, dep, vgo)?
I'm not sure about (1), although Google's infrastructure not making it an internal priority seems plausible. I also think for Go it's more common to build directly on top of the standard library than in many other languages.
For (2), dep was endorsed only as the "official experiment", whatever that means. It looks like they went with a different design for a bunch of specific technical reasons outlined in these blog posts. According to the last one there are 67(!) pages of content. I'd recommend at least reading part 3 though, "Semantic Import Versioning".
The main advantage of the new system seems to be that it handles major version upgrades in a way that doesn't leave the user exposed to difficult build problems in situations involving shared dependencies. As a trade off, it makes it a little less convenient to upgrade major versions (since the import paths change), although I'm sure tooling can make this easier.
I like a lot of the properties of it: builds should rarely break, the whole system is pretty easy to understand, you can make API v1 a wrapper for API v2 (so users who haven't migrated get the new implementation), you get library versions with the dependencies they were tested with, backwards compatibility is encouraged, a single module can't force you to use old versions of sub-packages (only upgrade), no lock files are needed since builds will be stable without them, it's easy to upgrade but done explicitly, and so on... I guess the main thing is it would be nice if it was around earlier!
Hopefully the transition from dep will be pretty smooth (I believe they are contracting with the main dep author to help with this). I think the vgo prototype already uses some information from other Go package managers (not only dep).
The dependency management story is awful because Go is developed for Google’s purposes, and Google builds from a monorepo.
The monorepo seems like a bit of a red herring. A fork maintained in a separate repository (like pressing the fork button GitHub) would achieve the same. In fact, the type aliasing feature was added because Google (and others) struggled to make modifications in single commit, so the atomicity the monorepo could theoretically provide has already proven to not be there in practice.
The main difference is that Google is comfortable with having a fork that they can update from the mainline branch when needed/desired, but will otherwise stay stable for their software. Others, who are familiar with more traditional package tools, are not. And perhaps their concern is justified, but interestingly I've never seen a Go experience report reporting why that approach failed them.
I took that to mean that most of the major "sanctioned" programs in go, do all of the heavy lifting "in repo."
Which is funny, because it is easy/natural to view dependencies as technical debt. You are literally building against someone else's technical assets. In that sense, most dependencies you have are easily argued to be technical debt. If you have the capital of google, why do that? (For the rest of us, the answer is easy, we don't have that kind of capital and have to.)
The main pain point in Go dependency management is that it lacks a central package registry and hardcode dependency in the source code. So developers have to vendor the dependencies in the source code in case of remote dependencies disappear. And if dependencies path changes, every source code file including import path has to be changed.
From the serial posts, vgo uses a proxy to cache dependency to solve the first issue. But I wonder whether the import path has to be changed to proxy URL. At first I've thought module would solve the second issue but after reading part 6 of the serial posts, it doesn't mention whether the packages imported in source code have to include the full package path with URL, so still suffers. Correct me if I misunderstand it.