Settings

Theme

The GitHub registry public beta is live

help.github.com

268 points by talal7860 6 years ago · 136 comments

Reader

carapace 6 years ago

FWIW, Software Heritage already has your github repos: https://www.softwareheritage.org/

https://hn.algolia.com/?query=Software%20Heritage&sort=byPop...

And GNU Guix at least will transparently fallback to them:

> Since Software Heritage archives source code for the long term, Guix can fall back to the Software Heritage archive whenever it fails to download source code from its original location. The way this fallback has been designed, package definitions don’t need to be modified: they still refer to the original source code URL, but the downloading machinery transparently comes to Software Heritage when needed.

https://www.softwareheritage.org/2019/04/18/software-heritag...

  • jolmg 6 years ago

    You know what'd be really cool? For either Nix or Guix to transparently support installation of any version of any program without hacks like putting an alternate version under a different name or some such. I wish programs didn't require continued maintenance for dependency updates or risk being uninstallable without putting them and a number of dependencies under different package names. I wish I could install Firefox or Chrome from a decade ago as easily as I can install the latest versions.

    I wonder if it'd be too crazy to add git-awareness to the Nix utils. Like tell nix to install version X of some package, so in the nixpkgs repo it would check the history of the corresponding file to find the commit where it last was that version, checkout that commit, build the package and its dependencies, and then return to the branch it was at.

    • eridius 6 years ago

      That would be cool, though I'd prefer an approach where the channels maintain a database of the pname/version combo for every package in the channel, and an API where you can query to find out what version of the channel contained a given pname/version combo for a given attribute path. Then you can just download the tarball for that version, no git needed anywhere.

      In theory, that database could also just be included in the channel so it could be queried locally, but I don't know how large it would end up being.

      • jolmg 6 years ago

        Honestly, I think I like doing it with git more. If you maintain such a "database", you'd have to explicitly state version info for the dependencies of every package. That's done automatically with git. For a particular version of a package, the versions of the dependencies it's compatible with are those that are present in the same commits.

        Also, to maintain package building scripts in a database outside of git would require either making 1 script per package that builds every version of that package ever. That sounds inconceivably hard. Another option is to have separate scripts per version, which sounds redundant. Another option is for newer scripts to import code from previous versions' and override parts, which sounds convoluted. If you use git, the script only has to worry about the current version. Not forgetting how to build previous versions would be automatically handled by git.

        Re-reading your comment thought, I think you're talking about pre-built packages. If that's the case, then I think I agree. I'm talking about the repo containing the scripts for building such packages.

        • eridius 6 years ago

          The default way to consume nixpkgs is through channels which today does not involve git. And a nixpkgs git checkout is over 1GB (my .git dir is currently sitting at 1.4GB on this machine). So that's not great.

          > If you maintain such a "database", you'd have to explicitly state version info for the dependencies of every package.

          I don't know what you mean. If I say "I want git v2.10.0" I don't care about dependencies; whatever nixpkgs tarball I download that has that will have the dependencies too. There will be a whole range of nixpkgs tarballs that contain the requested git version of course, but with such a database I could also say "find me a channel tarball that contains both git v2.10.0 and some-dependency v7.23" if I want that particular combination (assuming the two coexisted in the same nixpkgs at some point).

          > For a particular version of a package, the versions of the dependencies it's compatible with are those that are present in the same commits.

          This is true of downloading a nixpkgs channel tarball too.

          > Also, to maintain package building scripts in a database outside of git would require either making 1 script per package that builds every version of that package ever.

          I don't know what you mean by this either.

          > Re-reading your comment thought, I think you're talking about pre-built packages.

          No I'm not.

          What I'm talking about is just any time hydra builds a channel, it can run a script that collects the pname/version combination for every derivation in nixpkgs (or at least, every searchable derivation; we probably don't need to collect this from all the invisible helper derivations). This can then be appended to a database from the previous version, such that we end up with this information for every single tarball.

          In fact, this is pretty much exactly what you'd do for git, except you'd build it incrementally rather than running a script that processes the entire git history from scratch (which is pretty much building it incrementally for every commit).

          • jolmg 6 years ago

            I see I had completely misunderstood what you meant. When I last used NixOS, some 3 years ago I think, I didn't really use the channels. Since I wanted to make some modifications to some files in nixpkgs, I preferred to have the nixpkgs repo locally. Seems I forgot about them.

            EDIT:

            > The default way to consume nixpkgs is through channels which today does not involve git. And a nixpkgs git checkout is over 1GB (my .git dir is currently sitting at 1.4GB on this machine). So that's not great.

            One could also add tags to the repo in the form of pname-version for every package. I wonder how well git can handle that many tags...

            In any case, the advantage of being able to do this from the git repo is that you wouldn't depend on someone forever hosting every version of a channel. I would think one would discard old channels before they discard git history.

            • eridius 6 years ago

              I suppose you could build this database on top of git first, and then transform it to be relative to nixpkgs channel tarballs, since each channel release maps to a git commit.

    • matthewbauer 6 years ago

      I think a tool like that could be very useful! The main issue with basing it off of Git commits is that there is no guarantee that the last commit with a given version is actually a good version. Consider the case where a-1.0 depends on b-1.0 and b is updated to b-2.0 in a commit so that a is not compatible with b-2.0. Even though a-1.0 is still around, it's not going to work until we update it to a-2.0, so you need some more complex constaint solving on top of your commits to figure out what works.

      I would prefer doing it based on the 6-month release channels, so you get multiple version for every 6 months. You end up with some gaps between versions, but also have more guarantees everything actually works together. Basically "nix search" with multiple channels.

      I actually had to do something similar with GHC versions for a project of mine. It turns out you can run Nixpkgs all the way back to the first release in 13.10 (LC_ALL=C is needed). Obviously not that long ago right now, but it should continue to work as time goes on & give us 10+ years.

      https://gist.github.com/matthewbauer/ecdb665f84b7f6100f3f6e3...

    • civodul 6 years ago

      The `--with-commit` and related options of Guix is one step in that direction: https://guix.gnu.org/manual/en/html_node/Package-Transformat... .

      What you suggest in the second paragraph sounds great and definitely doable!

    • Crinus 6 years ago

      > I wish I could install Firefox or Chrome from a decade ago as easily as I can install the latest versions.

      You can, on Windows. The secret is the system keeping backwards compatibility in mind for its core APIs, something that most desktop APIs on Unix (outside of X11) do not do.

      • jolmg 6 years ago

        The main problem in Linux is dependency updates, and I never understood how dependencies in Windows work. Are most programs built as fat binaries that carry all their dependencies with them? Or do Windows programs just never build on top of other 3rd party programs and always just depend on what's provided by the base OS?

        • Crinus 6 years ago

          A bit of both. The base OS itself provides way more functionality out of the box than Linux (imagine an asterisk here) and that is done through APIs that have remained backwards compatible going back to Windows 95 (of course new stuff got added in later versions). When applications want more they bundle their dependencies with them (either via static or dynamic linking) and sadly that is indeed repeated. However the base OS provides enough functionality that you do not need to, e.g, bundle an entire GUI toolkit that implements everything from scratch (ignoring Qt/Java/etc programs here - they do that for portability but they didn't have to if they only cared about Windows).

          (now the asterisk: in terms of functionality above and taking an average desktop distribution in mind you'd probably find more of it in Linux, but applications can't rely on most of that functionality being available on all distros as a "standard" and even for the stuff they could expect, way more often than not they rely on specific version ranges - e.g. an application cannot rely on "Gtk", it can only rely on "Gtk <version>.xxx", as Gtk itself isn't backwards compatible across major versions).

        • batat 6 years ago

          Depends, also WinSxS[1] helps.

          [1] https://en.wikipedia.org/wiki/Side-by-side_assembly

danShumway 6 years ago

> is a software package hosting service, similar to npmjs.org, rubygems.org, or hub.docker.com, that allows you to host your packages and code in one place. You can host software packages privately or publicly and use them as dependencies in your projects.

I am... really confused by this.

Isn't this just Github? Github is a hosting service that allows you to host your packages and code in one place. It has testing and publishing pipeline support, you can add artifacts/releases, make your packages private or public, host different types of software at the same time, and it's compatible with most existing dependency systems, including NodeJS.

I can see this has more download statistics, which is nice. And it has a policy that artifacts can't be deleted, which is very nice.

Is that it though? I know I have to be missing something; what can I do now that I couldn't already do with Github as is?

  • brown9-2 6 years ago

    They are implementing the APIs that the package managers expect to fetch artifacts, rather than the package managers having to know how to fetch files via Git.

  • Pxl_Buzzard 6 years ago

    This appears to be a move to meet package managers halfway. Yes, tools like npm and go have great integration with git repositories, but others like NuGet still require a hosting source. Long-term, you could imagine package managers forgoing their own hosting services in favor of letting GitHub be the primary host who takes on the issues of bandwidth, availability, access controls, etc. It's another vector for GitHub to compete in FOSS.

    • bbatha 6 years ago

      It’s also good for githubs economics to be able offload data from git itself to easily mirrored versioned tarballs. It’s much more cost effective than mirroring constantly changing master branches or worse whole git repos for clones.

  • hn_throwaway_99 6 years ago

    If you are familiar with Nexus or Artifactory or Verdaccio, which all essentially let you have private NPM repos (among other formats like Maven, etc.), that's what this is.

    • danShumway 6 years ago

      What's the difference between publishing a binary file on Artifactory and linking to a binary file in a Github release[0]?

      Is Artifactory immutable? Or I guess that it handles versioning/publishing better?

      [0]: https://help.github.com/en/articles/linking-to-releases

      • WatchDog 6 years ago

        Each dependency management tool has their own nuances about how artifacts should be uploaded, and retrieved and what metadata should be stored along side them.

      • oblio 6 years ago

        Artifactory can also proxy stuff. It's widely used as a corporate proxy for public repos so that you don't rely on the whims of the internets.

  • matharmin 6 years ago

    In my experience, using git as a dependency source for NPM (including yarn) or Ruby never worked well. It works for a simple case, but it's usually much slower, has issues around managing credentials for private repos, and doesn't have a nice way to publish built files.

    • bad_user 6 years ago

      Also, most importantly, Git repositories are not immutable and any package repo that's not immutable is a terrible, terrible idea

      • rkeene2 6 years ago

        This is one reason I created hashcache [0][1], for referencing remote immutable resources that can be addressed by their cryptographic hash. I used this in my Linux distribution to download source tarballs for every package by their SHA2-256.

        [0] https://chiselapp.com/user/rkeene/repository/hashcache/ [1] http://hashcache.rkeene.org/

      • ageofwant 6 years ago

        git repos can be as immutable as you want. you just need to point your package manager to a commit or tag, instead of a branch head. if you are worried about a rebase, well you have that issue with any public artefact stores.

        • bad_user 6 years ago

          The point is that it's not your Git repo, usually, when talking of dependencies, so it's not really about what you want.

          SHAs can't be changed, but they can be deleted. And on GitHub, entire projects, usernames, orgs can be deleted. Or renamed. In case of a user rename, GitHub does maintain redirects for awhile. Until that username is taken by somebody else.

          • ageofwant 6 years ago

            If that is a big concern you can fork. If you are building production systems with dependencies on eggs you can't find in pypi you probably should take control of those in your own copies. I can't recall once that I had to do that for things that I ask money for though... if its not in pypi its probably not worth using. And if it is useful, forking or just copying the module or package into your own code base takes care of any shifting dependencies.

            So yea, does not seem to be a problem that actually exists.

            • bad_user 6 years ago

              > If that is a big concern you can fork

              Surely you must be joking.

              Yes it is a big concern and the solution is to use repositories that aren't so volatile.

      • juliusmusseau 6 years ago

        Copyright law prevents package repos from being truly immutable.

        Fortunately a copyright takedown request is not a typical scenario, but it does happen, even with "immutable" repositories like maven-central.

        • bad_user 6 years ago

          Once a piece of software is released as open source, it can be freely distributed. And Maven Central packages require an open source license. The author might own the copyright, but he licensed that copyright away when publishing on Maven Central.

          In other words a "copyright takedown request" isn't valid, unless the author was in violation of the copyright of somebody else while publishing those packages and this was decided in a court of law.

          It might happen, but I have never heard of Maven Central packages being removed.

          But I do see GitHub repos being renamed or removed all the time and I have seen NPM packages removed, for no reason other than the author wanted so, screwing the entire JavaScript ecosystem.

          • lmm 6 years ago

            > In other words a "copyright takedown request" isn't valid, unless the author was in violation of the copyright of somebody else while publishing those packages and this was decided in a court of law.

            The DMCA process is law. Maven Central (like anyone else who hosts things) have to respond to valid takedown requests (which means taking down content long before any court case; even if a counter-notice is filed the content still has to be taken down temporarily) or else they'd become liable for infringement themselves. It's less common than on github or NPM, sure (which I suspect has more to do with the complexity of maven central's registration process than anything else), but it happens and any host on the scale of maven central needs a process in place for doing it.

            • juliusmusseau 6 years ago

              Even bad_user's assertion that "Once a piece of software is [legitimately] released as open source, it can be freely distributed" is not 100% true. There is a mechanism in US copyright law through which copyright holders and their heirs can unilaterally retract copyright grants and licenses 56 years after the initial grant or license.

              Granted this is quite the esoteric edge case... at least for now. ;-)

      • aianus 6 years ago

        At least for Ruby, it was easy to pin a specific commit SHA in the Gemfile to guarantee immutability.

        • koolba 6 years ago

          That does not guarantee it exists at the source repo though. You can’t create different content at that same hash, but you can rewrite history or delete the branch of that hash entirely and it will eventually be GCed away.

  • chrisseaton 6 years ago

    > what can I do now that I couldn't already do with Github as is?

    Before this new service how would you use GitHub as a source for installing, for example, Maven packages?

    • farmerbb 6 years ago

      There's https://jitpack.io, but it's a third-party service. Having first-party support for Maven artifacts served by GitHub will be quite nice.

    • danShumway 6 years ago

      I guess I'm not sure how Maven works then -- I thought it was just downloading package binaries? I would use Github releases for that and link to the binary directly. I'd use a CI to auto-build and publish a new release binary whenever I pushed to master.

      Does Maven do something more complicated like automatically figure out which platform binary to pull?

      • chrisseaton 6 years ago

        It's the file layout, for just one thing. You can't just point Maven (or many package managers) at a simple HTTP server without the correct layout.

        If you could... they wouldn't have built this.

        • danShumway 6 years ago

          Can't you? http://repo.maven.apache.org/maven2/ looks a lot to me like a simple HTTP server.

          I'll take your word on it though. I don't know much about how Java package management works, and like you said, I assume the Github team wouldn't waste their time building something that wasn't necessary.

          I guess if nothing else it would be a pain in the neck to have to know in advance how release files had to be laid out.

          • chrisseaton 6 years ago

            > ...without the correct layout

            How are you going to recreate that directory structure with GitHub releases? You can't even have any custom directories - they're just release-name/file-name.

            I mean just try recreating it yourself and see how far you get.

            You could try use GitHub pages instead, but GitHub very actively pester you if it even looks like you're distributing binaries there.

    • dragonwriter 6 years ago

      > Before this new service how would you use GitHub as a source for installing, for example, Maven packages?

      http://www.lordofthejars.com/2011/09/questa-di-marinella-e-l...

  • dahfizz 6 years ago

    It may seem less useful for interpreted languages, like JS, where the code is the artifact. For compiled languages, having a separate place to store official artifacts is much more important.

    • koolba 6 years ago

      Compiled and interpreted are not mutually exclusive either. In particular in the JS world it’s very common to transpile from modern JS or TypeScript to lowest common denominator JS.

jrochkind1 6 years ago

Looking at the ruby docs, my interpretation is that if a gem is published only on github registry, there's no good way to use it as an indirect dependency (no good way for a gem to list it as a dependency) -- any app using such a thing would have to know the list of all of these indirect dependencies on github registry, and list them individually in the top-level Gemfile, along with their correct github source.

This seems to limit the utility for ruby. I'm not sure if other supported platforms have similar issues?

You could already do a lot of what github registry for ruby does by using an existing feature where you could already point to a git repo (not just GH) in your `Gemfile`. What this adds is just the ability to resolve multiple versions from github using ordinary rubygems resolution. The existing feature forced you to manually specify a tag (hoping there was a predictable tag for a version) or SHA, or use whatever is on master HEAD.

  • tehbeard 6 years ago

    Other platforms (maven/java comes to mind) benefit somewhat due to the compiled nature of artifacts.

    The immutability of the packages is also handy as you pointed out by the hope and a prayer that a tag stays static.

    Is there not a global config for rubygems that would specify a list of registries to search for a package instead of having to add them to each project?

    • jrochkind1 6 years ago

      The way they have set things up, every github account/organization (the first thing after a slash) is it's own separate 'source' to rubygems. (I am sure they have done this because it would be inconvenient to integrate with rubygems/bundler any other way).

      So you'd still need to add a separate source for each dependency hosted on github to your own project Gemfile. Including for each indirect dependency, knowing which indirect dependencies exist that need a github repo source.

      If you could list this for the entire project... it'd probably be a performance issue as rubygems/bundler check every repo source you list for every dependency (including every indirect dependency; a Rails app has hundreds, still an order of magnitude or two less than a react JS project heh).

      Even if you could only list "github's ruby registry" once (per project? for your account? and keep in mind this is hypothetical, you can't), it would still mean any gem expressing a dependency on another gem hosted on github would have to include in it's instructions "oh, if you use this, you need to manually make sure to add github to your sources. Or you'll get an error that says some gem you've never heard of can't be found, and have no idea how to fix it." Unless it's a bid to get _everyone_ to do that, and basically make github ruby registry a standard part of the ecosystem that everyone just always adds to every project.

      I don't think there's enough/any value added by the github ruby registry to get the ecosystem to shift like that. It's unclear what it does that the 'standard' rubygems.org gem source doesn't do already (unless rubygems.org can't solve their recent severe compromised account security problems... but as it is, with the indirect dependency problem, I think github registry will be too painful to use even if you'd like to to escape rubygems.org security issues).

      https://help.github.com/en/articles/configuring-rubygems-for...

psadauskas 6 years ago

Deja vu https://github.blog/2008-04-25-github-s-rubygem-server/

And then removed 16 months later: https://github.blog/2009-10-08-gem-building-is-defunct/

Hopefully this one lasts longer.

  • rogerkirkness 6 years ago

    Customers ability to 1. Punish innovation and 2. Punish a lack of innovation is a little bit hard to overestimate as a Product Manager. Experimentation = bad, no experimentation = also bad. It's like how Google makes some of the best software ever, and also people savagely denounce them every time they kill a failing product. As if they would have learned as fast if they either didn't make the product to begin with or kept it around to languish and maintain.

    • tidepod12 6 years ago

      It isn't a customer's responsibility to be a company's guinea pig, and it's not a secret that customers would be unhappy that tech companies treat them as such. This is especially true when Product Managers intentionally implement features that take advantage of users by monetizing their data and then implementing high switching costs that make it even more painful for the customer once the Product Manager ends their "experiment". If tech companies want to perform market research by experimenting on customers, they should do the same thing that other industries do and compensate the experiment subjects, not take advantage of them.

      If you want to disrespect customers by treating them like disposable guinea pigs (and not even giving them the courtesy of notifying them they're part of an experiment), don't be surprised if they start to catch on and treat your company as if it's disposable, too.

      • rogerkirkness 6 years ago

        The world would have almost no innovative technology if not for a period where customers tolerated "not good enough".

  • btown 6 years ago

    It’s incredible to read the casual tone of those postings. Ten years ago Github was just an amazing innovation that could just spin a service shutdown as an experiment - now everything it does is vital infrastructure for modern development.

    • oblio 6 years ago

      It's also a service they launched in the first year as a company... I think they were still looking for an MVP at the time, I have no idea how anyone could compare the situations.

  • jrochkind1 6 years ago

    i would say that went away because the rubygems ecosystem (including the invention of bundler) improved enough that there was really no reason to use github as your rubygem server. It didn't give you anything but a slightly confusing proprietary alternative with no added value. So people rightly stopped using it, and it rightly went away.

    It's not clear to me what the value added for ruby specifically is now. (Yes, I know rubygems.org has problems; but this has feature problems with indirect dependencies compared to rubygems.org hosting).

hprotagonist 6 years ago

There's no immediate mention of this on the site, but -- why did they select the package formats that they did?

I'd love to be able to host wheels for my python projects, or {rpm, deb, flatpack, etc...} for effectively arbitrary code. Is that in the works?

  • alexbecker 6 years ago

    Running a python package registry has some unique challenges, so it makes sense not to start with it (I run such a registry: https://pydist.com).

    For example, Python has a distinction between distributions (the actual file downloaded, e.g. a tarfile or a manylinux1 wheel) and versions that doesn't exist in most other languages.

    • takeda 6 years ago

      All of these concerns are handled on client side, in the end all python needs is an http server, it can be actually hosted on S3.

  • OJFord 6 years ago

    Some function of popularity on the platform and ease of implementation, I'm sure.

    And, I suspect (from python's noteworthy absence), degree to which the language's users were already (mis)using GitHub releases (or sites) for this purpose.

  • ageofwant 6 years ago

    Perhaps because pip already supports git repos:

        git+ssh://git@bitbucket.org/foo/bar.git@fixit/atemp69#egg=hotshit
  • Guillaume86 6 years ago

    > There's no immediate mention of this on the site, but -- why did they select the package formats that they did?

    Don't know but the formats do not seem to match the Azure DevOps package feeds formats (some overlap of course but some missing in one, some missing in the other) so it's not from shared code.

hobofan 6 years ago

At least in the case of NPM (I don't know as much about the other ones): Doesn't that create a huge opportunity for hijacking attacks, where someone publishes a malicious NPM package in the default NPM registry under the scope identical to a Github organization/username?

  • quickthrower2 6 years ago

    That is an interesting idea, playing on people's confusion as to where to install from. And someone is going to put the super terse `npm i -g mytool` on their README.md page (because it's all about the easy installs isn't it!) and forget to say "change your registry to github" and boom!

    • hobofan 6 years ago

      Not even "someone". Exactly that command is available to copy to clipboard on the page of this new feature. Yeah, a small link to the instructions is printed underneath it, but most users - especially the ones that are new to package managers and the most vulnerable - will ignore that.

  • IggleSniggle 6 years ago

    Maaaaaaybe. But I don’t think so in practice. At least not yet. Would probably be good to see package.json evolve to allow specification of registry. In my personal experience of using a private npm registry, you pay a lot more attention to package-lock. package.json could probably evolve to specify registry in a similar way.

hiccuphippo 6 years ago

Any word on trying to tackle package build verification/reproductibility so users can be guaranteed that the package was built from the source code?

The problems like with rubygems from yesterday and npm a few weeks back would be gone with something like that.

mvanbaak 6 years ago

Deleting packages is not supported. Sobhow to handle a compromised package? Looks like you have to contact github and hope the act fast.

Oh, and no pip registry :(

  • gkoberger 6 years ago

    The alternate is that critical infrastructure can just... disappear. Like "leftpad", but worse.

    GitHub is already really great about alerting you with critical issues. Whenever there's a security bug, it pops up in our repo (and with Dependabot, it's become automatic).

    • jmb12686 6 years ago

      I have appreciated the automated notifications from GitHib for projects that have known vulnerable dependencies in my package.json(s).

      I just looked up Dependabot and linked it with a repo that I already have robust testing and CI pipeline for. Preliminarily Dependabot is great!

      It automatically updates by dependencies to the latest versions and submits individual PRs. Since I have TravisCI hooked up to this particular repo, I can see all the test results for each PR and can (confidently) merge the changes into master without manually firing up my personal dev machine(s) and manually performing what Dependabot just did.

      Anyway, thanks for the tip!

  • aaomidi 6 years ago

    Probably because deleting causes a lot of issues.

    They should have a release and snapshot branches.

  • smudgymcscmudge 6 years ago

    Deleting was supported at initial release. Removing delete support was one of the first changes they made. My guess is that was because of the feedback they got here and on twitter.

nickjj 6 years ago

Any word on what the price will be for private repos after the beta ends?

Would be interesting to see how it compares to Docker Hub for hosting private images.

  • andyfleming 6 years ago

    I think if it's at all comparable, GitHub will win out. It just seems convenient to not have one more subscription with another provider. Plus, hopefully, it will all be integrated well workflow-wise with a repository with actions that publish to the registry.

    • devmunchies 6 years ago

      if docker loses, then who will maintain docker?

      • andyfleming 6 years ago

        There's more to docker than Docker Hub, but I'm not sure what % of their revenue is from Docker Hub.

        Even if GitHub "wins" there will still be a lot on Docker Hub.

        Also, it's already possible to use other registries like Amazon's ECR or Google Cloud's GCR, or even your own private one.

wbillingsley 6 years ago

I came across this one the other day, which looks like it does this plus producing the binary package for you:

https://jitpack.io/

(No, I'm not related to that company in any way. I just saw it yesterday and thought it seemed like a neater solution.)

craigds 6 years ago

Still no python support :(

  • ageofwant 6 years ago

    A bit disappointing yes, but pip has had support for git repo's for many years. In requirements.txt:

        git+ssh://git@bitbucket.org/foo/bar.git@fixit/atemp69#egg=hotshit
    
    So perhaps that's why, still would like to have a github hosted devpi
    • craigds 6 years ago

      git urls are tricky to use with many tools (like pip-compile) though. At best they're slow, since things like "what's the latest version?" require downloading the repo.

      We forked some things into a private DevPI instance at present for that reason (well, also for latency)

      • orf 6 years ago

        Pipenv locks the VCS dependency to the commit, making pulling very fast.

andyfleming 6 years ago

I hope they add robots accounts like Quay.io (https://docs.quay.io/glossary/robot-accounts.html).

  • justincormack 6 years ago

    It says that permissions are the same as the github repo, so you can create github accounts and grant access, or use tokens.

    • andyfleming 6 years ago

      It's nice to be able to manage machine/robot accounts more directly though.

      On top of that, for orgs, you'd be paying monthly for each user you add.

gotts 6 years ago

I believe it's going to affect the whole developer community in a bad way.

Right now, all the major package manager are indirectly making each other better, they experiment, improve and borrow good ideas from each other. It's open-source and there is a little barrier for developers to contribute.

If n years from now GitHub becomes the defacto standard for package managers and replaces all the existing ones the further innovation will be much slower.

It might transform into "Want to improve package managers? You have to work for Microsoft"

  • txcwpalpha 6 years ago

    I think you're conflating package managers and package registries here. This GitHub product has almost no overlap with package managers like pip, gem, maven, or npm. It is not a replacement for docker.

    This is a replacement for npmjs.com (the hosting service, which is not the same thing as npm the package manager), or for rubygems.org (again, the hosting service, not the gem tool), or for Docker Hub.

    If anything, this may actually improve the collaboration between package manager developers because there will now be a large development team that will be working with the backend of various package registries and will have better insight into what each one is doing wrong and what each one is doing right.

andyfleming 6 years ago

I think this beta sign-up has been up for a bit already. It still just adds you to a wait list as far as I can tell, unless I missed something.

  • max23_ 6 years ago

    I signed this up when they first announced it but I still don't get the invite to access it. I am not sure how they choose who will have early access to it.

samcat116 6 years ago

Weird that the announced support for SPM packages but I don't see that on this page anywhere.

thefounder 6 years ago

I like Go more. The git/hg/svn/bz repository is the "package". No custom (and central) "registry"

  • atombender 6 years ago

    It's worth noting that Go is getting its own registry system soon, the Go Module Index [1].

    [1] https://blog.golang.org/modules2019

    • thefounder 6 years ago

      You can use modules without a registry. It remains to be seen how many prefer a centralised registry

  • jayflux 6 years ago

    There was a Go dependency that got deleted (repo removed) and screwed everyone else’s project up, so I’m not sure this is perfect either.

    • tacticus 6 years ago

      Which won't stop it happening with a package manager either.

      Even immutable repos are going to have fun with DMCAs.

      • jayflux 6 years ago

        npm hosted the packages so acted like a cache, the same didn’t happen when people deleted repos

        • thefounder 6 years ago

          What happened when the npm owner decided to remove some packages? Why would you trust one npm owner than the package author? Can't you just cache the package if you need cache?

          • jayflux 6 years ago

            > What happened when the npm owner decided to remove some packages?

            They can’t, at least not older packages, I believe they need to contact NPM if they wish to unpublish packages. https://docs.npmjs.com/cli/unpublish

            • thefounder 6 years ago

              I mean when Registry owner (i.e joyent or node org) removes your package for various reasons. Not to mention that private packages are a pain(i.e you need to spin up your own registry)

  • brown9-2 6 years ago

    It helps that Go’s “distributable package” is the source code itself rather than a binary artifact.

  • penagwin 6 years ago

    Same with npm at least with git.

  • sooheon 6 years ago

    This is not unique to Go, every language I work with can use dependencies in this way.

catern 6 years ago

The lack of "the" makes this read a bit weirdly:

>GitHub Package Registry allows you to develop your code and host your packages in one place. You can use packages from GitHub Package Registry as a dependency in your source code on GitHub.

"Package registry" is a fairly generic term, so to me it would be natural to refer to this product as "the Github package registry" (capitalized or not).

Is there a name for deliberately avoiding "the" in this way?

needusername 6 years ago

They do not have a published list of requirements for Maven artifacts. This does not give a good first impression.

tasogare 6 years ago

> limited public beta

"Limited" should have been in the title, because it makes it a not so public beta.

no_wizard 6 years ago

For all the features GitHub has, this is the only one that myself and those that I know personally have made us care and watch very closely what GitHub does with this.

We've been looking for a simple way to streamline releases. Right now everything we have at my job is on GitLab and I use GitLab personally (though I have a github account, of course).

I prefer GitLab in every way, but this feature alone might be a good enough reason to switch. It would make releases just so darn easy. The only thing I hope (which is not made clear) is that the stipulation that you can't easily delete a package on the registry (According to the link, its only for GDPR requests and legal reasons) is something that, for instance, an Enterprise account wouldn't have. I already have our purchasing team looking into it, thats how serious this is.

If the API for hitting these packages is any good, its gonna be so hard to resist.

I really hope GitLab has a good response to this.

To wit, since GitLab is custom hosted, I wonder how hard it would be to add this into the CE edition....

With all that said, I wonder what the hidden limits will be. Imagine if instead of NPM maintaing all of its servers, it was just a thin database that had better routing to github releases? Would that fall afoul with GitHub?

I mean, whats the point of maintaining your own distribution server when GitHub can front all the hosting costs and all you have to do is map the name of a package to its Github Package Release URL. I could see NPM, PyPI et. al. just doing that, instead of having their own servers. Maybe its a good idea to run additional cache nodes, but GitHub being the main place where release code lives for you package index would cut the bills significantly no?

  • itslennysfault 6 years ago

    This is a feature Azure DevOps (formerly Visual Studio Team Services) has had for at least 3 years now. Their repositories Maven, Gradle, Pip, and NuGet in addition to NPM. I'm always surprised more people don't use it. It's a full featured ticket system, git (PRs / etc), package feeds, and ci/cd in one neat package.

    • no_wizard 6 years ago

      I did not know that. Though, we aren't on Azure for anything at all (AWS for some HIPAA stuff, Google Cloud or our own proxmox cluser for the rest).

      I know Azure Pipelines is becoming the sort of defacto automated CD/CI pipeline though (used to be Travis for so long) and I've heard nothing but good things about that. Might have to take a look.

      • GordonS 6 years ago

        It's called Azure DevOps, but beyond technically being (transparently) hosted in Azure data centers, the "Azure" part of the name is pretty meaningless.

  • tr-gitlab 6 years ago

    GitLab PM here: The GitHub registry looks really interesting. I like how they incorporated search and how they are encouraging people to host their packages on GitHub instead of npm.

    At GitLab, the CE edition currently offers a container registry, that allows users to build, push and share images using the Docker client and/or GitLab CI/CD.

    The EE edition offers an NPM and Maven registry, that allows users to publish, download and share dependencies. Both also integrate with GitLab CI/CD. We are currently working on Conan (C/C++) and NuGet (.NET). We are evaluating moving these features to CE as well.

    We also offer a proxy to for Docker images (which will be extended to each registry) that improves reliability and performance and (in the future) will help mitigate and remediate open source risk.

    If you end up trying GitHub's registry, I'd love to hear more about what you thought.

  • sytse 6 years ago

    At GitLab we already have a package stage and support Docker, npm, and C packages. More are in the works.

    GitLab can also work as a proxy to upstream registers for performance and in the future for security.

baybal2 6 years ago

No RPM or DEB supported

penagwin 6 years ago

Any word on pricing?

codingslave 6 years ago

Github is going to be the source of code and data that neural networks use to write software. There is so much data on there, and it will only increase. There's only so many coding patterns. Get ready to be a fill in the blanks developer

jonny383 6 years ago

Honestly, GitHub has been going down hill for about 18 months now. It all started with the ":D Set status" feature. I give it another two years before Microsoft has officially turned GitHub into a 2021 version of Skype

  • cranky_coder 6 years ago

    I’m confused, is this an example of one of those “:D Set status features”? It seems useful to me...

    • jonny383 6 years ago

      This is a glorified wrapper written around existing package managers. I wouldn't call that useful.

      • dstaley 6 years ago

        No, this is a reimplementation of basically every major package server. Now, instead of hosting your Ruby gems on rubygems, NPM packages on the NPM registry, and Python packages on PyPI, you can host them directly alongside your source code. The tools by which you access these package registries are the same, but they can now be backed directly by GitHub.

        • jonny383 6 years ago

          Which is a glorified wrapper. Why on earth would you make a decision to lock up all of your eggs in the Microsoft basket? Diversity is a good thing. Just look at what's happened to the web industry with Chrome, and now it's basically too late.

          • gotts 6 years ago

            I couldn't agree more with jonny383. Why so many people consider centralizing everything as always and universally a good thing? Some people just don't seem to want to learn from history.

            Some short-term conveniences come at a very high price in the long run.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection