Settings

Theme

Gitlab – Static passwords set during OmniAuth-based registration (CVE-2022-1162)

about.gitlab.com

66 points by altharaz 4 years ago · 23 comments

Reader

thcipriani 4 years ago

To save folks some digging on what exactly this means—it's exactly what it sounds like: https://gitlab.com/gitlab-org/gitlab/-/commit/e2fb87ec5d4e23...

  • altharazOP 4 years ago

    The hardcoded password seems to be:

    Gitlab::Password.test_default(21) => "123qweQWE!@#000000000"

  • eek2121 4 years ago

    Do they not have a code review process?

    • mdaniel 4 years ago

      https://gitlab.com/gitlab-org/gitlab/-/merge_requests/76318

      and I actually suspected it was a matter of the change hiding in an absolute sea of diffs, but there's a comment on the file right below the change: https://gitlab.com/gitlab-org/gitlab/-/merge_requests/76318/...

      > (10 Jan, 2022 1 commit) JH need more complex passwords

      > (30 Mar, 2022 1 commit) Revert "JH need more complex passwords"

      oops

      ---

      In case others are wondering what's up with the JiHu label and its matching "gitlab-jh" group: https://about.gitlab.com/handbook/ceo/chief-of-staff-team/ji...

      • bloomburger 4 years ago

        Why has the person who wrote this vuln, Zhu Shung, not made any commits/contributions since they pushed this out two months ago?: https://gitlab.com/memorycancel

        Can somebody who what they were trying to do here opine on if this might have been malicious or was more likely just a honest mistake?

      • xPaw 4 years ago

        What I find mildly curious, that's also the only place where a length was provided as an argument into `Gitlab::Password.test_default`.

      • theptip 4 years ago

        Thanks for doing the issue sleuthing. This is an excruciatingly bad look.

        You'd have thought with all the code-owner functionality that GL has, they would lock down the `/lib/gitlab/auth/` files to require a security engineer to give additional signoff on top of a normal review. It looks like anyone at Gitlab can approve changes to the auth code (except LDAP): https://gitlab.com/gitlab-org/gitlab/-/blob/master/.gitlab/C... which is terrifying if true.

      • fy20 4 years ago

        Is there a better way to catch errors like this?

        Looking through the PR it looks like this file was accidentally changed, I assume with a project wide search and replace.

        I could easily imagine myself missing this when reviewing the PR "oh it's just changing a whole bunch of specs, go ahead".

        • mdaniel 4 years ago

          > I could easily imagine myself missing this when reviewing the PR "oh it's just changing a whole bunch of specs, go ahead".

          That's one of the major benefits of the "tree view" in MRs, because one can collapse the "spec" folder, collapse the "ee/spec" folder, and it leaves "db" and "lib/gitlab/auth" visible which should for sure set off mental alarm bells: https://docs.gitlab.com/ee/user/project/merge_requests/chang...

        • NavinF 4 years ago

          An integration test that creates dummy accounts using every method including SSO and then attempts to bruteforce the password should find “12345678” within an hour.

          I think a test like this would also have found the dropbox and macos bugs that let you login to any account by using an empty password:

          https://techcrunch.com/2011/06/20/dropbox-security-bug-made-...

          https://arstechnica.com/information-technology/2017/11/macos...

          Edit: Oh, the password was "123qweQWE!@#000000000". Technically doable with an efficient password cracker that favors common patterns. zxcvbn’s entropy estimate says it will take 10^10.5 guesses. That’s 1 week at 50k/s. That’s a hell of an integration test for most software.

          https://lowe.github.io/tryzxcvbn/

          • dewey 4 years ago

            > then attempts to bruteforce the password should find “12345678” within an hour

            But only if there's no rate limiting or increasing timeouts for wrong passwords which in most cases exists.

            • NavinF 4 years ago

              Right, but you can bypass that in a testing environment.

              • dewey 4 years ago

                Which then opens you up to exactly that class of bugs that also caused this issue. Having test specific code and feature flags and then testing a tweaked version isn't really covering all the cases then.

                Just like in this case where a hardcoded password was set to maybe log in through a test based on the naming "test_default".

        • theptip 4 years ago

          Posted some of this in a sibling comment, but there are a few ways you can address this (not all viable for Gitlab).

          First of all, in code review, having merge requests touching security-sensitive code (e.g. /lib/gitlab/auth) require review by a security engineer using CODEOWNERS is the obvious process gap.

          If you're paranoid, have security engineers review the diffs on all releases before they are shipped (in practice the security engineer would filter the `git diff` down to just the files you care about like `/lib/gitlab/auth/`). This is expensive, but probably not as expensive as "whoops, our Chinese subsidiary completely broke our auth for 3 months". If you have a good enough bug bounty program then I'd expect your pentesters to spot this in less than 3 months, as reading git commits to auth code is a good way to spot potential bugs as they are committed. This is hard if you're doing continuous deployment (I don't know if GL do this) but since they have numbered releases, I'm guessing (hope?) they don't run gitlab.com off the tip of master. If you're doing Continuous Deployment you need to be damn sure that your CI/CD pipeline is rock solid, and you have the right reviewers and controls on the sensitive bits of the codebase.

          These are processes; you can also make this kind of error less likely at the architectural level. A general approach for larger applications is to split Auth out into its own service with its own repository, release cycle, restricted set of contributors, etc; while microservices are probably somewhere between Peak of Inflated Expectations and Trough of Disillusionment, splitting out services that require a strong security boundary is a good use-case. You can even use an OSS off-the-shelf system like https://www.keycloak.org/ to avoid having to build your own LDAP/SAML/JWT authority. (Or outsource this to Okta, though that seems a bit more dubious these days).

          This is harder for GL because they have the FOSS monorepo and splitting out auth code would be awkward. But Gitlab currently deploys a bunch of components from the monorepo (https://docs.gitlab.com/ee/development/architecture.html) and if there was a separate auth component, a paranoid security team could reasonably deploy their auth service from a fork that they update in a more controlled fashion, instead of just auto-deploying all changes to security-sensitive code as appears to have been the strategy here (if security engineers actually looked at this change and OK'd it, there are bigger problems).

          • bloomburger 4 years ago

            Is the vulnerability in the code change glaringly obvious to you?

            Looks like two security engineers actually took a look at it: https://gitlab.com/gitlab-org/gitlab/-/merge_requests/76318#...

            • theptip 4 years ago

              It’s easy to spot in hindsight. I can see how it slipped through if they didn’t actually closely read the changes in each file, it’s easy to let your guard down with many-file changesets like this. (That’s a cognitive bias that should be kept in mind when doing security review, particularly if that’s your only control in place to prevent issues like this).

              Fundamentally, “fix tests” changesets like this should not touch non-test code and anything touching /var/lib/gitlab would be a big code smell to me if I was reviewing.

              The advantage of the code owners approach is it highlights that there were changes to the prod auth code (shows the rule that triggers in the MR I believe), which gives you an additional chance to spot issues to focus on. Seems they also have bots running checks and so they could have the bot ask for an extra review if …/auth/ was touched. Test helpers shouldn’t live next to prod code, for obvious reasons, so the “prod auth code owner” rule being hit would have been an alarm bell on this MR.

              I do think it’s concerning that security review was given here, so I’m keen to see the post-mortem and see what other gaps they identify.

      • altcognito 4 years ago

        I thought I had an account there for a while, I don't know how long (2-3 years?), and all of a sudden I got a password reset out of the blue regarding this particular change. No other emails in my history regarding gitlab whatsoever, so it's even possible I didn't have an account at all.

        I go over to the website try to login with my gmail account via SSO which fails because I "already have an account". So I proceed to reset password via email alone.

        After I'm in it tells me I've had an account since January 22nd 2022. Super unlikely i created the account this year, so I don't know what's going on over there, but it's not accurate bookkeeping.

krebsonsecurity 4 years ago

This appears to be related. One Github user shared an alert they got today, two days after connecting their Github account to Gitlab. Something about an app added to the account. Their Github has 2fa turned on and a very strong password:

https://twitter.com/briankrebs/status/1509910113716514822

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection