Settings

Theme

Yubico: Secure Hardware vs. Open Source

yubico.com

190 points by francois2 10 years ago · 113 comments

Reader

davideous 10 years ago

In discussions like this the phrase "security by obscurity" gets used as an accusation. We all agree "security by obscurity" does not work. But that's not what is happening here.

Wikipedia's definition: "the reliance on the secrecy of the design or implementation as the main method of providing security for a system or component of a system."

Youbico isn't saying that the security of the device is increased by keeping the source code secret.

They say they are increasing the security by things like this: disabling user-loading of new firmware (which could be a bad actor loading bad firmware), using hardware with built-in side-channel countermeasures, and disabling JTAG ports (which could be used for key extraction).

This isn't obscurity. These are some good engineering arguments. Engineering is always full of trade-offs.

  • colemickens 10 years ago

    None of which precludes the implementation from being open source. In fact, it just means that even if the software were open source, it would be near-meaningless since I can't verify the code running on the device and can't reflash it myself.

    "Youbico isn't saying that the security of the device is increased by keeping the source code secret."

    Yeah, they're not really saying anything other than trying to provide an excuse for why they won't release it. "You can't use it anyway" isn't much of a response (I actually find it rather patronizing and dismissive).

    Not to pile on, but regarding: "Engineering is always full of trade-offs."... what exactly is the supposed trade off here? (Maybe they're using licensed code that they can't redistrib?)

    • tadfisher 10 years ago

      If I'm reading the statement correctly, they are unable to release the source due to an NDA with their hardware provider, which is at least a reason other than "it's not software under the Free Software definition".

      • pstrateman 10 years ago

        You are indeed reading their statement correctly.

      • mtgx 10 years ago

        What would be the purpose of an NDA with the hardware provider? Surely not to hide it from GCHQ/NSA?! I imagine a company like Yubico has all of its employees on GCHQ/NSA lists and may even have cell tower simulators outside of its offices.

        The NDA makes this even more suspicious. Who's the hardware provider? Huawei?

        • yc-kraln 10 years ago

          NXP makes you sign an NDA to use their secure stuff.

          The purpose is anti-competitive, preventing NXP's competitors from learning how the devices work. These devices often have advanced hardware and firmware countermeasures.

          The secure modules are considered weapons technology if they're allowed to be updated after sale; the company is responsible for tracking each one, they're impossible to ship overseas, etc.

          It's not suspicious, it's SOP. Choose between open and secure, or make your own silicon.

        • makomk 10 years ago

          Pretty much all of the providers of secure hardware are like this because they're all reliant on security by obscurity. They rely on keeping secret things like their instruction set, register locations, what countermeasures against intrusion they have, etc in order to make it harder for a hacker to compromise them.

          • rkangel 10 years ago

            > in order to make it harder for a hacker to compromise them

            Keeping implementation details secret DOES make it harder for a hacker to compromise them. When used as a defence on top of a decent security infrastructure. "Security through obscurity" is when a company only uses the secrecy as a defence. This is not true:

            > they're all reliant on security by obscurity

            They're generally reliant on some secure and proven methods of security, with a layer of design obscurity over the top (and in practice as others have pointed out, they don't keep the design secret for security reasons, they do it for commercial ones).

    • davideous 10 years ago

      I think if they released the source, but you weren't able to reflash the device (which is a design trade-off they chose to close some attack vendors), people would be up-in-arms and saying "it's not true open source because I can't re-flash or verify the device."

      • colemickens 10 years ago

        Except that I was just able to make the distinction... If their response wasn't patronizing enough, now you're adding on by saying we're too stupid to acknowledge the difference?

        Nah.

    • na85 10 years ago

      There's a market disruption opportunity here. Carpe consumer base.

  • sigmar 10 years ago

    >They say they are increasing the security by things like this: disabling user-loading of new firmware (which could be a bad actor loading bad firmware), using hardware with built-in side-channel countermeasures, and disabling JTAG ports (which could be used for key extraction).

    Are all of those listed features only possible with secret code? And if yes, once someone unobscures the code or methods, they'll be able to defeat the security. Isn't that the exact definition of 'security through obscurity'?

    • pritambaral 10 years ago

      I think what Yubico meant is that they aren't closing code for the sake of closing code, but since it can't be loaded onto the devices anyway, there's no need for the code to remain open.

      • sigmar 10 years ago

        Oh. Now I'm reading davideous' comment much differently. But the title of the blog post (ie "vs") makes it seem like they aren't making it open source so that the hardware is secure.

        • davideous 10 years ago

          Yes, pritambaral described what I'm trying to point-out.

          In think the "vs" in the title is saying this: they had to choose between open source (that is functional meaning you can really use the code and re-flash the device) and the secure hardware. It was a trade off of one "vs" the other, and this is their reasoning behind that trade-off.

          • sigmar 10 years ago

            Open source doesn't necessarily mean that you can put it on the device. I'm sure a lot of Yubico's critics would be happy with seeing the code even if it can't be flashed.

            • davideous 10 years ago

              Yes, it would technically meet the Open Source Initiative's definition (https://opensource.org/osd), but if there was no way to re-flash the device, no way to verify the binary on the device, or possibly even no way build a binary (which may require proprietary tools under NDA from the chip manufacturer) -- I think a lot of critics would still be critics, but I could be wrong.

              If Yubico did this it would be very interesting to see the reaction.

              • jevinskie 10 years ago

                It would allow a third party to discover a vulnerability similar to the one in the Neo just by just reading the code.

              • mavhc 10 years ago

                The general issue is when all hardware has software in, in the end it has to be open source. Going even further: The distinction between hardware, firmware, and software is logically irrelevant in terms of trust.

  • zobzu 10 years ago

    This isnt about security. Its about its was open source before and user modifiable and it no longer is. You can force wipe on flash for example.

    They clearly changed stance to ensure users cannot play with the hardware and competitors cannot copy the code. Which is fine. But its always weird when the argument of security is used instead of being genuine.

    You can copy the freaking key by removing the plastic of the yubikey4. you dont need a jtag port. you just connect to the pins. And guess what. its no big deal. You can't do that remotely and its not a device for 007 spies.

    • uola 10 years ago

      "They clearly changed stance to ensure users cannot play with the hardware"

      As per the statement (and earlier statements) you can't change the firmware unless you have a yubikey neo developer edition, which was only sold during 2012 and 2013. The change here is that the yubikey 4 doesn't run open source code (for the pgp part) as a result of changing platforms. The best way to show that you support open source is to buy the YubiKey NEO instead of the YubiKey 4.

      • xaduha 10 years ago

        > The best way to show that you support open source is to buy the YubiKey NEO instead of the YubiKey 4.

        YubiKey NEO isn't a unique product, it's basically a cardreader and a java smartcard all-on-one, but there are plenty of vendors for both, it will probably can be even cheaper in some circumstances/regions.

        If you support open source, then give https://github.com/philipWendland/IsoApplet a look instead.

        A separate cardreader also means that you can use several smartcards for various things.

        • mindslight 10 years ago

          A feature the Yubi has over a smartcard is the button. You can get smartcard readers with pinpads etc, but not that fit into an Expresscard slot.

          I was pretty close to getting a Yubi, until I realized that the default version couldn't modify the PGP applet, and didn't find exactly where to order the special "developer edition" either.

          At this point it probably makes more sense to find/make a dongle based on an STM32 or the like. The problems with non-hardened hardware discussed in the article are real, but I'd bet the features/innovation enabled by a Free design will outweigh those tradeoffs (eg an audit log, indication of what you're signing/unlocking, actual encrypted key material when the device is "cold").

          • xaduha 10 years ago

            You can still have pin protected stuff, both Security Officer and ordinary user can have them, it's a part of PKCS #11 standard probably. Also, we were talking about Neo.

            To me it makes more sense not to do crypto yourself, but trust in an established technology, which is a smartcard. They are used everywhere from sim cards to chip-and-pin credit cards.

            • mindslight 10 years ago

              Sure, but smartcards have traditionally fulfilled a narrow purpose - creating a notion of non-cloneable identity for some centralized top-down entity. The technology of a hardened mini computer could be applied to many other things, but the closed philosophy of the industry really hinders that. I'd love to get some samples of ST23 and create a board with an appropriate hardware UI for end-user signing, but alas this industry has not seen the light of Kerckhoff's principle.

              My problem with PINs is twofold. First, the reader required to use them in a transparent manner does not fit with the form factor of a laptop. Second, they're obviously less secure than a passphrase - relying completely on hardened hardware. If I'm willing to enter a passphrase for every session, why should I be carrying around the key in the clear?

        • uola 10 years ago

          It's a unique product in the sense that it has nice form factor and holds additional functionality for more main stream uses. I have a number of these devices, including the external card reader, the usb key card reader and the integrated rubber usb key. Everyone can decide what they want of course, just don't be surprised when they discontinue the NEO.

          If you read between the lines of how it went from closed, to very open, to less open, to now not open at all. It seems like they tried open source but failed. They were probably looking for people to integrate it into some e-mail client, chat application or even bitcoin wallet. Now they've gone back to focus on their core customer and using a cheaper more integrated chip.

          • xaduha 10 years ago

            Point being that if you support open-source, then Yubico isn't your champion. They failed in sense that it harms their business, not much more than that.

            > just don't be surprised when they discontinue the NEO.

            I'd say that Yubico isn't a big deal, therefore them discontinuing NEO isn't a big deal either.

            • uola 10 years ago

              If you care about crypto you should probably care what happens to the most appealing device out there and if you care about open source you should probably care what happens to companies that makes open sourcing part of their business.

              • xaduha 10 years ago

                This is getting nowhere, so let's stop. It's one of those "let's agree to disagree" moments. I certainly don't find yubikey particularly appealing and you already said that they "failed" when it comes to open-source, which I agreed with.

        • drazvan 10 years ago

          Feitian has a similar product with known keys and running JavaCard in the form of an USB token - http://www.ftsafe.com/product/epass/eJavaToken . Also much cheaper than the Yubikey, see http://javacardos.com/store/smartcard_eJavaToken.php . No NXP proprietary stuff on it and no NDA required either.

          • xaduha 10 years ago

            I did visit their site some weeks ago, but I'm not a fan of bundling a card reader with a card like that. It's better to buy a separate card reader, java cards themselves can go for as low as $2 each, maybe even cheaper.

            NXP is just one of many vendors, they sell blank java cards too.

            EDIT: $25 shipping fee is very inflexible.

        • mpnordland 10 years ago

          I use pgp on my mobile devices too, I would prefer something I could use for both my phone and my computer. The NEO would have filled that role. In my research I haven't found anything like that so far. I would love to be enlightened though if anyone knows about something that can do the same!

    • BuildTheRobots 10 years ago

      > You can copy the freaking key by removing the plastic of the yubikey4

      Any more information available? googling for "yubikey 4 takeapart" got me nowhere.

  • datenwolf 10 years ago

    > In discussions like this the phrase "security by obscurity" gets used as an accusation. We all agree "security by obscurity" does not work. But that's not what is happening here.

    Well, sort of.

    In the linked article Jakob Ehrensvard (Yubico CTO) wrote:

    >> (…) One could say it actually works the other way. In fact, the attacker’s job becomes much easier as the code to attack is fully known and the attacker owns the hardware freely. (…)

    While the rest of the article makes good points, this particular sentence hints at "security through obscurity".

    • rkangel 10 years ago

      Security through obscurity is when obscurity is your only security measure. When used on top of an otherwise secure system, obscurity actually makes finding vulnerabilities harder.

      The principle with open source is that you can trade that obscurity away in favour of the "many eyes" on your code and the fact that it is then proven secure. That tradeoff is definitely worth it, but that doesn't mean that the obscurity doesn't help security.

  • Alupis 10 years ago

    To take an alternate approach...

    Could this be a sly attempt to close-up the source (and hardware) before they have a Tangibot[1] situation?

    That scenario played out poorly for MakerBot, and perhaps YubiCo learned the wrong lessons from the entire ordeal.

    [1] http://www.cnet.com/news/pulling-back-from-open-source-hardw...

    • bonzini 10 years ago

      Unlikely. The MCU in a Yubikey is not something you can order from aliexpress.

  • discreditable 10 years ago

    > disabling user-loading of new firmware

    Am I understanding correctly that these devices can never have their firmware updated? That there is no update mechanism seems insane. They could prevent bad firmware updates by wiping keys on upgrade. The risk now is that some firmware version is discovered to have flaws, and that device is vulnerable forever.

fpgaminer 10 years ago

> we, as a product company

The most important thing any security company needs to realize is that their primary product is their reputation, not the physical or digital goods that they produce. "We, as a product company" is totally the wrong attitude. There's really no question about it, every ounce of closed source software/hardware in a security offering is something the customer should be concerned about it.

From a product perspective it totally makes sense to be worried about open sourcing the entire design. "Our competition will make clones!" And that may be true of every other kind of product. But would you buy a cheap knockoff Yubikey? I certainly wouldn't. Again, reputation is the key here. That's what a security company sells to their customers. Confidence that when they buy from company X they know that company X has put the best engineers to the task and crafted a device that will protect their valuable digital information.

A company can build up a reputation in the security industry, produce world class hardware and software, and charge a sharp premium on it, because security is _so_ important and protects some of our most valuable assets. That premium is completely derived from the trust that they've garnered. It's insane for Yubico to squander theirs under some false sense of IP security.

EDIT: And all that said, I totally understand where they're coming from on some of their points. They have to depend on chip manufacturers, and chip manufacturers are just the absolute worst when it comes to open source and security. Sometimes there are hard constraints and compromises have to be made. Most of cryptography is a trade-off. So don't take my comment to mean that designs absolutely have to be 100% open source. That's infeasible most of the time for hardware. But Yubico should be striving for it and pressuring the market.

  • jfindley 10 years ago

    > A company can build up a reputation in the security industry, produce world class hardware and software, and charge a sharp premium on it, because security is _so_ important and protects some of our most valuable assets.

    Hmm. I think there's considerable limits on how true this is. I would argue Yubikey's current security is more than good enough for almost everyone.

    As mentioned in your edit, there's not a lot Yubico can do about the hardware restrictions. Given these restrictions, a common way companies in this industry assure users of the security of their device is FIPS 140-2 certifications, which range from levels 1 to 4.

    Level 4-certified devices are extremely expensive, and the market for them is tiny, which seems to indicate that there's a definite limit on the amount people and organisations are prepared to pay to ensure security.

  • nickpsecurity 10 years ago

    "The most important thing any security company needs to realize is that their primary product is their reputation, not the physical or digital goods that they produce."

    That's semi-true. They're both important. The belief that the product is worth buying and effort into selling it are primary importance. Getting hacked or sued in public diminishes sales. So, the most important aspect of security for these kinds of companies is perversely minimizing potential for their image to be hit by hackers even if the products have no security. Not an accusation at Yubico but a common strategy in this market. So, they just have to present a good impression to target market.

    " every ounce of closed source software/hardware in a security offering is something the customer should be concerned about it."

    Not really. It might surprise you but many companies have run for decades on proprietary platforms. They generated ridiculous sums of money in the process. All kinds of people got jobs, made money, and retired in this time. Nothing to worry about apparently most of the time. The reasons to worry are there but smaller than you think. One must balance many needs in a business. For most, this kind of thing is a checklist item about reducing liability. They're fine if it looks good on paper.

    " But would you buy a cheap knockoff Yubikey? I certainly wouldn't. "

    Most would. They want something as an obstacle to hackers while minimizing cost. They don't know if Yubikey has any real quality underneath given how businesses often do things. So, it's a real Yubikey vs a cheaper one. Many, not all, will choose the cheaper one. See Cisco and mobile manufacturers vs Huwei to see how big of a market share that can lead to.

    "Confidence that when they buy from company X they know that company X has put the best engineers to the task and crafted a device that will protect their valuable digital information."

    There's a market for that. I used to try to serve it. It's tiny and fickle. Yet, I question what confidence people have in those engineers to begin with as they've never assessed their capabilities in INFOSEC and strong attacks rarely are publicized. It's not like Googling rate of car crashes.

    " produce world class hardware and software, and charge a sharp premium on it, because security is _so_ important and protects some of our most valuable assets. "

    Many tried. Market rejected almost all of it. Still does. They want security-defeating feature X, protocol Y, and fall-back Z. They want it to run as fast as competition despite security or safety checks on insecure, potentially-backdoored hardware to get COTS HW benefits. They also don't want to pay hardly anything extra for it despite whole teams of extra people being put into every other component for rigor and price of external evaluations. Market for high-assurance guards is so small that they have to charge over $100,000 per unit to make the money back. Hell, Signal is free and Threema charges $1-2 but they're barely a fraction of 1% of WhatApp or Facebook in marketshare. Demand-side is the problem.

    So, Yubico is doing what's good for business. All of them are and should until market shows it's willing to make the compromises necessary for strong security. They won't. So, wasting money on it is foolish outside defense sector, academia, and a few niches (eg smartcards) where one can keep a job doing it.

    • the_ancient 10 years ago

      >>Cisco and mobile manufacturers vs Huwei to see how big of a market share that can lead to.

      Implying the Huawei is the "cheap knock off" and Cisco/Apple/Samsung/etc are the noble high quality product fighting the good fight....

      My Hauwei Nexus 6P has been the best phone I have ever owned, far exceeding the quality and usability of every Motorola, Samsung, and other phones I have owned.

      As to Cisco, after their fasco with the NSA I would not trust them at all for security.

      • nickpsecurity 10 years ago

        That's an accusation and implication. The Chinese strategy, which isn't entirely secret, is to use their hackers to get trade secrets out of firms in all kinds of sectors to hand to their own firms. Each time, their firms leverage those as a head start on their own products which combine their own innovations, labor advantage, and money from vast market in China. It's a proven model. Far as Cisco and Samsung, it's been clear Huwei has been knocking them off the same way.

        Besides, what are you even questioning given that Huawei admitted they had and removed Cisco source code? Of course they robbed them. :P

        "As to Cisco, after their fasco with the NSA I would not trust them at all for security."

        Which is totally irrelevant to my point that cloners... especially Chinese cloners... will make knock-offs of a hardware product in any country that hurt that company's business if the product is worth it to them. The NSA collecting secret information to determine if you're a terrorist, felon, or threat to foreign policy != Chinese intelligence giving your competition your I.P. who then operate in your market with cheaper labor. NSA is a hypothetical threat for most companies whereas Chinese tech and labor market have been doing my country (U.S.) in for decades with many companies achieving parity or dominance in some sector through stolen I.P.. It didn't help that idiots running our companies put R&D centers over there to reduce labor costs. (rolls eyes) Such stuff is an existential threat to small, hardware providers worth cloning given what Shenzhen can pull off.

        • grawlinson 10 years ago
          • nickpsecurity 10 years ago

            Name five, foreign companies off the top of your head who have US products that cloned... with source and such... their product line. Which also became huge players in market taking huge sums from original. I'm interested in seeing them as I blast NSA for what tiny, industrial espionage I find.

        • the_ancient 10 years ago

          >Chinese intelligence giving your competition your I.P. who then operate in your market with cheaper labor.

          Well first and foremost I do not accept the concept of IP in the first place, Information is not property and should not be protected.

          Nor or they "my competition" they might be cisco, but I do not support nationalism, or protectionism.

          Let me Guess, your a Trump Supporter?

          • nickpsecurity 10 years ago

            How do you get from admitting Chinese use spies to rob American companies of R&D to thinking anyone avoiding that is a Trump supporter? You have a powerful imagination or loose standards of logic to make a leap like that.

            More like a company acting in rational self-interest should keep any IP they depend on away from the Chinese. Or expect to be cloned but leverage them and dominate their market as much as possible before displaced by homegrown offering.

davb 10 years ago

It's a shame to see that they used the goodwill of security-conscious cryptonerds to gain a foothold on the market only to, effectively, say "We're now targeting enterprise and government who can afford to pay for third party contracting security auditors. You can't, so just take our word that it's secure."

Other companies have managed secret distribution for secure devices just fine - randomise the card manager key and bundle a tamper proof packet containing the key along with the product. Provide instructions on how to verify the integrity of the packet, and confirm a digitally signed affirmation of the key against Yubico's public key online.

That's more than RSA offers for SecurID seed verification and more than my business bank offers for two factor device PIN integrity checking.

I'm not sure who they use for their Secure Element (NXP?) but it also sounds like Yubico has gone along with their request (and NDA) to keep implementation details secret. We've seen a similar situation in SE implementations in mobile phones (for contactless payment, primarily).

Again, enterprise customers don't care (mid-sized one have insurance that will cover loss if their Common Criteria EAL 5+ vendor's hardware is compromised, big enterprise can pay for auditing). Governments don't care (they'll pay for auditing or negotiate it in any significantly high volume contract).

End users and the tech community are the only groups who'll really lose out here.

nickpsecurity 10 years ago

I've studied high-assurance security and hardware for a long time. This looks to be motivated by a few things:

1. Hardware cost money to develop, has to make it back, and is easy to clone. They'll keep hardware secret by default for this reason like everyone does. Also lowers odds of patent suits. All kinds of people demand open, secure hardware but almost nobody will buy it. Just like software. Number 1 problem in the INFOSEC industry.

2. There's three companies IIRC building the kinds of secure IC's they need. They NDA the stuff critical to understanding it. Plus, the implementations are secret with tamper-resistance mechanisms. Pointless relying on open-source model to understand or evaluate such a thing. Some marginal benefits but major risks would still be there. Whereas, open-sourcing the stuff adds risk in terms of issues with the suppliers. So, no OSS is an acceptable choice here.

3. Restricting some of the firmware/software is a tradeoff of the protection methods they're using. Again, reduces value in open-sourcing it as you'd have to dump it off the chip to verify it anyway. The kind of people that can do that don't need Yubico's help.

4. Yubico might not know how to build secure HW/SW combos. It's a rare skill whose techniques are a mix of published and trade secrets. Plus, attackers are always coming up with new stuff. So, obfuscation... not security by obscurity... but obfuscation of aspects of design to increase work of attackers between product releases is both justified and a proven method. If no other measures exist, then it would be the garbage known as security by obscurity. This seems to be better practice of proven mechanisms plus obfuscation which can hamper even nation-state hackers. Who knows how good their mechanism are going to be but there's potential.

So, it seems like a combination of sustaining their business by stopping clones and lawsuits with improved branding from effects of obfuscation & hardened IC's on low-skilled attacks that dominate the press. Two, very-good reasons to make a decision in this market. It's just economics in action. :)

  • uola 10 years ago

    1. The hardware design per se isn't that valuable. It's quite easy to reverse engineer and is probably more like a reference design that anything. More likely NXP (?) don't want open designs and open software because it makes it easier to reverse engineer and clone the chips themselves. For YubiKey themselves it's mainly the firmware that is valuable (well, design and access to chips to of course) which is why part of their firmware isn't open source.

    • nickpsecurity 10 years ago

      "The hardware design per se isn't that valuable"

      People that spend considerable effort turning a good idea into hardware that sells tell me otherwise. ;)

      "because it makes it easier to reverse engineer and clone the chips themselves."

      You first said it's easy to reverse engineer and not valuable. Then, said they want closed designs to reduce reverse engineering and cloning. Which is it?

      "For YubiKey themselves it's mainly the firmware"

      That may be true. I can't speak to that.

      • uola 10 years ago

        "People that spend considerable effort turning a good idea into hardware that sells tell me otherwise. ;)"

        The execution and the overall ecosystem of course matters. But the hardware design, how the chips are connected, isn't really a secret as such and is easy to reverse engineer and recreate. It's just not very complex.

        http://www.hexview.com/~scl/neo/

        "Which is it?"

        The hardware design is easy to clone, the chips themselves aren't necessarily. Chips have a very low marginal cost and a functionally identical clone could easily be sold for 1/100th the cost in volume, since all the cost is R&D. Companies therefor try to protect their IP as much as possible by making reverse engineering harder and by "owning the ecosystem". There's been cases where clones have been made by emulating chips on much more capable (but cheaper) hardware and sold for 1/10th the price.

        • nickpsecurity 10 years ago

          "But the hardware design, how the chips are connected, isn't really a secret as such and is easy to reverse engineer and recreate. It's just not very complex."

          Hardware design is a combo of how the chips are connected, the firmware, and getting it to users. Your link supports my assertion that they should put in whatever obstacles they can.

          "Companies therefor try to protect their IP as much as possible by making reverse engineering harder and by "owning the ecosystem"."

          Point 1 in my original comment.

viraptor 10 years ago

After thinking through the initial "this is terrible" reaction, I actually don't mind what they're doing. Even though if there was an equivalent solution that was based on open source I'd definitely choose it over YK 4.

I also don't see anything that would really prevent them from just releasing the source they're using, even if we can't realistically do anything useful with it. The whole point of those systems is that it's secure via algorithms and hardware silos - releasing their sources shouldn't change anything.

But in practice it doesn't really matter that much - as long as they use standard interfaces and replace your key for free if someone finds a vulnerability, I'm (cautiously) fine with their new position. I think a big part of the issue is that they did something better before, but if they started with the current design, people wouldn't really complain about it that much.

rcthompson 10 years ago

Couldn't a hardware vendor theoretically provide read-only access to the firmware and then have an open-source reproducible build process so that anyone can build their own copy of the firmware and verify that the firmware on the device is bit-for-bit identical? Wouldn't that satisfy people who want to be sure of what code is running on their device while still preventing an attacker from loading custom firmware?

  • qrmn 10 years ago

    Absolutely. I built a concept that essentially did that.

    Separated program and data memory with only one executable. USB host would get (in hardware) an outright memory dump of the program memory on connection, so it could hash it/compare it to known-good firmware. If you flashed the firmware the data memory should get wiped, and if you flashed it with anything the driver didn't know as a good build, unless you manually whitelisted it, you'd be warned.

    That seems like a better approach to me. (It turns out I really suck at designing hardware, let alone secure hardware.)

    Doing the same kind of general thing with, say, a RISC-V microcontroller and trying to secure the RAM seems like a generally fruitful possible course of action? Let's see how Lowrisc turns out.

  • vbernat 10 years ago

    The read-only copy could be different from the running copy.

    • rcthompson 10 years ago

      If you trust the hardware enough to use it for 2-factor authentication, then I think you trust it enough to be honest with you about its contents.

      • dfox 10 years ago

        The problem there is that in usual case, the read-only access to software will not be provided directly by the hardware, but by the same software you are trying to verify.

        In theory, this could be solved by verifying whole memory of the device, but that still depend on you believing that the device does not have more memory than what it should have.

        • leni536 10 years ago

          > the read-only access to software will not be provided directly by the hardware, but by the same software you are trying to verify. Why not?

          • dfox 10 years ago

            Because in the usual case you want to do such verification through same interface as normal operation, both for usability reasons and to limit number of interfaces that cross the security boundary.

    • makomk 10 years ago

      The usual trick is to ensure that the firmware + user data fills all available storage space on the hardware so there's no room for other code, then add time limits and complexity to the verification code so they can't do any tricks with stuff like compression.

kerkeslager 10 years ago

The argument for disabling loading new firmware on your own device is valid. It prevents an outside actor loading malicious firmware. But it's a tradeoff: it means that if a vulnerability is found, the device has to be replaced, and users can't customize their firmware. That's a good tradeoff; I'd rather risk paying for a new Yubikey than risk a security compromise, and most users are unqualified to verify the security of firmware being loaded onto the device.

The problem is, it's not a tradeoff Yubico have to make. They can allow users to achieve the same goals by distributing the device un-flashed, with the source code to the firmware. Upon flashing, the firmware would disable further flashing. If the user doesn't like this tradeoff, the user can choose to change the code. As a courtesy to more trusting users they could provide the service of optionally flashing devices for you. And qualified users can verify the security of the firmware before loading it.

But by flashing the devices themselves, Yubico has chosen the worst of both worlds. Now an outside actor can once again add malicious firmware: Yubico is an outside actor. AND nobody can verify the security of the firmware. This isn't even a tradeoff, it's just a loss.

  • H3g3m0n 10 years ago

    > They can allow users to achieve the same goals by distributing the device un-flashed

    There is the possibility of the device being intercepted before it reaches you. Or before you have gotten around to locking it down. Or when you plug it into your (compromised) system to lock it down.

    Since all communication is done over the USB port, the problem is that the firmware can be flashed with a backdoored firmware that appears to be normal/unflashed. One that can be flashable (by basically having a virtual machine/emulator that runs the flashed image), appears to get locked down when you go through any lockdown process (since you just end up locking down the VM). But still has the backdoor in place.

    Firmware aside, people can modify the hardware too. Unless you crack open the device and inspect the internals (which many devices are designed to prevent). And even then a really sophisticated attack could replace the chips with identical looking ones. If you are using off the shelf ones then it wouldn't be that hard. They can also add an extra chip before the real one that intercepts the communication. Or maybe compromise the 'insecure' USB chip (if it's programmable).

    With locked down hardware the manufacturer can bake private keys onto the chips and ensure that the official stuff checks the hardware by asking it to digitally sign something with a private key. But if the attacker has added their own chip between the USB and the legit chip, they can pass through the requests to the official chip.

    TPM will do something like keep a running hash of all the instructions that are sent to the hardware and use the resulting has as part of the digital signature verification, but if you mirror the requests that doesn't help.

    The next stage is to use the keys on the chip to encrypt all communication between the 'secure' chip. So any 'pirate' chip won't get anything useful.

    Users could be allowed to 'bake' their own keys in, but that leaves us with the intercepted hardware problem. The attacker gets the hardware, installs fake firmware that appears to accept your custom key and preforms the encryption.

    Personally I think worrying about security to that level is over kill even if your dealing with quite a bit of money. It would have to be quite an organised attack. They would have to gain physical access to the device, compromise it, return it unknown and then gain physical access again later. Requiring both physical and digital security skills.

    That's much more work than just, stealing it or applying Rubber-hose cryptanalysis. Attackers can also compromise the system being used to access whatever.

eggy 10 years ago

I am pleased they took the time to respond in length. It makes a bit more sense now (NDAs, hardware manufacturers, etc...) vs. the 'security by obscurity' mantra prevalent in the replies.

I have had my own business, and the one thing I would say to the critics of Yubico: If you have a way, given existing hardware and software tools and suppliers, to do a better job, step up and do it. AFAIK, Apple didn't opensource their hardware related to crypto, or their software.

I think you will find it takes more than wishful thinking; more like, put your money ( = or your time) where your mouth is. Engineers, and I don't just mean CI engineers here, know it is a long way from a math equation or set of equations to a real world working object. I would love to see, and I would contribute money to an opensource solution. I just don't think it is as cookie-cutter simple as the majority of comments seem to intimate on this forum.

drazvan 10 years ago

BTW, the two major manufacturers they're talking about are NXP (http://www.nxp.com/) and Infineon (http://www.infineon.com/). STMicroelectonics (http://www.st.com/) is also a player here and Feitian has also started doing it (http://www.ftsafe.com/product/epass/eJavaToken). NXP and Infineon are notoriously hard to get started with for small companies and independent developers but they have some very clever proprietary stuff in their chips.

tptacek 10 years ago

Very long post. Apparently, very simple explanation: they want to use NXP hardware, and NXP requires NDAs, preventing them from meaningfully opening source code to the platform.

  • infinite8s 10 years ago

    This is the only comment that has figured out the real reason for not releasing the code - they can't due to NDA.

dmitrygr 10 years ago

This reads more like an excuse then a reason. Nothing of what he says is a reason that prevents them from being more open.

All that he says is summarized in "it was too hard to think of a solution, so we didn't do it."

  • rrego 10 years ago

    That's a very disingenuous summary. It seems impossible to make the device open due to the NDAs. Can you explain how they would get around these?

    With regards to the applet manager, that seems to be an issue with customer friction less so than being too hard. While "crypto nerds" would be fine, business applications could be affected.

    • dublinben 10 years ago

      Not using hardware components that would require NDAs would be the obvious alternative.

      • sgift 10 years ago

        Obvious? As in "I didn't read the post, didn't read that there are only two suppliers for this kind of hardware and didn't read that both of them require this NDAs"-obvious?

        Quote for your convenience:

          So — why not combine the best of two worlds then, i.e.
          using secure hardware in an open-source design? There 
          are a few problems with that:
        
          - There is an inverse relationship between making a 
          chip open and achieving security certifications, such as 
          Common Criteria. In order to achieve these higher levels
           of certifications, certain requirements are put on the 
          final products and their use and available modes.
        
          - There are, in practice, only two major players 
          providing secure silicon and none of their 
          products/platforms are available on the open market for 
          developers except in very large volumes.
        
          - Even for large volume orders, there is a highly 
          bureaucratic process to even get started with these 
          suppliers: procedures, non-disclosure agreements, secure 
          access to datasheets, export control, licensing terms, 
          IP, etc.
        
          - Since there is no debug port, embedded development
          becomes a matter of having an expensive emulator and 
          special developer licenses, again available only under 
          NDA.
        
          - Although this does not prevent the source code from 
          being published, without the datasheets, security 
          guidelines, and a platform for performing tests, the 
          outcome is questionable, with little practical value.
        
        You can disagree with this arguments, but just ignoring them to provide an "obvious" answer is a cheap tactic.
  • skybrian 10 years ago

    I'm not sure what distinction you're making? "Too hard" is often a valid reason for not doing something.

    • dmitrygr 10 years ago

      Yes, unless you refuse to admit it and instead claim that security is improved by obscurity you chose to engage in.

      • jolux 10 years ago

        But this isn't completely security by obscurity, it's security granted by hardware that is built for secure purposes for which it is difficult to provide a software platform.

sigmar 10 years ago

My opinion on this is that physical security is paramount. Your threat model can't possibly eliminate all threats from an adversary that has physical access.

No hardware is 100% secure and for Yubico to say this issue is about "Secure Hardware vs. Open Source" seems like a red herring. Perhaps they are just trying to protect their business model? After all, there isn't anything particularly unique about the hardware.

  • nickpsecurity 10 years ago

    Physical security is a moving target and a spectrum. Basic mechanisms can protect my computer if I leave it unattended in front of common hackers for a few minutes to take a leak at a restaurant. Another level of security is necessary for people with more access or tooling. At some point, basically nothing I do will help given enough resources by pro's.

    So, it's not so simple. Otherwise, all buildings containing valuable protected by locks and stuff would be compromise because enemies had potential of physical access. They aren't. That's telling you something.

foxhill 10 years ago

well, it's a shame that poor arguments get recycled like this, but it does make for easy dismissal - cryptography is based off of the idea that the methods used totally transparent, the power to decrypt comes from possession of the appropriate keys. by closing a design, hiding it from scrutiny from the majority of hackers like ourselves, helps no one other than the individuals who wish to gain unauthorised/unwanted access.

this is a fundamental concept in FOSS and for anyone to try and rationalise their way out of it - be it out of some corrupted sense of trying to do the right thing - is absurd.

fortunately i feel that the very people that would be interested in this device will be aware of this; i hope the folks at yubico reverse this decision.

captainmuon 10 years ago

This story made me think a bit about devices like the Yubikey. I'd really like one to store my keys to sign mail, or for two-factor-authentication. But the main selling point, the tamper-resistant secure-enclave-like chip, is something I don't need. I'd rather have a tiny microcontroler in USB format that I can program myself and understand nearly 100%, with no secret code going on.

My reasoning: I don't need physical tamper-resistance for my threat scenario - if it is stolen by a random thief, a coworker, a "friend", etc..

But if I was attacked by a nation-state-like actor, I cannot trust any security measure of the device. How do I know the NSA does not have a copy of every "random" card-manager key? How do I know that generated keys are not subtly biased so that they can be guessed easily? Or that there is not a secret function to extract them? Even if Yubico is 100% honest and their device is clean, I must assume that if e.g. the NSA were after me, they have the technology to extract the keys from the device, no matter what protection it has.

microcolonel 10 years ago

I understand where they're coming from. Though it would be even braver for them to get into the IC design game, and make a chip with the properties they desire. They can then publish whatever they would like about that chip.

kriro 10 years ago

tl;dr: code is closed and I can't change it anyway so it shouldn't matter to me.

I hope the response from consumers will be: we understand your position. Unfortunately that is unacceptable and we'll look for another vendor. It is mine. I own a Neo, not getting any of their future products.

Also as a strategic guideline...maybe if you're in the business of security...don't use hardware that requires NDAs. Yes it'll make it impossible to do some stuff and more expensive to do some stuff but I'd say there's really no option to compromise.

ansible 10 years ago

While it is good that they are implementing all these hardware security features, I think that we are in general over thinking the whole thing.

Their current industrial design very clearly says "hey, I am an important security key", which is exactly the wrong thing to do.

It should instead look like a cheap flash drive. And when the thief plugs it in, he sees exactly that, a low capacity USB flash drive, unencrypted, with some random documents on it.

Is the thief at this point going to perform some sophisticated hardware hacking? No, it will just get thrown away.

hlandau 10 years ago

The industry of smartcards and similar devices has annoyed me for a long time, mainly due to its failure to provide a secure general purpose computing environment and get out of the way. I wrote about it some time ago: https://www.devever.net/~hl/smartcards

  • nickpsecurity 10 years ago

    You don't know why they use interpreters? It's for the combo of security and app development. Just like long ago, the development of a high-assurance MULTOS or JavaCard system means you certify the interface one time. Apps get to build on that into an ecosystem. Then, only new implementations of that have to be certified [in theory]. MULTOS requires it while I think it's optional with JavaCard. I have less clear answers than most since I don't sign the NDA's either. At least let me reverse engineer and post some answers. ;)

    Regarding DES, the smartcards and HSM's were originally developed for use by both government and financial industry. They originally standardized on DES then used 3DES to reuse their HW and SW. It was one of few tradeoffs that made long-term sense given a three, key version of a 1975 algorithm is still secure in 2016. That's 41 years of security through variants of that algorithm. Unheard of in our industry. That you call 3DES, itself going strong almost 20 years, something that should be repellant shows the difference between security-critical sector and mainstream. Former prefers what's proven longest with latter preferring what's popular and good in theory. Both AES and 3DES are valid choices given peer review. That their money-makers came from 3DES customers made the best choice obvious.

    Regarding NDA's. A HW guru that taught me what I initially knew on the subject mentioned patent suits. He said his company refuses to do business in the U.S. since those companies get sued into the ground. There are so many patents on HW, esp microarchitecture, that it's impossible to avoid all them. So, he said keeping things as trade secrets was a common strategy of smaller firms to reduce legal risks and ensure profits. Also, reduces copying and attacks by hobbyists. And of course they didn't say "hides infringement" in the datasheet. :P

    I stopped there since I think these should address your concerns. At the least, it should start to make sense what those companies are doing whether we like it on our end or not. Personally, I'm more a fan of Caernarvon OS for smartcards as one of the inventors of INFOSEC (Paul Karger, grandmaster of high-security) made it. Look it up for interesting lessons on what smartcard OS's deal with in terms of development and certification difficulties.

jwildeboer 10 years ago

I have been a long term user/promoter of yubikeys. But today I ordered a Nitrokey Pro. They seem to be the better choice now. Definitely more open and with pen tests of hardware and firmware on their website. All schematics and firmware on GitHub.

xaduha 10 years ago

You can get blank 'java' smart cards and load open source applets on them, you don't need Yubico.

Personally I only tried IsoApplet, but openpgp applet should work too.

exabrial 10 years ago

Each key's signature is randomized.... brilliant. So buying 1000 of the keys doesn't give an attacker an advantage. Certainly flies in the face of FOSS, but this is about security... I'll be watching this closely to see Yubico's actions. I think so far this is a great response and I look forward to a non-sensationalist rebuttal.

amluto 10 years ago

I thought about this for awhile, and here are my thoughts about having the source code:

With the older YubiKey NEO devices, the applet source was available and I could freely upload an applet. This was great for a few reasons. I could modify or upgrade the app (of course, doing so would cause me to lose existing keys, which makes sense from a security PoV). (I actually did this on my old YubiKey.) I could also, in principle, audit the app. And, if I trusted Yubico to get their security right, I would trust that my freshly-arrived-in-the-mail device was secure. Moreover, if I trusted Yubico not to act maliciously, then the applet on the device I got in the mail would match the firmware on github, and I could trust that it did what I thought it did.

There were, of course, problems. The GlobalPlatform platform is awkward to use, the toolchain is terrible, and the key management is awkward at best.

I could not trust that a key I installed in the OpenPGP applet while my computer was compromised was secure.

With the new locked-down NEO devices, I can't change out the applets, and the bad guys would also have trouble doing so. As before, if I trusted Yubico not to act maliciously, then the applet on the device I got in the mail would match the firmware on github, and I could trust that it did what I thought it did. Also, as before, I could not trust that a key I installed in the OpenPGP applet while my computer was compromised was secure (because an attacker would simply export it before uploading rather than swapping out the whole applet).

Enter the YubiKey 4. If I use one, I am completely at the mercy of Yubico and their third-party audits. I cannot audit the code myself. Even if I trust Yubico not to act maliciously, I have to take them entirely at their word that they didn't accidentally mess up. And, of course, I cannot not trust that a key I installed in the OpenPGP applet while my computer was compromised is secure.

In other words, there's a big difference between source-available and source-not-available, even if I can't personally verify that the source I think I'm running is the source I'm running.

As an aside:

> There is an inverse relationship between making a chip open and achieving security certifications, such as Common Criteria. In order to achieve these higher levels of certifications, certain requirements are put on the final products and their use and available modes.

This may well be true, but, if so, it's a sad statement about Common Criteria and their misguided rules. Publicly disclosing the source code of an EAL5+ device should not reduce its supposed security level.

With SGX, Intel had the chance to offer a widely available security token (built in to every new CPU!) that anyone could freely program and use for their own security purposes. They blew it when they created their "launch control" policy, which essentially says that developers who don't sign lots of contracts (which you can't even read without an NDA AFAICT) can write an applet but can't run it. The Linux community, at least, is pushing back hard, and this just might change in the next generation of CPUs or maybe even sooner. Fingers crossed.

This inspires a challenge to Yubico: give me a hardware token that runs applets. Let the token attest to the hash of a running applet, but let it run any applet whatsoever. If I want to verify that I'm running the bona fide Yubico OpenPGP applet, I can check the hash myself. If I want to replace it, I can, but then the hash will change. It'll be hard: you'll have to figure out a real isolated execution environment. It's definitely doable, though.

  • ctz 10 years ago

    > With SGX, Intel had the chance to offer a widely available security token (built in to every new CPU!) that anyone could freely program and use for their own security purposes. They blew it when they created their "launch control" policy

    Now rescinded.

    • stijnhoop 10 years ago

      Could you detail that with a link to this news?

      • amluto 10 years ago

        The Intel SDM, Volume 3, version 058 has a new set of MSRs called IA32_SGXLEPUBKEYHASH along with a new feature control bit for them. The intended policy is not specified anywhere that I can see, nor can I find any PR announcement or whitepaper. I also don't know what CPU generation will support that feature.

  • c2352466 10 years ago

    Attackers will just use the JTAG/Debug port and be done with it. They probably didn't even give it a single thought whether there is a vulnerability to exploit in the opensourced firmware.

    The YubiKey NEO was always "unsecure" now with the YubiKey 4 it's only possibly "unsecure".

    • amluto 10 years ago

      I think you may be reading the OP wrong. The YK NEO used a secure element chip, too.

franciscop 10 years ago

Summary:

"If you have to pick only one, is it more important to have the source code available for review or to have a product that includes serious countermeasures for attacks against the integrity of your keys?"

bb88 10 years ago

Simply put, all things equal, the device will be reverse engineered and exploited sooner by well funded governments than a hacker collective.

Even worse, you won't know when the device becomes obsolete. So you might be buying an insecure solution from the start.

mindslight 10 years ago

A lot of handwaving which may throw those off who haven't pondered the design constraints of hardened hardware. But alas, it essentially boils down to the same reason that every productized solution goes closed: it's the expedient lazy option. Age-old antisecure solipsism.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection