Settings

Theme

Basic Electron Framework Exploitation

contextis.com

169 points by adambb 6 years ago · 137 comments

Reader

dvt 6 years ago

This is clickbait nonsense. Unfortunately, because it's so popular to hate on Electron these days, it's going to get a lot of traction on HN and elsewhere. The premise of the blog post is:

> It’s important to note that this technique requires access to the machine, which could either be a shell or physical access to it

I mean... what? I can literally do code injection on (almost) any application I'm running given that I have shell or physical access to the machine. It's like the author never heard of Detours[1] or VTable injection[2]. This is a low-effort clickbaity post that brings nothing to the table to serious security researchers or even hobbyist hackers.

It's a shame, too, because there are a lot of very interesting techniques out there for injection and remote execution, but they are OS-dependent and require a lot of research. Clearly, a more interesting post would have been too much effort for OP and instead we're going to pile on Electron.

PS: ASAR code-signing is not fool-proof, as we can still do in-memory patching, etc. Game hackers have been patching (signed) OpenGL and DirectX drivers for decades. It's a very common technique.

[1] https://www.microsoft.com/en-us/research/project/detours/

[2] https://defuse.ca/exploiting-cpp-vtables.htm

  • pimterry 6 years ago

    I think its been exacerbated significantly by the reporting elsewhere: https://arstechnica.com/information-technology/2019/08/skype...

    Notably, according to that Ars Technica coverage:

    > attackers could backdoor applications and then redistribute them, and the modified applications would be unlikely to trigger warnings—since their digital signature is not modified

    That isn't in a claim in the original post, and doesn't seem to be true afaict: every distribution mechanism I can think of signs the entire distributable, so you really can't just modify the ASAR without breaking the signature. Windows & macOS both require you to only install from signed application bundles/installers (or at least they make it very difficult for you to use unsigned software). On Linux you could get caught out, but only if you download and install software with no signing/verification whatsoever, and that's a whole other can of worms.

    If that claim were true this would be a bigger concern, but given that it's not I'm inclined to agree this is basically nonsense.

    • FreakLegion 6 years ago

      every distribution mechanism I can think of signs the entire distributable, so you really can't just modify the ASAR without breaking the signature. Windows & macOS both require you to only install from signed application bundles/installers (or at least they make it very difficult for you to use unsigned software)

      Only drivers have to be signed on Windows, and even then not all kinds until Windows 8. Also many apps, including Visual Studio Code, are available in 'run from USB' form, so there's no installer, just an archive you unpack and run. Those archives can be modified and redistributed without invalidating any of the PE signatures within, but since nobody pays attention to these signatures anyway and Windows doesn't enforce them, yeah, this is typical Black Hat-week PR nonsense.

      • dvt 6 years ago

        > Only drivers have to be signed on Windows

        This is half-true.

        Windows and macOS both make it difficult to install self-signed (or unsigned) software. For example, I made http://www.lofi.rocks (an open source Electron-based music player) and I'm not going to spend like a few hundred bucks a year to have a non-self-signed cert. This makes both macOS and Windows complain when users install the app. More draconian practices (that "protect users from themselves") will make it even harder for independent open source devs like me to share cool projects with a wide audience.

    • supportgpa_dev1 6 years ago

      Only drivers have to be signed on Windows This is half-true.

      Windows and macOS both make it difficult to install self-signed (or unsigned) software. For example, I made http://www.lofi.rocks (an open source Electron-based music player) and I'm not going to spend like a few hundred bucks a year to have a non-self-signed cert. This makes both macOS and Windows complain when users install the app. More draconian practices (that "protect users from themselves") will make it even harder for independent open source devs like me to share cool projects with a wide audience.

  • giancarlostoro 6 years ago

    This is like saying a Python or Ruby application could be exploited if someone snuck code into your machine. This is a known scripting language "flaw" that nobody cares deeply about.

  • cloudego 6 years ago

    I’m not sure why you’re so upset by this. Electron is installed on our machines and deserves to be scrutinized.

    The author presents the info clearly and even includes videos demonstrating the “technique,” so it doesn’t seem “low effort” and click-baity to me.

    I’m not sure I can support your view that this is unworthy of attention or fix because of in-memory patching, etc. If I told my customers Not to worry about my product because there are much scarier ways they can get hacked elsewhere, they would still ask why I didn’t put my best effort into closing a known loop.

    • dvt 6 years ago

      It's clickbaity and low-effort because this is no more an "exploit" than running a random .exe is an "exploit." It can be "fixed" by always installing software from trusted vendors and not running random executables you download from IRC. In other words, it doesn't even really qualify as an attack vector. Electron isn't any more vulnerable than any given native app.

      Compare that with an actual Chromium RCE vulnerability (a very clever PDF heap corruption exploit): https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-1748...

      • anonymous6969 6 years ago

        The interesting thing to me is that the techniques you are talking about get lit up like a Christmas tree by some modern endpoint protection products, whereas a backdoored Electron app is squeaky clean on Virustotal...

        • dvt 6 years ago

          This is untrue. A "backdoored" native app (i.e. an app where an executable or DLL was modified) would also be squeaky clean.

          • anonymous6969 6 years ago

            No, if you embedded malicious code (say a full blown RAT, like this tool gives you) into an exe, modern av models will do static analysis of that code and flag it as potentially malicious. Because none of the JavaScript code in this backdoored electron app is even looked at by any engine(none of the engines on virustotal do analysis of JavaScript) the binary features are indistinguishable from the legitimate version.

            Backdoored ccleaner flagged as malicious by multiple ml based products: https://www.virustotal.com/gui/file/6f7840c77f99049d788155c1...

            Backdoored xmanager flagged by multiple ml based products: https://www.virustotal.com/gui/file/d484b9b8c44558c18ef6147c...

            Countless other examples.

            • dvt 6 years ago

              True, but those are backdoored apps whose signatures has been identified and stored in some AV database. The solution is (provably) impossible to generalize with static analysis. Clearly, it's also reactive (people need to report the backdoored application before you know its signature). There are also fairly well-documented ways to get around this signature-approach to AV (polymorphism comes to mind).

              • anonymous6969 6 years ago

                No, signatures have nothing to do with this. The ml models embedded in those products (and what is evaluated on Virustotal) can flag legit software that has had a RAT executable inserted in it by modifying the binary. The ml models are trained on thousands of features, and are pretty good at classifying malware. USCYBERCOM has been tweeting APT malware that was not seen by these models or anyone in the public, and yet was still flagged. https://twitter.com/CNMF_VirusAlert . That would be completely impossible if these products were relying on signatures. Regardless, the entire point I was making in my original comment is that this article is far from clickbait nonsense, because you have a chance, significant from what I've seen, of flagging something like the backdoored pieces of software I linked or never before seen malware like in the tweets above because the malware exists as compiled code. JavaScript is currently not evaluated whatsoever by ANY software security product, so the chances of it being flagged and blocked is 0. Signatures and polymorphism are 10 years ago quite frankly. Backdoored Slack exfilling data in steganographic images over https to giphy.com and instagram and twitter and shit is one future realm of malware. Both the binary and the network traffic are completely indistinguishable from legitimate usage.

      • pvg 6 years ago

        The claim is it's easier to bypass some app integrity protection mechanisms when the target is an Electron app.

        • hnbroseph 6 years ago

          "easier" than what? and is it particularly noteworthy? if malicious code has write access to any given app's constituent files, there's effectively no app that's hard to subvert.

          • pvg 6 years ago

            Easier than apps that are better covered by system app integrity protection? I'm not sure what's unclear about this, it's right in the writeup.

            • dvt 6 years ago

              If you're talking about installing apps, every installed app needs to be signed (unless you ignore Windows/macOS warnings). If you're talking about injection or modifying program files (be them executables, DLLs, or ASARs) post-install, every app is equally-vulnerable. There is no functional difference between a native app or an Electron app in that regard, so maybe you can clarify what you mean by "system app integrity protection."

              • pvg 6 years ago

                so maybe you can clarify

                I didn't write this thing, I'm just saying that the claims it makes are not the claims you say it makes. 'Functionally equivalent' is a bit like 'Turing complete' - it makes it easy to say something so true it's not actually interesting.

                It's not some major discovery or controversial claim that Electron apps are an even more convenient and easier-to-leverage vector for exploitation than regular old binaries. But writing some blog post about it (they didn't give the vuln a name, they didn't rent it shoes, they aren't buying it a beer) does not warrant the weird invective you're throwing at it.

                • dvt 6 years ago

                  I wasn't trying to be snippy, I genuinely didn't understand what you meant since the term "system app integrity protection" isn't anywhere in the original blog post. Also, just to clarify, by "functionally equivalent" I meant "exactly the same."

felixrieseberg 6 years ago

Electron co-maintainer here, so I'm a bit biased.

1) We should absolutely work towards allowing developers to sign their JavaScript.

2) Re-packaging apps and including some menacing component as a threat vector isn't really all that unique. We should ensure that you can sign "the whole" app, but once we've done that, an attacker could still take the whole thing, modify or add code, and repackage. We sadly know that getting Windows SmartScreen and macOS to accept a code signature doesn't necessarily require exposing your identity and I'd _suggest_ that most people don't _actually_ check who've signed their code.

3) If you ship your app as a setup bundle (say, an AppSetup.exe, an App.dmg, or rpm/deb files), you should code-sign the whole thing, which completely sidesteps this issue. The same is true if you use the Mac App Store, Windows Store, or Snapcraft Store.

  • some_furry 6 years ago

    > 1) We should absolutely work towards allowing developers to sign their JavaScript.

    I've already been working on this for my own projects. It might be something that can be generalized for all Electron projects.

    https://github.com/soatok/libvalence

    https://github.com/soatok/valence-updateserver

    https://github.com/soatok/valence-devtools

    This uses Ed25519 signatures and an append-only cryptographic ledger to provide secure code delivery. The only piece it's currently missing is reproducible builds.

    For greater context: https://defuse.ca/triangle-of-secure-code-delivery.htm

    • cjbprime 6 years ago

      I think you need OS codesigning integration for this threat model. Otherwise whatever special app runtime check code you add just gets removed by the malicious overwrite of your app code.

  • cjbprime 6 years ago

    I don't think it's correct that 3) sidesteps the issue, if I'm understanding it. Electron App is installed via a codesigned setup bundle. Then Malicious App runs on the machine later and overwrites your ASAR. The OS doesn't complain because the ASAR isn't receiving codesigning protection, and Electron App has been backdoored in a way that the system's use of codesigning suggests wouldn't be possible.

    • felixrieseberg 6 years ago

      If you're already running code on the victim's machine, presumably with sudo rights to change `/Applications`, you've already hit the jackpot. Yes, you can change apps, but if you're the victim, that's _probably_ not the biggest issue. It's the rootkit on your machine.

      • cjbprime 6 years ago

        This (FS write access == game over) is usually true on Linux, but the Mac and Windows codesigning infrastructures exist to offer some protections and user warnings in this case, and they're what's being defeated by this attack.

        • kogir 6 years ago

          With FS access you can just strip the signature entirely and it’ll run without any fuss. In this case it’s the machine that’s compromised, not the app.

      • anonymous6969 6 years ago

        OSX is getting rid of the ability to run unsigned kernel extensions pretty soon. Compiled off the shelf RATs are usually lit up pretty well by modern AV as can be seen by Virustotal results. And a noisy python/ruby/whatever executable on the marketing persons computer would raise a few eyebrows in some organizations. Slack/Discord on the other hand...

  • floatingatoll 6 years ago

    3) is not a valid protection on macOS once the application is copied away from the signed DMG (which is then discarded).

    macOS code signing does not extend to Contents/Resources/ which, unfortunately, is where — without exception — every application on my system stores 'electron.asar'.

        /Applications/VMware Fusion.app/Contents/Library/VMware Fusion Applications Menu.app/Contents/Resources/electron.asar
        /Applications/balenaEtcher.app/Contents/Resources/electron.asar
        /Applications/itch.app/Contents/Resources/electron.asar
        /Applications/lghub.app/Contents/Resources/electron.asar
        /Applications/Boxy SVG.app/Contents/Resources/electron.asar
        /Applications/Slack.app/Contents/Resources/electron.asar
        /Applications/Discord.app/Contents/Resources/electron.asar
    • ilikehurdles 6 years ago

      This response from elsewhere [1] seems relevant:

      > Here's the thing with how gatekeeper works, that application had already passed gatekeeper and will never be _fully_ validated ever again.

      > If you zipped your modified Slack.app up, uploaded it to google drive, and downloaded it again. Gatekeeper would 100% reject that application, the ASAR file is included as part of the application signature. You can prove this by checking the "CodeResources" file in the apps signature.

      > You can't re-distribute the app without gatekeeper completely shutting you down.

      [1]: https://news.ycombinator.com/item?id=20637738

      • floatingatoll 6 years ago

        Hooray! I am glad to be wrong. For others looking to test this,

            $ codesign -dv /Applications/xyz.app
            ...
            Sealed Resources version=2 rules=13 files=122
            ...
        
        For version=2, all resources are signed.
  • burtonator 6 years ago

    Electron developer here. I work on this project:

    https://getpolarized.io/

    We ship binaries for MacOS, Linux and Windows. ALL our binaries are signed. You're INSANE if you don't do it. It's still a MAJOR pain though and wish it was a lot easier.

    If ANYTHING what we need to do is make it easier for MacOS and Windows developers to ship code signed binaries.

    It took me about 2-3 weeks of time to actually get them shipped. Code signing is an very difficult to setup and while Electron tries to make it easy it's still rather frustrating.

    The biggest threat to Electron is the configuration of the app and permissions like disabling web security. If you're making silly decisions you might be able to get Electron to do privilege escalation.

    • mceachen 6 years ago

      Can confirm. It took the better part of a month to get both windows and mac code signing certificates provisioned for PhotoStructure.

      The diligence applied for both platforms at least exceeded pure security theater. They actually did a modicum of effort to ensure I was who I said I was, but it wasn't much. It just took a lot of wall time.

      • mgoetzke 6 years ago

        which is really weird. a let's encrypt approach to validate ownership of a domain should be sufficient. if the app is from a domain you trust that should be enough for most apps. bonus checks for high-risk applications (banking/LoB etc)

        • mceachen 6 years ago

          I don't think it's analogous.

          If you need a certificate to prove you own a domain, changing DNS TXT records for that domain, or serving a secret, from that domain, proves you own the domain.

          If I need a certificate that proves I am the corporate entity on some signature, say, "PhotoStructure, Inc.", there isn't some magick TXT record I can add that uniquely identifies me as the owner of that business.

pfraze 6 years ago

I feel like the headline is a bit click-baity but I don't want to jump to conclusions.

> Tsakalidis said that in order to make modifications to Electron apps, local access is needed, so remote attacks to modify Electron apps aren't (currently) a threat. But attackers could backdoor applications and then redistribute them, and the modified applications would be unlikely to trigger warnings—since their digital signature is not modified.

So the issue is that Electron app distributions dont include a signed integrity check so there's no way for end-users to detect if they got a modified version. I thought that the MacOS builds did do this, but maybe the ASAR bundles aren't included in the hash, or maybe I'm wrong entirely.

I assume the a solution would store the signing pubkey on initial install and then check updates against that. The only way the signing key could be checked other than trust-on-first-install would be through some kind of registry, which is what I assume the Windows and Mac stores are geared toward. Am I correct on all this?

EDIT: Either way, it seems like the solution is to only use the projects' official distribution channels. Signed integrity checks would be useful but probably not change the situation that dramatically. Is that accurate?

  • cjbprime 6 years ago

    I'm still trying to figure it out too:

    > I thought that the MacOS builds did do this, but maybe the ASAR bundles aren't included in the hash?

    Yeah, I think that's the problem they're describing. It sounds like the Mac setup will require binaries -- like the Electron runtime itself -- to be codesigned, but if the first thing your codesigned binary does is to read an unprotected JS file off disk and execute it, there's no codesigning benefit.

    > Either way, I assume the a solution would store the signing pubkey on initial install and then check updates against that

    Not just updates that you initiate yourself, though -- I think the idea is that any other app on the system could backdoor the JS in the ASAR at any time. That's pretty hard to defend against.

    • marshallofsound 6 years ago

      Hey Electron maintainer here

      > but if the first thing your codesigned binary does is to read an unprotected JS file off disk and execute it, there's no codesigning benefit.

      The ASAR files described in this post are part of the signature of the application though. You can't modify that file and then redistribute the app to another machine without gatekeeper getting incredibly angry at you.

      E.g. Try modifying the ASAR file, zip the app up, upload to google drive, download again and try run the app. Gatekeeper will boot it into the shadow realm :)

    • pfraze 6 years ago

      > Not just updates that you initiate yourself, though -- I think the idea is that any other app on the system could backdoor the JS in the ASAR at any time. That's pretty hard to defend against.

      Good point, but if the attacker has filesystem access you're already hosed. I suppose there could be some other risk where the ASAR could be modified without full FS access? But I'd want to know what that attack is, if that's the case.

      • cjbprime 6 years ago

        > If the attacker has filesystem access you're already hosed.

        I think that's not supposed to be true in modern (e.g. latest macOS) threat models. App Y isn't permitted to just replace App X unannounced, and on both Mac and Win there's a large codesigning infrastructure in place to provide that protection.

        • applecrazy 6 years ago

          Also, sandboxing is designed to prevent unfettered filesystem access on macOS, meaning this isn’t part of the threat model if all apps are sandboxed and packaged.

    • ilikehurdles 6 years ago

      What makes this unique to electron as opposed to any other application that doesn't run as a completely closed binary (not that binaries can't be backdoored, of course)?

      • seandougall 6 years ago

        On macOS, if my understanding of the current situation is correct, code signing normally covers all binaries in an application bundle, including binaries in all bundled frameworks. What's different about Electron is that it puts application code, which is not a binary, into the Resources/ directory, which is not signed.

        I just tried this out with Slack on macOS, and it did work... almost as advertised. I had to use sudo to change the application files, which means this isn't really much of a novel attack surface, but it did bypass the code signing checks quite handily.

        So, is this a "vulnerability"? That may be a stretch, as far as I can see, but putting application code in Resources/ definitely counts as a "smell" in my book.

        • marshallofsound 6 years ago

          Hi, Electron maintainer here.

          > I just tried this out with Slack on macOS, and it did work

          Here's the thing with how gatekeeper works, that application had already passed gatekeeper and will never be _fully_ validated ever again.

          If you zipped your modified Slack.app up, uploaded it to google drive, and downloaded it again. Gatekeeper would 100% reject that application, the ASAR file is included as part of the application signature. You can prove this by checking the "CodeResources" file in the apps signature.

          You can't re-distribute the app without gatekeeper completely shutting you down.

          • seandougall 6 years ago

            Thanks for taking the time to reply! Like many here, I've been a critic of Electron, but I think it also does some amazing stuff, and I'm sorry you have to go into PR maintenance mode over such a weaksauce article.

            I was coming back to follow up and say that that's exactly what I found -- running `codesign --verify` does show the modification. It makes sense that Gatekeeper wouldn't re-verify a 185 MB bundle on every launch, which makes me wonder if there's something else macOS could be doing at the FS level to see if any files have been modified and trigger a new check.

            At any rate, while I don't quite take back what I said about application code in Resources/, I do take back the implication that it had anything to do with this; I suppose there doesn't seem to be anything Electron-specific about TFA, other than that exposing raw JS lowers the bar for who can write the code to inject. (Assuming you can get FS access to inject code in the first place, of course.)

    • hnbroseph 6 years ago

      if you have malicious code executing that has the access and opportunity to modify files, a modified electron app is likely just the beginning of your troubles.

  • withinrafael 6 years ago

    Developers on Windows, in this scenario, can generate a catalog of all files in their app and sign that/verify that at runtime [1], negating the need to rely on upstream to incorporate signature support into the asar file spec. There may be workable equivalents on macOS and Linux.

    [1] https://docs.microsoft.com/en-us/windows-hardware/drivers/in...

    But this will all be in vain if the attacker scenario includes unfettered file-system access. (They can modify the app to not perform these checks, for example.)

Rotten194 6 years ago

I don't see why this is a big deal -- a native app can also be distributed with malicious patches or dlls, and those are common methods for e.g. game modding and cracking. If you're worried about the integrity of a program, check the hash.

efficax 6 years ago

If you can write to my binaries, you can do anything you want to me. Boring.

saagarjha 6 years ago

> The problem lies in the fact that Electron ASAR files themselves are not encrypted or signed

Resources on macOS get signed as part of the application bundle. I wonder why this isn't possible for Electron apps as well.

  • marshallofsound 6 years ago

    Hi Electron maintainer here

    ASAR files are signed as part of the application bundle. The issue is that folks don't understand how gatekeeper works so let me try explain it here.

    When you download an application from the internet, macOS initially considers it "quarantined". When a quarantined application is first opened gatekeeper scans it _completely_ and if it's happy removes the quarantine tag and let's it launch.

    Once that quarantine tag is removed, gatekeeper will never run a complete check of that application again. Meaning the ASAR files are validated once, when the application is first launched.

    What people are seeing here is they're taking an application that gatekeeper has already signed off on, modifying it, and then asking why gatekeeper didn't stop them.

    If you took that modified application, zipped it up, uploaded it somewhere, downloaded it again and tried to run it, it would NOT work. Gatekeeper would boot that invalid application to the shadow realm.

    • saagarjha 6 years ago

      Once you can establish that the main application binary is codesigned correctly (which AFAIK macOS will do at each launch?), why can't put signature checks into that to validate the ASAR files?

    • ilikehurdles 6 years ago

      So this sounds like a non-issue -- or at least not a new or novel one. How did this get published so far and wide?

      • mceachen 6 years ago

        How does any nonsense get published far and wide?

        People are trying to be helpful, perhaps, by amplifying some concern, while at the same time not having the expertise necessary to see it as false.

  • adambbOP 6 years ago

    This appears to be the issue that is referenced in the article about why they don't sign currently:

    https://github.com/electron/electron-packager/issues/656#iss...

f00b4r666 6 years ago

This article seems a bit clickbait-y considering this means that you'd have to download the application from an untrusted source for this "exploit" to be taken advantage of. The same could be said for most applications if people aren't checking that the hashes match.

I feel like this will get a ton of discussion here anyway due to the Electron hate train.

  • blackflame7000 6 years ago

    What if you installed it via a trusted source, and then someone swapped the ASAR files without your knowledge? A flash-drive programmed to operate as a keyboard could easily swap in a malicious file simply by plugging it into the victim's computer when they aren't paying attention.

    • volkk 6 years ago

      can't you do far worse if you're actually plugging in and running code from a flash drive on someone's computer?

      • blackflame7000 6 years ago

        Perhaps, depending on what sort of Anti-virus/Monitoring software is installed. It would definitely leave a bigger trace to install, run, and persist a malevolent executable than it is to hijack an already trusted one. Like if you saw a random exe running in task manager you would be much more paranoid than if you just saw slack.

        I guess a better example might be if you have 2 admins on one computer and one could edit the files in programs directory to spy on the other. This assumes that only trusted executables are run by the victim (ie word) and you don't have the ability to modify its source code to make it malicious.

hnbroseph 6 years ago

You could say the same about (for example) a Python-based QT app. Or any scripting-language based application or framework.

It's also true to say something like "Rails can be back-doored by modifying the code and redistributing it to unsuspecting developers!"

Actually, you could say the same or similar things about many applications, including binary distributes. With some analysis you can figure out what conditions a jump instruction is using, and modify it to always jump where you want. Cheat Engine lets you analyse game memory at runtime and substantially modify behavior.

davej 6 years ago

Here's the corresponding issue on Github: https://github.com/electron/asar/issues/123

As you can see from the issue, this exploit has been known for 2 years and probably longer than that. As I said (November 2018) in the linked issue, I believe it's only a matter of time before Skype/Slack/VSCode gets packaged up with malicious code and flies under the radar of SmartScreen and Gatekeeper. It probably won't be downloaded from the official websites but there are plenty of other ways of distributing the software. I get the feeling that the Electron team aren't taking it too seriously. I think this has the potential for a really dangerous exploit.

My startup (ToDesktop[1]) uses Electron and I've put a huge effort into securing certificates on HSMs (Hardware Security Modules). But it's mostly a pointless exercise when a hacker can simply edit the javascript source.

[1] https://www.todesktop.com/

howlett 6 years ago

I wrote the original post. The main issue I was trying to highlight is that you can make signed apps run your code from a local perspective. Here's a real life scenario that happened :

I was doing a security assessment for a client, and after gaining foothold on the host we needed to establish persistence. As the endpoint protection was blocking anything non signed, I used slack to inject a powershell payload that's executed on startup and gains us access back to the internal network.

So the risk is there, but not the individual user but the organisations using it. I didn't expect this to become a big deal over "redistribution" but I hoped for the command execution without modifying the binary.

Having said that, this can be solved with a simple integrity check of the asar files. Sure, the attacker can modify the binary file too, but then it's not signed anymore.

pimterry 6 years ago

I'm unclear about how this attack works. The article says:

> attackers could backdoor applications and then redistribute them

Most distribution mechanisms however ship a single signed bundle, containing & thereby signing the entire application, including resources like ASARs. Any that don't sign the application are of course vulnerable to all sorts of trivial attacks (replace the whole binary with anything you like).

To make this a danger from a distribution POV, it seems you would need the application to be partly signed; i.e. the executable but not the included resources. Where does that happen?

For macOS for example, all resources (including ASAR files) are signed, and macOS makes it intentionally difficult to install anything that isn't signed.

Similarly for Windows you'll see large warnings if you open an unsigned application; Electron apps are almost always distributed as a single signed installer exe file, including the ASAR file.

On Linux it depends wildly, but most of the time either the entire package (e.g. a deb from the official repos) is signed, or nothing is signed and you're vulnerable regardless.

What am I missing?

(I'm not addressing the risk of altering an already-installed application - that's a separate attack also mentioned, but requires local access to rewrite files on the target machine, at which stage there's many other options)

EDIT: URL has now been updated, here I'm discussing points from https://arstechnica.com/information-technology/2019/08/skype.... The post now referenced doesn't mention redistribution, and I suspect that in fact Ars is wrong, and allowing signed redistribution of subverted versions isn't a real vulnerability here. I'd love to hear if I'm wrong though!

  • seandougall 6 years ago

    > For macOS for example, all resources (including ASAR files) are signed, and macOS makes it intentionally difficult to install anything that isn't signed.

    I just tried this with Slack on macOS, and it launched without a single complaint about code signing. It would appear that either the ASAR files are not included in the signature, or the OS doesn't check the entire application bundle on every launch.

    (Edit: That said, I needed sudo to do the mod in the first place, so I'm not about to start panicking about this as an attack vector.)

    (Edit 2: As 'marshallofsound pointed out below and elsewhere, it is the latter case; the OS doesn't check the entire bundle on every launch. Which makes sense, and also means TFA is not really about Electron at all.)

thrax 6 years ago

This is pure FUD. Literally "hacker with her hands on your keyboard can compromise your machine."

barnson 6 years ago

The high-order bit is that if you install your apps to user-writable locations in the file system, your app is vulnerable to any other app the user runs. There's no reason Electron apps can't be installed to protected locations. VSCode provides a "system installer" that does, for example (on Windows). However, updates require elevation so to reduce friction, the per-user installer is the recommended default for VSCode.

SamuelAdams 6 years ago

For those that do not read the article:

>Tsakalidis said that in order to make modifications to Electron apps, local access is needed, so remote attacks to modify Electron apps aren't (currently) a threat. But attackers could backdoor applications and then redistribute them, and the modified applications would be unlikely to trigger warnings—since their digital signature is not modified.

unnouinceput 6 years ago

Oh, I have power over my own applications? Thanks captain obvious. Those who also read The Old New Thing know what I'm talking about

c-smile 6 years ago

Ideally HTML/CSS/script application shall be just single monolithic signed executable.

But that's achievable only with Sciter :)

sctb 6 years ago

We've updated the link from https://arstechnica.com/information-technology/2019/08/skype..., which points to this.

mavdi 6 years ago

Electron is the new Flash. Change my mind. ¯\_(ツ)_/¯

  • bsmith0 6 years ago

    It makes cross-platform desktop development much easier.

    VSCode, Discord, and (the new) Slack are written in Electron, and those absolutely excel, even at the cost of a bit extra memory usage.

    There's a circlejerk of Electron hate, but there's a reason it's so popular, the ease of development outweighs its weighty memory drawbacks for many companies and individuals.

    Edit: Not to mention that it uses Node.js, HTML, CSS, so moving from web-apps => desktop becomes a much simpler endeavor.

    • packet_nerd 6 years ago

      Slack excels? In my experience it's really slow and clunky and consumes way more memory than it should.

      As the sysadmin at my company I ban all electron apps unless it's clear they are exceptionally well written and/or there are absolutely no alternatives. VSCode is really good and one of my few exceptions. I strongly suspect it would have been even better if they had developed with a more performant platform, but who knows.

      Edit: To expand on my reasoning:

      Say an operation on a well built performant application is 5 seconds faster than the electron (or otherwise bloated) version. Say employees do that operation on average 20 times a day. Say I have 2500 employees who work 246 days a year and get payed $25/h on average. The slow version will cost the company $427,083 every year. That's the amount of money I'd be willing to spend per year for the fast version of this hypothetical application.

      A company like Slack has hundreds of thousands of users and the poor performance must be costing someone millions. It boggles my mind that with all that money, they still can't find the resources to make a performant application?

      And that's the naive calculation, there's also the administrative aspect of installing, upgrading, and supporting the application. (The worse the applications quality, the more time I, who am payed a lot more than $25/h, spend supporting it.) While there are multiple variables here, a development team that prioritizes "easy and fast" development doesn't inspire me with confidence they have also prioritized building a quality product.

    • d2mw 6 years ago

      > It makes cross-platform desktop development much easier.

      That is literally how Adobe Air was billed

      • benbristow 6 years ago

        Except to run Adobe Air applications you needed to install/distribute the proprietary closed-source Adobe Air runtime.

        Electron is just web technologies wrapped with a library to call system APIs.

        • inferiorhuman 6 years ago

          Except to run Adobe Air applications you needed to install/distribute the proprietary closed-source Adobe Air runtime.

          Electron may be open source in theory, but it's developed by a fairly insular group that actively rejects improvements to portability. Getting FreeBSD patches merged back into Node was hard enough, the Electron crew simply rejects them.

          Have you ever tried to build something like Electron of VS Code on FreeBSD? It's horrendous.

        • miyoyo 6 years ago

          Except to run Electron applications you need to distribute the open source Electron runtime.

          Just so happens that it's shipped with every application.

          • benbristow 6 years ago

            True. With all binaries you’re going to be running software that’s bundled and you’re unsure what’s going on fully.

            Send me the source for an Adobe Air application though and I still need to download Adobe Air proprietary tools to run it.

            For an Electron application I can run it directly from open source tooling.

    • JustSomeNobody 6 years ago

      > It makes cross-platform desktop development much easier.

      For the developer, while the user has to deal with horrid battery and memory consumption.

      Devs are paid well enough that "easy" shouldn't be a top priority.

      • bsmith0 6 years ago

        What about the potential technical debt and additional manpower it requires to build out per-platform desktop applications.

        Easy is a priority for many OSS projects and companies who want to ship something quick and reliable.

    • sixothree 6 years ago

      I don't know why people complain about performance. VSCode is my primary experience with Electron and it runs extremely well. It handles large files just as well as Notepad++.

      edit: Just to be clear I don't suggest notepad++ is best at large files. EditPad is clearly the winner in that category. But I do mean to say VSCode and N++ are on par with each other.

      Besides, VSCode provides an incredible amount of functionality. So what if it uses as much RAM as my browser. I routinely run multiple browsers, each with multiple windows.

      • vanilla_nut 6 years ago

        VSCode is generally considered to be the canonical example of an Electron app well optimized. They have put a significant amount of work (much moreso than your average company producing an Electron app) into making it very responsive. All you have to do is use Slack or Discord or Atom or Skype and you'll quickly understand why people hate on the performance.

        Spotify is a good example of an application that doesn't use Electron (though it uses a similar framework) and still manages to be incredibly slow and bloated. As are most websites. So the problem might not be that Electron is slow... it might just be that the web in general is really slow unless you work extra hard.

        • inferiorhuman 6 years ago

          VSCode is generally considered to be the canonical example of an Electron app well optimized. They have put a significant amount of work (much moreso than your average company producing an Electron app) into making it very responsive.

          And it's still significantly slower than something like Emacs or Sublime (or, dare I say, IntelliJ?). I like, and use, Visual Studio code but it's hardly performant.

  • tomatotomato37 6 years ago

    Makes sense. Instead of bringing slow risky software to the web just so people didn't have to learn javascript, it brings slow risky software to the desktop just so people don't have to learn C.

    • t0astbread 6 years ago

      Why risky? This vulnerability applies to any runtime or program that isn't signed as far as I can tell. It's not specific to Electron or JavaScript.

      Aside from that, I don't know what the situation is in the year 2019 but does C still allow you to easily mess up memory management?

    • pgcj_poster 6 years ago

      But. Flash was faster than JavaScript for a long time.

  • metalliqaz 6 years ago

    I certainly avoid it the way I avoid flash webpages.

  • pjc50 6 years ago

    It doesn't have a convenient but proprietary IDE for creating it.

  • sixothree 6 years ago

    You can't run electron apps in a browser nor is there a runtime needing installation. :) since we're doing that sort of thing.

  • floatboth 6 years ago

    It's actually the new XULRunner.

    • thecopy 6 years ago

      Or the new JAVA

      • saagarjha 6 years ago

        Java actually made an effort to look like the platform it was on, though.

        • xigency 6 years ago

          That has never been my impression. Java applications have a Java look.

          What has actually impressed me as an easy way to do native looking UIs, albeit simple ones, is PyTK. On Windows, you can even select between the different styles that are internal.

  • ilikehurdles 6 years ago

    How? Honestly, I don't see the comparison.

  • notatoad 6 years ago

    name one way in which they're similar (beyond "I don't like either of them")

throwaway8879 6 years ago

At this point in time, it's reasonably healthy to assume that everything has backdoors. The only place where information can be kept safe and hidden is deep within our minds. Any method used to share said information with another human being is subject to surveillance and backdoors. Only share what you don't mind being read by the state and it's friends.

  • he0001 6 years ago

    For some reason, the key to my mind’s backdoor is beer.

  • loceng 6 years ago

    Unless you're using [or forced to use?] Neuralink; probably.

  • kd3 6 years ago

    This is actually good advice. Anything you want to keep secret should stay in your mind. Anything else gets progressively risky.

    • Ahwleung 6 years ago

      If you want to apply this advice practically, instead of using and trusting any of the various password managers out there, use a brain-stored hash algorithm for all password management. For example your hash could be <some secret phrase> + the last 4 letters of the website/service being visited, with the last 2 letters flipped. Combine the phrase in some non-intuitive way.

      Only other considerations are to have a more basic hash for certain financial websites/insurance companies (cough Allstate) that for some reason think an 11-character max password is still okay in this millenium, and to have a method of "incrementing" the password in case you have a service that forces rotations. The only reason to write the hash down is for financial service access in the case of estate planning - store it securely/safely, of course.

      Ever since switching to this, I've found it's even more convenient than a password manager. You get used to running your hash in a very short time, and don't need to have access to an electronic device to recall a password.

      • kd3 6 years ago

        I had thought of doing that but the various differences and requirements for password length and characters everywhere make it difficult to standardize on one hash. Before you know it you're keeping track of different hashes and it becomes risky to memorize. Or is your experience different?

  • rudiv 6 years ago

    You heard it here first, friends. Or maybe you heard it earlier from Huxley or Orwell.

StreamBright 6 years ago

It is amazing to see how large, fat, over-engineered frameworks are taking over the internet. Not only it is easy to backdoor but usually they consume an enormous amount of memory and CPU. Not sure how we ended up here.

  • weego 6 years ago

    Because app development insited on a high barrier to entry approach to paradigms and tooling that put it out of reach of most developers, enough of whom valued a pragmatic approach to getting their ideas out into the world.

    The fact that billion dollar companies insist on continuing to take the shortcut approach when they have the resources available to "be better" is not the fault of framework developers who originally innovated to fill the demand

  • pferde 6 years ago

    Cost saving, plus the "suck it up, it runs so it's good enough" user peer pressure to accept the lowest common denominator.

  • t-writescode 6 years ago

    Cross-platform guis are hard or ugly and html+css came to save the day

  • peteradio 6 years ago

    Bloat and corruption is the only conceivable way to keep X billion people employed?

  • sixothree 6 years ago

    > Not sure how we ended up here.

    Then you don't understand why people use them.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection