Settings

Theme

Temporary fork enables Node.js to optionally use the Chakra JavaScript engine

github.com

170 points by chcokr 11 years ago · 103 comments

Reader

elisee 11 years ago

"This temporary fork enables Node.js to optionally use the Chakra JavaScript engine on Windows 10, allowing Node.js to run on Windows on ARM." (the submission title has been updated after I posted this, was initially "MS releases a fork of Node that uses the Chakra JavaScript engine instead of V8")

Looks like they intend to merge back with node mainline... ?

EDIT: Found this: http://blogs.windows.com/buildingapps/2015/05/12/bringing-no...

They're doing this to be able to run Node.js apps on Windows 10 ARM (on which V8 supposedly doesn't run?)

"We will be submitting a pull request to Node.js after stabilizing this code, fixing key gaps and responding to early community feedback."

"Going forward, we plan to work closely with the Node Foundation, the Node.js Technical Committee(s), IO.js contributors and the community to discuss and participate in conversations around creating JavaScript engine agnostic hosting APIs for Node.js, which provide developers a choice of JavaScript engine that they would want to use in their Node.js workflow"

Looks like the pull request will consist mostly of exposing new hooks to integrate with Chakra / other JS engines and won't involve pulling any Chakra code into Node.js (which would be unlikely to be merged). Might lead to a SpiderMonkey version of Node.js at some point, too. Nice to see IO.js mentioned. Looks like a very positive initiative (assuming it doesn't complicate Node core too much)

  • zpao 11 years ago

    > Might lead to a SpiderMonkey version of Node.js at some point, too.

    I worked on that 4 years ago :) <http://zpao.com/posts/about-that-hybrid-v8monkey-engine/>. The Node community at the time wasn't a huge fan, though it's effectively the same thing that MS just did (build a minimal V8 API shim on top of another JS engine). I guess everybody is ok with a little fragmentation now. Our intention was also to try to get this upstreamed, however with low interest and other things to do, we didn't follow through.

    I'm excited to see this, and especially to have the MS folks involved with the TC. I'd love to see an engine-agnostic API but realistically I don't think it'll happen, at least not anytime soon. Right now Node itself definitely relies pretty heavily on the V8 APIs. Those APIs can be abstracted for the most part (even if each engine is just shimming those parts of the V8 API) but the other problem is the longer tail of binary npm modules. Right now they have the full V8 API to work with. If they do a shim layer then it will come at cost for every vendor except V8. And then maintaining that layer as V8 changes APIs. If you go the engine-agnostic API route, then you will need coordination between engine vendors. That opens the doors to a multitude of problems.

    • Twirrim 11 years ago

      Also that's going to potentially add a ton of work for people developing libraries. With a single engine behind it, the coding efforts can be focussed on writing and profiling something to be fast on v8. If you get replaceable engines, now developers have to either: 1) Ignore all but V8 (or spidermonkey, or chakra etc.) 2) Create different versions of their libraries for different engines. ( 3ish) write multiple paths in their code depending on the target engine)

      I'm not sure exactly where I sit on this one. In many regards I think I like it, allowing developers to use the right engine for the right job, for example. Each engine has its own strengths and it may be that v8 isn't the engine for you / your workload. It's just this could hurt as much as it helps.

      • bsimpson 11 years ago

        Using @zpao's example, most Python developers use CPython and either don't know about PyPy or have never used it. As such, I'd expect most of the long-tail of Python libraries to have been written and tested against CPython.

        That hasn't prevented PyPy, IronPython, JPython, etc. from existing/thriving.

      • angersock 11 years ago

        Having a decent shim to code against instead of having to deal with the internal gooiness of different JS engines is actually very much preferred. By throwing a shim over the JS execution layer, they actually make the life of extension developers much easier.

  • magicalist 11 years ago

    > They're doing this to be able to run Node.js apps on Windows 10 ARM (on which V8 supposedly doesn't run?)

    V8 definitely has an ARM runtime, so maybe this is a result of the restrictions on what's allowed to run on the platform? (e.g. iOS and Windows Phone don't allow JIT compilers except the ones provided by the platform itself)

  • themgt 11 years ago

    I wonder how difficult it will be to manage pull requests for merging against node.js & io.js simultaneously - also, creates an interesting political situation if one project accepts the PR but the other doesn't (I would guess io.js will be more eager to do a release against the new code)

  • silverwind 11 years ago

    The question is why should node/io.js take such a pull request. It adds support for an irrelevant platform and would add the burden of maintaining the 'wrapper' which will probably break on every v8 update.

    • rlidwka 11 years ago

      If it adds an engine abstraction layer which would allow users to choose an engine (and somebody implements SpiderMonkey support there), sure why not.

      Though this is a job that is already done by JXCore.

bhouston 11 years ago

It would be cool if the JavaScript engine was interchangable in NodeJS and IO.JS so you could pick Chakra, V8 or SpiderMonkey very easily. SpiderMonkey is faster than V8 these days on a lot of benchmarks.

  • petercooper 11 years ago

    For anyone who wants to run Spidermonkey with Node, JXcore is a Node fork that enables this.

  • consptheorist 11 years ago

    I can attest to that.

    I was experimenting a couple of days ago with high duty and extreme DOM nodes crunching and FF's SpiderMoneky blew Chrome's V8 out of the water for 7 - 9 multiples gain in performance measured in time elapsed to complete the operations.

    Chrome's V8 engine at this point is so overrated

    • esailija 11 years ago

      How do you live in such a bliss ignorance where supposedly an order of magnitude difference in performance between two state of the art browsers doesn't make you reconsider even for a second that you might have not written a working benchmark? :)

      Sorry but unless you have deep knowledge of how both engines and browsers work (knowing how `appendChild` is actually implemented for starters), you simply cannot write a working benchmark. Even then, it's very hard and tedious.

      If you don't have time to obtain such expertise, you could take a shortcut and compare realistic end-to-end benchmark. E.g. if your game runs at 210-270 fps in firefox but only at 30 fps in chrome, then you could claim that "firefox blows chrome out of the water".

      It's very easy (just look at 80%+ of jsperfs) to construct benchmarks that don't look completely broken to the untrained eye but actually are. The common theme is the benchmark missing many aspects of realistic code and being reduced to measuring irrelevant optimizing compiling features. For example the benchmark could only be measuring how thorough the engine's dead code elimination pass is even though what you wanted to benchmark is string concatenation performance.

    • magicalist 11 years ago

      > I was experimenting a couple of days ago with high duty and extreme DOM nodes crunching and FF's SpiderMoneky blew Chrome's V8 out of the water for 7 - 9 multiples gain in performance measured in time elapsed to complete the operations.

      This likely has nothing to do with the JS engines themselves and everything to do with the browser they were running in. To actually benchmark something like that you'd need to simulate the dom with something like https://github.com/tmpvar/jsdom

      • consptheorist 11 years ago

        Are you suggesting that the remarkable disparity in performance was DOM specific?

        Strange because I used a very common method appendChild() and I was under the impression that both browsers had optimized their respective inner workings a long time ago to the point that we should not notice such divergence in performance.

        • dpe82 11 years ago

          Yes. DOM manipulation in all major browsers is implemented in C/C++. The JS engine is just a wrapper; any noticeable performance difference in DOM manipulation is almost certainly due to differences in the underlying layout engine and not in the JS engine.

          • gsnedders 11 years ago

            Well, there's one way in which the JS engine effects it: how efficiently one can call into C++ from JS. Mozilla have done a lot of work to reduce the cost of that in SpiderMonkey.

            • dpe82 11 years ago

              Sure, though in the grand scheme of things that penalty is pretty small when compared to the DOM operation itself. Eg. doing an appendChild() on an attached element and causing a reflow.

              • gsnedders 11 years ago

                It depends a lot on what you're doing — if you're hitting fast-paths (esp. if you're dealing with out-of-tree nodes) it's entirely possible to end up with the JS/C++ trampoline being a significant part of the bottleneck, for much the same reasons as why Array.prototype.reduce can in several implementations.

    • agumonkey 11 years ago

      This reinforce my feeling that most of the issues with Firefox is the GUI framework.

  • TazeTSchnitzel 11 years ago

    Given node.js supports native code modules I can't see why it'd matter, but SpiderMonkey has a special asm.js compilation unit, OdinMonkey, giving best-in-class asm.js speed.

  • cleverjake 11 years ago

    and chakra is currently faster than both on the octane and jet stream benchmarks

cpeterso 11 years ago

Interesting that Microsoft forked Node instead of io.js. The Microsoft repo says, "This branch is 16 commits ahead, 29 commits behind joyent:master".

  • jjcm 11 years ago

    I work for MS, right now we're working on win10. Some of the UI is written in html/js now, so I'm not surprised by this at all. I'm guessing we'll see some native node.js apps on windows in the future.

    • untog 11 years ago

      I think what the OP meant was that it is interesting that MS forked Node, and not the io.js project that's more advanced than Node.

      IMO it's not too surprising - it's relatively simple to fast forward to io.js from where it is now, wheras reverse engineering it backwards to Node compatibility would be mayhem.

cmwelsh 11 years ago

Can I browse my node_modules folder in Explorer yet? [1]

[1] https://github.com/joyent/node/issues/6960

  • dyscrete 11 years ago

    Wow, what's up Microsoft?

    • pionar 11 years ago

      Actually, I was thinking, what's up node? Each dependency keeps a private copy of its dependencies? How messed up is that? Or am I just reading that wrong?

      • bryanlarsen 11 years ago

        That's a feature, not a bug. It lets A rely on version 0.9 of X while B relies on version 0.8.

        • pionar 11 years ago

          I don't call that a feature. There's other ways of doing version pinning without junking up my filesystem.

          • tdicola 11 years ago

            Regardless of who is right or wrong at the end of the day it's still broken for many people. If Microsoft and Joyent/the node & npm community care about people using node & npm on Windows then the issues needs to be resolved, period.

        • ksherlock 11 years ago
          • nemothekid 11 years ago

            I don't see how this npm's problem - the language itself doesn't let you catch this problem - and the pattern itself is dangerous. I'd have to agree with IsaacSchlueter here.

            If I have a library (libA) that uses fooV1 and another (libB) that uses fooV2, there is no reason I should expect those two dependencies to interop with each other (except explicitly stated by the developer).

            The solution (that there should be one set of dependencies) is an even worse problem that currently afflicts the Golang community, which they have decided to solve the same way npm has decided to solve it - vendoring.

            Now with Golang, if you only have 1 set of dependancies, and one of your libraries depended on an older version - your program just won't compile (unless you update the library to use fooV2) and I'm not sure how thats useful to anyone. Most people will just tell you should have have just versioned that dependency.

            • spion 11 years ago

              In the vast majority of cases, its not. But in the cases when people want to invent new globally useful abstractions, like bignum, promises, cleaner streams etc - i.e. to reshape the platform - they can't do that. For example, the overhead of adding bluebird as a dependency to every module quickly adds up, and bluebird had to fight with many issues stemming from multiple versions of it communicating with each other.

              IMO while this has contributed to the growth of node the ecosystem, it also stifled the growth of node the platform.

        • efdee 11 years ago

          No, it's a "bug". It is not required for loading specific versions -- those can be kept in a flat structure just as well. No need for the idiotic nesting.

          • bryanlarsen 11 years ago

            IMO a flat structure would just be a confused mess. There would be hundreds of packages in that directory.

            • efdee 11 years ago

              I'd rather have hundred of packages in a well sorted flat directory than the same number of packages in an "Alice In Wonderland"-style rabbit hole.

        • espadrine 11 years ago

          Having node_modules/X@0.9/ and node_modules/X@0.8/ would allow it as well. Yet each dependency would not keep a private copy of its own dependencies.

        • kuschku 11 years ago

          Unless you create file paths that are so long that they break the default tools of the OS. That’s the point where you went too far.

          • pmontra 11 years ago

            Or the OS went too short, so to say. There are two problems here.

            1) Windows' limit of 260 characters per path name is incredibly small for a modern OS and that was true many years ago as well.

            2) Npm approach relied on well behaved OSes and file systems allowing long path names. It's not future proof because you don't know on which OS and file system you'll need to run, maybe some limited embedded device.

            Ruby's rvm solved that problem with a single directory level and a gem@version naming scheme.

            • kuschku 11 years ago

              Exactly. Ruby’s solution will run even on FAT12 (which has only single-level directories), while npm isn’t really compatible.

              In general, node does this quite often – their memory model also assumes they have infinite memory.

      • tracker1 11 years ago

        npm 3 will adjust dependencies to flatten these things out as much as possible, only creating nested dependencies when there's a version conflict between modules.

  • rational-future 11 years ago

    With every new version of Windows, Explorer gets worse and worse. I personally switched to Directory Opus long time ago.

    • orik 11 years ago

      This isn't a limitation of Explorer, but actually is a very well documented part of the Windows API. MAXPATH has always been 260 characters.

      https://msdn.microsoft.com/en-us/library/aa365247(VS.85).asp...

      • frik 11 years ago

        How long will Microsoft wait till the fix MAXPATH and other limitations? (and various other Win32 limitations that are usually a legacy porting helper thing from Win16). There would have been a good point with the introduction of Win64API - but Microsoft forgot about it and was apparently busy with something else.

        • CHY872 11 years ago

          I doubt it's too high up on the list of priorities. It would require really careful work to work in a backwards compatible way (there are almost certainly a tonne of apps that expect <=260 character filenames).

          I guess the main thing is that it's one of those 'who cares' problems. The only time I've ever seen this limitation being complained about, it's by people who've had problems with npm.

          That directory structure is undoubtedly horrible and is not mirrored by any other piece of software that I've seen.

          • jessaustin 11 years ago

            Meanwhile it's also a "who cares?" problem for the npm people. They don't use this OS, and don't expect ever to do so.

            I think the node_modules directory structure is a novel, though straightforward, solution to the problem of interdependent module versioning. It's very Unix; it reminds me a bit of GNU stow. It makes perfect sense that it will be tweaked in the new npm to be less redundant, but it only really makes sense to tweak systems that already work perfectly. (Otherwise they should be fixed first, then tweaked.) Certainly it's better than having a separate LD_LIBRARY_PATH setting for every command invocation! (even that doesn't fix everything...)

            • xienze 11 years ago

              It doesn't even seem like an obvious solution to the problem. Take Maven for instance: a global dependency repository under which the dependencies are stored in the form <groupId>/<artifactId>/<version>. If X depends on Y.1 and Z depends on Y.2, so what? You have all the dependencies stored on your filesystem in a relatively flat structure.

              • jessaustin 11 years ago

                If I had coded in Java on Windows for years, I wouldn't trust my sense of what's "obvious". I'm not too impressed by a "global" repository either. Python struggled against that stupid architecture for years before they got virtualenv in good working order. Node just took a shortcut to the future.

                "All direct dependencies are in the node_modules directory, full-stop" is a pretty simple rule. Really, that's the only rule, and that's all of it. You don't need to worry about second-order dependencies, because those are direct dependencies of some other module, which means... they are in that module's node_modules directory.

          • ygra 11 years ago

            I had the problems with a Java codebase in SVN (back when your user profile was still under C:\Dokumente und Einstellungen\Username\...

            Apart from that there is a backwards-compatible way of using longer paths, which is prefixing with \\?\. Since MAX_PATH is a hard-coded constant there can only be an opt-in way of dealing with the problem. Sadly many application or framework developers these days still don't opt in.

            It also creates the problem that if Application A can create such paths and Application B cannot read them, you'll be annoyed too. And Application A might just disable long path support to mitigate the problem, leaving the whole state as it is.

          • dspillett 11 years ago

            > The only time I've ever seen this limitation being complained about, it's by people who've had problems with npm.

            Recently that is where you'll have heard about it a lot, but people have been hitting the problem for years. I've hit it in a couple of distinct contexts in my time.

            You hardly hear about it generally because people moan a bit then work around it so the problem goes away until next time, and sometimes people don't even moan about it because it is such an old problem they just think "oh, that again, better make my directory/file names and/or paths a bit shorter" and get on with their day.

            The difference between this and with npm over recent times (causing the greater noise around the issue) is that twofold:

            * linux/similar people hitting the problem in the wild for the first time (they've probably heard about it, but never had to deal with it) because their code is getting used on Windows which was much more a rare occurrence for them in the past, and being incredulous that such a problem exists in this decade

            * Windows people wanting to use node and similar cool new tech on their preferred and/or mandated platform but hitting issues due to this problem in their environment (and in some cases being incredulous that someone wouldn't consider the implications of the limitation it in their design)

          • noinsight 11 years ago

            > I guess the main thing is that it's one of those 'who cares' problems. The only time I've ever seen this limitation being complained about, it's by people who've had problems with npm.

            If you work on the sysadmin side of things, this comes up almost every time with directories on network shares. Explorer, cmd and PowerShell can't handle them and you need to resort to using robocopy or 3rd party tools like FastCopy. To delete stuff you often have to resort to hacks like robocopy mirroring an empty directory into the path before deleting it.

      • efdee 11 years ago

        It is a limitation of Explorer and CMD.EXE - Windows has had Unicode APIs to access paths up to 32767 characters, but for some reason they have Explorer and CMD.EXE use the older ANSI API which does not support it.

    • chimeracoder 11 years ago

      > With every new version of Windows, Explorer gets worse and worse.

      This criticism is a bit misplaced here, since the whole reason for this limitation is due to backward compatibility with older versions of Windows (and software written for older versions).

      It's not like this is a new change in Windows; it's been there for ages.

cpeterso 11 years ago

JXcore is another fork of Node that makes the VM pluggable, supporting both V8 and SpiderMonkey. I wonder how similar Microsoft's and JXcore's VM abstraction layers are and whether Node upstream would accept them. Drawing a hard line between the Node native code and the VM would make binary addon compatibility more stable (and lessen the need for NaN, the "Native Abstractions for Node").

  • streamline92 11 years ago

    The problem with the way JXcore did the SpiderMonkey port is their extensive use of C++ macros - not unlike NAN for Node.js. This makes the code hard to debug and maintain. The Microsoft Chakra Node port is more elegant because they've mimicked the V8 C++ API making it much more likely that it will be merged into Node.js and IO.js. In time I suspect Mozilla and other javascript engines will make V8-compatible API shims similar to what Microsoft did:

    https://github.com/Microsoft/node/tree/ch0.12.2/deps/chakras...

htilford 11 years ago

This was a long time coming. I remember some MS folks talking to Ryan Dahl about doing this back at nodeconf 2011.

Ezhik 11 years ago

I wonder if MS is going to end up open sourcing Edge? Between this and the fact that Visual Studio Code uses Chromium, it really seems like where MS should head, but who knows.

  • bastawhiz 11 years ago

    I'd expect them to open source various components of it before they open source the whole shebang. I.e., I'd expect to see them put Chakra out, maybe the browser chrome, the parsers, etc. before they put out all of edgehtml.dll. I could be wrong.

    Edge (and particularly IE) are fairly heavily tied to the OS in a bunch of places. IE, for example, can do weird FTP and Windows Explorer stuff. The infamous "Internet Settings" dialog and the way IE deals with stuff like proxy servers is only sort-of part of IE. IE's network stack is largely dependant on the bits and pieces available in the OS below (consider IE11 can only use SPDY on Windows 8). I wouldn't be surprised if open sourcing the browser wholesale would start unraveling a lot of things that MS doesn't intend to be public.

    • Ezhik 11 years ago

      I wonder if making it standalone will be a part of the whole 'ditching IE legacy' process?

Hansi 11 years ago

Is there a benchmark comparison available anywhere?

frik 11 years ago

What about the license of Node.js/Chromium? Isn't linking a closed source library (Chakra) problematic?

You know it includes multiple code parts under various licenses, Wikipedia says: BSD license, MIT License, LGPL, MS-PL and MPL/GPL/LGPL tri-licensed ( http://en.wikipedia.org/wiki/Chromium_(web_browser) )

There is a reason why major open source projects like Linux, etc. choose licenses like GNU GPL v2+. http://en.wikipedia.org/wiki/Embrace,_extend_and_extinguish and http://en.wikipedia.org/wiki/Fear,_uncertainty_and_doubt

  • poizan42 11 years ago

    It's already linked against tons of closed source libraries when built on windows, why would one more make a difference?

  • chc 11 years ago

    Based on that license list, I don't see how any of those would present a problem. Do you?

  • coldtea 11 years ago

    >Isn't linking a closed source library (Chakra) problematic?

    Yeah, because the tech world didn't have enough problems with projects being immature, unreliable, stale 30+ year designs, abandoned, incompatible, not provided by a specific distribution, coflicting, patented and 100 other issues to consider.

    It just had to also add 200 legal distinctions behind what you can and you cannot do, and how you can link stuff and under what circumstances.

    • magicalist 11 years ago

      If you're asking if software licenses are important, the answer is yes, they're very important.

      To the GP, though, I don't immediately see how linking to Chakra in this way would be a license issue. The more important thing is the license information for Node, though, not Chromium: https://github.com/joyent/node/blob/master/LICENSE (some overlap but quite a bit that doesn't)

      • coldtea 11 years ago

        >If you're asking if software licenses are important, the answer is yes, they're very important.

        I'm not asking about their importance, I complain about their existince (and need).

rational-future 11 years ago

Will this work on Raspberry Pi?

z3t4 11 years ago

I wonder if they have made it so that you can use milliseconds (1/1000) instead of (1/100) in setTimeout and setInterval. It was one of the things that annoyed me the most running Node.JS on Windows ...

flipchart 11 years ago

I wasn't aware that Chakra could be used standalone in this manner. I had assumed that it was tightly bundled to IE. Is this new?

pluma 11 years ago

I know Microsoft apparently has changed, but just to play devil's advocate:

[x] embrace

[x] extend

[ ] extinguish

  • chralieboy 11 years ago

    They are very open that this is not their plan here. They are _temporarily_ forking Node.js to add support for their JS engine. They want to extend Node to abstract away the JS engine so that it doesn't rely on V8 or Chakra or SpiderMonkey but can sit on any one of them.

    It's actually exactly the opposite. Their API abstraction work will only increase competition, especially since they aren't trying to run a competing fork. Despite their history, those in favor of a more open Node.js platform should commend this.

  • nivla 11 years ago

    To play the devil's advocate of the devil's advocate, how exactly would an [x]extinguish work in an open source world especially for something that is under a liberal licence (MIT vs GPL)? Isn't that the whole point of open source? That if even something gets abandoned or ignored, as long as there is still an active interest in it, it can still be used or improved upon?

    • pluma 11 years ago

      You're probably aware I'm not considering it to be likely that this is Old Microsoft in action but to humour the thought experiment: I don't think they could succeed either. IE is still only barely recovering from Microsoft's history and Windows has largely been defeated by OSX both in the consumer and developer space. We're unlikely to see Microsoft Space Nazis descend upon us from a hidden moon base any time soon.

      That said, there are plenty of examples of the extinguish phase not working out or resulting in less of a bang and more of a whimper. It's always been more of an infected blanket than nuclear warheads.

  • jbigelow76 11 years ago

    Extinguish Javascript? Good luck with that.

ilaksh 11 years ago

I don't see how anyone can rationally trust Microsoft here given their track record.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection