Settings

Theme

Show HN: Million 3 – Optimizing compiler for React

million.dev

97 points by aidenyb 2 years ago · 55 comments

Reader

lukevp 2 years ago

I met Aiden (the < 20 yo who started Million) a year or so ago. He presented about Million in front of a room full of 40+ grizzled JS devs. I don't personally see any reason to use Million JS, React is fast enough as it is if you memoize and use selectors correctly. Aiden said some similar things at the time (a solution in search of a problem, that got unexpectedly popular) but I gotta say, he's a hype man for sure. I wish him luck, I think if he made a more compelling library, it would be a rocket ship with his marketing. I do think he should take some hints from the other post regarding the deceptive benchmarks and make sure he can back up his marketing materials, but 14k stars on GitHub for something that (to me) seems pretty useless is truly bananas skill.

  • quickthrower2 2 years ago

    But if it works this is one of those “chuck this in and see if you get a speedup” things, a bit like a platform upgrade for Java or .NET or moving to a new cloud SKU. Devs and managers love this kind of thing. No code just chuck it in and stuff goes faster (assuming that is how it works).

  • paulddraper 2 years ago

    Yeah, memo() solves 98% of cases

    • jitl 2 years ago

      To me this kind of assessment can read like a C dev saying something like “snprintf solves 98% of buffer overflows” or something. Yes, every React application can be performant if coded carefully by a small team of experts, and every C program can be secure if coded by a small team of experts, but often applications are built by a large team of median engineers, so tooling to make median code automatically 90th percentile could be well worth it.

      I’m not saying Million is that tool, just that such dismissal of the problem addressed rings a little hollow.

  • Existenceblinks 2 years ago

    I feel like this is all missing the points. The point is React is mid. Engineering metric is very strange these days. A guy has made your tool wicked fast, dismissed - fast enough .. wat?

zackify 2 years ago

Or people could structure their react components efficiently and be fine as is.

I’m sure I’ll get a lot of downvotes for that.

But after 10 years of using react. Most performance issues were really just poor state management in the wrong locations and not the virtual dom holding it back.

  • ficklepickle 2 years ago

    > poor state management in the wrong locations

    Can you expand on that? I've never hit perf issues with react but I've been curious how it commonly happens.

agluszak 2 years ago

> Instead of traversing every node, Million uses a compiler to directly update dynamic nodes, resulting in O(1) time complexity.

This sounds very hand-wavy. What does it mean to "use a compiler to directly update dynamic nodes"?

  • nardi 2 years ago

    It means you compile-in a direct reference to the node that needs to be updated when some property changes, so instead of searching the tree of n nodes to find it, you already have the reference.

Hrun0 2 years ago

OP lied about benchmarks in the past:

https://www.reddit.com/r/javascript/comments/x2iwim/askjs_mi...

  • aidenybOP 2 years ago

    hi, this is aiden

    around 2 yrs ago, i messed up benchmarks in the past with million v1. i'm sorry about putting out false information. once the reddit post came out i stopped working/advertising million entirely. i then spent 3 months redesigning the entire library. i tried my best to make it as fast as possible and accurate. benchmarks are real now, see here: https://krausest.github.io/js-framework-benchmark/current.ht...

  • brailsafe 2 years ago

    People make mistakes, and should hopefully learn from them, but it's not right to comment with nothing but dug up dirt, especially with such a negative accusation of malice, while they try and move on.

    • hitekker 2 years ago

      The GP said the OP misrepresented their benchmarks before. It's good context and the rest of the Reddit thread is also informative.

      "Lie" is a bit edgey but I think adults should be able to stomach a little sourness instead of, ironically, accusing people of dirt and malice.

  • chrismorgan 2 years ago

    > OP lied

    Lying is a deliberate action. The thread you link to seems rather to expose accidental bad benchmarking and poor communication/explanation of results, probably in good faith and due to inexperience.

    • Hrun0 2 years ago

      It seemed deliberate to me because it happened more than once. But he apologized, and as far as I'm concerned, that's that.

698969 2 years ago

Million's optimizations are only relevant if you're rendering a large number of identical stateless components (exactly like JS Framework Benchmark).

Real world applications are mostly deep trees of stateful components.

c-smile 2 years ago

How does M3 stand with SvelteJS?

Seems like these are two conceptually similar things.

> React traverses the virtual DOM tree to update the UI, resulting in O(n) time complexity.

That's the worst case, on initial load. On most of UI changes nothing stops React to update only local portions of the tree - elements that have their state changed.

Educated guess: In Million and React.JS cases major bottleneck is inside browser re-layout mechanism, not on JS side I think.

  • acdha 2 years ago

    > On most of UI changes nothing stops React to update only local portions of the tree - elements that have their state changed.

    In practice, it still re-renders a lot. It’s easy to get a significant performance increase by not using React (usually at least one order of magnitude) - browsers have improved a lot and what looked like an optimization for IE6 is largely overhead now.

    • c-smile 2 years ago

      > In practice, it still re-renders a lot.

      Neither framework will fix programmer's errors.

      This

         for(let child of container)
            child.innerHTML = getHtmlFor(i++);
      
      will always be slower than this:

         for(let i = 0; i < N; ++i)
            html += getHtmlFor(i++);
         container.innerHTML = html;
      
      N transactions versus 1 transaction.

      I mean that compiling separate N "small" DOM access calls is not always faster than one integral update.

      • acdha 2 years ago

        My point was that that is enormously slower than storing a reference to the elements when you create them and setting only the specific attributes which changed. innerHtml string parsing forces the browser to do too much work unless you’re literally changing everything. Each time I’ve replaced that with DOM operations it’s been an order of magnitude or more performance gain - on one project, getting rid of fully-optimized React was 4 orders, but I think they’ve optimized it enough since then that it’s only 1-2.

RyanHamilton 2 years ago

I've built a real time updating app using react and I'm struggling to see the benefit here. For the very core parts I'm already avoiding the react overhead by using useref to maintain the same object. This reduces "hydration " and traversing to nil cost with no new concepts to learn. Why would I use million?

  • ricardobeat 2 years ago

    React is still doing reconciliation for your full element tree, even if you minimize updates. The approaches are not really comparable.

frfl 2 years ago

I haven't used React in a while and never used SolidJS, but would SolidJS not basically be a optimizing compiler as well for "React". Technically SolidJS is a separate framework and I don't know if it's a 100% drop in replacement like this make be.

Edit: My comment is probably not accurate. Please ignore what I said.

  • impulser_ 2 years ago

    SolidJS and React are on the opposite ends of the JS framework worlds.

    Solid doesn't use a virtual DOM like React. Solid is close to Svelte, Vue is in the middle more leaning towards Solid and Svelte today and eventually will be with Svelte and Solid when Vapor comes out and then React is on the other end.

    Solid and React do feels similar to the developer because they both use JSX and similar APIs but they aren't compatible.

    • frfl 2 years ago

      Yeah, I meant "optimizing compiler" in the sense it takes in JSX and outputs optimized code

      • Aeolun 2 years ago

        Solid doesn’t have a compiler as I recall? (Other than JSX)

        • frfl 2 years ago

          Sorry my mistake. I misunderstood Solid as working same as Svelte with a compiler.

          • worble 2 years ago

            I also understood it like that, and the FAQ seems to imply that it is.

            https://www.solidjs.com/guides/faq

            > We have a compiler working for you to give you optimal native DOM updates, but you have all the freedom of a library like React.

photon_collider 2 years ago

This looks really cool! It's interesting that this compiler provides performance optimizations by looking for data diffs rather than diffs in the Virtual DOM. Is this intended to be an alternative to React Forget (still in development)?

ggregoire 2 years ago

Never heard of this, sounds super cool! Gratz on the milestone!

Theoretically, could this be merged in the main React project or would this break something?

agluszak 2 years ago

Aiden seems to have hopped on the AI hype train, too.

https://million.dev/ai

voat 2 years ago

Ignore the haters. Million is a great project, and def provides a great solution to speeding up react.

tibbydudeza 2 years ago

Interesting - tried it but it did not like me using decorators.

quickthrower2 2 years ago

Anyone knows how this compares with Svelte’s approach?

gothink 2 years ago

I'm not a React dev, so I can't comment on the project itself. Something I noticed on the blog post, though: The image at the top of the page is served uncompressed at a whopping 18.5MB (9751px * 6132px)! Seems a bit extreme for what amounts to a simple logo and some text.

  • Hrun0 2 years ago

    Nothing a couple of rocket emojis won't fix :D

  • ricardobeat 2 years ago

    Ouch. It’s also downloaded twice (in Safari at least), putting the total page weight at almost 40MB.

  • lifthrasiir 2 years ago

    And the file has a very, very faint gradient background with heavy dithering, so lossless compression won't actually help that much (ECT was only able to shave 13% of the file after minutes). In fact it is one of the worst imaginable cases for raster image formats.

  • blackoil 2 years ago

    18 MB may be odd, but most pages now have multiple images making each page 10s of MB. That makes all discussions about some framework few hundred kb smaller funny.

  • cnasc 2 years ago
  • paulddraper 2 years ago

    Yeah for all the complaints about an extra 500Kb of JS, the real offenders are often images.

    • quickthrower2 2 years ago

      But don’t forget you need to download and parse the JS and load that into memory. Plus depending on what it does it may be battery intensive.

  • quickthrower2 2 years ago

    Looks like a job for pure css, or if you are feeling brave, svg!

  • aidenybOP 2 years ago

    Holy crap you're right, that was a VERY large image, fixed: 18.5MB -> 21KB

spyke112 2 years ago

Kudos to the team, but why on earth should i choose React when we’ve now reached a point where it needs an optimization compiler, seems silly to be honest.

  • chrisco255 2 years ago

    It doesn't need one, probably less than 1% of React apps make use of this tool. Its an optimization that would be premature for most apps unless they've got large amounts of dynamic onscreen components.

    • wredue 2 years ago

      It has absolutely needed significant performance improvements for years. Vanilla JS is somewhere in the ballpark of 30x (x not percent) faster than react. If that’s not calling for significant performance optimization necessity, I don’t know what is.

      Nobody has ever complained about their app feeling better to use because it performs better.

      Users do not sit there like “man, I really wish the web was SLOWER”.

      Additionally, assuming this lives up to the claims, or even lives up to a quarter of the claims, then the optimization is, by definition, NOT premature. Premature optimization is the act of optimizing before you even know if something is slow, or before you measure.

      I suppose you are probably working under the Functional Programming definition of “premature optimization” where they tell you to never measure (because it just makes FP look bad).

      • kaba0 2 years ago

        The one order of magnitude difference is on some todo app benchmark with mass-updates and the like, it is not really representative of most real world applications, and you might as well just do an escape hatch in plain js for certain parts of your site if you expect/measure significant slowdown from react itself.

        It is absolutely not significant compared to site load, images, initial DOM layouting, etc. Plus your FP paragraph is straight up uninformed flame-bait.

        • wredue 2 years ago

          >your FP paragraph is uninformed flame-bait

          It’s weird how functional fans always tell you something, and then the moment you start calling them on it, they say “no you!”

          According to functional programmers, all optimization is premature optimization, unless your program is “noticeably slow”. What is “noticeably slow” you might ask, and the answer is: nobody knows!

          Being able to type faster than VSCode registers your keystrokes is not “noticeably slow”. This counts as “fast enough”.

          Processing 10,000 lines of data in 5 minutes is fast enough. Just make it a task and toss it on a highly parallel cluster, then you won’t notice that this should take milliseconds, move on to the next thing.

          While you’re contemplating what “noticeably slow” means, let’s also toss up some articles claiming Haskell is faster than C, but not providing any evidence for said claim, contrary to the countless measurements demonstrating the falseness of this claim.

          >it’s not significant compared to

          Is that really what our counter argument is? That by the time your ill-conceived images download on a clean browser cache, your garbage code may have finished generating the DOM? You regularly build apps that you expect people to use once and fill said apps with large images?

      • blackoil 2 years ago

        > in the ballpark of 30x (x not percent) faster than react

        In real world apps or theoretical benchmarks?

  • root_axis 2 years ago

    The benefits of react have been belabored for a decade, if none of that reasoning makes sense to you, just don't use it.

  • klysm 2 years ago

    Ecosystem and ability to hire folks who are already competent. Sure it's not the most technically optimal framework but you can build great products with it.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection