Show HN: Million 3 – Optimizing compiler for React
million.devI met Aiden (the < 20 yo who started Million) a year or so ago. He presented about Million in front of a room full of 40+ grizzled JS devs. I don't personally see any reason to use Million JS, React is fast enough as it is if you memoize and use selectors correctly. Aiden said some similar things at the time (a solution in search of a problem, that got unexpectedly popular) but I gotta say, he's a hype man for sure. I wish him luck, I think if he made a more compelling library, it would be a rocket ship with his marketing. I do think he should take some hints from the other post regarding the deceptive benchmarks and make sure he can back up his marketing materials, but 14k stars on GitHub for something that (to me) seems pretty useless is truly bananas skill.
But if it works this is one of those “chuck this in and see if you get a speedup” things, a bit like a platform upgrade for Java or .NET or moving to a new cloud SKU. Devs and managers love this kind of thing. No code just chuck it in and stuff goes faster (assuming that is how it works).
Yeah, memo() solves 98% of cases
To me this kind of assessment can read like a C dev saying something like “snprintf solves 98% of buffer overflows” or something. Yes, every React application can be performant if coded carefully by a small team of experts, and every C program can be secure if coded by a small team of experts, but often applications are built by a large team of median engineers, so tooling to make median code automatically 90th percentile could be well worth it.
I’m not saying Million is that tool, just that such dismissal of the problem addressed rings a little hollow.
memo() does not require a team of experts.
I feel like this is all missing the points. The point is React is mid. Engineering metric is very strange these days. A guy has made your tool wicked fast, dismissed - fast enough .. wat?
Or people could structure their react components efficiently and be fine as is.
I’m sure I’ll get a lot of downvotes for that.
But after 10 years of using react. Most performance issues were really just poor state management in the wrong locations and not the virtual dom holding it back.
> poor state management in the wrong locations
Can you expand on that? I've never hit perf issues with react but I've been curious how it commonly happens.
> Instead of traversing every node, Million uses a compiler to directly update dynamic nodes, resulting in O(1) time complexity.
This sounds very hand-wavy. What does it mean to "use a compiler to directly update dynamic nodes"?
It means you compile-in a direct reference to the node that needs to be updated when some property changes, so instead of searching the tree of n nodes to find it, you already have the reference.
OP lied about benchmarks in the past:
https://www.reddit.com/r/javascript/comments/x2iwim/askjs_mi...
hi, this is aiden
around 2 yrs ago, i messed up benchmarks in the past with million v1. i'm sorry about putting out false information. once the reddit post came out i stopped working/advertising million entirely. i then spent 3 months redesigning the entire library. i tried my best to make it as fast as possible and accurate. benchmarks are real now, see here: https://krausest.github.io/js-framework-benchmark/current.ht...
Fair enough! Your levelheaded response gives me hope.
People make mistakes, and should hopefully learn from them, but it's not right to comment with nothing but dug up dirt, especially with such a negative accusation of malice, while they try and move on.
The GP said the OP misrepresented their benchmarks before. It's good context and the rest of the Reddit thread is also informative.
"Lie" is a bit edgey but I think adults should be able to stomach a little sourness instead of, ironically, accusing people of dirt and malice.
> OP lied
Lying is a deliberate action. The thread you link to seems rather to expose accidental bad benchmarking and poor communication/explanation of results, probably in good faith and due to inexperience.
It seemed deliberate to me because it happened more than once. But he apologized, and as far as I'm concerned, that's that.
Million's optimizations are only relevant if you're rendering a large number of identical stateless components (exactly like JS Framework Benchmark).
Real world applications are mostly deep trees of stateful components.
How does M3 stand with SvelteJS?
Seems like these are two conceptually similar things.
> React traverses the virtual DOM tree to update the UI, resulting in O(n) time complexity.
That's the worst case, on initial load. On most of UI changes nothing stops React to update only local portions of the tree - elements that have their state changed.
Educated guess: In Million and React.JS cases major bottleneck is inside browser re-layout mechanism, not on JS side I think.
> On most of UI changes nothing stops React to update only local portions of the tree - elements that have their state changed.
In practice, it still re-renders a lot. It’s easy to get a significant performance increase by not using React (usually at least one order of magnitude) - browsers have improved a lot and what looked like an optimization for IE6 is largely overhead now.
> In practice, it still re-renders a lot.
Neither framework will fix programmer's errors.
This
will always be slower than this:for(let child of container) child.innerHTML = getHtmlFor(i++);
N transactions versus 1 transaction.for(let i = 0; i < N; ++i) html += getHtmlFor(i++); container.innerHTML = html;I mean that compiling separate N "small" DOM access calls is not always faster than one integral update.
My point was that that is enormously slower than storing a reference to the elements when you create them and setting only the specific attributes which changed. innerHtml string parsing forces the browser to do too much work unless you’re literally changing everything. Each time I’ve replaced that with DOM operations it’s been an order of magnitude or more performance gain - on one project, getting rid of fully-optimized React was 4 orders, but I think they’ve optimized it enough since then that it’s only 1-2.
I've built a real time updating app using react and I'm struggling to see the benefit here. For the very core parts I'm already avoiding the react overhead by using useref to maintain the same object. This reduces "hydration " and traversing to nil cost with no new concepts to learn. Why would I use million?
React is still doing reconciliation for your full element tree, even if you minimize updates. The approaches are not really comparable.
I haven't used React in a while and never used SolidJS, but would SolidJS not basically be a optimizing compiler as well for "React". Technically SolidJS is a separate framework and I don't know if it's a 100% drop in replacement like this make be.
Edit: My comment is probably not accurate. Please ignore what I said.
SolidJS and React are on the opposite ends of the JS framework worlds.
Solid doesn't use a virtual DOM like React. Solid is close to Svelte, Vue is in the middle more leaning towards Solid and Svelte today and eventually will be with Svelte and Solid when Vapor comes out and then React is on the other end.
Solid and React do feels similar to the developer because they both use JSX and similar APIs but they aren't compatible.
Yeah, I meant "optimizing compiler" in the sense it takes in JSX and outputs optimized code
Solid doesn’t have a compiler as I recall? (Other than JSX)
Sorry my mistake. I misunderstood Solid as working same as Svelte with a compiler.
I also understood it like that, and the FAQ seems to imply that it is.
https://www.solidjs.com/guides/faq
> We have a compiler working for you to give you optimal native DOM updates, but you have all the freedom of a library like React.
This looks really cool! It's interesting that this compiler provides performance optimizations by looking for data diffs rather than diffs in the Virtual DOM. Is this intended to be an alternative to React Forget (still in development)?
Never heard of this, sounds super cool! Gratz on the milestone!
Theoretically, could this be merged in the main React project or would this break something?
Aiden seems to have hopped on the AI hype train, too.
Ignore the haters. Million is a great project, and def provides a great solution to speeding up react.
Interesting - tried it but it did not like me using decorators.
Anyone knows how this compares with Svelte’s approach?
I'm not a React dev, so I can't comment on the project itself. Something I noticed on the blog post, though: The image at the top of the page is served uncompressed at a whopping 18.5MB (9751px * 6132px)! Seems a bit extreme for what amounts to a simple logo and some text.
Nothing a couple of rocket emojis won't fix :D
Ouch. It’s also downloaded twice (in Safari at least), putting the total page weight at almost 40MB.
And the file has a very, very faint gradient background with heavy dithering, so lossless compression won't actually help that much (ECT was only able to shave 13% of the file after minutes). In fact it is one of the worst imaginable cases for raster image formats.
18 MB may be odd, but most pages now have multiple images making each page 10s of MB. That makes all discussions about some framework few hundred kb smaller funny.
Yeah for all the complaints about an extra 500Kb of JS, the real offenders are often images.
But don’t forget you need to download and parse the JS and load that into memory. Plus depending on what it does it may be battery intensive.
Looks like a job for pure css, or if you are feeling brave, svg!
Holy crap you're right, that was a VERY large image, fixed: 18.5MB -> 21KB
Kudos to the team, but why on earth should i choose React when we’ve now reached a point where it needs an optimization compiler, seems silly to be honest.
It doesn't need one, probably less than 1% of React apps make use of this tool. Its an optimization that would be premature for most apps unless they've got large amounts of dynamic onscreen components.
It has absolutely needed significant performance improvements for years. Vanilla JS is somewhere in the ballpark of 30x (x not percent) faster than react. If that’s not calling for significant performance optimization necessity, I don’t know what is.
Nobody has ever complained about their app feeling better to use because it performs better.
Users do not sit there like “man, I really wish the web was SLOWER”.
Additionally, assuming this lives up to the claims, or even lives up to a quarter of the claims, then the optimization is, by definition, NOT premature. Premature optimization is the act of optimizing before you even know if something is slow, or before you measure.
I suppose you are probably working under the Functional Programming definition of “premature optimization” where they tell you to never measure (because it just makes FP look bad).
The one order of magnitude difference is on some todo app benchmark with mass-updates and the like, it is not really representative of most real world applications, and you might as well just do an escape hatch in plain js for certain parts of your site if you expect/measure significant slowdown from react itself.
It is absolutely not significant compared to site load, images, initial DOM layouting, etc. Plus your FP paragraph is straight up uninformed flame-bait.
>your FP paragraph is uninformed flame-bait
It’s weird how functional fans always tell you something, and then the moment you start calling them on it, they say “no you!”
According to functional programmers, all optimization is premature optimization, unless your program is “noticeably slow”. What is “noticeably slow” you might ask, and the answer is: nobody knows!
Being able to type faster than VSCode registers your keystrokes is not “noticeably slow”. This counts as “fast enough”.
Processing 10,000 lines of data in 5 minutes is fast enough. Just make it a task and toss it on a highly parallel cluster, then you won’t notice that this should take milliseconds, move on to the next thing.
While you’re contemplating what “noticeably slow” means, let’s also toss up some articles claiming Haskell is faster than C, but not providing any evidence for said claim, contrary to the countless measurements demonstrating the falseness of this claim.
>it’s not significant compared to
Is that really what our counter argument is? That by the time your ill-conceived images download on a clean browser cache, your garbage code may have finished generating the DOM? You regularly build apps that you expect people to use once and fill said apps with large images?
> in the ballpark of 30x (x not percent) faster than react
In real world apps or theoretical benchmarks?
The benefits of react have been belabored for a decade, if none of that reasoning makes sense to you, just don't use it.
Ecosystem and ability to hire folks who are already competent. Sure it's not the most technically optimal framework but you can build great products with it.