Computing with JavaScript's Undefined (2020)
esoteric.codesUnavailable at the moment (HN hug of death?), Wayback Machine: https://web.archive.org/web/20240204131916/https://esoteric....
Or on archive.*, faster, nicer:
https://archive.ph/oldest/https://esoteric.codes/blog/calcul...
Hi, I wrote the piece. Site should be responding well again now.
> Service Unavailable
> HTTP Error 503. The service is unavailable.
It is not well at present.
https://en.wikipedia.org/wiki/Tony_Hoare#Research_and_career
>Speaking at a software conference in 2009, Tony Hoare apologized for inventing the null reference:
>"I call it my billion-dollar mistake. It was the invention of the null reference in 1965. At that time, I was designing the first comprehensive type system for references in an object oriented language (ALGOL W). My goal was to ensure that all use of references should be absolutely safe, with checking performed automatically by the compiler. But I couldn't resist the temptation to put in a null reference, simply because it was so easy to implement. This has led to innumerable errors, vulnerabilities, and system crashes, which have probably caused a billion dollars of pain and damage in the last forty years." -Tony Hoare
https://news.ycombinator.com/item?id=19568378
>"My favorite is always the Billion-Dollar Mistake of having null in the language. And since JavaScript has both null and undefined, it's the Two-Billion-Dollar Mistake." -Anders Hejlsberg
>"It is by far the most problematic part of language design. And it's a single value that -- ha ha ha ha -- that if only that wasn't there, imagine all the problems we wouldn't have, right? If type systems were designed that way. And some type systems are, and some type systems are getting there, but boy, trying to retrofit that on top of a type system that has null in the first place is quite an undertaking." -Anders Hejlsberg
Oh, don't worry, were talking about undefined, not null:)
What's better than null?
Two nulls, obviously.
As the author mentioned, JSFuck uses a very similar mechanism.
I interviewed Martin Kleppe of JSFuck for the same blog: https://esoteric.codes/blog/interview-with-martin-kleppe
It reminds me the earliest implementations of Lisp where numbers have considered as lists of certain length.
How would that work? I've only come across recursive function application as a way to implement Peano/Church numerals, similar to how it might be done in Prolog:
natnum(0). natnum(s(N)) :- natnum(N).I have no idea if this is actually the case with early Lisp, but when I was making my own vaguely Lisp-alike interpreter, it became awfully tempting to make everything into a list. After all, arrays are lists, strings are lists of characters, and even structs can be thought of as a list with unorthodox indexing.
At one point I was faced with the fact that I only really had 2 datatypes in my interpreter, the list and the integer. So I bit the bullet and made my integer type into a list with one byte entry, thus reducing my total amount of datatypes to one. Every piece of my interpreter code always knew that it would get a list as inputs, and it was kinda beautiful. CPU-wise it was wasteful, but it reduced the interpreter size by a lot.
I believe the microprocessor from Steele's and Sussman's paper Design of a LISP-based microprocessor [1] used Peano numbers.
Under 10. Discussion they discuss the lack of numbers and how they're using CONS as a kind of 'add 1' and CAR/CDR as a kind of 'subtract 1' when calculating positions in memory, similar to how integer index/pointers might be used on a different type of machine, as if the length of a list would be an implicit pointer.
They seem almost giddy about having designed a computer that can perform interesting mathematical tasks like integrals and derivatives without numbers.
What Lisp would that be? Considering it is not (as far as I can see) mentioned in McCarthys History of Lisp (https://dl.acm.org/doi/pdf/10.1145/800025.1198360) I think no actual Lisp used unary arithmetic based on list lengths.
Sure, there might be toy examples used for teaching that did this, but that is a something different.
Reminds me of Tom 7's NaN computer: http://tom7.org/nand/
This is quite cool and a reminder that SIGBOVIK is happening again next week
This could be used for a really good obfuscation.
There are quite simple ways to reverse this kind of obfuscation though, like shown here: https://steakenthusiast.github.io/2022/06/14/Deobfuscating-J...
as the sibling thread already mentioned. It is pretty much just doing what JSFuck is doing. Before we go all crazy and start transpiling all our JS to this, be aware running code obfuscation like that will have performance penalties.
> be aware running code obfuscation like that will have performance penalties.
Both computationally and societally.