Settings

Theme

A stochastic method to generate the Sierpinski triangle

github.com

110 points by wenderen 4 years ago · 63 comments

Reader

klntsky 4 years ago

I played with a similar approach a while ago. By changing the algorithm a bit it's possible to create various other beautiful images:

https://drive.google.com/drive/folders/1TSgLRdO0tKIFUb6Oj8yN...

Here's a link to my post (unfortunately, in Russian) describing the process: https://t.me/little_pieces/822?single

A quick translation:

1. We have a 2D space and a regular shape with N angles, where N is a parameter.

2. Choose a function f : R -> R (e.g. f(x) = x / 2 for Sierpinsky).

3. Choose a random "current" point

4. Choose a random angular point of the N-shape

5. Calculate the distance D between two points. Move the current point to the selected angle point by distance calculated as f(D).

6. Repeat steps 4-5, saving position of the current point on each step, until there is enough data to draw a density map

Another person from gen-club[0] (Russian telegram chat dedicated to generative art) made a 3D-version of it[1][2]:

[0] https://t.me/gen_c

[1] https://t.me/mathimages/156

[2] https://codepen.io/strangerintheq/full/ZEKXamr (zoom out at startup)

arutar 4 years ago

Here's a quick proof as to how it works. Suppose the triangle has endpoints (0,0), (1,0), and (1/2, sqrt(3)/2). Given a point (x,y), the three transformations then become

f1(x,y) = (x/2, y/2)

f2(x,y) = (x/2+1/2, y/2)

f3(x,y) = (x/2 + 1/4, y/2 + sqrt(3)/4)

which are precisely the maps in the Iterated Function System [1] which generate the Sierpinsky triangle!

It's a very nice observation that the three maps have a concise representation in terms of taking the midpoint with the given point and the vertices of the triangle.

[1] https://en.wikipedia.org/wiki/Iterated_function_system

  • codeflo 4 years ago

    I'm sorry, but that gives me "draw the rest of the fucking owl" vibes. You're basically restating the algorithm's definition -- it's the rest of the proof that contains the interesting steps.

    Edit: In my view, to show that this draws the Sierpinsky triangle, one would need to show that (1) we only draw points that are in the Sierpinski triangle and (2) we draw all the points of the Sierpinski triangle.

    (1) is clearly false (we start with a random point), but the claim is of course only that we draw an “approximation”. What that means exactly would need to be defined. I assume a rigorous argument would involve 2D probability densities. Given that, (2) isn’t obvious to me as well, what’s the argument that all the parts of the Sierpinski shape correspond to areas with high probability density?

    • scotty79 4 years ago

      I really don't see how any proof is necessary.

      Taking midpoint between a point and tringle vertex is a transformation that scales everything by 1/2 with this vertex as pivot.

      By choosing one of such three transformations randomly you are scaling the whole triangle into three smaller copies of itself that it cosists of.

      If you start with a point belonging to Sierpinski trinagle, you are adding more and more points belonging to that triangle.

      Fun thing is that you can use the same algorithm with different number and kind of transformations to get other fractals with such random walk. For exaple a fractal tree or fern leaf or Sierpinski carpet.

      Another interesting thing is you can start with a point not belonging to a fractal and it pretty quickly coverges. It's because fractals are attractors.

      • codeflo 4 years ago

        But you don’t start with a point belonging to the triangle, that’s the whole idea!

        • gus_massa 4 years ago

          There are probably many better explanations out there, but I'll try one:

          It's important t note that if you pick any point p in the Sierpinsky triangle and you calculate f1(p), f2(p) and f3(p) you get tree points that are also in the Sierpinsky triangle. [f1, f2, f3 as defined in the top comment]

          ---

          You pick any point in the plane, let's call it q, and you draw it in the screen.

          You pick any point in the Sierpinsky triangle, let's call it p, p is a secret point.

          The important number is the distance between p and q. You could have pick a better point in the Sierpinsky triangle but anyone you have choose is fine.

          Now you select one of the functions f1, f2, f3, let's call it f. And you calculate p'=f(p) and q'=f(q) (the same functions for both).

          Now you have two new points p' and q'. p' is a again a point in the Sierpinsky triangle as I said at the beginning. q' is a point that you draw in the screen.

          Again, the important number is the distance between p' and q'. What happens if you compare the distance between p' and q' with the initial distance between p an q? The new distance is one half of the initial distance, so q' is closer to p'. You could have pick a better initial point p in the Sierpinsky triangle or pick now a better point p*in the Sierpinsky triangle but don't worry, p' is fine.

          And now you repeat the process with cut&paste...

          You select one of the functions f1, f2, f3, let's call it f. It may be the same functions as before or another one. You calculate p''=f(p') and q''=f(q') (the same functions for both).

          Now you have two new points p'' and q''. p'' is a again a point in the Sierpinsky triangle as I said at the beginning. q'' is a point that you draw in the screen.

          Again, the important number is the distance between p'' and q''. What happens if you compare the distance between p'' and q'' with the previos distance between p' and q' and the initial distance between p an q? The new distance is one half of the previous distance and one quarter of the initial distance, so q'' is closer to p''. You could have pick a better initial point p in the Sierpinsky triangle or pick now a better point p**in the Sierpinsky triangle but don't worry, p'' is fine.

          And now you repeat the process with cut&paste...

          ...

          After enough halvings, the distance between p'''''...'''' and q'''''...'''' is smaller that any level of precision you select initially, like 1E-15. If you need more precision because you are using a high zoom level, just add more steps and more halvings.

          Now you have a sequence of points q, q', q'', q''', q'''', ... that are drawn in the screen and p, p', p'', p''', p'''', ... that are only in your imagination.

          Except a few of the initial points, all the other points q'''''...'''' are very close to the Sierpinsky triangle. One trick is to not draw the initial 1000 points, so you don't get some points in weird locations. You should prove that 1000 is enough, or use a bigger number or just cross your fingers and hope nobody notices them.

          Note that this method actually doesn't draw the Sierpinsky triangle. It just draw a cloud of points that very close to the Sierpinsky triangle. (Unless you are very lucky, and the initial point q is also in the Sierpinsky triangle or something like that.)

        • scotty79 4 years ago

          Yeah, it's easier if you don't start with a random point in a triangle (it doesn't even have to be in a triangle) but with one of the vertices.

          To understand why you can start with any point you'd have to understand why fractals are attractors and that's some serious math I presume, with epsilons and quantifiers.

          • floatingatoll 4 years ago

            Tell us some more serious math! You’re doing great so far and that’s where the “huh?” ends up answered.

            • klyrs 4 years ago

              The point here is that you'll almost certainly end up inside the Sierpinski triangle within a logarithmic number of steps.

              Let's do the 1-dimensional version, the Cantor set.

                init_x = (anything)
                while 1:
                  draw(x)
                  if flip_coin() == HEADS:
                     x = (x+1)/2
                  else:
                     x = x/2
              
              Supposing we start with a large negative number N, and our coin is secretly TAILS on both sides, our sequence will be

                N, N/2, N/4, N/8, N/16...
              
              So we'll rapidly converge on the point 0, but never quite make it into the interval [0, 1] because N is negative. But, if we draw our point with a diameter of say, 1/128, it will rapidly become visually indistinguishable from zero.

              If the coin is fair, then something really cool happens -- if we hit a HEADS and x was already in the interval [-1, 0] then our result will be in the interval [0, 1/2] -- which is inside the interval [0, 1] where the problem is easier to think about. Approaching the problem in this way requires a bunch of case analysis, but the gist is that an attractor will quickly suck any point into the fractal -- if you get really unlucky, it will only get very close to the fractal in question.

            • scotty79 4 years ago

              I'd be happy to tell you if I actually knew it. I just know it exists. Sorry if I gave the wrong impression.

              If I were to look for it I think I'd search for proof that Iterated Function System has a fixed point or attractor https://en.m.wikipedia.org/wiki/Iterated_function_system

        • jheriko 4 years ago

          the algorithm ensures that the point is approximately in the triangle. using the midpoint with one of the vertices, excludes the triangle in the middle...

    • CrazyStat 4 years ago

      Here's a sketch of a proof using tools from Markov chain monte carlo. Writing on mobile so forgive any typos.

      Suppose you start by drawing a point uniformly at random from the triangle. We'll call this step 0. With probability 1 this point is not in the sierpinski triangle, but if we consider finite approximations of the sierpinsky triangle it is in the level 0 sierpinski triangle (see this random illustration I found on Google [1]). In particular, since the Level 0 sierpinski triangle is just the whole triangle, the initial point is uniformly distributed in the level 0 sierpinski triangle.

      This is a base case. Then we can do an induction step: If we have a point which is uniformly distributed in the level k sierpinski triangle, applying the transformation (pick one vertex of the triangle at random and move halfway towards it) results in a point which is uniformly distributed in the level k+1 sierpinski triangle. This is because each corner of the level k+1 sierpinski triangle is a scaled copy of the level k sierpinski triangle.

      Putting the base case and induction step together, we deduce that when we repeatedly apply the transformation of picking a vertex and moving halfway towards it, at each step k we have a point which is uniformly distributed in the level k sierpinski triangle.

      This is enough to show that if we start with a sample of N points drawn uniformly from the triangle, after k steps we'll have N points drawn uniformly from the level k sierpinski triangle (i.e. an arbitrarily good approximation of the sierpinski triangle, if we make k large enough). It's not quite enough to show that if we start with a single point and record the location after each step we'll end up with something that looks like the sierpinski triangle. For that we need to establish ergodicity of the Markov chain. The details are technical but not difficult to show; basically the Markov chain needs to not get caught in fixed-length loops (aperiodicity) and never get stuck unable to reach part of the triangle (irreducibility). Both these conditions are satisfied, so the Markov chain is ergodic and starting from a single point will approximate a sierpinski triangle.

      I've glossed over a bunch of details but hopefully that shows the basic idea.

      [1] https://www.researchgate.net/profile/Ali-Bashir-4/publicatio...

    • gus_massa 4 years ago

      The rest of the proof is in the Wikipedia page linked by the GP. It even has a description of the method used by the OP:

      > The most common algorithm to compute IFS fractals is called the "chaos game". It consists of picking a random point in the plane, then iteratively applying one of the functions chosen at random from the function system to transform the point to get a next point. An alternative algorithm is to generate each possible sequence of functions up to a given maximum length, and then to plot the results of applying each of these sequences of functions to an initial point or shape.

      • codeflo 4 years ago

        Are we looking at a different version of the page? I see descriptions and definitions, but no proof.

    • reedf1 4 years ago

      You're reaching a classic physics/mathematical limit that arrises from asking "why" questions. To understand this concept further you need to understand concepts from chaos theory, specifically attractors, but it may be difficult to explore further if you are not then satisfied.

    • arutar 4 years ago

      This is fair - there are a decent number of details omitted here. However, a lot of the details are hidden inside the proof that an IFS has a unique fixed point. At least to me, this "proof" is nice because it converts a non-obvious procedure (taking midpoints) into a "standard" procedure (the IFS construction).

      A sketch of the argument in the IFS goes like this: consider the set of maps {f1, f2, f3} as acting on compact subsets of R^2 by the action F -> f1(F) U f2(F) U f3(F). Since the maps fi are continuous, the image is a compact set. But one can also prove that this action is a contraction mapping on the space of compact subsets of R^2 equipped with Hausdorff metric [1], and appealing to the Banach fixed point theorem [2],

      (1) there is a unique compact set K fixed by this action, and

      (2) the images of F converge "geometrically fast" to the set K

      In other words, there is some constant 0 < r < 1 such that after k steps of this algorithm (not the random process), the distance from the k^th approximation to the Sierpinsky triangle is at most r^k

      Actual convergence rates of the corresponding random process are more complicated. One of my colleagues recently wrote a paper on convergence rates of the "chaos game" (essentially what this is doing), and one can get really sharp information on how fast the random process converges [3]. However, this uses some non-trivial covering time theory for Markov processes. It's not too complex to follow, but definitely way more detail than I can write up in this note here.

      [1] https://en.wikipedia.org/wiki/Hausdorff_distance

      [2] https://en.wikipedia.org/wiki/Banach_fixed-point_theorem

      [3] https://arxiv.org/abs/2007.11517

      Edit: To see that the points converge to the Sierpinsky triangle (with no quantitative information) is a "bit less challenging" (i.e. within the realm of "standard" material - not necessarily easier!) - you can essentially reduce this problem to an application of the Birkhoff ergodic theorem [4]. Firstly, we can consider this process as a random map

      pi: {1,2,3}^{\mathbb{N}} -> R^2

      which is defined by taking finite subsequences (i1, i2, ..., ik) and mapping them to the images fi1(fi2( ... (fik(x_0))), and taking the limit in k, for some fixed starting point x0.

      Then the "apply map fi" can be interpreted as looking at pi(sigma(i1,i2,...)) where sigma is the left-shift map sigma(i1, i2, ...) = (i2, i3, ...). But the shift map plays very nicely with the measure on the sequence space (the Bernoulli measure), in the sense that the Birkhoff ergodic theorem can be applied. This gives that for "typical" sequences of random function applications, this process will converge to the Sierpinsky triangle.

      Here's another way to see it: let's say I apply maps fi1, fi2, ..., fik to my starting point x0. Then if we code the "level k" approximations to the Sierpinsky triangle in the same way [triangle (i1, ..., ik) = fi1(...(fik(initial triangle)))], the image of the point x0 will lie in triangle (i1, ..., ik). But these triangles, for large k, approximate the Sierpinsky triangle. Some more argumentation is required to deal with the case when the starting point does not sit in the Sierpinsky triangle, but this isn't a problem because of the fast convergence result explained above (if you wait a long time, all the points will be "very close"). The above ergodic theory argument is just showing that every possible finite subsequence will show up for "typical" random function applications.

      [4] https://en.wikipedia.org/wiki/Ergodic_theory

  • JJMcJ 4 years ago

    Michael Barnsley has two books that discuss this in greater detail.

    https://maths.anu.edu.au/people/academics/michael-barnsley

    and

    http://superfractals.com/wpfiles/

  • guimplen 4 years ago

    Like it. The theory of the iterated function systems beautifully utilizes the Banach fixed-point theorem.

user-the-name 4 years ago

"Pick a random vertex of the triangle and find its midpoint with p" is actually "Take the whole triangle, and scale it down to half size, and place it so that it shares a random vertex". This is the basic structure of the Sierpinski triangle, so repeating this process creates the fractal shape of the Sierpinski.

  • asxd 4 years ago

    Thanks. I wasn't quite sure what "pick a random vertex of the triangle and find its midpoint with p" meant. What's the midpoint of a vertex?

a_e_k 4 years ago

I think this is basically the "chaos game."

https://en.wikipedia.org/wiki/Chaos_game

frob 4 years ago

I believe this is also the algorithm included in the TI-83 (plus) manual: https://www.manualowl.com/m/Texas%20Instruments/TI-83-Plus/M...

I remember sitting on the floor of our living room programming this in character by character on that TI keyboard. It's the first BASIC program I ever wrote. Everything before that was js and html.

  • mawise 4 years ago

    I have fond memories of learning to program on the TI-83 and following that same example!

_archon_ 4 years ago

I thought I'd note that one of my favorite pages on the web pertains to exploring Sierpinski triangles [0]

"The sierpinski triangle page to end most sierpinski triangle pages ™"

[0]:http://www.oftenpaper.net/sierpinski.htm

daneel_w 4 years ago

The method presented by the author is the only method I've ever known. It was both simple enough for me to understand as a kid, and fast enough that I could plot the results within a minute using HiSoft BASIC on my Amiga 500.

Just to add the curiosum, the method as it was explained to me as a kid: "Plot a pixel on any corner of the triangle. Pick a new corner on random and plot a pixel halfway between there and where your last pixel was, then just repeat."

  • marcodiego 4 years ago

    > The method presented by the author is the only method I've ever known.

    The recursive implementation always seemed more intuitive for me.

  • kragen 4 years ago

    There are an astonishing number of ways to generate the Sierpinski triangle:

    1. The Chaos Game, as described in this post.

    2. Other ways of rendering the IFS; for example, iterating over the possibility tree of transforming a single point 10 times and plotting the 59049 leaves, or searching over the tree of inverse transformations from a given pixel to some depth like 6 and coloring the pixel with the length of the longest chain that doesn't blow up. It's a very well behaved IFS, so even methods that have trouble with some IFSs will do fine.

    3. Coloring the odd numbers in Pascal's Triangle, aka Yang Hui's Triangle or the triangle from मेरु प्रस्तार.

    4. As a generalization, plotting histories of an astonishingly wide range of 1-D cellular automata, starting with a single "live" cell. For example, the totalistic rule where a cell is live iff exactly 1 of itself and its neighbors were alive (rule 14). About a third of the 256 2-state neighborhood-3 CAs like this are some deformation of the Triangle IIRC.

    5. A surprisingly large variety of L-systems also generate the Triangle. In http://canonical.org/~kragen/laserboot/cut-6 I used, I think, F = !F!-F-!F!, where ! swaps left and right. An interesting thing about this curve in particular is that it avoids self-intersections, which is what makes it somewhat suitable for laser cutting.

    6. And of course you can start with a solid triangle, cut a hole in its middle to make a triforce, and then recursively do the same for each of the resulting three remaining triangles.

    7. For use as a calendar, http://canonical.org/~kragen/sw/dev3/siercal I generated a tour of the triangle with a non-L-system method, just subdividing the triangle recursively.

    8. It's also interesting to note that the state space of the Towers of Hanoi puzzle has the shape of the Sierpinski Triangle. So a suitable 2D graph layout algorithm (all edges equal length, maximizing total distance from the center) applied to the state space graph ought to generate it. This is a little handwavy, though.

    9. I'd forgotten this, but as John D. Cook points out in https://www.johndcook.com/blog/2019/11/26/fractal-via-bit-tw..., the pixels where the bitwise AND of the X and Y coordinates is zero form a distorted Sierpinski Triangle.

    10. Cook points out in https://www.johndcook.com/blog/2019/10/19/binary-surprise/ that the numbers of sides in the regular n-gons constructible with compass and straightedge, when written down in binary, also form the Sierpinski Triangle.

pfedak 4 years ago

As others have pointed out, a generalization of this method (iterated function systems) can generate a large class of fractals. There's a some fun software for generating these ( see https://en.wikipedia.org/wiki/Fractal_flame) which also adds coloring based on the sequence of function applications, resulting in a pattern that respects the fractal structure.

Going a step further, changing the "functions" over time can create smoothly animated fractals. Electric Sheep (https://electricsheep.org/) is a screensaver version of this which doubles as a distributed computing network to generate them. It also mixes in non-linear transformations which makes for some impressive, mesmerizing patterns.

baliex 4 years ago

You’ll find some more insight from Numberphile here, https://youtu.be/kbKtFN71Lfs

arunchaganty 4 years ago

The chaos game is one of my favorite things! I first came across it nearly 15 years ago from James Gleick's excellent book Chaos and in fact I learned how to program so that I could actually implement the chaos game.

After spending many years idly reflecting on why the chaos game converged to the Sierpinski triangle, I'd like to share an illustrated proof of mine that might be enjoyed by this audience: https://arun.chagantys.org/technical/2020/04/28/chaos-game.h...

IngoBlechschmid 4 years ago

Here is a version which can be played by up to three persons sharing a keyboard:

https://www.speicherleck.de/iblech/stuff/chaosspiel-2015/cha...

Developed as part of a mathematics day for children, the worksheet is available here: https://www.speicherleck.de/iblech/stuff/chaosspiel-2015/arb...

paol 4 years ago

This is how I learned to draw the Sierpinsky triangle, I don't even know the analytical method. I got this method from fractint[0] I think, it had great docs explaining some of the fractal types. This was sometime in the mid 90's.

[0] https://en.wikipedia.org/wiki/Fractint

EDIT: Also, as a bonus this method is very easy to do with a pen and paper. I remember trying that, but it takes a lot of points before the structure starts to become visible.

  • Sharlin 4 years ago

    Well, the standard nonstochastic way is the obvious recursive subdivision: given a triangle, find the middle points of the edges to subdivide it into four smaller triangles, then continue recursively for the new triangles except the center one. Once you hit the recursion limit, simply draw a triangle.

londons_explore 4 years ago

This is what got me into programming... I had fiddled with Print "My Name Is Bob" in an infinite loop, and decided that there was nothing interesting programming could do...

Then a relative wrote this algorithm in about 10 lines of qbasic. I was amazed that such simple code could do something so complex and unexpected. As a 5 yr old child, I spent hours trying to figure out how on earth this algorithm worked - even though it's blatantly obvious to adult me.

Jyaif 4 years ago

Kudos to whoever included it in the TI82's manual, it successfully tickled my curiosity when I was a kid in the 90s and played a part in me becoming a programmer.

https://www.manualslib.com/manual/325928/Ti-Ti-82.html?page=...

aimor 4 years ago

It's funny, so each point is only guaranteed to be in that iteration of the triangle. So if you want a sierpinski triangle accurate to n levels deep you want to throw away the first n-1 points. If you want to plot a specific sierpinski triangle you have to re-run the algorithm from the beginning to get more points.

ralferoo 4 years ago

This isn't new. I first learned about Sierpinski triangles with this algortihm in the early 90s. Later, I learned how it can be generalised to IFS and opened my eyes to much more interesting examples. I think there's even an entire chapter on this specific algorithm in Barnsley's Fractals Everywhere book.

sizediterable 4 years ago

I remember there being a section demonstrating this in the manual for my TI-83 calculator [0]

It prompted me to learn a bit more about IFS and discover one of my favorite fractals ever. The word "chaos" where each stroke is made up of the word "chaos" [1]. I spent a bunch of time coding up the fractals on that site [2] on my calculator

[0] http://manualsdump.com/en/manuals/texas_instruments-ti-83/13...

[1] http://paulbourke.net/fractals/ifs/chaos_1.gif

[2] http://paulbourke.net/fractals/ifs/

Nydhal 4 years ago

You can make it using Cellular Automata too.

Python Code:https://nbviewer.org/github/Nydhal/Python-Notebooks/blob/mas...

jankovicsandras 4 years ago

Cool. Shameless plug:

Sierpinski carpet L-system

677 bytes will render a Sierpinski carpet using an L-system to create a Z-order curve :

https://github.com/jankovicsandras/misc

RonInDune 4 years ago

The Sierpinski triangle was the first "complex" system I had managed to draw, using the Logo programming language in the 90s. I recently tried to do a vtk 3D extension (Sierpinski cube)[1] using the same process as in the OP, but it was surprisingly computationally expensive!

[1] https://en.wikipedia.org/wiki/Menger_sponge

ArtWomb 4 years ago

Love comp prob & geom ;) Coolest part is the algo to generate uniform random points bounded within any triangle. I feel like its useful in loads of applications: FEM, monte carlo rendering, etc.

https://blogs.sas.com/content/iml/2020/10/19/random-points-i...

jventura 4 years ago

I did the same thing, but in Python, some years ago [1]. Never understood how it works, but it's easy to implement..

[1] https://joaoventura.net/blog/2016/sierpinski-triangle/

rileyphone 4 years ago

Also rule 90 tends to generate them from the nil seed - try clicking on the banner I made and set window size to something even and the seed to zero.

https://rileystew.art/posts/banner

  • hasmanean 4 years ago

    When I was in high school my teacher told me a trick…take Pascal’s triangle and colour the even numbers black and the odd numbers white. You’ll end up with a Sierpinski triangle.

    Since Each row of Pascal’s triangle is generated from the row just above it…

    P(x,y) = P(x,y-1) + P(x-1,y-1)

    and since for any even or odd numbers …

    even+even=even, odd+odd=e, o+e=o

    so really you can compute the sign of any item in Pascal’s triangle by just checking the sign of the two numbers above it and xoring them.

    So you can compute each pixel S(x,y) = xor( S(x,y-1), S(x-1,y-1)

    Where the initial row is all 0 with a single 1 in it.

    I wrote a program on the Mac II in high school to compute this…it was quite a lot of fun. The process of going from integer to floating point to large integer to single bits was instructive.

richm44 4 years ago

Here's a version of the same as a BBC Microbot tweet https://twitter.com/bbcmicrobot/status/1333542986588753921

londons_explore 4 years ago

Why it works:

Imagine what would be required to put a dot in an area that ought to be blank... The previous point would have had to have been outside the triangle, or in another blank area at the previous step.

Hence, it is impossible to draw in the blank areas.

  • guimplen 4 years ago

    I think your argument only shows that k-th point lies inside the k-th approximation of the Sierpinsky triangle, but says nothing abiut the fact that the final picture loooks like the Sierpisnsky triangle.

RantyDave 4 years ago

Could you write a function so that for point (x,y) it returns true if the point is "in" the spierpinski triangle? Yes, I am wondering if it could be expressed as a fragment shader :)

punnerud 4 years ago

Video with here: https://imgur.com/1Svgp0G

(From the repo)

ssm008 4 years ago

How lovely! I thoroughly enjoyed this

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection