Settings

Theme

Psychedelic Graphics 0: Introduction

benpence.com

306 points by tasshin a year ago · 72 comments

Reader

dtristram a year ago

Hi, David Tristram here. founding member of Raster Masters, 1990's computer graphics performance ensemble. As @hopkins has mentioned, we used high end Silicon Graphics workstations to create synthetic imagery to accompany live music, including notably the Grateful Dead, Herbie Hancock, and Graham Nash.

After many iterations I'm currently working mainly in 2D video processing environments, Resolume Avenue and TouchDesigner. The links here are inspiring, thanks for posting.

satyarthms a year ago

If anyone wants to play around with psychedelic graphics without going too low-level, [hydra](https://hydra.ojack.xyz/) is a cool javascript based livecoding environment with a gentle learning curve.

  • jerjerjer a year ago

    Is there anything which supports music input? I liked Winamp era visualizers, but the art seems to be dead today.

    • joenot443 a year ago

      I've been working on an free open-source macOS app for just that - https://nottawa.app Hoping to release in the next couple months!

      The UI has been greatly improved since I took the original demo on the site, the real thing is MUCH better now. Same base idea - chain together shaders, videos, or webcams and then drive their parameters via an audio signal, BPM, oscillator, MIDI board, or manual sliders.

      The beta link on the site isn't really worth trying yet - if you're interested in getting on the TestFlight just shoot me a message at joe@nottawa.app. Would love some HN feedback :)

      • jerjerjer a year ago

        > I've been working on an free open-source

        May I ask where are the sources? Looks great, any plans for Windows or Linux (Docker) version?

        • joenot443 a year ago

          Sure! It should have been linked on the site, that’s totally my oversight.

          https://github.com/joenot443/dude_wake_up

          The code isn’t anything to write home about, it’s in C++ leveraging OpenFrameworks and OpenGL. I’m an iOS and macOS dev, but after the initial release I’ll get started on porting to Windows and Linux. OF generally works well multi-platform so I’m hoping it won’t be too hairy.

          I’m specifically targeting the non-technical artist/creator market, ideally with optional macOS App Store distribution. I’ve been involved in the live visuals scene in NYC a bit and something I commonly heard was that musicians and DJs wanted visual accompaniment which just works out of the box. TouchDesigner etc are incredibly powerful, but generally out of reach for non technical folks.

          I’ve contracted a great artist from UpWork who’s been making presets which will be included. There should ideally be as little friction as possible for a user to go from first launch to live, audio-reactive visuals.

          Thanks for checking it out! :)

    • satyarthms a year ago

      Hydra actually works well with music input! It grabs audio from the mic and `a.show()` will show you the frequency bins. Then any numerical parameter can be modulated by the intensity of a bin, for example:

      `noise().thresh(()=>a.fft[0]*2).out()`

      • jerjerjer a year ago

        > It grabs audio from the mic

        Is it possible to grab from default audio output device instead of mic? Probably not as it's browser based. I suppose mic can be faked on OS level somehow.

    • progmetaldev a year ago

      I used to spend so much time messing around with MilkDrop in Winamp. You could grab existing visualizations and see what they were doing, and make your own edits. Thanks for the nostalgia hit!

    • jcelerier a year ago

      You can do that easily with https://ossia.io :)

  • leptons a year ago

    There's a lot of examples of using javascript for "psychedelic graphics" on dwitter.net

dtristram a year ago

Regarding the OP doc and UV coordinates. A major area of investigation for us back in the day was finding interesting ways to displace the uv texture coordinates for each corner of the rectangular mesh. We used per-vertex colors, these days one would use a fragment (pixel) shader like those in ShaderToy.

A very interesting process displaces the texture coordinates by advecting them along a flow field. Use any 2D vector field and apply displacement to each coordinate iteratively. Even inaccurate explicit methods give good results.

After the coordinates have been distorted to a far distance, the image becomes unrecognizable. A simple hack is to have a "restore" force applied to the coordinates, and they spring back to their original position, like flattening a piece of mirroring foil.

Just now I am using feedback along with these displacement effects. Very small displacements applied iteratively result in motion that looks quite a bit like fluid flow.

  • DonHopkins a year ago

    That was how Jeremy Huxtable (inventor of the original NeWS "Big Brother" Eyes that inspired XEyes) PostScript "melt" worked: choose a random rectangle, blit it with a random offset, lather, rinse, repeat, showing how by repeating a very digital square, sharp, angular effect, with a little randomness (dithering), you get a nice smooth organic effect -- this worked fine in black and white too of course -- it's just PostScript:

    https://www.donhopkins.com/home/archive/news-tape/fun/melt/m...

        %!
        %
        % Date: Tue, 26 Jul 88 21:25:03 EDT
        % To: NeWS-makers@brillig.umd.edu
        % Subject: NeWS meltdown
        % From: eagle!icdoc!Ist!jh@ucbvax.Berkeley.EDU  (Jeremy Huxtable)
        % 
        % I thought it was time one of these appeared as well....
    
        % NeWS screen meltdown
        %
        % Jeremy Huxtable
        %
        % Mon Jul 25 17:36:06 BST 1988
    
        % The procedure "melt" implements the ever-popular screen meltdown feature.
    
        /melt {
            3 dict begin
            /c framebuffer newcanvas def
            framebuffer setcanvas clippath c reshapecanvas
            clippath pathbbox /height exch def /width exch def pop pop
            c /Transparent true put
            c /Mapped true put
            c setcanvas
    
            1 1 1000 {
                pop
                random 800 mul
                random 600 mul
                random width 3 index sub mul
                random height 2 index sub mul
                4 2 roll
                rectpath
                0
                random -5 mul
                copyarea
                pause
            } for
    
            framebuffer setcanvas
            c /Mapped false put
            /c null def
            end
        } def
    
        melt
    
    Here's Jeremy's original "Big Brother" eye.ps, that was the quintessential demo of round NeWS Eyeball windows:

    https://www.donhopkins.com/home/archive/news-tape/fun/eye/ey...

    • interroboink a year ago

      Are these animated, or somesuch?

      I tried naïvely using `ps2pdf` (Ghostscript), but got errors on both of them. I guess they're meant to be consumed by some other sort of system?

      • DonHopkins a year ago

        Oh sorry I didn't explain: they're interactive PostScript scripts for the NeWS window system, so they don't actually print, they animate on the screen! The "pause" yields the light weight PostScript thread and lets the rest of the window system tasks run, and NeWS had an object oriented programming system that was used to implement the user interface toolkit, window managements, interactive from ends, and even entire applications written in object oriented PostSCript. NeWS is long obsolete, but you can run it in a Sun emulator!

        https://en.wikipedia.org/wiki/NeWS

        For example, here's a heavily commented demo application called PizzaTool:

        https://donhopkins.medium.com/the-story-of-sun-microsystems-...

        Source code:

        https://www.donhopkins.com/home/archive/NeWS/pizzatool.txt

        It uses an iterated feedback pixel warping technique kind of like melt.ps, to spin the pizza rotationally, which melts the cheese and pizza toppings, instead of melting the screen by simply blitting random rectangles vertically like melt.ps -- note the randomization of the rotation to "dither" the rotation and smooth out the artifacts you'd get by always rotating it exactly the same amount:

          % Spin the pizza around a bit.
          %
          /Spin { % - => -
            gsave
              /size self send    % w h
              2 div exch 2 div exch   % w/2 h/2
              2 copy translate
              SpinAngle random add rotate
              neg exch neg exch translate  %
              self imagecanvas
             grestore
          } def
        
        It animates rotating a bitmap around its center again and again as fast as you "spin" it with the mouse, plus a little jitter, so the jaggies of the rotation (not anti-aliased, 8 bit pixels, nearest neighbor sampling) give it a "cooked" effect!

        It measures the size of the pizza canvas, translates to the center, rotates around the middle, then translates back to the corner of the image, then blits it with rotation and clipping to the round pizza window.

  • DonHopkins a year ago

    Aaaah, remember the simple directly manipulative pleasures of Kai Power Goo:

    LGR: Kai's Power Goo – Classic 90s Funware for PC!

    https://www.youtube.com/watch?v=xt06OSIQ0PE

AndrewStephens a year ago

I love how easy it is to write shaders that operate on images in HTML. My skills in this area are mediocre but I love seeing how far people can take it. Even providing a simple approximation of a depth map can really make the results interesting.

Some years ago I did a similar project to smoothly crossfade (with "interesting effects") between images using some of the same techniques. My writeup (and a demo):

https://sheep.horse/2017/9/crossfading_photos_with_webgl_-_b...

coffeecantcode a year ago

I’ll be honest I’m far more interested in the rolling hills article that accompanies this one.

Specifically about halfway through the process and applying:

uv.x = uv.x + sin(time + uv.x * 30.0) * 0.02; uv.y = uv.y + sin(time + uv.y * 30.0) * 0.02;

to the static image. Having experienced a range of psychedelic experiences in my life this appears to be the closest visually with the real thing, at least at low, non-heroic, doses. Maybe slow the waves down and lessen the range of motion a bit.

Note: I am far more interested in replicating the visual hallucinations induced by psychedelic compounds than by making cool visuals for concerts and shows, utmost respect for both sets of artists though.

There is an artist (and I’m sure many more) who does a fantastic job with psychedelic visuals using fully modern stacks to edit, unfortunately their account name entirely escapes me. I’ll comment below if I find it.

The comparison that I would make with this portion of the Rolling Hills article would be the mushroom tea scene from Midsommar, specifically with the tree bark. The effect of objects “breathing” and flowing is such a unique visual and I love to see artists accomplishing it in different ways.

  • progmetaldev a year ago

    It's probably not who you were talking about, but this account on YouTube does a good job of representing the visual experience, while also talking about other effects. The videos looking at nature, and the way the visuals start to form geometric patterns, and that "breathing" effect are powerful. The author covers various substances, and how the effects can be minor (slight "breathing" or pulsing of surfaces), to full geometric "worlds" (such as from DMT - although I've never dipped into that substance).

    https://www.youtube.com/@josikinz

    • coffeecantcode a year ago

      That is not who I had in mind but after looking through their account I’m going to binge their videos, very cool stuff. I always found that studying the minute differences in these substances is such a genuinely interesting topic. It’s covered a lot in Mike Jay’s Psychonauts.

cancerhacker a year ago

Early 90s, Todd Rundgren realized a Mac App called Flowfazer - it didn’t simulate your experience but was helpful as a distraction to move you along. Some people used it to provide guidance for their own creations.[2]

[1] https://grokware.com/ [2] https://m.youtube.com/watch?v=3Z4X4FmIhIw

It was a time of screensavers and palette animation.

brotchie a year ago

If this is your kind of thing and you ever get a chance to see the musical artist Tipper alongside Fractaled Visions driving the visuals, you’re in for a treat.

Most spot on visual depictions of psychedelic artifacts I’ve witnessed.

Saw them together last year and it’s the no. 1 artistic experience of my life. The richness, and complexity of Fractaled Vision’s visuals are almost unbelievable.

Even knowing a lot about shader programming, etc. some of the effects I was like “wtf how did he do that”.

Here’s the set, doesn’t fully capture the experience, but gives a feel: Seeing this in 4k at 60fps was next level.

https://youtu.be/qMcqw12-eSk?si=R5mCaIbR01w3Tbyv

trollied a year ago

This needs a link to shadertoy https://www.shadertoy.com

cess11 a year ago

Reminds me of an old Flash classic in this area, Flashback.swf. Here's a video render of it: https://m.youtube.com/watch?v=KaSqrx93rS0

  • progmetaldev a year ago

    This video (back in the Flash days) is how I discovered the electronic group Shpongle. Their remix of Divine Moments of Truth is used in this animation. I believe the version is the "Russian Bootleg" version. I had been into electronic music before this, but this genre of electronic really blew my mind when I heard it.

tylertyler a year ago

I've been writing webgl shaders at work this week and noodling with the details to make things look like physical camera effects but occasionally I'll get something wrong and see results that look similar to the stuff in this article and I have to say it is just so much more fun than the standard image effects.

Sure there might be limited use cases for it visually but playing with the models we've built up around how graphics in computers work are a great way to learn about the each one of these systems. Not just graphics but fundamental math in programming, how GPUs work and their connection to memory and CPUs, how our eyes work, how to handle animation/time, and so on.

mwfogleman a year ago

Here's a music video the OP and I made with these techniques: https://www.youtube.com/watch?v=5GOciie5Pjk

alanbernstein a year ago

This might have been written just for me, I love the premise.

I am truly fascinated by people who attempt to reproduce the actual physiological vision effects of psychedelic drugs.

Psychoactive drugs can be probes into the inner workings of our minds - in some scientific sense - and exploring the vision effects seems likely to suggest interesting things about how our visual system works.

Mostly, I am just impressed when anyone is able to capture the visual experience in graphical effects, with any level of realism.

  • caseyohara a year ago

    > Mostly, I am just impressed when anyone is able to capture the visual experience in graphical effects, with any level of realism.

    I have to say that the cliche of super bright, super saturated, geometric or melty shapes like in the article are not a great reproduction of the typical visual effects of psychedelics. Apart from very high doses, the visual effects are much more subtle.

    The /r/replications subreddit has GIFs and short videos with a much higher degree of realism https://www.reddit.com/r/replications/top/?t=year

  • helboi4 a year ago

    This is 100% not what psychedelics look like. It's generally just mildly more saturated colours and the feeling that everything is possibly breathing or swaying in a more natural way. I dunno what happens if you take insane amounts tbf. I always thought that psychedelic art was a bit more about the sort of thing that is super appealing to look at while tripping.

    • icameron a year ago

      The trick is go out of body. Eyes closed and let your mind create all the visuals. Then its like being in alex grey land

  • GuB-42 a year ago

    Maybe the most "scientifically accurate" replication of psychedelics are in these "DeepDream" images.

    They were originally made to debug neural networks for image recognition. The idea is run the neural network in reverse while amplifying certain aspects, to get an idea on what it "sees". So if you are trying to recognize dogs, running the network in reverse will increase the "dogginess" of the image, revealing an image full of dog features. Depending on the layer on which you work, you may get some very recognizable dog faces, or something more abstract.

    The result is very psychedelic. It may not be the most faithful representation of an acid trip, but it is close. The interesting part is that it wasn't intended to simulate an acid trip. The neural network is loosely modeled after human vision, and messing with the artificial neurons have an effect similar to how some drugs mess with our natural neurons.

openrisk a year ago

Fun thing: in relativity u,v are typical variable names used for a really funky coordinate transformation that mixes space and time, sometimes called Penrose coordinates [1]. So when I saw this:

> uv.x = uv.x + sin(time + uv.x * 50.0) * 0.01;

> uv.y = uv.y + sin(time + uv.y * 50.0) * 0.01;

I thought, wow, what on Earth is going on here? But no, it turns out that its not that psychedelic. They could have used p,q or any other variable pair but its still quite interesting geometrically [2].

[1] https://en.wikipedia.org/wiki/Penrose_diagram

[2] https://en.wikipedia.org/wiki/UV_mapping

z3phyr a year ago

Slightly offtopic: Is there a way to do create meshes and animate them directly inside blender, pragmatically? Sort of like shadertoy, but instead of drawing, sculpting and rigging manually, I write some code that generates meshes and run shaders on them for effect?

  • zipy124 a year ago

    Yes, all of blender is extensible with python, and last time I used it in a project in university it was surpisingly easy to do too.

  • DonHopkins a year ago

    Blender's deeply extensible and largely written in Python, but it also has full blown visual node programming language for procedurally modifying and generating textures, shaders, 3d geometry and meshes and parametric objects, etc!

    Actually Blender has an abstract base "Node" set of Python classes and user interfaces that you can subclass and tailor for different domains, to create all kinds of different domain of application specific visual programming languages.

    So visually programming 2d video filters, GPU shaders, 3D geometry, animations, constraints, state machines, simulations, procedural city generators, etc, and each can have their own compilation/execution model, tailored user interface, node libraries, and connection types. Geometry nodes have the visual programming language equivalent of lambdas, functions you can pass to other functions that parameterize and apply them repeatedly, iterating over 3d geometry, texture pixels, etc.

    Blender extensions can add nodes to the existing languages and even define their own new visual programming languages. So you can use a bunch of integrated tightly focused domain specific visual programming languages together, instead of trying to use one giant general purpose but huge incoherent "uber" language (cough cough Max/MSP/Jitter cough).

    https://docs.blender.org/manual/nb/2.79/render/blender_rende...

    What are Geometry Nodes:

    https://www.youtube.com/watch?v=kMDB7c0ZiKA

    Geometry Nodes From Scratch:

    https://studio.blender.org/training/geometry-nodes-from-scra...

    Free blender City Generator Addon:

    https://www.youtube.com/watch?v=9nLsew8I7KM

    Here's a paid product, an incredibly detailed and customizable city generator (and traffic simulator!) that shows off what you can do with Geometry Nodes, well worth the price just to play with as a video game, and learning geometry nodes:

    Using The City Generator 2.0 in Blender | Tutorial:

    https://www.youtube.com/watch?v=kRHkGoTQKM8

    How to Create Procedural Buildings | Blender Geometry Nodes | Procedural City:

    https://www.youtube.com/watch?v=hGgAEp-n0uk

VinLucero a year ago

Very cool explanation between time as a variable and graphical design “aberrations”

b4ckup a year ago

Really interesting! I'm very much interested in pychedelic graphics. I played around with shadertoy a little bit maybe I should give it another go. For anyone interested I made some cool visuals by interpolating prompts in stable diffusion 1.5 like https://m.youtube.com/watch?v=ajfMlJuDswc. I found that the older diffusion models are better for abstract graphics as it looks more "raw" and creative.

DonHopkins a year ago

https://news.ycombinator.com/item?id=33071119

DonHopkins 11 months ago | parent | context | favorite | on: John Walker, founder of Autodesk, has died

Jim Crutchfield is DOCTOR CHAOS -- he's got a PhD in Complexity Science!

https://www.youtube.com/watch?v=B4Kn3djJMCE

Space-Time Dynamics in Video Feedback

A film by Jim Crutchfield, Entropy Productions, Santa Cruz (1984). Original U-matic video transferred to digital video. 16 minutes.

James P. Crutchfield. Center for Nonlinear Studies, Los Alamos National Laboratories, Los Alamos, NM 87545, USA.

ABSTRACT: Video feedback provides a readily available experimental system to study complex spatial and temporal dynamics. This article outlines the use and modeling of video feedback systems. It includes a discussion of video physics and proposed two models for video feedback based on a discrete-time iterated functional equation and on a reaction-diffusion partial differential equation. Color photographs illustrate results from actual video experiments. Digital computer simulations of the models reproduce the basic spatio-temporal dynamics found in the experiments.

1. In the beginning there was feedback ...

James P. Crutchfield. "Space-Time Dynamics in Video Feedback." Physica 10D 1984: 229-245.

[pdf] https://csc.ucdavis.edu/~cmg/papers/Crutchfield.PhysicaD1984...

[Plates 1-4] https://csc.ucdavis.edu/~cmg/papers/Crutchfield.PhysicaD1984...

[Plates 5-7] https://csc.ucdavis.edu/~cmg/papers/Crutchfield.PhysicaD1984...

https://csc.ucdavis.edu/~chaos/

calebm a year ago

https://gods.art/math_videos/strange_faces_thumb.html

rikroots a year ago

If we're sharing, this is my effort at psychedelic graphics - animating a gradient over a live video feed (all done using a boring 2D canvas, because I don't have the brain capacity for shaders) over on CodePen: https://codepen.io/kaliedarik/pen/MWMQyJZ

jasonjmcghee a year ago

The fully interactive nature of the post is such a great way to communicate about a topic. Also it's just really clean design.

Appreciate you taking the time!

JansjoFromIkea a year ago

a bit of a tangent but I'm surprised how heavily visualisers and the like always seem to focus on packing in as much colour as possible. With OLED screens it feels like there's a ton of potential for making really great black-heavy ambient visuals that so an idle TV can become a feature of a room's decor rather than just a big black rectangle in the middle of it.

  • fourteenfour a year ago

    Yeah, I have an older LG and it has a disappointingly simple 4k fireworks visualization when it "sleeps" that always makes me wish I could create a custom replacement.

mbreese a year ago

The next link in the series was better, IMHO.

https://benpence.com/blog/post/psychedelic-graphics-1

This gets more into how to introduce motion and new visuals instead of the building blocks. The rolling hills graphic was really interesting.

whism a year ago

I write semi-psychedelic paint and video mixing software for personal use. Here’s a video from last year of mixing a few things together, hopefully some here enjoy it :)

https://youtu.be/IgpcJN4qAAg?si=Khq6U1nxXi-9n73A

dghf a year ago

> Basically any color that humans can perceive can be created from a mixture of these three colors.

Many, but not any. No finite set of real primary colours can produce every perceivable colour. Some will always be out of gamut.

epiccoleman a year ago

Ben - so glad I stumbled on this article. Love this kind of graphical stuff (I'm a huge sucker for psychedelia) and I really enjoyed your videos on your channel. Thanks for sharing!

DonHopkins a year ago

Here's a classic video by Rudy Rucker demonstrating his CALab product that he made with John Walker at Autodesk:

https://www.youtube.com/watch?v=lyZUzakG3bE

At 24:28 he shows a running Belousov–Zhabotinsky reaction mapped onto a 3d model's texture:

https://youtu.be/lyZUzakG3bE?t=1468

I wrote about it in the discussion of John Walker passing away, and Josh Gordon, who worked on Chaos at Autodesk, joined the discussion:

https://news.ycombinator.com/item?id=39300605

>DonHopkins 11 months ago | parent | context | favorite | on: John Walker, founder of Autodesk, has died

>I really love and was deeply inspired by the great work that John Walker did with Rudy Rucker on cellular automata, starting with Autodesk's product CelLab, then James Gleick's CHAOS -- The Software, Rudy's Artificial Life Lab, John's Home Planet, then later the JavaScript version WebCA, and lots of extensive documentation and historical information on his web page. CelLab:

https://www.fourmilab.ch/cellab/

https://www.fourmilab.ch/cellab/classic/

https://www.fourmilab.ch/homeplanet/

https://www.rudyrucker.com/oldhomepage/cellab.htm

[...]

>josh_gordon 11 months ago | prev [–]

>I'm amazed that my beloved CHAOS still runs beautifully on emulators like DOSbox. It was the last programming project where I could completely roll my own interface - and maybe my last really fun one.

Here's some stuff I did that was inspired by Rudy Rucker and John Walker's work, as well as Tommaso Toffoli and Norm Margolus's wonderful book, "Cellular Automata Machines: A New Environment for Modeling":

https://news.ycombinator.com/item?id=37035627

by DonHopkins on Aug 7, 2023 | parent | context | favorite | on: My history with Forth and stack machines (2010)

>"Cellular Automata Machines: A New Environment for Modeling" is one of my favorite books of all time! It shows lots of peculiarly indented Forth code. https://donhopkins.com/home/cam-book.pdf

>CAM6 Simulator Demo:

https://www.youtube.com/watch?v=LyLMHxRNuck

>Forth source code for CAM-6 hardware:

https://donhopkins.com/home/code/tomt-cam-forth-scr.txt

https://donhopkins.com/home/code/tomt-users-forth-scr.txt

And a couple more recent videos to music using the SimCity/Micropolis tile set and WebGL tile engine to display cells:

SimCity Tile Sets Space Inventory Cellular Automata Chill Resolve 1

https://www.youtube.com/watch?v=319i7slXcbI

I performed it in real time in response to the music (see the demo below to try it yourself), and there's a particularly vivid excursion that starts here:

https://youtu.be/319i7slXcbI?t=314

The following longer demo starts out with an homage to "Powers of 10", and is focused on SimCity, but shows how you can switch between simulators with different rules and parameters, like setting rings of fire with the heat diffusion cellular automata, then switching to the city simulator to watch it all burn as the fires spread out and leave ashes behind, then switching back to another CA rule to zap it back into another totally different pattern (you can see a trail of destruction left by not-Godzilla at 0:50 while the city simulator is running).

I had to fix some bugs in the original SimCity code so it didn't crash when presented with the arbitrarily scrambled tile arrangements that the CA handed it -- think of it as fuzz testing; due to the sequential groups of 9 tiles for 3x3 zones, and the consecutive arrangements of different zone type and growth states, the smoothing heat diffusion creates all these smeared out concentric rings of zones for the city simulator to animate and simulate, like rings of water, looping animations of fire, permutations of roads and traffic density, rippling smokestacks, spinning radars, burbling fountains, an explosion animation that ends in ash, etc.

Chaim Gingold's "SimCity Reverse Diagrams" visually describes the SimCity tiles, simulator, data models, etc:

https://smalltalkzoo.thechm.org/users/Dan/uploads/SimCityRev...

Micropolis Web Space Inventory Cellular Automata Music 1:

https://www.youtube.com/watch?v=BBVyCpmVQew

You can play with it here. Click the "X" in the upper left corner to get rid of the about box, use the space bar to toggle between SimCity and Cellular Automata mode, the letters to switch between cities, + and - switch between tile sets (the original SimCity monochrome tiles are especially nice for cleansing the palette between blasts of psychedelic skittles rainbows, and the medieval theme includes an animated retro lores 8 bit pixel art knight on a horse), the digits to control the speed, and 0 toggles pause. (It's nice to slow down and watch close up, actually!):

https://micropolisweb.com/

As you can see it's really fun to play with to music and cannabis, but if you're going to use any harder stuff I recommend you get used to it first and have a baby sitter with you. Actually the whole point of my working on this for decades is so that you don't need the harder stuff, and you can put it on pause when you mom calls in the middle of your trip and you have to snap back to coherency, and close the tab when you've had enough.

CodeWriter23 a year ago

That humanoid figure, looks like dude is getting some bad acid.

2-3-7-43-1807 a year ago

And where are the psychedelic graphics now?

  • leptons a year ago

    Yeah, I did not find anything psychedelic in the article or the pages linked from it.

sowut a year ago

nice

DonHopkins a year ago

https://news.ycombinator.com/item?id=33105030

>deepnet on Oct 6, 2022 | parent | context | favorite | on: Recording the Grateful Dead: The Culture of Tapers

>The overlap between early nerd culture and The Grateful Dead was very significant.

>Taping and sharing culture and its benefits were very apparent in many net forums.

>As were democratisation of the new tools, public terminals with BBS access and the Deadheads community spirit exemplified on Usenet and Arpanet.

>Look no further than John Perry Barlow, EFF co-founder and his Manifesto of Cyberspace - he was a Grateful Dead Lyricist !

https://www.wired.com/2016/02/its-been-20-years-since-this-m...

>Barlow's paradigm seems cheeky without awareness of the Net's public roots, how it came up through BBS and Fidonet culture, is forgotten by those who only saw the view of the Net as a gift from the ivory towers of academia and the military rather than bedroom z80 & 6502 modem culture.

q.v. Fidonet BBS documentary

https://m.youtube.com/watch?v=Dddbe9OuJLU

DonHopkins on Oct 6, 2022 [–]

>In another comment reply to Gumby, I mentioned how I often accidentally call them "Grateful Dead Conferences", because so many tech people I knew and worked with in Silicon Valley and the Free Software community and regularly saw at computer conferences and trade shows would show up at Dead shows.

>The Raster Masters would lug enormous million dollar high end SGI workstations across North Shoreline Boulevard from SGI headquarters to Shoreline Amphitheater, and actually pack them into trucks and travel on tour with the Dead, performing live improvisational psychedelic graphics on the screen behind the band in real time to their live music, using an ensemble of custom software they wrote themselves, mixing together and feeding back the video of several SGI workstations in real time.

>At one concert, some hippie came up to me, pointed at the graphics on the screen behind the stage in awe, and said, "I took all these shrooms, I'm tripping my balls off, and you would not fucking believe what they're making me seeing on the screen up there!!!" I explained to him that I hadn't taken any shrooms, but I could see the exact same thing!

>The Raster Masters wrote and performed their own software, which reflected the taping and sharing culture of the Dead scene, including ElectroPaint and the Panel Library from NASA, whose source code and recorded live performances were distributed with SGI's demo software and free source code library.

>The improvisational software was like a musical instrument performed in real time along with the music.

[...Lots more stuff with links and videos at the link:...]

https://news.ycombinator.com/item?id=33105030

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection