I was discussing AI with several friends and what it does to programming and the world. One of the themes was connecting AI to physical stuff. On an unrelated path I bought some Roland synths to play with and ended up not using them much, because there is too much other stuff happening in the world. And then I had this idea to connect the dots, connect the two hobbies, to find out if AI can be connected to HW synths to play some music.
In the end it seems almost unrealistically easy. Connect synth to computer and tell AI to use it. Ha. That easy. Ok, maybe not always, I actually had to make a few iterations of the video, because some interations the video recording was not on, some iterations the AI got lost, some iterations I gave it too much information, some iterations too little. Some iterations the music was too ugly. A decent compromise follows. The music may not be top notch, but read the AI conversations. I find some parts quite nice - maybe, just maybe we can in the end cooperate with the AIs and avoid the Terminator drama.
I made a few more attempts. I wanted the AI to be writing more music, while I play the existing one. Then transition seamlessly. Some attempts did not go too well (for example unix signals based implementation had too big overlap when old and new patches were playing over each other; or simply killing one python program and starting another usually hits the music mid bar and produces really awful transition).
Kind of by accident the AI was creating visualizations in the MIDI python patches. Which was nice. So I vibe coded this visualization paired with directory scanning and patch replacement micro-app. It scans a directory, offers a menu of patches found in there, let’s user schedule next patch. If new patch is discovered midplay, it is added to the menu. In the bottom half, there are some visualizations - some only for the looks, but some are helping to time the physical knob tweaks as they display how much time is left to a transition.
Speaking of the knobs - Roland T-8 is uniquely suitable for this collaborative experiment because it has no MIDI CC - so the AI may play the notes of the bass and also the beats, but have no way to change other parameters and the human must turn the knobs. Hence the AI-human teaming for live jamming.
The next video is an attempt at AI augmented melodic techno.
This was great fun. Maybe too much fun actually.
It is true, that technically this is not too new concept. It is just a software based MIDI sequencer connected to a MIDI capable HW synth. And from the other side an LLM model is generating the sequences. But to my knowledge not much prior art exists at least publicly on the internet. So enjoy the videos, maybe try it also. If you do not want to vibe code your visualization/patch switcher, use the one form the second video - just be warned that it is vibe-coded code meant to be run and played but not meant to be read.