Settings

Theme

MIDI 2.0 Specifications Available for Download

ask.audio

107 points by dmoreno 6 years ago · 35 comments

Reader

hjanssen 6 years ago

MIDI is such an interesting standard. Anyone who has anything to do with music equipment can attest that it is literally everywhere. Every (decent) synth nowadays has at least a midi input. You can sync practically everything with MIDIs clock function (which was VERY painful to do beforehand, especially if you tried to sync multiple synths). Its interesting to see if MIDI 2.0 can follow up with that. Certainly, MIDI is here to stay and i cant wait to see what cool things people are going to create with it!

  • tannhaeuser 6 years ago

    > Every (decent) synth nowadays has at least a midi input

    Well that sentence could've been straight from a music/computer mag from early/mid 1980s - that's how old MIDI is, using DIN jackets and all.

    Source: owned an Atari ST using Steinberg 24 track sequencing software to control Yamaha DX-7 (every New Wave band had one of those on stage), early digital drum machines, digital reverbs/echoes, and other devices around 1985.

    • croon 6 years ago

      Cubase represent!

      My dad was a music teacher and a big proponent of getting the school to purchase Ataris for this reason.

      He in later years also decided to build an electric organ with complete foot pedals and all.

      MIDI never says die.

    • core-questions 6 years ago

      You posted all that, and didn't link to any audio of what you made? Come on, man, we can never have enough DX-7.

  • ptah 6 years ago

    > Its interesting to see if MIDI 2.0 can follow up with that.

    there lies the rub. MIDI is quite simple. if MIDI 2.0 is error-prone to implement, it will fail as literally no one has time for that

    • dmorenoOP 6 years ago

      I have not checked the docs yet, but from what I read so far the protocol is actually easier, with clearer fixed size messages for usual operations, with much more information.

      But it is always required MIDI 1.0 compatibility, so if your device need the extra bits, it activate the MIDI 2.0 protocol. IIRC it is possible to use other parts of the MIDI 2.0 as MIDI CI with the old MIDI protocol.

      Also things like MPE I think are really improved, as current MPE is an ugly hack using channels. Time wise MIDI 2.0 should be easier to implement than MPE, IMHO.

      The most difficult part would be the MIDI CI (Capability Inquiry) to first set up the MIDI 2.0 protocol, and then to send information about current device. But the info about the current device is actually a JSON file, so not really difficult.

  • nkozyra 6 years ago

    While I won't deny that midi has been a great and long lasting standard, syncing clock via cv was always pretty straightforward, you just lacked thru so everything was daisy chained directly.

    CV being effectively monoplexed meant a lot of connections to control a lot of parameters on one or more instruments.

    Of course, the modular crowd is still happy to live in this world, but that's more an aesthetic choice, imo.

    • H1Supreme 6 years ago

      > more an aesthetic choice

      What's the alternative?

      • squarefoot 6 years ago

        Tunneling the analog control signal into digital packets would be easy, then it's just a matter of picking the best PHY interface to send them around.

        Don't worry about sound issues, though. Old hybrid synths (analog generators with digital control and parameters storage) were very recognizable from fully analog ones (purely CV controlled) just because say a pitch bend mapped to a 8 bit sampling of a potentiometer, which translates to 7 bit up and 7 bit down, wouldn't have the minimum quantization to let the listener perceive it as an analog control. Today cheap MCUs allow adding digital control to analog generators, filters, envelopes etc. to a granularity level (16+ bit) so that any realtime variation would be identical sound wise to a fully analog controlled synth. Latency wouldn'b an issue as we're talking about control and not sound generation: MCUs with no or minimal (realtime) OSes would never impose such higher latencies as in virtual instruments etc. Having an ADSR kick like two milliseconds after the key is pressed would hardly be a problem.

        • core-questions 6 years ago

          > Tunneling the analog control signal into digital packets would be easy

          Yeah, but at that point, you're already giving up the "analog purity" that analog synth fans love so much. Might as well just use midi; and then you might as well just use soft-synths; and then you might as well just use Propellerhead Reason, so you can have all the flexibility of gate/CV fully virtualized.

          • squarefoot 6 years ago

            I meant the control signal, not the audio signal. The audio stays analog, but the CV once sampled into a digital word can be packed together with other controls, moved around without fear of losing accuracy, and makes trivial for example to record and play it.

thomasfl 6 years ago

The timing must be perfect for startups that create new kinds of MIDI input devices that can support the new expressiveness controls that MIDI 2.0 supports. Roli probably a head start with their seaboard "keyboard" made in neopren. The Osmose keyboard from Expressive E, also look promising. Osmose looks like a traditional keyboard, but detects vibration on the keys.

  • atoav 6 years ago

    As a (happy) owner of a Linnstrument I think supporting MPE would already be a great thing as it removes a lot of the problems I had with traditional midi (e.g. allows per finger pressure, pitchbend and mod instead of one pitchbend for all notes played).

    As I know Roger Linn however he is probably already in the process of writing Midi 2.0 support for the Linnstrument firmware

    • thomasfl 6 years ago

      The Linnstrument looks awesome. I am the happy owner of a Roli Music block, that actually looks like a small and cute version of the Linnstrument.

  • nkozyra 6 years ago

    Osmose isn't the first to use horizontal key bend/pressure/direction and midi mpe already supported that and things like polyphonic aftertouch.

    I think it could mark a resurgence in midi guitars, where things like bends tend to quantize a bit too much.

    • nitrogen 6 years ago

      Are there any synthesizers or virtual instruments that bend the fundamental+harmonics with the right stretch, and separarely from the formants/inharmonics for a more realistic bend? When I bend a guitar or violin (or piano) on my Korg workstation it sounds like a nasty digital resampling bend, and very unnatural.

K0SM0S 6 years ago

> MIDI 2.0 delivers more nuanced expressiveness for electronic instruments. It’s now possible to convey the same kind of subtle expression normally associated with acoustic instruments, thanks to higher-resolution dynamics and control data, vastly extended controller options (including per-note controllers for exceptional articulation), and simplified controller assignments.

This could be a musical revolution in the making, some 10-20 years from now. Huge, huge implications for the entire industry and craftmanship of "instruments". At the bottom of the market, this could be the proverbial end of the 'cheap' analog stuff for the masses, a world of fantastically sounding budget intruments). At the state of the art, a whole new category of instruments with potentially crazy original software-defined features.

MIDI 2.0 would have been sci-fi not so long ago. It's fantastic that we are here.

  • hootbootscoot 6 years ago

    are you referring to a control protocol or a magical signal path effect?

    you could always do what you described since at least 20 years now, as MIDI need not describe your synths parameters controls

    the issues with MIDI are not so much dynamics as timing resolution. 7 bits of dynamics might not sound like much but I'm not confident in any known musicians ability to express dynamics with more than 127 discrete levels lol...

    timing resolution of MIDI is great for more quantized musical styles, but for accurately capturing nuanced rubato performances this is the area that needs to be improved.

    the primary issue with timing in physical MIDI interfaces is timing STABILITY. this is arguably worse on a modern Mac with CoreMIDI than on an Atari 1020ST... This is directly a product of scheduler vagaries and even firmware. MIDI should be locked to the sample clock, perhaps updating as often as once a buffer or even less... (there was even a recent Macbook Pro that had it's audio clock jittering all over the place due to a power-management IC hardware rev, aka you can't download an update to fix THAT one, but I digress...)

    • K0SM0S 6 years ago

      > 7 bits of dynamics might not sound like much but I'm not confident in any known musicians ability to express dynamics with more than 127 discrete levels lol...

      Think one step further: when I hit a key on the piano, or a fret on a guitar, virtually all other strings resonate to some degree, however minutely, and this has to do with harmonic resonances, the geometry of the piano, etc. (Fourier + chaos). Now the only way to convey that kind of subtlety currently is either to digitize "as a whole" (microphone) or discretely (e.g. individual string sensors); but each has its tradeoff that you don't get from the other (no discreteness in your microphone, and the discrete approach probably won't render any accoustic feel, let alone room shape, etc.

      Basically, at a mathematical level, it seems like we should be able to get both worlds — a discrete yet complete description of an "instrument", which obviously has to be designed for the purpose. It's really breaking wide open the barrier between the physics of real-world tangible objects and the mathematics of software objects, computation for music. You may thus simulate analog stuff 'perfectly' (good enough to human ear), or quantize real analog also 'perfectly' (enough).

      Obviously you could do all of that now building your own stuff, instruments and software and protocols. But having it baked in MIDI is a game changer in terms of actual mainstream use, thus products to market.

    • nitrogen 6 years ago

      I share your disappointment in what seems like a downward slope in audio latency and jitter on modern hardware. My old AMD Athlon 64x2 with a Sound Blaster Live 5.1 was more consistent than my modern Core i7 workstation with a pro/prosumer audio interface.

  • 72deluxe 6 years ago

    But will anyone hear such subtlety with the over-handed compression applied to everything these days, and the low-quality streaming quality 99.9% of listeners are satisfied with??

    It is indeed great news but I do not hear a lot of subtlety in most music, particularly electric genres that believe loud loud square and triangle waves are the future, given the amount I hear.

    I must be getting old.

    • derefr 6 years ago

      > low streaming quality

      You mean AM radio? ;)

      Keep in mind, MIDI is also used for live performances—and not just in raucous club concert venues, but also in pin-drop quiet orchestral performance halls (if the genre isn’t classical.)

      Also, more than ever, people are releasing music in high-bitrate lossless formats, making even a compresssed soundstage not particularly lossy.

    • unlinked_dll 6 years ago

      The difference between 7 bit control changes and continuous floating point automation is night and day, and neither dynamic range compression nor lossy compression affects that.

      EDM in particular is full of sonic subtleties that MIDI 1.0 did not really support. I don't think you're really listening to it if you think it's all triangle waves, those aren't even particularly popular in contemporary sound design.

      • hootbootscoot 6 years ago

        I'm not sure you are referring to a control protocol here... "dynamics" are not directly correlated to velocity, and see my lengthy spiel above for why 7 bits is enough, lol...

        I really don't follow you on this "EDM full of sonic subtleties that MIDI 1.0 didn't support" ? 1) EDM full of sonic subtleties? pray tell. 2) how does a control protocol support sonic obtusities or subtleties in any way shape or form?

        MIDI = note numbers, note ON, note OFF (or you need to hit reset as the note will hang) CC or continuous controller, velocity, modulation, pitch-bend... https://www.midi.org/specifications-old/item/table-1-summary...

        you can sniff MIDI messages... there's not much to them.

        I'm trying to understand how the control protocol would influence the sonic subtleties? It's my distinct experience to note that most EDM features little to no velocity variance nor any mapping of that to volume of oscillators etc. I've also noted a distinct lack of anything resembling subtlety in EDM, but call me biased as I produced dance music for about 30 years and lived touring as a performing act from it for over a decade, but I digress, as I am wont to do from time to time...

        triangle waves? do you mean square waves? aka the artifact of ridulously over-compressed music?

        • nitrogen 6 years ago

          7 bits makes zips on control sweeps. 7 bits is also not nearly enough to cover the expressive velocity range of e.g. a piano.

          • hootbootscoot 6 years ago

            7 bits directly coupled to a control of a filter cutoff, for example, will create a zip. I doubt any living pianists can present substantially more than 127 discrete velocity values, however.

            Regarding the former point you made: let's not confuse a simple low-level means of remotely issuing continuous controllers with a limited bit value (MIDI CC's) with how you map it in your synthesizer or effect. Smoothing, ramping, interpolating, filtering... You are essentially mapping a coarse range to a finer range and you can apply acceleration curves, etc. there's no limit here, and there hasn't been in software synths/fx for 20 years on this. zipper noise is amateur.

            MIDI is essentially a very simply synchronous wire, a stream of events in which the transport timing is THE timing framework, leaving mtc aside for the moment.

            to call this protocol MIDI 2.0 is not accurate, as it's more like a meta-MIDI protocol much more like OSC.

            all are physical transport agnostic, it's just that MIDI and its 5 pin DIN connector are ubiquitous, present on old gear (old gear is valued with music people...)

            and you can connect 1 of those wire, pin 2 or 3 IIRC, directly to your microcontroller and be toggling your sound thingy faster than you can say "debounce"... it's stupidly simple, and the notion of "protocol negotiation" runs counter to the spirit of the original entity.

            call this OSC ALT 2 or something...

            • nitrogen 6 years ago

              7 bits directly coupled to a control of a filter cutoff, for example, will create a zip. I doubt any living pianists can present substantially more than 127 discrete velocity values, however.

              Filter cutoffs are another example where 7 bits aren't enough, and if they linearly map to 20..20kHz, then even 14 bits isn't enough.

              As for conscious piano velocities, there are ppp, pp, p, mp, mf, f, ff, fff, and you could maybe add pppp and ffff for 10. But those are velocity ranges, and you definitely will notice if every note within a passage is quantized to one of 12 or 13 velocity levels.

              First of all there's the accent pattern of each measure where in 6/8 time you'd want 6 velocities. There's also expression within a chord and from note to note on a melodic sequence, where e.g. a note struck by the pinky might be expected to be just a little bit quieter. There are gradual crescendos that might last for more than 12 notes as well. And finally there's just the subtle randomness of the player and the instrument that makes things sound natural instead of artificial.

              So for a piano piece to sound natural, you absolutely must have more than 128 velocity levels. Maybe 4096 would get you by. Boesendorfer's older CEUS computer piano system used more, but I never used it and it looks like they have switched to Yamaha's Disklavier.

              ----

              All that said, I think you do have a fair point that the jump in complexity is significant from MIDI1 to MIDI2. Each protocol seems to have been designed near the state of the art of its respective time period. Maybe in another 30 years it will be just as easy to drop a $1 microcontroller on a board and talk MIDI2.

lioeters 6 years ago

I've been following the development of MIDI 2.0 specs ¹ with interest (just personal, not for work).

In the past year, I got fairly deep into Open Sound Control ², which has so much fun potential and can practically be a superset of MIDI. In fact, I implemented encode/decode from OSC <-> MIDI in C++ and Node.js for a hobby project.

So I wonder, could MIDI and OSC converge in the future?

It seems to me that the latter being a generic protocol for any kind of message, including musical data, that it could supercede depending on industry adoption (like MIDI+OSC instruments)..

---

¹ https://www.midi.org/articles-old/details-about-midi-2-0-mid...

² http://opensoundcontrol.org/

dacohenii 6 years ago

I'm glad to see the MIDI spec is available for download! It was my understanding that the only way to get the specification for MIDI 1.0 was to buy a hard copy of the spec, and that sales of the book were the only way the MIDI association made money. Looks like now both are free.

I think I had heard that from an interesting youtube lecture on how MIDI works, which I'll link to just in case anyone is interested: https://www.youtube.com/watch?v=ZPteB_LpHoM

hootbootscoot 6 years ago

yeah, great... "protocol negotiation" lol... this wants to be OSC 2.0 or maybe just everything to everyone?

See, MIDI has an elegant simplicity in that you can connect 1 stupid wire (either pin 2 or 3, IIRC) to your micro-controller and before you can even say "debounce" you can be triggering envelopes...

MIDI, as in normal old-assed MIDI is low-level.

Look, I get it! if ONLY we had the full 8 bits for CC's instead of only 7, etc...

The temptation to "improve and update everything" results in the crap 2020 software engineering artifacts that we leave as our legacy: firmly reminding eternity about just how far we believed that modern trendy habits are always the best for everyone always in everything...

I would have joked about making your "profiles" in XML, but then as we know it's 2020 so that would be JSON, lol...

FraKtus 6 years ago

When can we expect to see it in macOS / Windows?

Any big actor did already commit to it (Abelton, Cubase)?

polyterative 6 years ago

does midi 2 suffer from MIDI timing jitter? Using modular (eurorack) now and that issue made midi unusable for me

  • dmorenoOP 6 years ago

    It has a mechanism to reduce it, quite probably adding some latency. But better always 8ms of latency than random jitter between 1ms to 8ms.

    This will also help with WiFi + rtpmidi and bluetooth.

    From the (oldish) midi.org article[1]:

    > The Universal MIDI Packet format adds a Jitter Reduction Timestamp mechanism. A Timestamp can be prepended to any MIDI 1.0 Protocol message or MIDI 2.0 Protocol message for improved timing accuracy.

    1: https://www.midi.org/articles-old/details-about-midi-2-0-mid...

  • hootbootscoot 6 years ago

    Well, there should be a sample timestamp in there, so that's ONE benefit a high-level async protocol provides. The delivery timing does not correlate to the playback timing, and so this will reduce the need for real-time determinacy of the data-streams transport.

    make sense?

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection