UAPMD: Building MIDI 2.0 Ecosystems on Desktop
Last year (okay, I'm a bit late; read it as 2024, and other years likewise), I wrote a post titled "Building MIDI 2.0 Ecosystems on Android". There I explained that we can build MIDI 2.0 ecosystems on Android on every aspect. There was nothing much happened this year on that platform - I had been focusing on MIDI 2.0 on desktop this year. I named it UAPMD (ubiquitous audio plugin MIDI device). I kinda started that effort last year, but I was busy on many months this year to accomplish something "releasable". But at last, I tagged version 0.1 of the effort within this year.
Here is the screenshot for v0.1 status quo. It covers a lot of features it supports.

You can find the sources and its essential information here: https://github.com/atsushieno/uapmd
What UAPMD offers are:
- uapmd-app: a plugin host application that instantiates any audio plugins and expose them as virtual MIDI 2.0 devices
- remidy: a plugin hosting foundation for VST3, AudioUnit, LV2, and CLAP (without API stability though)
- uapmd: a virtual MIDI 2.0 device bridge layer to the plugin host
The entire codebase is MIT licensed. Those modules I used are under the permissive licenses like MIT, ISC, or zlib. It involves a lot of Claude Code and Codex works in the last few months, from providing packaging setup to implementing a handful of plugin features by feeding the spec. documentations and FLOSS sources to find bug defects and problematic implementations.
I wrote almost everything we need on the repository (README, docs, wiki). What I'm going to write here is the rest that don't fit there.
Rationale
UAPMD achieves whatever anyone else didn't yet.
Last year I wrote:
It works fairly well with MIDI 1.0 protocol. But what we want now is MIDI 2.0 music player. While it is able to send UMPs to MIDI output devices, we don't really have MIDI 2.0 devices. It is probably not a big deal as UMP outputs from those MIDI 2.0 apps will be translated to MIDI 1.0 messages if the connected device is MIDI 1.0. But then it does not make sense to use MIDI 2.0 there. What I want instead is to be able to play "MIDI 2.0 music" over a "MIDI 2.0 device" which has more than 16 channels. Now I'm exploring that kind of fields, but I need more time to achieve something useful.
So, UAPMD filling the missing piece.
Actually I had a handful of virtual MIDI 2.0 devices on Android (blogged last year, explained later), but that happens only on Android, which is not a "creative" platform to me - I need PC. Also, those virtual devices are not what I find applicable here (explained later). What I needed is virtual MIDI 2.0 devices on the desktop.
In the very early stage, I figured that I'm going to need my own plugin hosting foundation. There were only JUCE and JUCE-based projects (even including Carla, which contains old fork of JUCE v4 named as "water"). I need UMP processing at common hosting part and JUCE is only dealing the input events in MIDI 1.0 manner. (They announced their plan to support UMP this year, but it still won't not happen until v9.0 and I don't plan to wait for them.)
I'll explain more details, but UAPMD itself ended up being an extra stack to expose MIDI 2.0 control points for those plugin instances over the entire plugin hosting foundation. So, the actual development effort is spent more on the plugin hosting foundation (apparently).
UAPMD v0.1 is not a solid foundation for multi-track sequencers yet, but it should be at some stage, because hosting one plugin is not enough for my ultimate plan for "MIDI 2.0 music player". It is also why AAP virtual MIDI 2.0 devices are not enough. But when someone implements a decent multi-track sequencer over (lib)uapmd sequencer engine, anyone can build a DAW that features many audio plugin features.
Recap: AAP as MidiUmpDeviceService
When I had been working primarily on Android plugin stuff, I have added support for virtual MIDI 2.0 devices last year. It is to expose arbitrary audio plugins as UMP devices, so I could use plugins including (but not limited to) MDA, Sfizz, Dexed, OPNplug(-AE), Odin2, Vital OSS (when it was open source), airwindows... a lot.
The way how I mapped audio plugin features to MIDI 2.0 was twofolds:
- parameter changes are mapped to Assignable Controllers (NRPNs), preset changes are mapped to Program Changes, and other UMPs are sent to the MIDI event port.
- for more complicated features such as saving states and controlling GUI, we use our own Universal SysEx8 to transmit plugin API requests and responses in async manner. (More details are blogged here).
This was kind of interoperable for UMP level, but we still had no public way to expose parameter list and program list. They were specified as MIDI-CI Property Exchange especially as AllCtrlList, ProgramList, and even Device States, but no one had realized them (except for ProgramList on KORG KeyStage and their siblings).
midicci: MIDI-CI implementation
Creating virtual MIDI 2.0 devices became possible when ALSA supported MIDI 2.0 on Linux kernel 6.5 (and CoreMIDI has supported that earlier). And thanks to libremidi, creating those virtual devices is made quite easy. Libremidi is so far the only option I know of if we want to achieve that.
Now that we can expose virtual UMP devices, the next step is to have MIDI-CI integrated. Last year I have built fairly advanced MIDI-CI implementation (detailed in two blog posts), but in Kotlin. Kotlin works in general and ktmidi-ci builds as a Kotlin/Native module, but when it comes to providing fully compilable sources with C++ without binary dependencies, it is awkward.
So I decided to port it to C++, now that we could achieve somewhat higher level porting easily using coding AI agents like Claude Code and Codex. It is how midicci was born.

It is an experimental app called midicci-keyboard. It does not only act as UMP keyboard, it also implements controller list and program selector by name. This was not doable with MIDI 1.0, only doable with MIDI 2.0 with proper MIDI-CI support. Note that this MIDI 2.0 client application does not resort to any audio plugin features. It should work with any MIDI 2.0 devices that support AllCtrlList and ProgramList. As I mentioned earlier, there is no MIDI 2.0 devices that support AllCtrlList as far as I know of. You can still try KORG MIDI 2.0 products for ProgramList.
remidy: audio plugin host
The primary audio plugin host component "remidy" is a common publish API designed to be cross-platform, multi-format manner. It started more like a research project to focus on which plugin features can be commonized, which are format specific (e.g. CPU multicore threading is specific to CLAP), etc. The primary content of this uapmd repository was docs directory at that time.
I never thought I was going to implement such a "fully" featured plugin host. The initial development idea was optimistic. If I provide minimum plugin API implementation things would just work for simple MIDI features. That was all wrong. If I do not provide appropriate audio bus buffers or event bus buffers plugins often crash. If I do not implement some interface on VST3 IHostApplication they crash. If I leave some interfaces not implemented on IEditController they don't sound. I only had some experience on LV2 while I was working on Android plugins. I knew almost nothing about VST3 and AudioUnit APIs.
I was also using DPF which contained VST3 compatibility layer called Travesty, and while it worked perfectly, it was C API with manual vtable tricks, which I was sometimes stuck at my misimplementation. Fortunately VST 3.8.0 became available under the MIT license very recently, I could switch to the Steinberg API and directly understand other developers use of the API. Translation from Travesty to VST3SDK was quite straightforward thanks to Claude Code or Codex (I don't remember which). Of course they understood Travesty API fairly well, especially I told them that it is a C compatibility API.
The plugin hosting implementation was actually GUI-less for the first year (I had been working on it only about 3 months though), which was actually almost no working IEditController - technically we could implement this interface and component without GUI, but it'd be easier to have it working. It is almost impossible for most plugins to get the right parameter sets without GUI. You might think that if we choose a preset by some index or id, but that is hardly possible - most plugins do not expose presets in each plugin API's presets API.
It is most likely because there is not good enough ways to provide both factory presets and user presets in consistent manner. For example, how do you deal with preset IDs? Of course, each plugin format offers their way to appropriately to handle that, but having all into a decent common API in plugin SDKs (such as JUCE) would not be easy - especially if they had to cover old APIs such as VST2.
I had a proof-of-concept standalone plugin host called "remidy-plugin-host" (which became part of uapmd-app) using Web UI, and thought that everything could be implemented through URL handler. I quickly gave up the idea as its feature requirements became complicated. I switched to ImGui later which reduced a lot of boilerplate binding work.
uapmd: UMP mapping and MIDI-CI bridge
My initial plan for the virtual MIDI device management layer was, one simple UMP device to target one single instrument plugin, like how Timidity, Fluidsynth, Microsoft GM Synth, etc. worked for MIDI 1.0. This layer was not supposed to be very complicated - once a plugin instance got working, then I could simply set up a virtual MIDI device (as I mentioned earlier, it is quite easy with libremidi) and map the incoming UMPs to note on/off, parameter changes, etc. while detecting MIDI-CI universal SysEx to route to the MIDI-CI device I set up for the plugin.
But when it comes to "MIDI 2.0 music player", it becomes a lot more and it becomes more like a DAW sequencer engine that is supposed to process and merge multiple audio tracks simultaneously. A potential solution is Tracktion Engine here, except that JUCE projects are in general not suited for reusable libraries. Their graph library might be still useful if people don't mind their (A)GPL-ed library. It doesn't perform well on Android which I target either.
Support for decent multi-track engine is one of the post-0.1 milestones, and that would come up with simple-ish imaginary MIDI 2.0 Container file player (imaginary as the spec. document is still not published yet) which would look more like a DAW.
One thing that would still look annoying is that uapmd exposes ALL the instantiated plugins, including effect plugins connected to the instrument plugin, as UMP devices. So there will be hundreds of virtual MIDI devices when they are to construct a song. uapmd-app can disable those UMP devices, but it would be nicer if one track exposes only one UMP device. My initial idea was to behave like that, but I was then mixing all those plugin parameters from those multiple plugins which of course doesn't work. The ideal solution would be to have separate Function Blocks for each plugin, but I need some design changes in midicci and ktmidi-ci (its origin). It is another post-0.1 task.
The UMP transport layer was (or it still is) cause of a lot of problems. Things get unstable on macOS when I fix some issues on Linux, and vice versa. AllCtrlList is one of the largest UMP blobs that I had to deal with, and now State binaries as well (if I remember correctly, KORG MIDI-CI developers were talking about that too when I talked to them).
misc. notes
AI coding agents: it is my first experience to introduce generated code by AI coding agents. The beginning of midicci was an automated translation from Kotlin to C++ by Devin. Then I switched my primary engine to Claude Code (also Codex as well). I kept it AI-only project for a while. While it needed a lot more work, the initial conversion from Compose Multiplatform to Qt was impressive - as you would know, their architectures are totally different, but the conversion was just one pass (apparently, it was actually a lot of stubs on the converted code though).
Then I used the same approach to convert the plugin host app UI from WebUI to ImGui to simplify use of various plugin API features (i.e. in the same C++ code), and that was quite successful. While I know how typical GUI frameworks work (like when I implemented XIM support in mono Windows Forms around 2008), I had almost no experience in Qt or ImGui, and I don't bother to dig in depth.
Apart from GUI, uapmd module was left untouched by the AI engines, but starting migration from Travesty to VST3SDK, I figured that they should perform fairly good work to implement well documented specifications like plugin format APIs. So I tried to create a common plugin UI API based on CLAP, and let them to implement it for each plugin format as well (it was kind of intentional to see how they could figure out the mental model differences as UI extension design is quite different between CLAP, VST3, AU, and LV2). Then expanded the uses of them almost everywhere. It's good that remidy was API first, so that I don't have to worry much about awkward copypasting from somewhere else.
Testing: uapmd lacks comprehensive tests. What I really miss in the audio world is solid testing foundation. It is one of the recurring topics at ADC (audiodevcon) these years. They run in realtime manner and not very reproducible. Even for non-realtime part, audio plugins have to be installed first, then listed and instantiated on the CI servers. I had some experience from a few years ago, but that's very specific to the song project.
I had some plan to get plugins easily installed on GitHub Actions using StudioRack, namely studiorack-cli, but I have to wait for some fixes to get it really install and process audio (currently it does not work).
vNext
There are couple of things I want to improve for v0.2 milestones:
- sequencer for multiple tracks and clips (as mentioned above)
- non-desktop targets: migration to AUv3 API, Android, maybe Web (WebCLAP) ?
- hopefully more integrated Function Blocks
I'm not sure how I prioritize them, but hopefully I have time to work on them.