acb's technical journal

Loading macOS AudioUnit instruments in Swift

Recently, I have been experimenting with writing macOS code which loads and uses AudioUnit instruments. More specifically, code which does this not with Apple's DLSMusicDevice or AUMIDISynth but with commercially available software instruments, from vendors such as Native Instruments, KORG and Applied Acoustic Systems. The end goal is to write an app which can open any instruments installed on the system, allow the user to interact with their user interface and play MIDI notes on them, with the results going through an AudioUnit graph of its own choosing. Which sounds fairly straightforward, or one would think so.

Fortunately, Apple have provided their own code examples, in the form of AudioUnitV3Example, a suite containing sample units and hosting applications for both macOS and iOS. Unfortunately, though, while this works perfectly with Apple's own units which come with the OS, its operation with third-party instruments, particularly larger and more complex ones, is variable. While some seem to work perfectly (KORG's recreation of their M1 synthesizer, for example), others sometimes crash the hosting app at various stages (Native Instruments' plugins have tended to do so when changing presets), and at other times behave erratically.

This appears to be an artefact of the AudioUnit V3 API, or possibly the internal code which bridges V2 AudioUnits to the V3 API. Presumably, while it works well enough for simple components, some more complex ones may have issues with initialisation, memory management or something similar, causing them to malfunction. The fact that, while there exist many apps which host software instruments, most of them are old enough to predate the AudioUnit V3 API, suggests that (at least for now), the key to stability is to work with the AudioUnit V2 API. Apps doing so will be in good company, alongside the likes of Logic and Ableton Live, and because of this, as V3-based software instruments come out, there will be strong incentive to ensure that the adapters allowing them to run in the V2 API will be solid. Of course, eventually the bugs will be fixed, and then the V2 API will be retired, so even if using the AudioUnit V3 API be postponed, it should not be abandoned.

One problem with the V2 API, however, is that it is old. It dates back to the Carbon days of MacOS, and doesn't even touch the Objective C runtime, let alone Swift. Objects are referred to by opaque integer handles (which may or may not be pointers to undocumented structures); they are created, modified and manipulated with functions which take pointers to memory buffers and return numeric status codes. While the AudioUnit system itself is elegant (being a network of composable unit generators with standard interfaces for parameters and a standard API for composition), the same can't be said, at least these days, for the V2 API for dealing with it. One can access most of it in Swift (with small amounts of C code for some parts, such as initialising structures which have not been bridged to Swift), but the code is not particularly Swifty.

I have written an example project which allows the user to load and play AudioUnit instruments, playing MIDI notes through the computer keyboard and operating the instrument's GUI in a window. This program is written in a modular fashion, and can be built to operate using either the V2 or V3 API; the default is V2, as that's the one that currently works correctly. The two API interfaces are in interchangeable classes implementing a protocol named InstrumentHost.

The V3 system is fairly straightforward, and uses AVFoundation to initialise its AudioUnits, much as Apple's AudioUnitV3Example does. The V2 system is somewhat more involved; to keep things tidy, it is implemented using some simple Swift wrapper types which encapsulate the parts of the functionality of AUGraph and AudioUnit which it uses. Not counting the V2 wrapper code, the two InstrumentHost instances are of roughly the same length.

This code is by no means exhaustive; just enough of the V2 AudioUnit API has been wrapped in Swift to demonstrate the functionality of this toy app. A number of things are missing (such as preset management, parameter editing, and such), though hopefully the existing code and the comments in Apple's AudioToolbox.h will suffice to allow those to be implemented if needed. It is hoped that this example will fill a gap on the web, providing an example of how to interoperate with the base of existing AudioUnit instrument plugins in Swift.

There are no comments yet on "Loading macOS AudioUnit instruments in Swift"

Want to say something? Do so here.

Post pseudonymously

Display name:
URL:(optional)
To prove that you are not a bot,
please enter the text in the image on the right
in the field below it.

Your Comment:

Please keep comments on topic and to the point. Inappropriate comments may be deleted.

Note that markup is stripped from comments; URLs will be automatically converted into links.