acb's technical journal

Posts matching tags 'audiounit'

2017/12/8

Loading macOS AudioUnit instruments in Swift

Recently, I have been experimenting with writing macOS code which loads and uses AudioUnit instruments. More specifically, code which does this not with Apple's DLSMusicDevice or AUMIDISynth but with commercially available software instruments, from vendors such as Native Instruments, KORG and Applied Acoustic Systems. The end goal is to write an app which can open any instruments installed on the system, allow the user to interact with their user interface and play MIDI notes on them, with the results going through an AudioUnit graph of its own choosing. Which sounds fairly straightforward, or one would think so.

Fortunately, Apple have provided their own code examples, in the form of AudioUnitV3Example, a suite containing sample units and hosting applications for both macOS and iOS. Unfortunately, though, while this works perfectly with Apple's own units which come with the OS, its operation with third-party instruments, particularly larger and more complex ones, is variable. While some seem to work perfectly (KORG's recreation of their M1 synthesizer, for example), others sometimes crash the hosting app at various stages (Native Instruments' plugins have tended to do so when changing presets), and at other times behave erratically.

This appears to be an artefact of the AudioUnit V3 API, or possibly the internal code which bridges V2 AudioUnits to the V3 API. Presumably, while it works well enough for simple components, some more complex ones may have issues with initialisation, memory management or something similar, causing them to malfunction. The fact that, while there exist many apps which host software instruments, most of them are old enough to predate the AudioUnit V3 API, suggests that (at least for now), the key to stability is to work with the AudioUnit V2 API. Apps doing so will be in good company, alongside the likes of Logic and Ableton Live, and because of this, as V3-based software instruments come out, there will be strong incentive to ensure that the adapters allowing them to run in the V2 API will be solid. Of course, eventually the bugs will be fixed, and then the V2 API will be retired, so even if using the AudioUnit V3 API be postponed, it should not be abandoned.

One problem with the V2 API, however, is that it is old. It dates back to the Carbon days of MacOS, and doesn't even touch the Objective C runtime, let alone Swift. Objects are referred to by opaque integer handles (which may or may not be pointers to undocumented structures); they are created, modified and manipulated with functions which take pointers to memory buffers and return numeric status codes. While the AudioUnit system itself is elegant (being a network of composable unit generators with standard interfaces for parameters and a standard API for composition), the same can't be said, at least these days, for the V2 API for dealing with it. One can access most of it in Swift (with small amounts of C code for some parts, such as initialising structures which have not been bridged to Swift), but the code is not particularly Swifty.

I have written an example project which allows the user to load and play AudioUnit instruments, playing MIDI notes through the computer keyboard and operating the instrument's GUI in a window. This program is written in a modular fashion, and can be built to operate using either the V2 or V3 API; the default is V2, as that's the one that currently works correctly. The two API interfaces are in interchangeable classes implementing a protocol named InstrumentHost.

 [...]

audio audiounit macos swift 0

This will be the comment popup.
Post a reply
Display name:

Your comment:


Please enter the text in the image above here: