How AudioRoute works

AudioRoute gets system audio into Ableton without aggregate devices — drop one plugin on a track and your Mac's output stays pointed at your normal speakers or interface. Below: how that's actually implemented, and why the usual BlackHole-style output swap isn't necessary.

The short version

Getting system audio into a DAW on macOS has traditionally meant redirecting your Mac's output through a virtual device (BlackHole, Soundflower) and then reassembling a path to your speakers with an aggregate or multi-output device. It works, but it hijacks your audio setup and breaks monitoring in subtle ways.

AudioRoute uses CoreAudio process taps, an API Apple shipped in macOS 14. Process taps let a trusted helper observe the system mix without intercepting it — your Mac's output continues going to whatever device you had selected, and AudioRoute gets a parallel copy of the audio to deliver to your DAW.

The four components

AudioRoute ships as four small pieces that talk to each other over local IPC. Each one has a narrow job.

┌─────────────┐ ┌──────────────┐ │ Tray app │◀──────▶│ Daemon │ │ (UI/ctrl) │ UDS │ (capture) │ └─────────────┘ └──────┬───────┘ │ shared memory │ (lock-free ring) ▼ ┌─────────────┐ │ VST / AU │────▶ DAW track │ plugin │ └─────────────┘ ▲ │ (or) ┌─────────────┐ │ Virtual HAL │────▶ DAW input │ device │ └─────────────┘

1Daemon

A user-mode background process (not a kernel extension). It installs the CoreAudio process tap, reads the captured audio, and writes it into a shared-memory ring buffer. Runs per-user as a LaunchAgent — starts with your session, not with the machine.

This is where essentially all of the audio work happens. It's intentionally the component with the fewest responsibilities beyond "capture and publish."

2Tray app

A menu-bar UI built with JUCE. It doesn't touch audio — it just talks to the daemon over a Unix domain socket for control (start/stop, status, device selection, level metering). If you quit it, capture keeps working; it's just a remote control.

3VST3 / AU plugin

Drop it on an Ableton (or any DAW) track. The plugin opens the same shared-memory ring the daemon writes to, reads audio in the DAW's buffer size, and outputs it as the track's signal. No audio preferences to change in the DAW — your interface stays selected, the plugin just appears as a new sound source on whatever track you put it on.

4Virtual audio device

A CoreAudio HAL plugin installed at /Library/Audio/Plug-Ins/HAL/. If you'd rather use AudioRoute as a standard input device — in Audacity, OBS, Zoom, or any DAW's input dropdown — this is the path. Under the hood it reads from the same shared-memory ring as the VST.

Why shared memory?

Audio plugins run inside the DAW's realtime audio thread, which has hard timing constraints. You cannot block that thread on a network call, a mutex, or a system call that might take longer than a few hundred microseconds. So the plugin and the daemon don't talk "live" — they meet at a shared-memory ring buffer, written by the daemon and read by the plugin without locks (a single-producer single-consumer queue).

Control messages (start, stop, change source) go over a Unix domain socket separately, because those can tolerate the latency of a syscall.

What it doesn't do

Platform & requirements

Where the code lives

AudioRoute is not open source today, but I'm happy to go deeper on any of this — specifically the shared-memory layout, the tap lifecycle, latency numbers, or the Windows plan. Drop a note and I'll reply.

Ask me anything

[email protected]

Architecture questions, build questions, "does this do X?" — all welcome. I usually reply within a day.