Freeform / Modular / DAW-less

How the iOS Music Scene is growing its own groove

Bram Bos
5 min readJun 9, 2018

There’s no denying the world of mobile music is growing up. With hundreds of synths, sequencers and audio effects to choose from, Apple’s mobile iOS platform is offering a rich toolshed full of virtual instruments for the musician that wants to get away from his or her desktop-based studio. But one of the defining aspects of growing up is finding one’s own identity.

One of the defining aspects of growing up is finding one’s own identity

Plenty of musicians have come to iOS in search of an ‘Abletonesque’ workflow, or ways to translate their familiar DAW-centric music process into one that works the same on the bus/couch/plane. And just as many of them have come away frustrated and unsatisfied, complaining about the limitations and restrictions of iOS.

Why is there no Ableton Live equivalent on iOS? That’s an interesting discussion for another time. It will be about technology, economics, UI patterns, failed expectations-management and the mysterious ways of Apple. My own 1-liner hypothesis: creating a DAW with the scope, depth and technical complexity of Live is too grand a project for indie developers and not economically sustainable for the big dogs. Maybe we will get there one day.

Freeform, Modular, MIDI-centric

But then again… maybe not. Because it looks as if iOS is already evolving a succesful musicking paradigm of its own, ignited by its peculiar sandboxed economy and embraced by a community of open-minded musicians trying to escape the conventional DAW paradigms of the desktop world. I would describe it as being: freeform, modular and MIDI-centric.

The best way to spot it is by taking a step back from the apps, and seeing the entire iPad as the modular synthesizer, the individual apps and plugins as the modules, and MIDI/audio as the wires connecting everything.

iOS music has always had its origins in an unconventional modular undercurrent. Back in the early days when every app was a discrete standalone entity, the clever cats behind Audiobus came up with the brilliant idea to repurpose an obscure networking protocol for routing audio between apps. In the next few years this concept evolved into a thriving ecosystem of music apps that would run autonomously yet also cooperatively, not unlike the modules of a modular synthesizer.

iOS music has always had its origins in an unconventional modular undercurrent.

In recent years, actual plugin technology (Audio Units or AUv3, in Apple lingo) was brought to the platform allowing hosts and plugins to work conceptually like conventional desktop music applications; with obvious benefits such as being able to use multiple instances of plugins, easy saving of all plugin states into a single project file, painless tempo/song-position synchonization to the host, etc. And although it has led to mobile DAWs becoming more like their desktop counterparts (just look at Cubasis), it has also opened the doors for something novel. Something unique to this mobile platform.

Shrinking the DAW

What’s happening is a growing popularity of hosting plugins without a central DAW environment. Instead of having a linear or clip-based ‘master sequencer’ that controls all, there are multiple plugins running simultaneously all playing their own part. Some send out MIDI, controlling other plugins, some send out audio, some process audio. And all these plugins combine into a freeform web of interactions creating an evolving music-session, often without a clear beginning or end.

In such a workflow the main role of the DAW/host has been ‘reduced’ to hosting and synchronizing the plugins and facilitating the routing of audio and MIDI (and saving/restoring sessions so they can be suspended and resumed seamlessly — a critical feature for a mobile platform).

The main role of the DAW has been ‘reduced’ to hosting and synchronizing the plugins and facilitating the routing of audio and MIDI

Interestingly this modular workflow has many similarities to those two other major music trends of the past 5 years:

  1. the exploding popularity of analog modular synthesizers and
  2. the phenomenon of “DAW-less jamming” (meaning: making music on synths and samplers without using a computer with DAW software).

In both paradigms, computer-based master DAW software has been replaced by much simpler discrete sequencers — integrated in analog modules or inside individual hardware synths, drummachines, samplers or other music-boxes. Rather than ‘composing’ a full track in a DAW, both workflows are a hybrid between programming and performing. The resulting piece of music evolves in real-time, as the musician builds up his or her sounds and sequences.

The resulting piece of music evolves in real-time, as the musician builds up his or her sounds and sequences.

So what we’re witnessing is iOS spawning its own flavor of DAW-less modular jamming, and it combines it with the mobile freedom to jam whereever or whenever creativity strikes. The advent of MIDI plugins and hosts built around routing of MIDI and audio will likely only accelerate this trend. It may take people coming from desktop studios some time to make ‘the click’ (or in other words: unlearn what they know). But now that iOS music has found its own self, hopefully we will see fewer people turn away from iOS music disappointed — and start drawing in new creatives and musicians instead.

More where this came from

This story is published in Noteworthy, where thousands come every day to learn about the people & ideas shaping the products we love.

Follow our publication to see more stories featured by the Journal team.

--

--

Bram Bos

Creative Problem Solver, UX Expert, World Traveler, Perpetual Learner, Amateur Tinkerer, Occasional Writer