Tagged: opinion

iOS State of the Music

Well it’s that time of year — WWDC time. This is my annual time to gaze in to the future (in fall, with the release of iOS 9) put forth by our Apple overlords, and attempt to prepare for it. Sometimes the changes are very small, and I only have to do a little maintenance; sometimes it’s a HUGE, paradigm shifting, app obsoleting cannonball.

This year appears to be a cannon ball. In a way, iOS 9 does for music app developers what iOS 8 did for every other developer. With the introduction of App Extensions, iOS 8 greatly expanded the scope of interaction between apps, and the OS in general. It made developers rethink how their apps could fit into a users workflow. This was a big game changer for most apps, especially those concerned with productivity. For music apps… There was not a lot to make use of. I mean, there was some potential for streamlining wav file editing with extensions, but the big pain points in the pro-audio workflow went largely unaddressed.

Well the future has finally remembered us, and with iOS 9 Apple will be introducing Audio Unit extensions. Anyone familiar with audio production in the Apple world should know that Audio Units are virtual instruments, and effects modules, usable in a Digital Audio Workstation (like Ableton Live, Cubase, Audio Mulch, Renoise, etc). You can think of this as Apples answer to VSTs.

This new API will allow us to have what many have longed for on iOS for years: A method of writing music where you can compose entirely within one app, and use third party instruments and audio processors, just as on a PC. The only way to approach this sort of work flow now, is to write your song in a DAW (like BeatMaker 2, or Cubasis), use CoreMidi to send notes and modulation to each instrument or effect app, then set up AudioBus to route all of the audio back into the DAW, and record it. Don’t get me wrong, it’s great to have these tools available, but it’s a pain in the neck to get this all set up, and working right. Not only that, but the composer also needs to worry quite a lot about CPU/Memory load. Not all apps are optimized very well, and latency is a big concern in this long round trip through multiple stand-alone apps.

Well, third party instruments and effects can now be loaded and used within your host app (in iOS 9, of course). Just as on OS X, Audio Units will be able to have their own interface, which will be projected into your composition app. You can tweak a setting, write a few notes, and tweak it again, without app switching! How crazy is that?

Of course one of the first things that comes to mind is that devs will now be able to port existing Audio Units from OS X to iOS. It’s important to emphasize that Audio Units are not fully interchangeable between iOS and OS X. You can’t just take all of your old AUs and copy them into your iPad. Audio units on iOS are implemented as extensions, so the developer will need to create a standalone app, and build the AU as an extension. But the good news is that version 3 Audio Units are totally cross platform between OS X and iOS, if they don’t have a custom GUI (Graphical User Interface). The AUViewController class, which developers will use to make a custom GUI, is cross-platform, but UIKit and AppKit are specific to each OS (They provide the basic building blocks of the User Interface). So their is some work to create compatibility. Version 2 Audio Units will be able to use a “Bridge” class which will help developers port old plugins to iOS.

While all of this progress is amazingly cool, we are now left to wonder what will happen to the infrastructure we’ve built in order to get around the limitations of the old iOS. Developers have spent countless hours integrating with Inter App Audio, and Audiobus, and Core-Midi, in an attempt to liberate audio functionally from its app sandbox. It’s unclear to me how much of my midi code will be reusable (probably very little), and It’s a little frustrating to have the rug pulled out from under you. I will continue to support Audiobus I believe, because really it’s still a powerful tool and the Audiobus community is amazing. They are really enthusiastic about pro-audio experiences on iPads and iPhones.

I’m now scrapping plans of adding a sequencer to SquareSynth, because I’m not sure if there’s a point. You will be able to use the app in a DAW shortly, so why bother? This will totally change the way I will build audio apps in the future. Stand alone features will suffer for sure. There’s less reason to make your standalone like a musical-hub if your app can be run in many different instances, and seamlessly chained together with other apps; Instruments and effect apps will get much, much thinner, and DAWs will get much fatter. On one hand, it’s great, because we can make very focused tools, and write less code. But a lot of existing code now seems redundant. We will have to wait and see how users adopt this new model. Perhaps for some people a heavy duty standalone synth will still be preferable.

In spite of the friction and uncertainty, I am very excited for the arrival of the modular iOS universe. It’s been a long time coming.