Tagged: iOS

SqrSyn Export

Recently I read a review for SquareSynth where the user was confused by the audio export options. I’d just like to address this question here (since Apple doesn’t provide a way to respond to App Store reviews). When you tap the Export button in SquareSynth two things happen, the audio is copied to the clipboard, and shared with iTunes.

Clipboard

How the clipboard works is that SquareSynth copies a stereo wav file on to the clipboard. You then would go into another app and paste it, but there’s a caveat; not all apps support audio paste. When another app tries to use the data on the clipboard, it looks at the type of the data on the clipboard. If it recognizes the type as something it knows how to use, it will do so. Otherwise it will just do nothing.

So, to copy wav data from SquareSynth to another app, using the clipboard, you first tap the Export button in the Record overlay.

the button is in here

Now you switch to another app that knows how to use wav data from the clipboard. I recommend AudioShare! Here’s two lists of apps which support it:

https://iosaudio.wordpress.com/audio-copy-paste-the-master-list/

http://www.sonomawireworks.com/iphone/audiocopy/

These apps will have an Audio Paste option for you to use (consult the manuals). The clipboard will hold your audio until you copy something else.

File Sharing

The other thing that happens when you hit the Export button is a wav gets shared with iTunes (for your Mac/PC). To access this file:

◦ Connect your iDevice to your Mac/PC.

◦ Go to the “Apps” section

◦ Locate the File Sharing area at the bottom of the page.

◦ Locate SquareSynth in the list of File Sharing apps, and click it.

◦ You will see your file in SquareSynth Documents list. Drag it to your desktop, or use the Save To… button.

Enjoy!

iOS State of the Music

Well it’s that time of year — WWDC time. This is my annual time to gaze in to the future (in fall, with the release of iOS 9) put forth by our Apple overlords, and attempt to prepare for it. Sometimes the changes are very small, and I only have to do a little maintenance; sometimes it’s a HUGE, paradigm shifting, app obsoleting cannonball.

This year appears to be a cannon ball. In a way, iOS 9 does for music app developers what iOS 8 did for every other developer. With the introduction of App Extensions, iOS 8 greatly expanded the scope of interaction between apps, and the OS in general. It made developers rethink how their apps could fit into a users workflow. This was a big game changer for most apps, especially those concerned with productivity. For music apps… There was not a lot to make use of. I mean, there was some potential for streamlining wav file editing with extensions, but the big pain points in the pro-audio workflow went largely unaddressed.

Well the future has finally remembered us, and with iOS 9 Apple will be introducing Audio Unit extensions. Anyone familiar with audio production in the Apple world should know that Audio Units are virtual instruments, and effects modules, usable in a Digital Audio Workstation (like Ableton Live, Cubase, Audio Mulch, Renoise, etc). You can think of this as Apples answer to VSTs.

This new API will allow us to have what many have longed for on iOS for years: A method of writing music where you can compose entirely within one app, and use third party instruments and audio processors, just as on a PC. The only way to approach this sort of work flow now, is to write your song in a DAW (like BeatMaker 2, or Cubasis), use CoreMidi to send notes and modulation to each instrument or effect app, then set up AudioBus to route all of the audio back into the DAW, and record it. Don’t get me wrong, it’s great to have these tools available, but it’s a pain in the neck to get this all set up, and working right. Not only that, but the composer also needs to worry quite a lot about CPU/Memory load. Not all apps are optimized very well, and latency is a big concern in this long round trip through multiple stand-alone apps.

Well, third party instruments and effects can now be loaded and used within your host app (in iOS 9, of course). Just as on OS X, Audio Units will be able to have their own interface, which will be projected into your composition app. You can tweak a setting, write a few notes, and tweak it again, without app switching! How crazy is that?

Of course one of the first things that comes to mind is that devs will now be able to port existing Audio Units from OS X to iOS. It’s important to emphasize that Audio Units are not fully interchangeable between iOS and OS X. You can’t just take all of your old AUs and copy them into your iPad. Audio units on iOS are implemented as extensions, so the developer will need to create a standalone app, and build the AU as an extension. But the good news is that version 3 Audio Units are totally cross platform between OS X and iOS, if they don’t have a custom GUI (Graphical User Interface). The AUViewController class, which developers will use to make a custom GUI, is cross-platform, but UIKit and AppKit are specific to each OS (They provide the basic building blocks of the User Interface). So their is some work to create compatibility. Version 2 Audio Units will be able to use a “Bridge” class which will help developers port old plugins to iOS.

While all of this progress is amazingly cool, we are now left to wonder what will happen to the infrastructure we’ve built in order to get around the limitations of the old iOS. Developers have spent countless hours integrating with Inter App Audio, and Audiobus, and Core-Midi, in an attempt to liberate audio functionally from its app sandbox. It’s unclear to me how much of my midi code will be reusable (probably very little), and It’s a little frustrating to have the rug pulled out from under you. I will continue to support Audiobus I believe, because really it’s still a powerful tool and the Audiobus community is amazing. They are really enthusiastic about pro-audio experiences on iPads and iPhones.

I’m now scrapping plans of adding a sequencer to SquareSynth, because I’m not sure if there’s a point. You will be able to use the app in a DAW shortly, so why bother? This will totally change the way I will build audio apps in the future. Stand alone features will suffer for sure. There’s less reason to make your standalone like a musical-hub if your app can be run in many different instances, and seamlessly chained together with other apps; Instruments and effect apps will get much, much thinner, and DAWs will get much fatter. On one hand, it’s great, because we can make very focused tools, and write less code. But a lot of existing code now seems redundant. We will have to wait and see how users adopt this new model. Perhaps for some people a heavy duty standalone synth will still be preferable.

In spite of the friction and uncertainty, I am very excited for the arrival of the modular iOS universe. It’s been a long time coming.

CM Feed update

It’s been a while, but new features are on the way for the app. I’ve been working on adding support for offline playlists. This will let you set certain playlists to play offline while not connected to the internet. Once a song is added to one of these lists, they will become available offline throughout the app. The Feed tab will now be searchable. Also lots of optimizations, and bug fixes. The update should be complete this week, and on your iPhone soon after.

The next feature coming will be commenting for songs.

New projects are also in the pipeline so more to come.

Update: The commenting feature is being delayed. Sorry. ;(