MOBILE DEVICES AND ARTISTIC PRACTICE

Today we’ll look at using the Accelerate framework to run optimized FFTs under iOS (the FFT code we’ll use also runs under OS X). The Accelerate framework is a little bit tricky to use but not too bad; luckily for us someone has gone and what little trickery there is out of it (assuming you want to do a one dimensional real-to-complex FFT, which is the most common one for audio) for us.

Parag K Mital’s FFT code on GitHub

I created an example project that visualizes the bin magnitudes generated by the FFT in an OpenGLES view (1.1). The code to setup and run the fft is actually only four lines: Two to allocate memory, one to setup the fft and one to run it. The pkmFFT code really seems to be as simple as anyone could possibly make it.

Today we’ll be looking at Android Development. As I’m not particularly fond of Eclipse or NetBeans, we’ll be kicking it old-skool with a terminal and text editor. The first step is to download and install the Java Development Kit.. After that the next step is to install the Android SDK. We’ll then add a couple of the various Android platforms and SDKs to our development environment.

Class today will be split amongst a variety of topics. First we’ll look at a brief code example of how to pick images out of the image library and get access to individual pixel data from them. We’ll use the UIImagePickerController to do this. After that we’ll take turns discussing the architectures of our final projects and brainstorm ways to design them better. We’ll then look at some basic graphic design principles in an effort to make our projects look a little more hip. Finally, we’ll have some work time to tackle problems that seem to be common in the final projects, like mapping video to shaders.

Image Picker demo

Graphic Design Resources:

We’ll be spending most of Day 10 looking at application architecture, specifically how UIViewController and its subclasses are used and at the process of creating a Universal iPhone / iPad application. Once this is done we’ll also look at CoreLocation and MapKit. CoreLocation allows you to access both the GPS and the magnetometer (compass) on iOS devices. MapKit provides an easy way to embed Google Maps (at least for now). Together they provide a simple solution for showing a users location and providing annotations and overlays about their surrounding geographic area. Although the GPS on iOS devices only yields average accuracy of about 55 feet, this is enough accuracy to provide coarse location info for large installations / museums.

There are a few different techniques for presenting custom graphics on iOS devices. We’ll look at two of these today: CoreGraphics, the native iOS solution for 2D graphics, and OpenGL ES, the cross-platform solution for 2D and 3D graphics.

In general I’d say that if you’re doing 2D graphics, CoreGraphics is the way to go. The first set of demos will build off of the audio file reader demo from last week. We’ll use CoreGraphics to draw the waveform in the first demo; the second demo caches and reuses our initial drawing of the waveform to save on CPU. The third demo embeds the waveform in a UIScrollView that allows users to pinch zoom and scrub (visually) through the waveform at a higher resolution.

After looking at this we’ll dive into simple OpenGL ES 1.1. The examples below first just draw a line in as few lines of code as possible (for me), the second animates that line using a NSTimer, and the third displays the waveform of an audio file using OpenGL instead of CoreGraphics.

Finally we’ll take a look at how to read in video input, modify the pixels of the input and then display them. This is useful for augmented reality and photo shooting apps.

Gamma is a synthesis library written in C++ by Lance Putnam. It’s actually very easy to embed in iOS applications using the AudioUnit programming techniques we covered in class this week. I’m posting a sample iOS project that uses Gamma to play pink noise. Gamma provides a lot of unit generators to experiment with; hopefully people will be able to get some interesting sounds going with it for the musical instrument assignment.

Last week we looked at how to setup basic callbacks for audio input and output. Today we’ll look at how to run multiple audio units concurrently in an AudioUnit graph and to read and write audio files. We’ll also look at the process of moving files between the desktop and iOS devices using the iTunes interface; this is how we’ll get audio files on and off devices.

In addition to listening to ideas for final projects, we’ll also look at various methods for synthesizing and playing back audio and how to read audio input.

The first demo looks at the highest level API for playing audio, AVFoundation. The demo assigns a button to play a short audio file asynchronously. It also shows a quick example of how to make the phone vibrate.

The second demo looks at a mid-level API for audio, the AudioQueue framework.. The demo plays a sine wave at fixed frequency. Audio queues have less setup required than AudioUnits, but do not allow full duplex audio programming.

The third demo is basically the same as the second (playing a synthesized sine wave) except that is uses an AudioUnit instead of AudioQueue.

The fourth and final demo shows the basics of reading microphone input using an AudioUnit.

Today we’ll be talking about design patterns in Cocoa and trying to get a better feel for how Cocoa / UIKit work. I’ll be giving a variety of demos that are linked here; I’ll also post to some external links that have further explanations on how stuff works.

Memento:
Archiving Object-C Objects With NSCoding
Demo of Adding A Button To A Window Without using Interface Builder

Hierarchy:
Apple View Hierarchy Documentation
View Hierarchy and Core Animation Demo

Observer:
Notification Demo

Delegation:
TableView Delegation Demo

MVC:
Cocoa Bindings Demo

In today’s class we’ll be looking at using the CoreMIDI framework to send wireless RTP MIDI information to a personal computer. Although OS X has RTPMIDI built into it, on Windows you need to install some extra drivers to get RTP MIDI working.

Here’s the example project we’ll look at in class. It allows you to both send and receive wireless MIDI data. In case anyone is interested there is also a RTP MIDI library available for Android.

We’ll also be looking at using the CoreMotion framework to handle Accelerometer and Gyroscope data. In order to use CoreMotion successfully, we need to learn about a language feature of Obj-C (it also works in C and Obj-C++) called blocks. I’ve given some links below that discuss blocks and their usefulness.

Here’s the demo file for basic CoreMotion behavior. It prints out acceleration, attitude and rotational rate information. It also allows you to reset the reference attitude via the GUI.

One more link related to accelerometers: an interesting blog post on setting accelerometer sensitivity and the use of accelerometers in games.

As requested by the class, here is a simple Cocoa threading demo.