Camera and microphone are standard accessories on most mobile devices and Android devices are no exception to this. In the previous article we dealt with Visual Input via Camera. The present article will cover encoding raw audio captured from the device microphone and encoding it to WAV or MP3 for use on other platforms and systems.
All of the recipes in this article are represented as pure ActionScript 3 classes and are not dependent upon external libraries or the Flex framework. Therefore, we will be able to use these examples in any IDE we wish.
The reader is advised to refer to the first recipe of Flash Development for Android: Visual Input via Camera for detecting microphone support.
By monitoring the sample data being returned from the Android device microphone through the ActionScript Microphone API, we can gather much information about the sound being captured, and perform responses within our application. Such input can be used in utility applications, learning modules, and even games.
We will set up an event listener to respond to sample data reported through the Microphone API:
import flash.display.Sprite;
import flash.display.Stage;
import flash.display.StageAlign;
import flash.display.StageScaleMode;
import flash.events.SampleDataEvent;
import flash.media.Microphone;
import flash.text.TextField;
import flash.text.TextFormat;
private var mic:Microphone;
private var traceField:TextField;
private var traceFormat:TextFormat;
protected function setupTextField():void {
traceFormat = new TextFormat();
traceFormat.bold = true;
traceFormat.font = "_sans";
traceFormat.size = 44;
traceFormat.align = "center";
traceFormat.color = 0x333333;
traceField = new TextField();
traceField.defaultTextFormat = traceFormat;
traceField.selectable = false;
traceField.mouseEnabled = false;
traceField.width = stage.stageWidth;
traceField.height = stage.stageHeight;
addChild(traceField);
}
protected function setupMic():void {
mic = Microphone.getMicrophone();
mic.setSilenceLevel(0);
mic.rate = 44;
mic.setLoopBack(false);
}
protected function registerListeners():void {
mic.addEventListener(SampleDataEvent.SAMPLE_DATA, onMicData);
}
public function onMicData(e:SampleDataEvent):void {
traceField.text = "";
traceField.appendText("activityLevel: " +
e.target.activityLevel + "n");
traceField.appendText("codec: " + e.target.codec + "n");
traceField.appendText("gain: " + e.target.gain + "n");
traceField.appendText("bytesAvailable: " +
e.data.bytesAvailable + "n");
traceField.appendText("length: " + e.data.length + "n");
traceField.appendText("position: " + e.data.position + "n");
}
When we instantiate a Microphone object and register a SampleDataEvent.SAMPLE_DATA event listener, we can easily monitor various properties of our Android device microphone and the associated sample data being gathered. We can then respond to that data in many ways. One example would be to move objects across the Stage based upon the Microphone.activityLevel property. Another example would be to write the sample data to a ByteArray for later analysis.