Android audio framework
I will try to explain as detailed as possible about streaming audio in android. For this purposes in SDK 1.5 has been added AudioRecord and AudioTrack classes. AudioRecord is responsible to get samples from microphone and AudioTrack is responsible for playback samples.
This classes works only with PCM encoded data (it means that they are not encoded at all :)
Play one buffer loop
AudioRecord record = null;
AudioTrack track = null;
record = new AudioRecord(MediaRecorder.AudioSource.MIC, 8000,
AudioRecord.getMinBufferSize(8000, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT) * 4
log.d("AudioTrack min buffer size: "
+ AudioTrack.getMinBufferSize(8000, AudioFormat.CHANNEL_CONFIGURATION_MONO,
track = new AudioTrack(AudioManager.STREAM_VOICE_CALL, 8000, AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_16BIT, AUDIO_SYSTEM_BUFFER_SIZE, AudioTrack.MODE_STREAM);
Initialization is quite simple. All you have to do is to create AudioRecord and AudioTrack
Start audio player
Start audio recorder
Stop audio player/recorder
Because this objects works with system resources developer has to release them when they are not in use. Java garbage collector will not release them automatically because instances hold system handlers.
Hope you have got the idea ...
Now few practical tips
When you call record.startRecording(); recording is already started! and to reduce latency you have to read that data immediately.
You should realize that android is not a real-time operating system and often call backs come not exactly at certain. You will end-up with increased latency over time unless you apply a sort of throttling algorithm to adjust playback side.