converting PCM-16 to AMR using AmrInputStream - android

I'm doing a conversion from PCM-16 to AMR using AmrInputStream. The details for the AmrInputStream can be found here http://hi-android.info/src/android/media/AmrInputStream.java.html
I'm quite new to programming to while it talks about using JNI and stuff, I have no idea what JNI is and I don't think it is required for this discussion. The AmrInputStream above is also apparently not found in the SDK nor the NDK, but I have been able to use it.
I've been searching around the internet for how to use the stream, but have not found any examples. In the end I experimented and found it to be similar to just any InputStream. Here's a code snippet
InputStream inStream;
inStream = new FileInputStream("abc.wav");
AmrInputStream aStream = new AmrInputStream(inStream);
File file = new File("xyz.amr");
file.createNewFile();
OutputStream out = new FileOutputStream(file);
byte[] x = new byte[1024];
int len;
while ((len=aStream.read(x)) > 0) {
out.write(x,0,len);
}
out.close();
I have tested this and it has worked (requiring adding the #!AMR\n tag to the output file for playing.) (Edit: The AMR tag must be #!AMR\n).
My question pertains to that I have only managed to get this to work on a PCM-16 file sampled at 8000Hz. Any (higher) frequency used for the original PCM-16 file results in an (undersampled) output. There is a SAMPLES_PER_FRAME variable in the AmrInputStream.java file which I have tried playing with but it does not seem to affect anything.
Any advise or related discussion is welcomed!

SAMPLES_PER_FRAME is the block of data the amrencoder acts on in one go(which is mapped to 20 msec of audio).
from the signatures of the amr encoder functions (at the bottom of http://hi-android.info/src/android/media/AmrInputStream.java.html)
private static native int GsmAmrEncoderNew();
private static native void GsmAmrEncoderInitialize(int gae);
private static native int GsmAmrEncoderEncode(int gae,
byte[] pcm, int pcmOffset, byte[] amr, int amrOffset) throws IOException;
private static native void GsmAmrEncoderCleanup(int gae);
private static native void GsmAmrEncoderDelete(int gae);
There doesnt seem to be a way to pass samplerate to the encoder.(gae is native handle)
the sample rate is hardcoded to 8k atleast with this api

Related

Buffered file reading in Flutter

I have following Dart code and I am trying to make reading the file buffered. Just like Java's BufferedReader or C++ ifstream. Is there such functionality? I cannot even find buffer mentioned in file.dart nor file_impl.dart. If I understood my debugging correctly, it seems that Dart is reading the whole file at once.
So could anybody help me make it buffered or point me in right direction where the buffer is?
final file = File(join(documentsDirectory, "xxx.txt"));
final List<String> lines = await file.readAsLines(); //file.readAsLinesSync()
lines.forEach((line) {
....
});
Use file.openRead(). This will return a Stream of bytes. If you want to read as characters, transform the stream using the appropriate decoder (probably utf8).
As it says, you must read the stream to the end, or cancel it.

Get audio input stream from a locale file on android

I am trying to get the audio input stream from a file in the local file system on an android device.
This is so that i can use this library to show a wave form for the audio file.
https://github.com/newventuresoftware/WaveformControl/blob/master/app/src/main/java/com/newventuresoftware/waveformdemo/MainActivity.java#L125
The example in the project uses rawResource like so
InputStream is = getResources().openRawResource(R.raw.jinglebells);
This input stream is later converted into byte array and passed to somewhere that uses it to paint a wave picture and sound.
however when I did
InputStream is = new InputFileSystem(new File(filePath));
But this does not seem to work properly. The image generated is wrong, and the sound played is nothing like what the file actually is.
This is the body of the function in that library that gets the input stream and convert it into byte arrays.
private short[] getAudioSample() throws IOException {
// If i replace this part with new FileInput(new File(filePath))
// the generated "samples" from it does not work properly with the library.
InputStream is = getResources().openRawResource(R.raw.jinglebells);
byte[] data;
try {
data = IOUtils.toByteArray(is);
} finally {
if (is != null) {
is.close();
}
}
ShortBuffer sb = ByteBuffer.wrap(data).order(ByteOrder.LITTLE_ENDIAN).asShortBuffer();
short[] samples = new short[sb.limit()];
sb.get(samples);
return samples;
}
The sound file that I would like to get processed and pass to that library is created by a MediaRecorder with the following configurations
MediaRecorder recorder = new MediaRecorder();
recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
recorder.setOutputFormat(MediaRecorder.OutputFormat.DEFAULT);
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
Basically that library requires PCM samples. Using that 3gp FileInputStream generated by MediaRecorder directly is not gonna cut it.
What I did is to use MediaExtractor and MediaCodec to convert the 3gp audio data into PCM samples by samples. and then it is fed into that library. Then everything worked =)
The logic in encoding audio data can be almost directly taken from here this awesome github repo

Using AudioTrack in AudioTrack.MODE_STATIC?

I've found lots of tutorials and posts showing how to use AudioTrack to play wav files in AudioTrack.MODE_STREAM and I've successfully implemented this example.
However I'm having issues with performance when playing multiple audio tracks at once and thinking that I should first create the tracks using AudioTrack.MODE_STATIC then just call play each time.
I can't find any resources on how to implement this. How can I do this?
Thanks
The two main sticking points for me were realizing that .write() comes first and that the instantiated player must have the size of the entire clip as the buffer_size_in_bytes.
Assuming you have recorded a PCM file using AudioRecord, you can play it back with STATIC_MODE like so...
File file = new File(FILENAME);
int audioLength = (int)file.length();
byte filedata[] = new byte[audioLength];
try{
InputStream inputStream = new BufferedInputStream(new FileInputStream(FILENAME));
int lengthOfAudioClip = inputStream.read(filedata, 0, audioLength);
player = new AudioTrack(STREAM_TYPE, SAMPLE_RATE, CHANNEL_OUT_CONFIG, AUDIO_FORMAT,audioLength, AUDIO_MODE);
player.write(filedata, OFFSET, lengthOfAudioClip);
player.setPlaybackRate(playbackRate);
player.play();
}

Android Inputread stream reads exactly the given buffer size?

I am reading a file with:
char [] buffer = new char[300];
FileInputStream istream = new FileInputStream(path);
InputStreamReader file = new InputStreamReader(istream);
size = file.read(buffer);
file.close();
After a few tries, it turns out that the file.read(buffer) reads exactly the number of chars allocated for buffer (in this case, 300, even that the file has much more characers in it).
Can I rely on read() always reading as much as it can, without generating any exception?
Or is this an undocumented feature?
The read method description says:
Reads characters from this reader and stores them in the character
array buf starting at offset 0. Returns the number of characters
actually read or -1 if the end of the reader has been reached.
There is no mention of the buffer allocation issue.
This is very important, and a good thing that it works this way, because it allows you to define the size of the buffer as you want/need and there is no need to guess, no need to code for exceptions. Actually, it is read(char[] buffer) but it works as read(char[] buffer, int size).
Yes you can rely on this call, unless an I/O error occurs, which is already mentionned in the api.
If you look at the code of read(char cbuf[]) you'll notice it calls the method public int read (char[] buffer, int offset, int length).
From Android source code:
public int read(char cbuf[]) throws IOException { read(cbuf, 0, cbuf.length);}
In your implementation, you need to continue reading the file with file.read(buffer) to obtain remaining bytes. The content of buffer needs to be appended to another buffer that will grow, depending on the size of the file you're reading.
You could also allocate that buffer with the size of the file with the method getTotalSpace()

How to merge two mp3 files into one (combine/join)

Can any tell how to combine/merge two media files into one ?
i found a topics about audioInputStream but now it's not supported in android, and all code for java .
And on StackOverflow i found this link here
but there i can't find solution - these links only on streaming audio . Any one can tell me ?
P.S and why i can't start bounty ?:(
import java.io.*;
public class TwoFiles
{
public static void main(String args[]) throws IOException
{
FileInputStream fistream1 = new FileInputStream("C:\\Temp\\1.mp3"); // first source file
FileInputStream fistream2 = new FileInputStream("C:\\Temp\\2.mp3");//second source file
SequenceInputStream sistream = new SequenceInputStream(fistream1, fistream2);
FileOutputStream fostream = new FileOutputStream("C:\\Temp\\final.mp3");//destinationfile
int temp;
while( ( temp = sistream.read() ) != -1)
{
// System.out.print( (char) temp ); // to print at DOS prompt
fostream.write(temp); // to write to file
}
fostream.close();
sistream.close();
fistream1.close();
fistream2.close();
}
}
Consider two cases for .mp3 files:
Files with same sampling frequency and number of channels
In this case, we can just append the second file to end of first file. This can be achieved using File classes available on Android.
Files with different sampling frequency or number of channels.
In this case, one of the clips has to be re-encoded to ensure both files have same sampling frequency and number of channels. To do this, we would need to decode MP3, get PCM samples,process it to change sampling frequency and then re-encode to MP3. From what I know, android does not have transcode or reencode APIs. One option is to use external library like lame/FFMPEG via JNI for re-encode.

Categories

Resources