Saving ByteArray of audio into an Audio File - android

I'm working on a test app to integrate soundtouch (an open source audio processing library) on Android.
My test app already can receive input from the mic, pass the audio thru soundtouch and output the processed audio to an AudioTrack instance.
My question is, how can I change the output from AudioTrack to a new File on my device?
Here's the relevant code in my app (where I'm processing the output of soundtouch, into the input for AudioTrack)
// the following code is a stripped down version of my code
// in no way its supposed to compile or work. Its here for reference purposes
// pre-conditions
// parameters - input : byte[]
soundTouchJNIInstance.putButes(input);
int bytesReceived = soundTouchJNIInstance.getBytes(input);
audioTrackInstance.write(input, 0, bytesReceived);
Any ideas on how to approach this problem? Thanks!

Hope you are already getting the input voice from microphone and saved on a file.
Firstly, import JNI libraries to your oncreate method :
System.loadLibrary("soundtouch");
System.loadLibrary("soundstretch");
Soundstrech library :
public class AndroidJNI extends SoundStretch{
public final static SoundStretch soundStretch = new SoundStretch();
}
Now you need to call soundstrech.process with the input file path and the desired output file to store processed voice as parameters :
AndroidJNI.soundStretch.process(dataPath + "inputFile.wav", dataPath + "outputFile.wav", tempo, pitch, rate);
File sound = new File(dataPath + "outputFile.wav");
File sound2 = new File(dataPath + "inputFile.wav");
Uri soundUri = Uri.fromFile(sound);
The soundUri can be provided as a source to media player for play back :
MediaPlayer mediaPlayer = MediaPlayer.create(this, soundUri);
mediaPlayer.start();
Also note that, the sample size for recording should be selected dynamically by declaring an Array of Sample Rates :
int[] sampleRates = { 44100, 22050, 11025, 8000 }

The best way to write byteArray this :
public void writeToFile(byte[] array)
{
try
{
String path = "Your path.mp3";
File file = new File(path);
if (!file.exists()) {
file.createNewFile();
}
FileOutputStream stream = new FileOutputStream(path);
stream.write(array);
} catch (FileNotFoundException e1)
{
e1.printStackTrace();
}
}

I am not aware of sound touch at all and the link i am providing no where deals with jni code, but you can have a look at it if it helps you any way: http://i-liger.com/article/android-wav-audio-recording

I think the best way to achieve this is converting that audio to a byte[] array. Assuming you have already done that (if not, comment it and I'll provide an example), the above code should work. This assumes you're saving it in a external sdcard in a new directory called AudioRecording and saving it as audiofile.mp3.
final File soundFile = new File(Environment.getExternalStorageDirectory().getAbsolutePath() + "AudioRecording/");
soundFile.mkdirs();
final File outFile = new File(soundFile, 'audiofile.mp3');
try {
final FileOutputStream output = new FileOutputStream(outFile);
output.write(yourByteArrayWithYourAudioFileConverted);
output.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
mkdirs() method will try to construct all the parent directories if they're missing. So if you're planning to store in a 2 or more level depth directory, this will create all the structure.

I use a simple test code snippet to write my audio byte arrays:
public void saveAudio(byte[] array, string pathAndName)
{
FileOutputStream stream = new FileOutputStream(pathAndName);
try {
stream.write(array);
} finally {
stream.close();
}
}
You will probably need to add some exception handling if you are going to be using this in a production environment, but I utilise the above to save audio whenever I am am in the development phase or for personal non-release projects.
Addendum
After some brief thought I have changed my snippet to the following slightly more robust format:
public void saveAudio(byte[] array, string pathAndName)
{
try (FileOutputStream stream= new FileOutputStream(pathAndName)) {
stream.write(array);
} catch (IOException ioe) {
ioe.printStackTrace();
} finally {
stream.close();
}
}

You can use the method using SequenceInputStream, in my app I just merge MP3 files in one and play it using the JNI Library MPG123, but I tested the file using MediaPlayer without problems.
This code isn't the best, but it works...
private void mergeSongs(File mergedFile,File...mp3Files){
FileInputStream fisToFinal = null;
FileOutputStream fos = null;
try {
fos = new FileOutputStream(mergedFile);
fisToFinal = new FileInputStream(mergedFile);
for(File mp3File:mp3Files){
if(!mp3File.exists())
continue;
FileInputStream fisSong = new FileInputStream(mp3File);
SequenceInputStream sis = new SequenceInputStream(fisToFinal, fisSong);
byte[] buf = new byte[1024];
try {
for (int readNum; (readNum = fisSong.read(buf)) != -1;)
fos.write(buf, 0, readNum);
} finally {
if(fisSong!=null){
fisSong.close();
}
if(sis!=null){
sis.close();
}
}
}
} catch (IOException e) {
e.printStackTrace();
}finally{
try {
if(fos!=null){
fos.flush();
fos.close();
}
if(fisToFinal!=null){
fisToFinal.close();
}
} catch (IOException e) {
e.printStackTrace();
}
}
}

Related

How to play Base64 encoded audio in android?

What I have tried till now :-
Have a Base64 encoded String of an audio file.
Now here I am trying to play like this.
public void PlayAudio(String base64EncodedString){
try
{
String url = "data:audio/mp3;base64,"+base64EncodedString;
Uri uri = Uri.parse(url);
mediaPlayer.setDataSource(MainActivity.this, uri);
/*mediaPlayer.prepare();*/
mediaPlayer.start();
}
catch(Exception ex){
System.out.print(ex.getMessage());
}
}
and
private void playMp3(byte[] mp3SoundByteArray) {
try {
// create temp file that will hold byte array
File tempMp3 = File.createTempFile("kurchina", "mp3", getCacheDir());
tempMp3.deleteOnExit();
FileOutputStream fos = new FileOutputStream(tempMp3);
fos.write(mp3SoundByteArray);
fos.close();
// resetting mediaplayer instance to evade problems
mediaPlayer.reset();
// In case you run into issues with threading consider new instance like:
// MediaPlayer mediaPlayer = new MediaPlayer();
// Tried passing path directly, but kept getting
// "Prepare failed.: status=0x1"
// so using file descriptor instead
FileInputStream fis = new FileInputStream(tempMp3);
mediaPlayer.setDataSource(fis.getFD());
//mediaPlayer.prepare();
mediaPlayer.start();
} catch (IOException ex) {
String s = ex.toString();
ex.printStackTrace();
}
}
But audio is not playing.
May any one assist me over this.
You actually need to decode the Base64 encoded String before you play it. Please check the decoding procedure here

How can I get a Picasa image folder on my Marshmallow device?

My application allows users to select an image to upload. When users select an image from a picasa album my data intent comes back with dat=content://com.sec.android.gallery3d.provider/picasa/item/....
Apparently when selecting an image from a picasa folder, I must handle getting the image differently as noted in this answer.
But before I implement a fix, I want to be able to reproduce the crash so I can verify my fix actually works. So how can I get a Picasa folder on my new (marshmallow) Android test device since Picasa has been killed by Google?
The most guaranteed way of getting a file send inside an intent, is to open a stream to it and copy it over to a private folder on your app.
This way works for local file, content uri, picasa, all of it.
Something like that:
private File getSharedFile() {
Uri uri = intent.getExtras().getParcelable(Intent.EXTRA_STREAM);
// or using the new compat lib
Uri uri = ShareCompat.IntentReader(this).getStream();
InputStream is = null;
OutputStream os = null;
try {
File f = ... define here a temp file // maybe getCacheDir();
is = getContentResolver().openInputStream(uri);
os = new BufferedOutputStream(new FileOutputStream(f));
int read;
byte[] bytes = new byte[2048];
while ((read = is.read(bytes)) != -1) {
os.write(bytes, 0, read);
}
return f;
} catch (Exception e) {
... handle exceptions, buffer underflow, NPE, etc
} finally {
try { is.close(); } catch (Exception e) { /* u never know */ }
try {
os.flush();
os.close();
} catch (Exception e) { /* seriously can happen */ }
}
return null;
}

Is there any way to copy byte by byte from a file and play it using videoView in android?

I have tried this but it's not working. It copies all the bytes at once and play it. But i want to copy byte by byte from a file and play this video during/while receiving...
Some Code
tempFile = new File(Environment.getExternalStorageDirectory()+ "/" +"temp.mp4");
int length=0;
byte[] vidBytes = new byte[(int) file.length()];
try {
//while (length<100)
//{
fileInputStream = new FileInputStream(file);
fileInputStream.read(vidBytes,0,vidBytes.length);
fileInputStream.close();
videoPlayer.setVideoPath(tempFile.getAbsolutePath());
Toast.makeText(this,tempFile.getPath(),Toast.LENGTH_LONG).show();
videoPlayer.start();
length++;
//}
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
try {
FileOutputStream fileOutputStream = new FileOutputStream(tempFile);
fileOutputStream.write(vidBytes,0,50,vidBytes.length);
fileOutputStream.close();
what the hell are you doing? the fileInputStream is practically useless ! you read the bytes into the byte array vidBytes, but never actually use it anywhere, videoView doesn't require you to open an input stream, all videoView requires is the path to the file and the videoView code takes care of opening the stream and loading the data efficiently.
Also why are you writing the same Byte array that you read just few statements ago not modify it and write it back to the SAME FILE!? What are you trying to accomplish here?
P.S This is all you need to do to play the mp4 file in the videoview
tempFile = new File(Environment.getExternalStorageDirectory()+ "/" +"temp.mp4");
videoPlayer.setVideoPath(tempFile.getAbsolutePath());
Toast.makeText(this,tempFile.getPath(),Toast.LENGTH_LONG).show();
videoPlayer.start();

Accessing the output video while recording

In short, I'm looking for a way to get the byte stream from the camera while recording video.
The aim is to continuously record while saving certain portions of the current recording without stopping the actual recording process to access the output file. Is this even possible, or will I need to actually stop the recording and save it for it be playable?
I've seen projects and open source library's that allow live streaming from the camera to a server via a local socket and the ParcelFileDescriptor class, so I assume (maybe incorrectly) that the recorder byte stream must be accessible somehow.
Any suggestions or help would be greatly appreciated.
Set output file to FileDescriptor:
mRecorder.setOutputFile(getStreamFd());
Then use this function:
private FileDescriptor getStreamFd() {
ParcelFileDescriptor[] pipe = null;
try {
pipe = ParcelFileDescriptor.createPipe();
new TransferThread(new ParcelFileDescriptor.AutoCloseInputStream(pipe[0]),
new FileOutputStream(getOutputFile())).start();
} catch (IOException e) {
Log.e(getClass().getSimpleName(), "Exception opening pipe", e);
}
return (pipe[1].getFileDescriptor());
}
private File getOutputFile() {
return (new File(Environment.getExternalStorageDirectory().getPath().toString() + "/YourDirectory/filename"));
}
New thread code:
static class TransferThread extends Thread {
InputStream in;
FileOutputStream out;
TransferThread(InputStream in, FileOutputStream out) {
this.in = in;
this.out = out;
}
#Override
public void run() {
byte[] buf = new byte[8192];
int len;
try {
while ((len = in.read(buf)) > 0) {
out.write(buf, 0, len);
}
in.close();
out.flush();
out.getFD().sync();
out.close();
} catch (IOException e) {
Log.e(getClass().getSimpleName(),
"Exception transferring file", e);
}
}
}
Don't forget to add persmissions to your manifest file:
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
I had a similiar problem and wanted to record H264 in an MP4 file while accessing the H264 NAL units camera byte stream (redirect it to libRTMP). The following example helped alot (requires at least Android 4.3):
http://bigflake.com/mediacodec/ <- the CameraToMpegTest.java and "Android Breakout game recorder patch" examples
Basically, Androids MediaCodec class provides low level access to the device encoders/decoders. Take a look at the function drainEncoder() of the examples above:
the video data is send to MediaMuxer to create an output file
you can easily access the H264 NAL units from encodedData ByteBuffer and process them the way you want
Example:
int old_pos = encodedData.position();
encodedData.position(0);
byte[] encoded_array = new byte[encodedData.remaining()];
encodedData.get(encoded_array);
encodedData.position(old_pos);

RandomAccessFile in Android raw resource file

I tried to create a RandomAccessFile object from a raw resource file in android resource directory without success.
I'm only able to get a inputstream object from raw resource file.
getResources().openRawResource(R.raw.file);
Is it possible to create a RandomAccessFile object from raw asset file or Do I need to stick with inputstream?
It's simply not possible to seek forward and back in an input stream without buffering everything in between into memory. That can be extremely costly, and isn't a scalable solution for reading a (binary) file of some arbitrary size.
You're right: ideally, one would use a RandomAccessFile, but reading from the resources provides an input stream instead. The suggestion mentioned in the comments above is to use the input stream to write the file to the SD card, and randomly access the file from there. You could consider writing the file to a temporary directory, reading it, and deleting it after use:
String file = "your_binary_file.bin";
AssetFileDescriptor afd = null;
FileInputStream fis = null;
File tmpFile = null;
RandomAccessFile raf = null;
try {
afd = context.getAssets().openFd(file);
long len = afd.getLength();
fis = afd.createInputStream();
// We'll create a file in the application's cache directory
File dir = context.getCacheDir();
dir.mkdirs();
tmpFile = new File(dir, file);
if (tmpFile.exists()) {
// Delete the temporary file if it already exists
tmpFile.delete();
}
FileOutputStream fos = null;
try {
// Write the asset file to the temporary location
fos = new FileOutputStream(tmpFile);
byte[] buffer = new byte[1024];
int bufferLen;
while ((bufferLen = fis.read(buffer)) != -1) {
fos.write(buffer, 0, bufferLen);
}
} finally {
if (fos != null) {
try {
fos.close();
} catch (IOException e) {}
}
}
// Read the newly created file
raf = new RandomAccessFile(tmpFile, "r");
// Read your file here
} catch (IOException e) {
Log.e(TAG, "Failed reading asset", e);
} finally {
if (raf != null) {
try {
raf.close();
} catch (IOException e) {}
}
if (fis != null) {
try {
fis.close();
} catch (IOException e) {}
}
if (afd != null) {
try {
afd.close();
} catch (IOException e) {}
}
// Clean up
if (tmpFile != null) {
tmpFile.delete();
}
}
Why not get a new AssetFileDescriptor each time you need a seek? It seems not to be a cpu cycles intensive task (or is it?)
//seek to your first start position
InputStream ins = getAssets().openFd("your_file_name").createInputStream();
isChunk.skip(start);
//read some bytes
ins.read(toThisBuffer, 0, length);
//later on
//seek to a different position, need to openFd again!
//because createInputStream can be called on asset file descriptor only once.
//This resets the new stream to file offset 0,
//so need to seek (skip()) to a new position relative to file beginning.
ins = getAssets().openFd("your_file_name").createInputStream();
ins.skip(start2);
//read some bytes
ins.read(toThatBuffer, 0, length);
I've used this method in my app that needs random access to a 20Mb resource file hundreds of times per second.

Categories

Resources