Strange issue in Combining Audio Files and playing in different API versions - android

All , I am using Media Recorder for Recording Audio.
Case 1: If i use Android Version 2.2 installed devices, my recorded audio combined and playing well.
Case 2: If i use it in Android 1.6 installed devices, i am not able to play the combined audio file.
It is playing only the very first recorded audio and next recorded audio files keep empty no sound.
Also i am not getting any error in Logcat.
I used following code for recording audio :
mRecorder = new MediaRecorder();
mRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mRecorder.setOutputFormat(MediaRecorder.OutputFormat.RAW_AMR);
mRecorder.setOutputFile(main_record_file);
mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
mRecorder.prepare();
mRecorder.start();
Also i tried for mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.DEFAULT);
Code for combining Audio file :
public void createCombineRecFile(){
combined_file_stored_path=getFilename_combined_raw(); // File path in String to store recorded audio
byte fileContent[]=null;
FileInputStream ins;
FileOutputStream fos = null;
try{
fos = new FileOutputStream(combined_file_stored_path,true);
}
catch (FileNotFoundException e1){
// TODO Auto-generated catch block
e1.printStackTrace();
}
for(int i=0;i<audNames.size();i++){
try{
File f=new File(audNames.get(i));
Log.v("Record Message", "File Length=========>>>"+f.length());
fileContent = new byte[(int)f.length()];
ins=new FileInputStream(audNames.get(i));
int r=ins.read(fileContent);// Reads the file content as byte from the list.
Log.v("Record Message", "Number Of Bytes Readed=====>>>"+r);
fos.write(fileContent);//Write the byte into the combine file.
Log.v("Record Message", "File======="+i+"is Appended");
}
catch (FileNotFoundException e){
// TODO Auto-generated catch block
e.printStackTrace();
}
catch (IOException e)
{
// TODO Auto-generated catch block
e.printStackTrace();
}
}
try{
fos.close();
Log.v("Record Message", "===== Combine File Closed =====");
}
catch (IOException e){
// TODO Auto-generated catch block
e.printStackTrace();
}
}
Let me know any details need.Thanks.

Every audio file has its own header (includes information about length/samples etc.) - by combining the files the way you do the resulting file has multiple headers, one per source file (depending on the exact format with file offsets etc.).
Thus the resulting file is NOT correct in terms of file format spec.
The newer Android version are more permissive and work/play with "multiple headers" present... the older versions do not...
To create a correctly combined audio file you must conform to the spec which among other things means creating one new header which describes all included audio...
Use for the the combination of audio files a different approach - for example via ffmpeg (see this for how to make ffmpeg for android).

Foreword: Haven't tested this, but I don't see why it shouldn't work.
Provided the headers ARE the cause of this problem, you can solve it really easily. Using the code you've given, the encoding is AMR-NB. According to this document the AMR header is simply the first 6 bytes, which are 0x23, 0x21, 0x41, 0x4D, 0x52, 0x0A. If the headers in subsequent files are causing the issue, simply omit those bytes from subsequent files e.g.
write all bytes of first file
write from byte[6] -> byte[end] of subsequent files
Let me know how it goes.
EDIT: At request, change the try block to:
try{
File f=new File(audNames.get(i));
Log.v("Record Message", "File Length=========>>>"+f.length());
fileContent = new byte[(int)f.length()];
///////////////new bit////////
//same as you had, this opens a byte stream to the file
ins=new FileInputStream(audNames.get(i));
//reads fileContent.length bytes
ins.read(fileContent);
//now fileContent contains the entire audio file - in bytes.
if(i>0){
//we are not writing the first audio recording, but subsequent ones
//so we don't want the header included in the write
//copy the entire file, but not the first 6 bytes
byte[] headerlessFileContent = new byte[fileContent.length()-6];
for(int j=6; j<fileContent.length();j++){
headerlessFileContent[j-6] = fileContent[j];
}
fileContent = headerlessFileContent;
}
////////////////////////////
Log.v("Record Message", "Number Of Bytes Readed=====>>>"+r);
fos.write(fileContent);//Write the byte into the combine file.
Log.v("Record Message", "File======="+i+"is Appended");
}

Related

How to break a video into pieces using android?

I want java code to create partition of video of specific size.
e.g. Consider a video of size 20mb and I want peices of 5 mb each. so we get 4 parts.
I have used code below but it only creates .MP4 file it is not creating video file.
public static void divideFile(File f) {
int partCounter = 1;//I like to name parts from 001, 002, 003,
//you can change it to 0 if you want 000, 001,003
int sizeOfFiles = 1024 * 1024;// 1MB
byte[] buffer = new byte[sizeOfFiles];
String fileName = f.getName();
//try-with-resources to ensure closing stream
try (FileInputStream fis = new FileInputStream(f);
BufferedInputStream bis = new BufferedInputStream(fis)) {
int bytesAmount = 0;
while ((bytesAmount = bis.read(buffer)) > 0) {
//write each chunk of data into separate file with different number in name
String filePartName = String.format("%s.%03d", fileName, partCounter++);
File newFile = new File(f.getParent(), filePartName);
try (FileOutputStream out = new FileOutputStream(newFile)) {
out.write(buffer, 0, bytesAmount);
}
}
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
I think you want each part to be a playable video itself - in this case each part needs the correct metadata headers to allow a player to handle it properly.
Breaking the video up just by bytes will mean that metadata is not present or not correct in any of the chunks.
You can use ffmpeg to do this correctly with the following command (for mp4):
ffmpeg -I videoPath -ss startTime -t endTime -c copy outputVideoChunk.mp4
There are several ways to use ffmpeg within an Android app but one of the easiest is to use a well supported wrapper library like:
https://github.com/WritingMinds/ffmpeg-android-java

How to save a textview's result into txt or pdf file on the internal storage of the device?

I'm new on android development and i plan to print my result of my calculation on my app through print API, and i will use txt or pdf file for this mission.
for every calculation there is an output which will be as a textview like that:
Result.setText(" Weight + i1+ \nHeight+i2");
the result may be reach 20 lines !!!
and first of all i need to save it as a txt or pdf file on the internal storage of the device ?!
So, if there is any way to do that please help.
I'm assuming that you have some other method for creating a String or byte[] containing your data. When you place them into internal storage, they're saved as raw bytes usually, and the extension name really doesn't mean much (but you can add one if you want).
//create an output stream
FileOutputStream out = null;
try {
//open output to internal storage. filename here is whatever
//you want to use to call it (extensions are optional)
// context can be application or activity context
out = context.openFileOutput(filename, Context.MODE_PRIVATE);
//data is your string or byte block or whatever
out.write(data);
//catch all exceptions
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}

Saving ByteArray of audio into an Audio File

I'm working on a test app to integrate soundtouch (an open source audio processing library) on Android.
My test app already can receive input from the mic, pass the audio thru soundtouch and output the processed audio to an AudioTrack instance.
My question is, how can I change the output from AudioTrack to a new File on my device?
Here's the relevant code in my app (where I'm processing the output of soundtouch, into the input for AudioTrack)
// the following code is a stripped down version of my code
// in no way its supposed to compile or work. Its here for reference purposes
// pre-conditions
// parameters - input : byte[]
soundTouchJNIInstance.putButes(input);
int bytesReceived = soundTouchJNIInstance.getBytes(input);
audioTrackInstance.write(input, 0, bytesReceived);
Any ideas on how to approach this problem? Thanks!
Hope you are already getting the input voice from microphone and saved on a file.
Firstly, import JNI libraries to your oncreate method :
System.loadLibrary("soundtouch");
System.loadLibrary("soundstretch");
Soundstrech library :
public class AndroidJNI extends SoundStretch{
public final static SoundStretch soundStretch = new SoundStretch();
}
Now you need to call soundstrech.process with the input file path and the desired output file to store processed voice as parameters :
AndroidJNI.soundStretch.process(dataPath + "inputFile.wav", dataPath + "outputFile.wav", tempo, pitch, rate);
File sound = new File(dataPath + "outputFile.wav");
File sound2 = new File(dataPath + "inputFile.wav");
Uri soundUri = Uri.fromFile(sound);
The soundUri can be provided as a source to media player for play back :
MediaPlayer mediaPlayer = MediaPlayer.create(this, soundUri);
mediaPlayer.start();
Also note that, the sample size for recording should be selected dynamically by declaring an Array of Sample Rates :
int[] sampleRates = { 44100, 22050, 11025, 8000 }
The best way to write byteArray this :
public void writeToFile(byte[] array)
{
try
{
String path = "Your path.mp3";
File file = new File(path);
if (!file.exists()) {
file.createNewFile();
}
FileOutputStream stream = new FileOutputStream(path);
stream.write(array);
} catch (FileNotFoundException e1)
{
e1.printStackTrace();
}
}
I am not aware of sound touch at all and the link i am providing no where deals with jni code, but you can have a look at it if it helps you any way: http://i-liger.com/article/android-wav-audio-recording
I think the best way to achieve this is converting that audio to a byte[] array. Assuming you have already done that (if not, comment it and I'll provide an example), the above code should work. This assumes you're saving it in a external sdcard in a new directory called AudioRecording and saving it as audiofile.mp3.
final File soundFile = new File(Environment.getExternalStorageDirectory().getAbsolutePath() + "AudioRecording/");
soundFile.mkdirs();
final File outFile = new File(soundFile, 'audiofile.mp3');
try {
final FileOutputStream output = new FileOutputStream(outFile);
output.write(yourByteArrayWithYourAudioFileConverted);
output.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
mkdirs() method will try to construct all the parent directories if they're missing. So if you're planning to store in a 2 or more level depth directory, this will create all the structure.
I use a simple test code snippet to write my audio byte arrays:
public void saveAudio(byte[] array, string pathAndName)
{
FileOutputStream stream = new FileOutputStream(pathAndName);
try {
stream.write(array);
} finally {
stream.close();
}
}
You will probably need to add some exception handling if you are going to be using this in a production environment, but I utilise the above to save audio whenever I am am in the development phase or for personal non-release projects.
Addendum
After some brief thought I have changed my snippet to the following slightly more robust format:
public void saveAudio(byte[] array, string pathAndName)
{
try (FileOutputStream stream= new FileOutputStream(pathAndName)) {
stream.write(array);
} catch (IOException ioe) {
ioe.printStackTrace();
} finally {
stream.close();
}
}
You can use the method using SequenceInputStream, in my app I just merge MP3 files in one and play it using the JNI Library MPG123, but I tested the file using MediaPlayer without problems.
This code isn't the best, but it works...
private void mergeSongs(File mergedFile,File...mp3Files){
FileInputStream fisToFinal = null;
FileOutputStream fos = null;
try {
fos = new FileOutputStream(mergedFile);
fisToFinal = new FileInputStream(mergedFile);
for(File mp3File:mp3Files){
if(!mp3File.exists())
continue;
FileInputStream fisSong = new FileInputStream(mp3File);
SequenceInputStream sis = new SequenceInputStream(fisToFinal, fisSong);
byte[] buf = new byte[1024];
try {
for (int readNum; (readNum = fisSong.read(buf)) != -1;)
fos.write(buf, 0, readNum);
} finally {
if(fisSong!=null){
fisSong.close();
}
if(sis!=null){
sis.close();
}
}
}
} catch (IOException e) {
e.printStackTrace();
}finally{
try {
if(fos!=null){
fos.flush();
fos.close();
}
if(fisToFinal!=null){
fisToFinal.close();
}
} catch (IOException e) {
e.printStackTrace();
}
}
}

encode video file at specific time of the video

i have video file in the sdcard. i want to encode the video file at minute 3 to 5. what im trying to say is,
let say, the complete video file is 10 minute. now i need to retrieve the video data at 3 to 5 minutes of the video to be encode and do the next process.
i have done it before but only encoded the complete video file and i dont have any idea to encoded the video file at specific time of the video. please help with this. Below is my encoded complete video file code:
public void loadVideo(){
//convert whole file into byte
File file = new File("/sdcard/videooutput.mp4");
byte[] fileData = new byte[(int) file.length()];
FileInputStream in;
try {
in = new FileInputStream(file);
try {
in.read(fileData);
for(int readNum; (readNum = in.read(fileData)) != -1;){
}
} catch (IOException e) {
Toast.makeText(this, "IO exc READ file", 2500).show();
e.printStackTrace();
}
try {
in.close();
} catch (IOException e) {
Toast.makeText(this, "IO exc CLOSE file", 2500).show();
e.printStackTrace();
}
String encoded = Base64.encodeToString(fileData, Base64.DEFAULT);
hantar(encoded);
} catch (FileNotFoundException e) {
Toast.makeText(this, "FILE NOT FOUND", 2500).show();
e.printStackTrace();
}
Please guide me with this. Thanks for advance.
You can try to use this class. It crops video from start to end position and saves cropped video to the file in your file system. Or you can use it to implement your own solution.
There is already answered question about cutting or trimming video on Android:
Android sdk cut/trim video file
You can try INDE Media for mobile - https://software.intel.com/en-us/articles/intel-inde-media-pack-for-android-tutorials
It has transcoding\remuxing functionality as MediaComposer class and a possibility to select segments for resulted files

Managing text files Android

I've got a little problem while managing .txt files on Android. I've found this link (it's in Spanish) that explains how to use text files on my Android device.
What I want to do, is create a text file using the intern memory of the device, as I don't want the user depend on a SD card, and a raw text file won't allow me to write on in, only read. So I want a text file that can append some information and, in a particular case, delete all the content in the text file and reset it.
I've written this code for the writing side:
OutputStreamWriter fout = null;
try
{
fout = new OutputStreamWriter(openFileOutput("measures.txt", Context.MODE_APPEND));
fout.write(measure);
}
catch (Exception ex)
{
Log.e("Files", "Error while opening to write file measures.txt");
}
finally
{
try
{
fout.close();
}
catch (IOException e)
{
// TODO Auto-generated catch block
e.printStackTrace();
}
}
I guess this part opens the file "measure.txt" in the APPEND mode, or creates it in APPEND mode.
When I try to read from it:
BufferedReader fin = null;
try
{
fin = new BufferedReader(new InputStreamReader(context.openFileInput("medidas.txt")));
String line = fin.readLine();
// Some staff with this line
fin.close()
}
// catch staff
What I want to do is delete all the content in the text file before I close the file. The idea is store the information in another type of variable, and then, when I finish reading from file, reset the content.
How can I do that?
Ok, I solved my problem doing this:
deleteFile("measures.txt");
And that will [i]erase[/i] for sure the file... :P

Categories

Resources