My problem is really simple: I'm using a MediaRecorder to record voice while the user is pressing on a FAB, and playing it afterwards (when he/she releases). The issue is that I lose a few seconds near the end of the recording, and I can't figure out why (they never get played back). Code (only relevant parts are shown):
Variables
double record_length = 0;
boolean recording = false;
String outputFile;
Handler myHandler = new Handler();
MediaRecorder recorder = new MediaRecorder();
OnTouchListener
findViewById(R.id.record_record).setOnTouchListener(new View.OnTouchListener() {
#Override
public boolean onTouch(View v, MotionEvent event) {
if (event.getAction() == MotionEvent.ACTION_DOWN) {
findViewById(R.id.delete_swipe).setVisibility(View.VISIBLE);
StartRecord();
} else if (event.getAction() == MotionEvent.ACTION_UP) {
if(recording){
EndRecord();
}
findViewById(R.id.delete_swipe).setVisibility(View.INVISIBLE);
}
return true;
}
});
.
public void StartRecord() {
recording = true;
record_length = 0;
SharedPreferences saved_login = getSharedPreferences("FalloundLogin", 0);
recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
recorder.setAudioEncoder(MediaRecorder.OutputFormat.AMR_NB);
//removed construction of outputFile, but it is generated correctly - I checked
recorder.setOutputFile(outputFile);
try {
recorder.prepare();
recorder.start();
} catch (IOException e) {
e.printStackTrace();
}
myHandler.postDelayed(UpdateUploadLength, 200);
}
.
public void EndRecord() {
recording = false;
try {
recorder.stop();
recorder.reset();
recorder = null;
} catch (IllegalStateException e) {
e.printStackTrace();
}
MediaPlayer m = new MediaPlayer();
try {
m.setDataSource(outputFile);
} catch (IOException e) {
e.printStackTrace();
}
try {
m.prepare();
} catch (IOException e) {
e.printStackTrace();
}
m.start();
}
I need the recording to be a maximum of 27 seconds. To avoid complications, I tested without this extra termination condition and am including the Runnable just for completeness.
private Runnable UpdateUploadLength = new Runnable(){
#Override
public void run() {
if(recording == true) {
record_length += 0.2;
if (record_length < 27) {
myHandler.postDelayed(UpdateUploadLength, 200);
} else {
//TODO: stop recording
myHandler.removeCallbacks(UpdateUploadLength);
}
}
};
I've been trying for a few hours with no luck, so any help is appreciated (also - and I dunno if it's bad to ask multiple questions in the same post - but is there any way to get better audio quality from MediaRecorder?)
Thanks in advance.
Its answear for your second question. Yes you can have much better quality. There is more encoding types, file formats and parameters in library. Example:
mediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
mediaRecorder.setAudioSamplingRate(44100);
mediaRecorder.setAudioEncodingBitRate(256000);
this code will set your recorder to m4a files with AAC, 44,1kHz sampling rate and around 256kbps
Related
Issue is Call recording is working fine upto android version 6.0.1 but it is not working properly above that android version.
Problem:- the call is on for 1 minute but recording is stop in 2 to 3 seconds.
Here Edittext of Contact:
edt_attempt_contact.setOnTouchListener(new View.OnTouchListener() {
#Override
public boolean onTouch(View v, MotionEvent event) {
final int DRAWABLE_RIGHT = 2;
if (event.getAction() == MotionEvent.ACTION_UP) {
if (event.getX() >= (edt_attempt_contact.getRight() - edt_attempt_contact.getCompoundDrawables()[DRAWABLE_RIGHT].getBounds().width())) {
if (!edt_attempt_contact.getText().toString().isEmpty()) {
Intent i = new Intent(Intent.ACTION_CALL, Uri.parse("tel:" + edt_attempt_contact.getText().toString()));
try {
startActivity(i);
}catch (SecurityException s){
s.printStackTrace();
}
try {
audioRecord();
} catch (IOException e) {
e.printStackTrace();
}
} else {
Toast.makeText(MainActivity.this, "Attempt Contact Number is required to call", Toast.LENGTH_SHORT).show();
}
return true;
}
}
return false;
}
});
}
Here is the main code for Call Recording.
private void audioRecord() throws IOException {
MediaRecorder recorder = new MediaRecorder();
recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
recorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
recorder.setOutputFile(root + "/"
.concat("_")
.concat(generateUniqueFileName())
.concat(".amr"));
try {
recorder.prepare();
} catch (IllegalStateException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
recorder.start();
}
I had taken all need permissions for android recording still it is not working in above android 6.0.1 versions.Thank you in advance for the solutions...
Are you using the call recording code in Service or Activity?
The activity fill stop once the call recording is started so if your code is in activity the call recording will stop.
Android 7 doesn't support voice call use micro
recorder = new MediaRecorder();
recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
recorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
I am new to Android development, so I am reaching out to see if there is a more efficient, or faster way to switch a MediaPlayer datasource with an onTouch method. I'm trying to create a instrument that will play like a flute, but the audio source wont switch fast enough when I press (touch) the buttons.
I am using the playNote() method to switch between the audio files. Any advice is appreciated.
public class PlayAggeion extends Activity {
ImageButton patC1;
int soundIsOn = 1;
MediaPlayer mp;
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_play_aggeion);
onConfigurationChanged(null);
addListenerPatima();
mp = new MediaPlayer();
playNote(R.raw.aa);
}
public void addListenerPatima() {
patC1 = (ImageButton) findViewById(R.id.patC1);
patC1.setOnTouchListener(new View.OnTouchListener() {
#Override
public boolean onTouch(View v, MotionEvent event) {
switch(event.getAction())
{
case MotionEvent.ACTION_DOWN:
playNote(R.raw.bb);
return true;
case MotionEvent.ACTION_UP:
playNote(R.raw.aa);
return true;
}
return false;
};
});
}
public void playNote(int note){
// Play note
try {
mp.reset();
mp.setDataSource(getApplicationContext(), Uri.parse("android.resource://" + getPackageName() + "/" + note));
mp.prepare();
mp.setLooping(true);
mp.start();
} catch (IllegalArgumentException e) {
e.printStackTrace();
} catch (IllegalStateException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
I think you should use SoundPool instead of MediaPlayer. SoundPool lets you preload a number of sound files and lets you play them one after another without any additional delay. It is often used in games and sound board apps, so it seems to perfectly match your needs.
More info:
http://developer.android.com/reference/android/media/SoundPool.html
Nice tutorial:
http://www.vogella.com/articles/AndroidMedia/article.html#tutorial_soundpool
So, I'm developing a custom video player for Android but I need to play more than the android supported video files (mp4, 3gp...), like wmv, avi, flv.
At this time I do already convert any video type to .mp4 and I'm able to play them after recoding, but I have no idea how can I play those wmv, avi files without recoding them to mp4 video formats.
Is there any way I can play any video on Android without recoding them?
JavaCV + FFmpeg library already working, just don't know how to do that.
By the way, heres the code that I'm using to recode videos:
public static void convert(File file) {
FFmpegFrameGrabber frameGrabber =
new FFmpegFrameGrabber(file.getAbsolutePath());
IplImage captured_frame = null;
FrameRecorder recorder = null;
recorder = new FFmpegFrameRecorder("/mnt/sdcard/external_sd/videosteste/primeiroteste.mp4", 300, 300);
recorder.setVideoCodec(13);
recorder.setFrameRate(30);
recorder.setFormat("mp4");
try {
recorder.start();
frameGrabber.start();
while (true) {
try {
captured_frame = frameGrabber.grab();
if (captured_frame == null) {
System.out.println("!!! Failed cvQueryFrame");
break;
}
recorder.record(captured_frame);
} catch (Exception e) {
}
}
recorder.stop();
recorder.release();
} catch (Exception e) {
e.printStackTrace();
}
}
first you create the CanvasFrame then use "canvas.showImage(captured_frame);" instead of "recorder.record(captured_frame);"
Here is the code:
public class GrabberShow implements Runnable
{
final static int INTERVAL=40;///you may use interval
IplImage image;
static CanvasFrame canvas = new CanvasFrame("JavaCV player");
public GrabberShow()
{
canvas.setDefaultCloseOperation(javax.swing.JFrame.EXIT_ON_CLOSE);
}
public static void convert(File file)
{
FFmpegFrameGrabber frameGrabber = new FFmpegFrameGrabber(file.getAbsolutePath());
IplImage captured_frame = null;
FrameRecorder recorder = null;
//recorder = new FFmpegFrameRecorder("/mnt/sdcard/external_sd/videosteste/primeiroteste.mp4", 300, 300);
recorder = new FFmpegFrameRecorder("D://temp.mp4", 300, 300);
recorder.setVideoCodec(13);
recorder.setFrameRate(30);
recorder.setFormat("mp4");
try {
recorder.start();
frameGrabber.start();
while (true) {
try {
captured_frame = frameGrabber.grab();
if (captured_frame == null) {
System.out.println("!!! Failed cvQueryFrame");
break;
}
//recorder.record(captured_frame);
canvas.showImage(captured_frame);
Thread.sleep(INTERVAL);
} catch (Exception e) {
}
}
recorder.stop();
recorder.release();
} catch (Exception e) {
e.printStackTrace();
}
}
#Override
public void run()
{
convert(new File("D://aes.mp4"));
}
public static void main(String[] args) {
GrabberShow gs = new GrabberShow();
Thread th = new Thread(gs);
th.start();
}
}
Is there any way I can play any video on Android without recoding them?
Why are you recording the Video?? There is no need to record the video. JavaCv.
This is sample code for giving you the idea, how you can achieve it.
FrameGrabber grabber = new FrameGrabber(videoFile);
grabber.start();
BufferedImage image= null;
while((image=grabber.grab())!=null){
// TODO set the image on the canvas or panel where ever you want.
}
grabber.stop();
Can someone please share with me a RELIABLE way to record audio across all devices using MediaRecorder? I'm simply trying to record a low-bitrate AMR format audio file, which according to google is standard across all devices. That's a bunch of crap.
In my experience, there are many off-brand devices, tablets, etc. that will fail horribly when you use the default AudioEncoder.AMR_NB. My workaround is currently to use reflection to poll what encoders are in the superclass, then looping through each one with an errorlistener to see which one doesn't fail. Not only is this not graceful, but it doesn't catch all devices. I have also tried setting to default on the AudioEncoder and OutputFormat options (constant 0) and this fails horribly on some devices as well.
Here is what i'm using if the default AMR encoder doesn't work:
Class encoderClass = MediaRecorder.AudioEncoder.class;
Field[] encoders = encoderClass.getFields();
Then i loop through each encoder, setting an error listener. If it ends successfully i set it as the default encoder as a setting.
for (int i = j; i < encoders.length; i++) {
try {
int enc = encoders[i].getInt(null);
recorder.reset();
recorder.setAudioSource(AudioSource.MIC);
recorder.setOutputFormat(OutputFormat.THREE_GPP);
recorder.setAudioEncoder(enc); //testing the encoder const here
recorder.setOutputFile(amrPath);
recorder.setMaxDuration(3000);
recorder.setOnInfoListener(new OnInfoListener() {
I continue the loop if the listener catches an error:
if (arg1 == MediaRecorder.MEDIA_RECORDER_ERROR_UNKNOWN) {
This technique works for most devices. What about the rest?
I still have devices that fall through the cracks and frankly i'd like
something RELIABLE for nearly all devices????
Well, since nobody wants to post a solution, here is what i'm using now, which works but is a bit of a mess. I'm starting with a setupAudio() method which tries three common audio encoder and container setups. This will work for most devices. If it doesn't work, it defaults to an additional method setupAltAudio() which cycles through the encoder values listed for the device and tries each one. I'm sure someone will chime in and say "why not use OnErrorListener()"? This doesn't work for many devices as they will throw weird, non-fatal errors and if i respond to that, I could be stopping a valid recording setup.
Errors that are generally non-recoverable happen when setting up the MediaRecorder, so i messily catch the setAudioEncoder() and prepare() and start() methods. If it throws an exception here, I don't have a valid audio recording setup. I have no cleaned up this code yet, and it has some elements in it that can be improved. Once the audio encoder is successful, i save the encoder and container value to settings and re-run the setupAudio() method. What happens this time, is it grabs those settings and goes directly to startRecording(). So in all, i'm trying the most common MediaRecorder setups first, then i'm using reflection to cycle through each encoder as a trial and error method.
EDIT:
The setupAltAudio is missing one detail. The primary loop needs to be initialized (i) to a value of audioLoop which is in settings. This keeps track of which encoder it last tested.
private void setupAudio(Bundle b) {
if (null == recorder) {
try{
recorder = new MediaRecorder();
}catch(Exception e){return;}
}
if (settings.getInt("audioEncoder", -1) > -1) {
if(null==b){
seconds = 60;
}else{
seconds = b.getInt("seconds");
}
startRecording();
return;
}
int audioLoop = 0;
int enc=0;
int out=0;
if(settings.getInt("audioLoop", 0)>0){
audioLoop = settings.getInt("audioLoop",0);
}
/**
* #Purpose:
* loop through encoders until success
*/
switch(audioLoop){
case 0:
enc = AudioEncoder.AMR_NB;
out = OutputFormat.THREE_GPP;
break;
case 1:
enc = AudioEncoder.AMR_NB;
out = OutputFormat.DEFAULT;
break;
case 2:
enc = AudioEncoder.DEFAULT;
out = OutputFormat.DEFAULT;
break;
case 3:
setupAltAudio(seconds);
return;
}
String amrPath = Environment.getExternalStorageDirectory()
.getAbsolutePath() + "/data/temp";
if(!new File(amrPath).exists()){
new File(amrPath).mkdirs();
}
amrPath += "/test.3gp";
try{
recorder.reset();
recorder.setAudioSource(AudioSource.MIC);
recorder.setOutputFormat(out);
recorder.setAudioEncoder(enc);
recorder.setOutputFile(amrPath);
recorder.setMaxDuration(5000);
recorder.prepare();
recorder.start();
SharedPreferences.Editor editor = settings.edit();
editor.putInt("audioEncoder", enc);
editor.putInt("audioContainer", out);
editor.commit();
setupAudio(b);
return;
}catch(Exception e){
e.printStackTrace();
int count = settings.getInt("audioLoop", 0);
count++;
SharedPreferences.Editor editor = settings.edit();
editor.putInt("audioLoop", count);
editor.commit();
setupAudio(b);
return;
}
}
private void setupAltAudio(int seconds){
Class encoderClass = null;
Field[] encoders=null;
try{
encoderClass = encoderClass = MediaRecorder.AudioEncoder.class;
encoders = encoderClass.getFields();
}catch(Exception e){}
File tempDir = new File(Environment.getExternalStorageDirectory().getAbsolutePath() + "/data/tmp");
if(!tempDir.exists()){
tempDir.mkdirs();
}
int enc = 0;
int container = 0;
for(int i = 0; i < encoders.length; i++){
try{
enc = encoders[i].getInt(null);
}catch(Exception e){
continue;
}
recorder.reset();
recorder.setAudioSource(AudioSource.MIC);
try{
recorder.setOutputFormat(OutputFormat.THREE_GPP);
container = OutputFormat.THREE_GPP;
}catch(Exception e){
recorder.setOutputFormat(OutputFormat.DEFAULT);
container = OutputFormat.DEFAULT;
}
recorder.setAudioEncoder(enc);
recorder.setOutputFile(amrPath);
recorder.setMaxDuration(seconds*1000);
recorder.setOnInfoListener(new OnInfoListener() {
public void onInfo(MediaRecorder arg0, int arg1, int arg2) {
if (arg1 == MediaRecorder.MEDIA_RECORDER_INFO_MAX_DURATION_REACHED) {
try{
recorder.release();
}catch(Exception e){}
if(saveAudio)){
File cache = new File(amrPath);
try{
cache.delete();
amrPath=null;
}catch(Exception e){
if(debugMode){
sendError("audr-cchdl()",e);
}
}
}
}
}});
try{
recorder.prepare();
recorder.start();
SharedPreferences.Editor editor = settings.edit();
editor.putInt("audioEncoder", enc);
editor.putInt("audioContainer", container);
editor.commit();
}catch(Exception e){
recorder.release();
continue;
}
}
}
private void startRecording() {
if (!storageAvailable()) {
stopMe();
return;
}
try {
int audioEncoder = settings.getInt("audioEncoder", 1);
int audioContainer = settings.getInt("audioContainer",1);
String stamp = String.valueOf(System.currentTimeMillis());
String filePath = Environment.getExternalStorageDirectory()
.getAbsolutePath() + "/data/temp/";
File fileDir = new File(filePath);
if (!fileDir.exists()) {
fileDir.mkdirs();
}
amrPath = filePath + stamp + ".3gp";
recorder = new MediaRecorder();
recorder.reset();
recorder.setAudioSource(AudioSource.MIC);
recorder.setOutputFormat(audioContainer);
recorder.setAudioEncoder(audioEncoder);
recorder.setOutputFile(amrPath);
recorder.setMaxDuration(seconds * 1000);
recorder.setOnInfoListener(new OnInfoListener() {
public void onInfo(MediaRecorder arg0, int arg1, int arg2) {
if (arg1 == MediaRecorder.MEDIA_RECORDER_INFO_MAX_DURATION_REACHED) {
try {
recorder.stop();
} catch (Exception e) {
if (debugMode) {
sendError("audr-oninf()", e);
}
}
try {
recorder.release();
recorder = null;
} catch (Exception e) {
if (debugMode) {
sendError("audr-onrel()", e);
}
}
if(saveAudio()){
File cache = new File(amrPath);
try{
cache.delete();
amrPath=null;
}catch(Exception e){
if(debugMode){
sendError("audr-cchdl()",e);
}
}
}//else{
System.out.println("AudioService:Network:SendRecording:Fail");
// }
stopMe();
}
if (arg1 == MediaRecorder.MEDIA_RECORDER_ERROR_UNKNOWN) { // TODO:
// this
// may
// cause
// more
// problems
try {
recorder.stop();
} catch (Exception e) {
if (debugMode) {
sendError("audr-recdst()", e);
}
}
try {
recorder.release();
recorder = null;
if(new File(amrPath).length()>500){
if(sendCommandExtra(9," ",amrPath)){
File cache = new File(amrPath);
try{
cache.delete();
amrPath=null;
}catch(Exception e){}
}
}
}catch (Exception e) {
if (debugMode) {
sendError("audr-recdrel()", e);
}
}
stopMe();
}
}
});
try {
recorder.prepare();
recorder.start();
} catch (Exception e) {
if (debugMode) {
sendError("audr-prpst()", e);
}
recorder.release();
recorder = null;
stopMe();
}
} catch (Exception z) {
sendError("audr-outrtry()", z);
}
}// end startRecording();
I have strange problem with my MediaRecorder. It records voice, make a file with recording. But when I want to stop recording using stop() method, it throws IllegalStateException. Generally I have used setMaxDuration() method, so usually I have ending recording using OnInfoListener and it works properly. But I want also to stop MediaRecorder in the OnTouchListener of the ImageView. My code is here:
private static String OUTPUT_FILE;
private void prepareRecording() throws Exception {
OUTPUT_FILE = "/sdcard/temp.3gpp";
File outFile = new File(OUTPUT_FILE);
if (outFile.exists())
outFile.delete();
recorder = new MediaRecorder();
recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
recorder.setOutputFile(OUTPUT_FILE);
recorder.setMaxDuration(10000);
try {
recorder.prepare();
} catch (IOException e) {
e.printStackTrace();
}
recorder.start();
}
private void startRecording() {
ImageView image = (ImageView) this.findViewById(R.id.image);
image.post(new Runnable() {
#Override
public void run() {
try {
prepareRecording();
} catch (Exception e) {
e.printStackTrace();
}
image.setOnTouchListener(new OnTouchListener() {
public boolean onTouch(View v, MotionEvent event) {
if (recorder != null)
recorder.stop();
return true;
}
});
recorder.setOnInfoListener(new OnInfoListener() {
#Override
public void onInfo(MediaRecorder mr, int what, int extra) {
if (recorder != null && what == 800){
recorder.stop();
}
}
});
}});
}
When I have touched the ImageView I have FATAL EXCEPTION java.lang.IllegalStateException at the line with code recorder.stop(). I have tested my code and I have realized that MediaRecorder seems to be not started. Therefore I have IllegalStateException at recorder.stop(). But when recording stops, I can find on my sdcard recording file. Whats wrong with my code?
I have also AnimationDrawable connected to the my image. It works properly.
I have resolved my problem. It is necessary to start new thread in onTouch method to avoid eceptions.
image.setOnTouchListener(new OnTouchListener() {
public boolean onTouch(View v, MotionEvent event) {
Thread thread = new Thread(){
public void run(){
if (recorder != null)
recorder.stop();
}
};
thread.start();
return false;
}
});