I am struggling with this, I have changed the bitrate to reduce the recording filesize, my app correctly posts audio files to a server, yet I want to minimize filesize, this is my record code
private void startRecording() throws IOException {
String state = android.os.Environment.getExternalStorageState();
if (!state.equals(android.os.Environment.MEDIA_MOUNTED)) {
throw new IOException("No SD mounted. It is" + state
+ ".");
}
mRecorder = new MediaRecorder();
mRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mRecorder.setAudioSamplingRate(44100);
mRecorder.setAudioEncodingBitRate(44100);
mRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
File path = new File(Environment.getExternalStorageDirectory()
.getPath());
if (!path.exists() && !path.mkdirs()) {
throw new IOException("The file directory is invalid.");
}else{
try {
archivo = File.createTempFile("audio", ".3gp", path);
} catch (IOException e) {
}
}
mRecorder.setOutputFile(archivo.getAbsolutePath());
mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
try {
mRecorder.prepare();
} catch (IOException e) {
Log.e(LOG_TAG, "prepare() failed");
}
mRecorder.start();
}
I am getting like 336 kb for 1 minute recording right now, I want to
decrease it to around 100 - 200 kb per minute without loosing too
much quality
A couple of things you can try.
1) use AMR (adaptive multi-rate) for higher compression:
recorder.setAudioEncoder(mRecorder.AudioEncoder.AMR_NB);
AMR Wideband and AMR Narrowband are the encoding methods used by the device for telephone calls. Might not have the quality you require.
2) use mono 1 channel:
mRecorder.setAudioChannels (1)
// Sets the number of audio channels for recording. Call this method before
// prepare().
// Usually it is either 1 (mono) or 2 (stereo).
Related
I am recording communication voice in my app and I added storage and audio record permission manifest and also getting programmatically.
My code is working fine on one device(Android 6.0 Lenovo K3 Note)
But not on another (Android 8.1 ONEPLUS A5010)
In second device output is saved as a blank file of 3.15KB
I am adding my code which I am using please tell what I am doing wrong.
MediaRecorder mRecorder;
String mFileName;
Code in OnCreate
File file = new File(getFilesDir(), "engwingoLastCall.3gp");
mFileName = file.getAbsolutePath();
try {
if(mRecorder == null) {
mRecorder = new MediaRecorder();
mRecorder.setAudioSource(MediaRecorder.AudioSource.VOICE_COMMUNICATION);
mRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
mRecorder.setOutputFile(mFileName);
mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
}
} catch (Exception e) {
Log.d(TAG,"Recorder Error:"+e.getMessage());
}
Methods
public void startRecording() {
try {
if(mRecorder != null) {
mRecorder.prepare();
mRecorder.start();
}
} catch (Exception e) {
Log.d("Recorder", "prepare() failed");
}
}
public void stopRecording() {
if(mRecorder != null) {
try {
mRecorder.stop();
mRecorder.release();
mRecorder = null;
} catch (IllegalStateException e) {
e.printStackTrace();
}catch (Exception e){
Log.d(TAG,e.getMessage());
}
}
}
Since you are not setting a profile with setProfile() method you may need to set audio channels, bitrate and sampling rate for audio too. Here is an example:
mRecorder.setAudioChannels(1);
// you would not want to record stereo, it is not logical
mRecorder.setAudioEncodingBitRate(128000);
// you can set it to 64000 or 96000 to lower quality, therefore decreasing the size of the audio
mRecorder.setAudioSamplingRate(44100);
// AFAIK, default value.
Hope this helps.
My Code was OK but the reason for this behavior of my recorder was,
Some other service also using my recorder at that time and that's why the file was saved empty (3.15KB Size)
I have using spydroid from https://github.com/fyhertz/spydroid-ipcamera.
Based on the requirement streaming should be send and receive in the device. from local network we should able to shown the rtsp stream. Ex. VLC Media Player.
The issue I am facing is, When I am change the resolution Ex. 640*480. It should give black screen with streaming live. In Default demo, It should support 320*240, which is working fine. I have also change the bitrate and framerate according to 640*480 resolution. But couldn't get the result.
Any help would be appreciate.
You might be using old library which the demo SpyDroid is using.
I had same issue with the code i tried following way :-
Steps:-
1.)include library LibStreaming .
As it is the latest library and supporting above Lollipop version.
2.)Find H263Stream class Change following method :-
From
#SuppressLint("NewApi")
private MP4Config testMediaCodecAPI() throws RuntimeException, IOException {
createCamera();
updateCamera();
try {
if (mQuality.resX>=640) {
// Using the MediaCodec API with the buffer method for high resolutions is too slow
mMode = MODE_MEDIARECORDER_API;
}
EncoderDebugger debugger = EncoderDebugger.debug(mSettings, mQuality.resX, mQuality.resY);
return new MP4Config(debugger.getB64SPS(), debugger.getB64PPS());
} catch (Exception e) {
// Fallback on the old streaming method using the MediaRecorder API
Log.e(TAG,"Resolution not supported with the MediaCodec API, we fallback on the old streamign method.");
mMode = MODE_MEDIARECORDER_API;
return testH264();
}
}
To
#SuppressLint("NewApi")
private MP4Config testMediaCodecAPI() throws RuntimeException, IOException {
createCamera();
updateCamera();
try {
if (mQuality.resX>=1080) {
// Using the MediaCodec API with the buffer method for high resolutions is too slow
mMode = MODE_MEDIARECORDER_API;
}
EncoderDebugger debugger = EncoderDebugger.debug(mSettings, mQuality.resX, mQuality.resY);
return new MP4Config(debugger.getB64SPS(), debugger.getB64PPS());
} catch (Exception e) {
// Fallback on the old streaming method using the MediaRecorder API
Log.e(TAG,"Resolution not supported with the MediaCodec API, we fallback on the old streamign method.");
mMode = MODE_MEDIARECORDER_API;
return testH264();
}
}
-You can find the difference change resolution from "640" into "1080"
-Don't know what exact reason is but above solution worked for me.
-Revert me back if there is any loop fall.
In my case the problem was in MediaRecorder:
file: VideoStream.java
method: encodeWithMediaRecorder
MediaRecorder doesn't support ParcelFileDescriptor correctly. I created local file to save stream form MediaRecorder:
mMediaRecorder.setOutputFile(this.tmpFileToStream);
//mMediaRecorder.setOutputFile(fd); //disable..
and then I run new thread to copy bytes from tmpFileToStream to mParcelWrite
public void run()
{
FileDescriptor fd = mParcelWrite.getFileDescriptor();
try {
InputStream isS = new FileInputStream(tmpFileToStream); //read from local file..
FileOutputStream outputStream = new FileOutputStream(fd);
while (!Thread.interrupted()) {
int content;
while ((content = isS.read()) != -1) {
outputStream.write(content);
}
Thread.sleep(10);
}
} catch (Exception e)
{
Log.e(TAG, "E.. " + e.getMessage());
}
}
/**
* Video encoding is done by a MediaRecorder.
*/
protected void encodeWithMediaRecorder() throws IOException, ConfNotSupportedException {
Log.d(TAG,"Video encoded using the MediaRecorder API");
Log.d(TAG,"Roz" + mRequestedQuality.resX + " x " + mRequestedQuality.resY + " frame " + mRequestedQuality.framerate );
// We need a local socket to forward data output by the camera to the packetizer
createSockets();
// Reopens the camera if needed
destroyCamera();
createCamera();
// The camera must be unlocked before the MediaRecorder can use it
unlockCamera();
this.tmpFileToStream = this.getOutputMediaFile(MEDIA_TYPE_VIDEO);
Log.d(TAG,"Video record to " + this.tmpFileToStream.getAbsolutePath() );
try {
mMediaRecorder = new MediaRecorder();
mMediaRecorder.setCamera(mCamera);
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
// mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mMediaRecorder.setVideoEncoder(mVideoEncoder);
mMediaRecorder.setPreviewDisplay(mSurfaceView.getHolder().getSurface());
mMediaRecorder.setVideoSize(mRequestedQuality.resX,mRequestedQuality.resY);
mMediaRecorder.setVideoFrameRate(mRequestedQuality.framerate);
// The bandwidth actually consumed is often above what was requested
mMediaRecorder.setVideoEncodingBitRate((int)(mRequestedQuality.bitrate*0.8));
// We write the output of the camera in a local socket instead of a file !
// This one little trick makes streaming feasible quiet simply: data from the camera
// can then be manipulated at the other end of the socket
FileDescriptor fd = null;
if (sPipeApi == PIPE_API_PFD) {
fd = mParcelWrite.getFileDescriptor();
} else {
fd = mSender.getFileDescriptor();
}
mMediaRecorder.setOutputFile(this.tmpFileToStream); //save to local.. afeter read and stream that..
// mMediaRecorder.setOutputFile(fd); //disable..
// copy bytes from local file to mParcelWrite.. :)
if(tInput == null) {
this.tInput = new Thread(this);
this.tInput.start();
}
mMediaRecorder.prepare();
mMediaRecorder.start();
} catch (Exception e) {
StringWriter sw = new StringWriter();
PrintWriter pw = new PrintWriter(sw);
e.printStackTrace(pw);
Log.d(TAG, "Error stack " + sw.toString());
throw new ConfNotSupportedException(e.getMessage());
}
InputStream is = null;
if (sPipeApi == PIPE_API_PFD) {
is = new ParcelFileDescriptor.AutoCloseInputStream(mParcelRead);
} else {
is = mReceiver.getInputStream();
}
// This will skip the MPEG4 header if this step fails we can't stream anything :(
try {
byte buffer[] = new byte[4];
// Skip all atoms preceding mdat atom
while (!Thread.interrupted()) {
while (is.read() != 'm');
is.read(buffer,0,3);
if (buffer[0] == 'd' && buffer[1] == 'a' && buffer[2] == 't') break;
}
} catch (IOException e) {
Log.e(TAG,"Couldn't skip mp4 header :/");
stop();
throw e;
}
// The packetizer encapsulates the bit stream in an RTP stream and send it over the network
mPacketizer.setInputStream(is);
mPacketizer.start();
mStreaming = true;
}
That's unusual solution but it works for another resolution
I got this issue and cant get my head around it, im recording calls directly from the speaker, so when i get TelephonyManager.CALL_STATE_OFFHOOK I instantly start recording audio from VOICE_CALL. thats part its ok, start recording but if the call is ended and start a new one, I get a java.lang.IllegalStateException
I think this is because the first call it's still being recorded... I've tried to do:
mRecorder.stop();
mRecorder.release();
mRecorder.reset();
but no luck, they all gave me a illegalStateException, I just want to know how to stop first call recording and record a new one without errors.
Here's my code for recording and call handling,
//At least one call exists that is dialing, active, or on hold, and no calls are ringing or waiting.
if (state == TelephonyManager.CALL_STATE_OFFHOOK){
if (record_calls == 1){
record_enviroment();
}
}
//when Idle = No activity.
if (state == TelephonyManager.CALL_STATE_IDLE){
//Check for audio recorded and if exists post audio to server
}
public void record_enviroment(){
path = context.getFilesDir().getAbsolutePath() + "/";
try {
//Random number for file name
Random r = new Random( System.currentTimeMillis() );
i = 10000 + r.nextInt(20000);
// Save file local to app
mFileName = path + i + "_call_" + id_asociado + ".3gp";
mRecorder = new MediaRecorder();
mRecorder.setAudioSource(MediaRecorder.AudioSource.VOICE_CALL);
mRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
mRecorder.setOutputFile(mFileName);
mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
try {
mRecorder.prepare();
} catch (IOException e) {
Log.e("AUDIO_RECORDER", "prepare() failed");
}
mRecorder.start();
} catch (JSONException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
I am writing an app that records voice from the microphone in AMR format using MediaRecorder, and then plays the data back using MediaPlayer.
That's the goal anyway.
I am fairly confident my MediaRecorder side is working, I'm producing the data file in the right place at the right data rate. Here's how I start and stop my MediaRecorder
public void OnStartRecord(View v )
{
System.out.println( "StartRecord");
try {
audioFile = File.createTempFile("amrtmp", ".amr", getApplicationContext().getFilesDir());
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
System.out.println( "Recording to " + audioFile.getAbsolutePath());
mRecorder = new MediaRecorder();
mRecorder.setAudioSource(MediaRecorder.AudioSource.VOICE_COMMUNICATION);
mRecorder.setOutputFormat(MediaRecorder.OutputFormat.AMR_NB);
mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
mRecorder.setAudioEncodingBitRate(4750);
mRecorder.setAudioSamplingRate(8000);
mRecorder.setOutputFile(audioFile.getAbsolutePath());
try {
mRecorder.prepare();
} catch (IllegalStateException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
mRecorder.start();
}
public void OnStopRecord(View v )
{
System.out.println( "StopRecord");
mRecorder.stop();
mRecorder.release();
}
This works like a charm. Typical output is something like
StartRecord
Recording to /data/data/com.test.playback/files/amrtmp-235967797.amr
And when I start, then stop recording I can see that the file has been created and it has a certain amount of data in it that properly corresponds to the settings.
Side note: I detect an odd buzzing at my speaker while this runs. Any idea what that is?
When I try to play the file back however I have no end of trouble. I have tried the following:
public void OnPlay(View v )
{
MediaPlayer mPlayer = new MediaPlayer();
mPlayer.setAudioStreamType(AudioManager.STREAM_VOICE_CALL);
FileInputStream FIS = null;
try {
FIS = new FileInputStream(audioFile.getAbsolutePath());
mPlayer.setDataSource(FIS.getFD());
mPlayer.prepare();
}
catch( Exception e )
{
e.printStackTrace();
}
mPlayer.start();
}
This results in nothing being played at all with the following output from MediaPlayer:
start() mURI is null
I have also tried the same code, but setting mPlayer's data source differently:
mPlayer.setDataSource(audioFile.getAbsolutePath());
This fails when prepare is called witha java.io.IOException status 0x1.
I have to imagine there is something else I need to do with MediaPlayer to set it up properly. Any suggestions?
I am trying to record audio from the microphone on the Android emulator with this code:
recorder = new MediaRecorder();
recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
recorder.setOutputFile(Environment.getExternalStorageDirectory() + "/test/test.3gp");
try {
recorder.prepare();
}
catch (IOException io) {
Log.v(LOG_TAG, "Could not prepare the audio " + io.getMessage());
}
recorder.start();
For stopping the audio, this is the code:
recorder.stop();
recorder.reset();
recorder.release();
The recording process works fine but the resulting audio that is distorted. When I record an audio for 60 seconds duration and play it, it's duration is being shown as 120 seconds. The measurement is not exact but the this is just to give you an idea.
Only the AMR_NB encoder is working on my emulator. I have tried different output formats but the result is always the same.
Is it a limitation of the emulator or am I doing something wrong here?
Edit 1:
I have tried the AudioRecord class too and the result is the same dragging audio.
Thanks.
I have been working for the same and found the solution, Try using the following code:
private void startRecording()
{
this.recorder = new MediaRecorder();
this.recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
this.recorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
this.recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
MediaRecorder.getAudioSourceMax();
this.recorder.setOutputFile(this.getFilename());
this.recorder.setOnErrorListener(this.errorListener);
this.recorder.setOnInfoListener(this.infoListener);
try
{
this.recorder.prepare();
this.recorder.start();
} catch (final IllegalStateException e)
{
e.printStackTrace();
} catch (final IOException e)
{
e.printStackTrace();
}
}
This is working perfactly. Hope it helps you :)