I try to build an app where there is a record function in. I want the user to be able to choose the record quality, one of the options is AAC recording. I am using the code below, but I keep getting errors on setting the audioSamplingRate.. any ideas?
mediaRecorder = new MediaRecorder();
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mediaRecorder.setAudioSamplingRate(96000);
mediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
and the error
04-29 10:32:17.477: E/MediaRecorder(18750): setParameters(audio-param-sampling-rate=96000) failed: -22
04-29 10:32:17.477: E/com.test.com.AudioRecorder(18750): setParameter failed.
04-29 10:32:17.477: E/com.test.com.AudioRecorder(18750): prepare() method called on illegal state
Every audio sampling rate is not supported in devices. so,if it is not supported it will give error or clipped internally to make it suitable.for quality you can set audio encoding bit rate with sample rate.44100 sample rate is supported by all android devices according to its audiorecorder documentation.you can use bit rate according to your need.this formula may help you
Bit rate = (sampling rate) × (bit depth) × (number of channels)
Bit depth is generally 8 in android devices.
Related
I'm using the MediaRecorder to record the device's screen. I'd like to be able to control the frame rate of the video captured.
I've tried setting both the Capture Rate and the VideoFrameRate properties of the MediaRecorder, but they seem to have no effect:
this.mRecorder.setCaptureRate(this.fps);
this.mRecorder.setVideoFrameRate(this.fps);
As per the documentation (setCaptureRate, setVideoFrameRate), I am calling setCaptureRate and setVideoFrameRate after setting the format and video source, and before calling prepare:
this.mRecorder = new MediaRecorder();
this.mRecorder.setAudioSource(MediaRecorder.AudioSource.DEFAULT);
this.mRecorder.setVideoSource(2);
this.mRecorder.setOutputFormat(2);
this.mRecorder.setOutputFile(this.mFilePath);
this.mRecorder.setVideoSize(screenWidth, screenHeight);
this.mRecorder.setVideoEncoder(2);
this.mRecorder.setAudioEncoder(2);
this.mRecorder.setVideoEncodingBitRate(this.mBitRate);
this.mRecorder.setCaptureRate(this.fps);
this.mRecorder.setVideoFrameRate(this.fps);
this.mRecorder.prepare();
I've checked, and this.fps is set to 5, so it is not like some other incorrect value... and the documentation says:
The fps can go as low as desired
Does anyone know how to set the FPS?
I am using Camera2 with MediaRecorder. For 30 sec recorded video, the length of the video is coming as 8-9 hours.
When we analyzed, we saw that the frame rate of the video is coming as 0.01 FPS. This is happening only in some of the devices of Samsung like J6, J7.
I am also not able to replicate this issue also.
Here is my code for setting MediaRecorder configuration
mediaRecorder?.apply {
setAudioSource(MediaRecorder.AudioSource.MIC)
setVideoSource(MediaRecorder.VideoSource.SURFACE)
setOutputFormat(MediaRecorder.OutputFormat.MPEG_4)
setOutputFile(nextVideoAbsolutePath)
setVideoEncodingBitRate(1000000)
setVideoSize(320, 240)
setVideoFrameRate(20)
setVideoEncoder(MediaRecorder.VideoEncoder.H264)
setAudioEncoder(MediaRecorder.AudioEncoder.AAC)
prepare()
}
I'm using the Android oboe library for high performance audio in a music game.
In the assets folder I have 2 .raw files (both 48000Hz 16 bit PCM wavs and about 60kB)
std_kit_sn.raw
std_kit_ht.raw
These are loaded into memory as SoundRecordings and added to a Mixer. kSampleRateHz is 48000:
stdSN= SoundRecording::loadFromAssets(mAssetManager, "std_kit_sn.raw");
stdHT= SoundRecording::loadFromAssets(mAssetManager, "std_kit_ht.raw");
mMixer.addTrack(stdSN);
mMixer.addTrack(stdFT);
// Create a builder
AudioStreamBuilder builder;
builder.setFormat(AudioFormat::I16);
builder.setChannelCount(1);
builder.setSampleRate(kSampleRateHz);
builder.setCallback(this);
builder.setPerformanceMode(PerformanceMode::LowLatency);
builder.setSharingMode(SharingMode::Exclusive);
LOGD("After creating a builder");
// Open stream
Result result = builder.openStream(&mAudioStream);
if (result != Result::OK){
LOGE("Failed to open stream. Error: %s", convertToText(result));
}
LOGD("After openstream");
// Reduce stream latency by setting the buffer size to a multiple of the burst size
mAudioStream->setBufferSizeInFrames(mAudioStream->getFramesPerBurst() * 2);
// Start the stream
result = mAudioStream->requestStart();
if (result != Result::OK){
LOGE("Failed to start stream. Error: %s", convertToText(result));
}
LOGD("After starting stream");
They are called appropriately to play with standard code (as per Google tutorials) at required times:
stdSN->setPlaying(true);
stdHT->setPlaying(true); //Nasty Sound
The audio callback is standard (as per Google tutorials):
DataCallbackResult SoundFunctions::onAudioReady(AudioStream *mAudioStream, void *audioData, int32_t numFrames) {
// Play the stream
mMixer.renderAudio(static_cast<int16_t*>(audioData), numFrames);
return DataCallbackResult::Continue;
}
The std_kit_sn.raw plays fine. But std_kit_ht.raw has a nasty distortion. Both play with low latency. Why is one playing fine and the other has a nasty distortion?
I loaded your sample project and I believe the distortion you hear is caused by clipping/wraparound during mixing of sounds.
The Mixer object from the sample is a summing mixer. It just adds the values of each track together and outputs the sum.
You need to add some code to reduce the volume of each track to avoid exceeding the limits of an int16_t (although you're welcome to file a bug on the oboe project and I'll try to add this in an upcoming version). If you exceed this limit you'll get wraparound which is causing the distortion.
Additionally, your app is hardcoded to run at 22050 frames/sec. This will result in sub-optimal latency across most mobile devices because the stream is forced to upsample to the audio device's native frame rate. A better approach would be to leave the sample rate undefined when opening the stream - this will give you the optimal frame rate for the current audio device - then use a resampler on your source files to supply audio at this frame rate.
I compiled vlc for android with version 1.8 and I found a official demo with link:
https://bitbucket.org/edwardcw/libvlc-android-sample .It's works fine with localVideo. I try to play a http stream, so I change the code below:
// Create LibVLC
// TODO: make this more robust, and sync with audio demo
ArrayList<String> options = new ArrayList<String>();
//options.add("--subsdec-encoding <encoding>");
options.add("--aout=opensles");
options.add("--audio-time-stretch"); // time stretching
options.add("-vvv"); // verbosity
libvlc = new LibVLC(options);
libvlc.setOnHardwareAccelerationError(this);
holder.setKeepScreenOn(true);
// Create media player
mMediaPlayer = new MediaPlayer(libvlc);
mMediaPlayer.setEventListener(mPlayerListener);
// Set up video output
final IVLCVout vout = mMediaPlayer.getVLCVout();
vout.setVideoView(mSurface);
//vout.setSubtitlesView(mSurfaceSubtitles);
vout.addCallback(this);
vout.attachViews();
//Media m = new Media(libvlc, media);
Uri uri = Uri.parse(httpAddress);
Media m = new Media(libvlc, uri);
mMediaPlayer.setMedia(m);
mMediaPlayer.play();
It's works fine on samsung,android 4.1.2. But it's crash with mi4 mobile. with start, it's has 2 seconds sound without image, then it's crash,just like ANR but stay the black screen for ever.
here is the logcat:
core video output: picture is too late to be displayed (missing 953 ms)
core vout display: Failed to change zoom
android_window vout display: change source crop/aspect
core video output: picture is too late to be displayed (missing 1156 ms)
core vout display: auto hiding mouse cursor
core audio output: playback too late (66254): up-sampling
core video output: picture is too late to be displayed (missing 1155 ms)
core video output: picture is too late to be displayed (missing 1153 ms)
[OMX.qcom.video.decoder.avc] ERROR(0x80001009)
Codec reported an error. (omx error 0x80001009, internalError -2147483648)
mediacodec decoder: Exception in MediaCodec.dequeueOutputBuffer
mediacodec decoder: dequeue_out failed
mediacodec decoder: OutThread stopped
threadid=16: thread exiting, not yet detached (count=0)
Error with hardware acceleration
more log info
where can i find a httpstream demo with vlc complied 1.8?
Thanks for your help
Issue of this nature are due to HardwareDecoder on board, this awesome page compiles up all the Decoding capability on Android.
The MediaCodec class first became available in Android 4.1 (API 16).
In Android 4.3 (API 18), MediaCodec was expanded to include a way to provide input through a Surface (via the createInputSurface method).
Even though Android Introduced MediaCodec in a big way on 4.3, not all vendors support them, need to have intelligent Player/Decoder to switch to Software Decoding.
LibVLC does it intelligently but need to manage it through LibVLC Options.
The aac decoder is initialized as below:
MediaFormat outfmt = new MediaFormat();
outfmt.setString(MediaFormat.KEY_MIME, "audio/mp4a-latm");
outfmt.setInteger(MediaFormat.KEY_AAC_PROFILE, mAudioProfile);
mSampleRate = format.getInteger(MediaFormat.KEY_SAMPLE_RATE);
outfmt.setInteger(MediaFormat.KEY_SAMPLE_RATE, mSampleRate);
mChannels = format.getInteger(MediaFormat.KEY_CHANNEL_COUNT);
outfmt.setInteger(MediaFormat.KEY_CHANNEL_COUNT, mChannels);
outfmt.setInteger(MediaFormat.KEY_BIT_RATE, 64000);
audioEncoder.configure(outfmt, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
audioEncoder.start();
But the encoder behaviors different on two devices.
One outputs normal presentation:
64000 128000 192000 256000 320000
Another outputs as two channels:
64000 64000 128000 128000 192000 192000 256000 256000 320000 320000
And the format extracted using MediaExtractor is different on two devices:
the normal one is
{max-input-size=1572864, aac-profile=2,
csd-0=java.nio.ByteArrayBuffer[position=0,limit=2,capacity=2], sample-rate=16000,
durationUs=8640000, channel-count=1, mime=audio/mp4a-latm, isDMCMMExtractor=1}
The other is
{max-input-size=798, durationUs=8640000, channel-count=1, mime=audio/mp4a-latm,
csd-0=java.nio.ByteArrayBuffer[position=0,limit=2,capacity=2], sample-rate=16000}
So the original audio has one channel and the encoder is configured with one channel too.But the encoder outputs as in two channel way.
Does it matter with isDMCMMExtractor flag?
Help!Help!
#fadden
First off, the question is very hard to understand - both of the listed MediaFormat contents show channel-count=1, so there's very little actual explanation of the issue itself, only an explanation of other surrounding details.
However - the software AAC decoder in some android versions (4.1 if I remember correctly, possibly 4.2 as well) will decode mono AAC into stereo - not sure if some of the hardware AAC decoders do the same. You can argue whether this is a bug or just unexpected behaviour, but it's something you have to live with. In the case that the decoder returns stereo data even though the input was mono, both stereo channels will have the same (mono) content.
So basically, you have to be prepared to handle this - either pass the actual format information from the decoder (not from MediaExtractor) to whoever is using the data (e.g. reconfigure the audio output to stereo), or be prepared to mix down stereo back into mono if you really need to have the output in mono format.