AAC format in Android - android

I have a problem with audio on android.
Short Question:
I need to play and record audio in aac format an all Android devices.
I found that it is possible starts from API 10, but on my BLU device(2.3.5) it works by using MediaRecorder and MediaPlayer.
But on HTC Nexus One it doesn't work.
Have you any suggestions?
Long question:
To record and play audio in AAC format I'm using following code. It pretty simple and stupid, but it works for testing.
String pathForAppFiles = getFilesDir()
.getAbsolutePath();
pathForAppFiles += "/bla.mp4";
if (audioRecorder == null) {
File file = new File(pathForAppFiles);
if (file.exists())
file.delete();
audioRecorder = new MediaRecorder();
audioRecorder
.setAudioSource(MediaRecorder.AudioSource.MIC);
audioRecorder
.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
audioRecorder
.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
audioRecorder.setOutputFile(pathForAppFiles);
try {
audioRecorder.prepare();
} catch (IllegalStateException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
audioRecorder.start();
} else {
audioRecorder.stop();
audioRecorder.release();
audioRecorder = null;
new AudioUtils().playSound(pathForAppFiles);
}
new AudioUtils().playSound(pathForAppFiles); - creates MediaPlayer and play sound from file;
To make it work on nexus I tried aac-decoder - it doesn't play file to the end (plays only 6 second of 10 seconds file). And it doesn't plays sound recorded by code above.
Also I tried to install FFmpeg, but I haven't experience to make this library work.
So can you recommend something?

I resolve this issue by changing audio type to MP3, because some devices (like Kindle Fire) do not play aac from anywhere.
My recommendation is:
If you want to make cross platform sounds - use MP3. You can convert any sound to MP3 with lame encoder.

Related

Get information about audio file in Android

I am new to Android and I want load an audio file (wav or mp3) from the file system and display audio information, such as sampling rate etc.
How can I do this? Do you know any examples?
You can approximate it by dividing the file size by the length of the audio in seconds, for instance, from a random AAC encoded M4A in my library:
File Size: 10.3MB (87013064 bits)
Length: 5:16 (316 Seconds)
Which gives: 87013064 bits / 316 seconds = 273426.147 bits/sec or ~273kbps
Actual Bitrate: 259kbps
Since most audio files have a known set of valid bitrate levels, you can use that to step the bit rate to the appropriate level for display.
Link to original answer by Jake Basile
Or use this code to get it much more accurate:
MediaExtractor mex = new MediaExtractor();
try {
mex.setDataSource(path);// the adresss location of the sound on sdcard.
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
MediaFormat mf = mex.getTrackFormat(0);
int bitRate = mf.getInteger(MediaFormat.KEY_BIT_RATE);
int sampleRate = mf.getInteger(MediaFormat.KEY_SAMPLE_RATE);
Link to original answer by architjn

Getting bit rate or bit depth of an audio wav file

I am using AudioTrack to play a .wav audio file. Everything is fine but I for now I have hard coded the bit depth of the audio file while initializing the AudioTrack object in STATIC_MODE.
mAudioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, mSampleRate, mChannels,
AudioFormat.ENCODING_PCM_16BIT, dataSize, AudioTrack.MODE_STATIC);
I want to get the bit-depth/bitrate of the .wav file programmatically and then set the encoding in the AudioTrack object. I have tried to use MediaExtractor and MediaFormat but it gives me only the following information:
mediaFormat:{mime=audio/raw, durationUs=10080000, channel-count=1, channel-mask=0, sample-rate=16000}
In the documentation of MediaFormat, it says that KEY_BIT_RATE is encoder-only. Does that mean that I can only use this option while encoding raw PCM bits. If yes, what can be any other way to read the bitrate/bit-depth programmatically? I have already tried getting the information for the same file on the terminal using the mediainfo binary and it gives me the correct bit depth.
You could always look at the 34th and 35th bytes of the wav file's header. See this resource.
MediaExtractor mediaExtractor = new MediaExtractor();
try {
mediaExtractor.setDataSource(path);
return mediaExtractor.getTrackFormat(0).getInteger("bit-per-sample");
} catch (Exception e) {
e.printStackTrace();
}
int currentapiVersion = android.os.Build.VERSION.SDK_INT;
int bitDepth;
if (currentapiVersion >= android.os.Build.VERSION_CODES.N){
bitDepth = format.getInteger("pcm-encoding");
} else{
bitDepth = format.getInteger("bit-width");
and the format above android 7.0 like
mime: string(audio/raw), channel-count: int32(2), sample-rate: int32(48000), pcm-encoding: int32(2)}
below android 7.0 like
mime: string(audio/raw), channel-count: int32(2), sample-rate: int32(48000), bit-width: int32(16), what: int32(1869968451)}
https://developer.android.com/reference/android/media/MediaFormat.html#KEY_PCM_ENCODING

SurfaceView, SurfaceTexture and MediaPlayer cant play my video in android

I am trying to play live streaming video on my app using SurfaceView, when i try it with Vitamio it plays well, but as it is a HTTP link, I tried to get rid of any 3rd party library and had used the native classes. I have tried VideoView as I always do, then I tried the SurfaceView basic implementation after failure I have tried texture videw like this:
#Override
public void onSurfaceTextureAvailable(SurfaceTexture surfaceTexture, int width, int height) {
Surface surface = new Surface(surfaceTexture);
try {
mMediaPlayer = new MediaPlayer();
mMediaPlayer.setDataSource(getApplicationContext(), Uri.parse(link));
mMediaPlayer.setSurface(surface);
mMediaPlayer.setLooping(true);
mMediaPlayer.prepareAsync();
// Play video when the media source is ready for playback.
mMediaPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mediaPlayer) {
mediaPlayer.start();
}
});
mMediaPlayer.setOnErrorListener(new MediaPlayer.OnErrorListener() {
#Override
public boolean onError(MediaPlayer mp, int what, int extra) {
Log.d(TAG, "Error occured");
return false;
}
});
} catch (IllegalArgumentException e) {
Log.d(TAG, e.getMessage());
} catch (SecurityException e) {
Log.d(TAG, e.getMessage());
} catch (IllegalStateException e) {
Log.d(TAG, e.getMessage());
} catch (IOException e) {
Log.d(TAG, e.getMessage());
}
}
but no luck everytime MediaPlayer's OnError is called and in the logcat I get this:
06-28 16:00:56.612 144-8044/? E/GenericSource﹕ Failed to prefill data cache!
06-28 16:00:56.614 7997-8016/? E/MediaPlayer﹕ error (1, -2147483648)
06-28 16:00:56.614 7997-7997/? E/MediaPlayer﹕ Error (1,-2147483648)
but the thing is there is no problem with the URL, this url is playing fine on vitamio and every other playes that I could test on, please help!!
I've had my own pain trying to get video to play on Android via MediaPlayer and I have also tried Vitamio as well. Most of the time, if a video didn't play properly on Android's MediaPlayer it was because it was not of a supported format.
http://developer.android.com/guide/appendix/media-formats.html
This may not be the answer you want, but you're likely going to have to re-encode whatever you're trying to play to a supported format. Android's video playing capabilities are far weaker than that of the iphone, and this is just something you're going to have to accept.
If instead you're willing to put in (a lot) more work, you can compile ffmpeg yourself for android, make a jni interface to it's many components, and play videos into the surface/texture view. I don't personally recommend this route as my experience with streaming 1080p video via ffmpeg wasn't great.
Your best, and easiest bet is to simply re-encode your videos.
Background: I made an app that played up to 5 videos silmutaniously from a variety of vendors.
It seems to be one of 2 problems. Either the format is incorrect or there are permission issues with the file and it is unable to open it.
First convert the video using ffmpeg. I use this command to convert to a stream-able mp4:
ffmpeg -i InputVideo.mp4 -c:v libx264 -profile:v baseline -c:a libfaac -ar 44100 -ac 2 -b:a 128k -movflags faststart OutputVideo.mp4
Second, try loading the video as a file first and then pass the data source to media player. This is needed at times as I have noticed that when using MediaPlayer to open file it triggers an OS level call to load the file while the file is in the apps private folder and is hence unopenable by the OS. We do so like this:
AssetFileDescriptor afd = contxt.getResources().openRawResourceFd(R.raw.prepare_artwork);
if (afd == null) {
Log.e(TAG, "Failed to load video.");
} else {
mMediaPlayer.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength());
afd.close();
}

Voice recognition fails to work when the voice is under recording

I am working on a function that when a button is pressed, it will launch voice recognition and at the same time will record what the user says. Codes as follows:
button_start.setOnTouchListener( new View.OnTouchListener()
{
#Override
public boolean onTouch(View arg0, MotionEvent event)
{
if (pressed == false)
{
Intent intent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH);
intent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL,RecognizerIntent.LANGUAGE_MODEL_FREE_FORM);
intent.putExtra(RecognizerIntent.EXTRA_CALLING_PACKAGE,"voice.recognition.test");
intent.putExtra(RecognizerIntent.EXTRA_LANGUAGE, "zh-HK");
intent.putExtra(RecognizerIntent.EXTRA_MAX_RESULTS,1);
sr.startListening(intent);
Log.i("111111","11111111");
pressed = true;
}
recordAudio();
}
if((event.getAction()==MotionEvent.ACTION_UP || event.getAction()==MotionEvent.ACTION_CANCEL))
{
stopRecording();
}
return false;
}
});
}
public void recordAudio()
{
isRecording = true;
try
{
mediaRecorder = new MediaRecorder();
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
mediaRecorder.setOutputFile(audioFilePath);
mediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
mediaRecorder.prepare();
}
catch (Exception e)
{
e.printStackTrace();
}
mediaRecorder.start();
}
public void stopRecording()
{
if (isRecording)
{
mediaRecorder.stop();
mediaRecorder.reset(); // set state to idle
mediaRecorder.release();
mediaRecorder = null;
isRecording = false;
}
else
{
mediaPlayer.release();
mediaPlayer.reset();
mediaPlayer = null;
}
}
class listener implements RecognitionListener
{
// standard codes onReadyForSpeech, onBeginningOfSpeech, etc
}
Questions:
I have made the app step by step, and at first the app does not have recording functions, and the voice recognition works perfectly.
After I have tested many times and considered the voice recognition is ok, I start to incorporate the recording functions using the MediaRecorder.
I then tested, once the button_start is pressed, ERROR3 AUDIO message immediately appears even before I tried to speak.
I play back the voice recording. The voice is recorded and saved properly too.
What is happening? Why Cannot recording at the same time when using voice recognition?
Thanks!
--EDIT-- module for Opus-Record WHILE Speech-Recognition also runs
--EDIT-- 'V1BETA1' streaming, continuous, recognition with minor change to sample project. Alter that 'readData()', so the raw PCM in 'sData' is shared by 2 threads ( fileSink thread , recognizerAPI thread from sample project). For the sink, just hook up an encoder using a PCM stream refreshed at each 'sData' IO. remember to CLO the stream and it will work. review 'writeAudiaDataToFile()' for more on fileSink....
--EDIT-- see this thread
There is going to be a basic conflict over the HAL and the microphone buffer when you try to do:
speechRecognizer.startListening(recognizerIntent); // <-- needs mutex use of mic
and
mediaRecorder.start(); // <-- needs mutex use of mic
You can only choose one or the other of the above actions to own the audio API's underlying the mic!
If you want to mimic the functionality of Google Keep where you talk only once and as output from the one input process (your speech into mic) you get 2 separate types of output (STT and a fileSink of say the MP3) then you must split something as it exits the HAL layer from the mic.
For example:
Pick up the RAW audio as PCM 16 coming out of the mic's buffer
Split the above buffer's bytes (you can get a stream from the buffer and pipe the stream 2 places)
STRM 1 to the API for STT either before or after you encode it (there are STT APIs accepting both Raw PCM 16 or encoded)
STRM 2 to an encoder, then to the fileSink for your capture of the recording
Split can operate on either the actual buffer produced by the mic or on a derivative stream of those same bytes.
For what you are getting into, I recommend you look at getCurrentRecording() and consumeRecording() here.
STT API reference: Google "pultz speech-api". Note that there are use-cases on the API's mentioned there.
buferUtils
code
more code

JB Media codec decoder issue

I am using JB's Hardware Media Codec . I am trying to encode a video and decode it and display using the codecs (in video/avc format)...
I am using two buttons to "start" and "stop" the video rendering. The first time, when I render the video it is displayed correctly. When I start the video second time, it is not getting displayed and throws the following error:
"NOT in AVI Mode"
I copy paste the code snippets for the start and Stop button.
public void Stop(){
try {
//stopping the decoder alone
decoderMediaCodec.flush();
decoderMediaCodec.stop();
decoderMediaCodec.release();
//Tried with various combination of flush(), stop() and release();
} catch (Exception e) {
e.printStackTrace();
}
public void Start(Surface view){
try {
decoderMediaCodec = MediaCodec.createDecoderByType(mime);//Initialize the decoder again
MediaFormat format = MediaFormat.createVideoFormat(mime, mWidth, mHeight);
format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
format.setInteger(MediaFormat.KEY_BIT_RATE, bitrate);
format.setInteger(MediaFormat.KEY_COLOR_FORMAT, colorFormat);
format.setInteger(MediaFormat.KEY_FRAME_RATE, framerate);
decoderMediaCodec.configure(format, view, null, 0);
decoderMediaCodec.start();
} catch (Exception e) {
e.printStackTrace();
}
}
Kindly help me with the video rendering.
Note : data received in decoder is valid... data is checked using the beyond compare tool
I am getting -1 for outputBufferIndex
int outputBufferIndex = decoderMediaCodec.dequeueOutputBuffer(bufferInfo, 0);
In logs i get
E/( 271):
E/( 271):not in avi mode
E/( 271):
E/( 271): not in avi mode
It would be good if you could share more logs when you encounter the issue. From the description of your issue, could you confirm that the Surface being passed for your 2nd Start call is a valid handle?
If you can rebuild android, probably enabling log traces in Mediacodec.cpp would be helpful, specifically Mediacodec::setNativeWindow method.
P.S: For a decoder, why are I-frame interval, bitrate and framerate being set?

Categories

Resources