SurfaceView, SurfaceTexture and MediaPlayer cant play my video in android - android

I am trying to play live streaming video on my app using SurfaceView, when i try it with Vitamio it plays well, but as it is a HTTP link, I tried to get rid of any 3rd party library and had used the native classes. I have tried VideoView as I always do, then I tried the SurfaceView basic implementation after failure I have tried texture videw like this:
#Override
public void onSurfaceTextureAvailable(SurfaceTexture surfaceTexture, int width, int height) {
Surface surface = new Surface(surfaceTexture);
try {
mMediaPlayer = new MediaPlayer();
mMediaPlayer.setDataSource(getApplicationContext(), Uri.parse(link));
mMediaPlayer.setSurface(surface);
mMediaPlayer.setLooping(true);
mMediaPlayer.prepareAsync();
// Play video when the media source is ready for playback.
mMediaPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mediaPlayer) {
mediaPlayer.start();
}
});
mMediaPlayer.setOnErrorListener(new MediaPlayer.OnErrorListener() {
#Override
public boolean onError(MediaPlayer mp, int what, int extra) {
Log.d(TAG, "Error occured");
return false;
}
});
} catch (IllegalArgumentException e) {
Log.d(TAG, e.getMessage());
} catch (SecurityException e) {
Log.d(TAG, e.getMessage());
} catch (IllegalStateException e) {
Log.d(TAG, e.getMessage());
} catch (IOException e) {
Log.d(TAG, e.getMessage());
}
}
but no luck everytime MediaPlayer's OnError is called and in the logcat I get this:
06-28 16:00:56.612 144-8044/? E/GenericSource﹕ Failed to prefill data cache!
06-28 16:00:56.614 7997-8016/? E/MediaPlayer﹕ error (1, -2147483648)
06-28 16:00:56.614 7997-7997/? E/MediaPlayer﹕ Error (1,-2147483648)
but the thing is there is no problem with the URL, this url is playing fine on vitamio and every other playes that I could test on, please help!!

I've had my own pain trying to get video to play on Android via MediaPlayer and I have also tried Vitamio as well. Most of the time, if a video didn't play properly on Android's MediaPlayer it was because it was not of a supported format.
http://developer.android.com/guide/appendix/media-formats.html
This may not be the answer you want, but you're likely going to have to re-encode whatever you're trying to play to a supported format. Android's video playing capabilities are far weaker than that of the iphone, and this is just something you're going to have to accept.
If instead you're willing to put in (a lot) more work, you can compile ffmpeg yourself for android, make a jni interface to it's many components, and play videos into the surface/texture view. I don't personally recommend this route as my experience with streaming 1080p video via ffmpeg wasn't great.
Your best, and easiest bet is to simply re-encode your videos.
Background: I made an app that played up to 5 videos silmutaniously from a variety of vendors.

It seems to be one of 2 problems. Either the format is incorrect or there are permission issues with the file and it is unable to open it.
First convert the video using ffmpeg. I use this command to convert to a stream-able mp4:
ffmpeg -i InputVideo.mp4 -c:v libx264 -profile:v baseline -c:a libfaac -ar 44100 -ac 2 -b:a 128k -movflags faststart OutputVideo.mp4
Second, try loading the video as a file first and then pass the data source to media player. This is needed at times as I have noticed that when using MediaPlayer to open file it triggers an OS level call to load the file while the file is in the apps private folder and is hence unopenable by the OS. We do so like this:
AssetFileDescriptor afd = contxt.getResources().openRawResourceFd(R.raw.prepare_artwork);
if (afd == null) {
Log.e(TAG, "Failed to load video.");
} else {
mMediaPlayer.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength());
afd.close();
}

Related

Android Java: how to disable microphone input while streaming system audio?

Basically, my program records user input via the microphone and store it as a .pcm file in the sdcard/ directory. It'll be overwritten should there be an existing one. The file is then later sent for playback and analysis (mainly FFT, RMS computation).
I have added another function which allows the program to record system audio, so it allows user's mp3 files to be analyzed as well. It streams the system audio and store it as a .pcm file for later playback and analysis.
It's all functioning well. However, there's a slight issue, when the program streams audio, it captures input from the mic and there'll be noises in the playback. I do not want this as it'll affect the analysis reading. I googled for a solution and found that I can actually mute the mic. So now, I want to mute the mic when the mp3 file is being streamed.
The code I have found is,
AudioManager.setMicrophoneMute(true);
I tried to implement it but it just crashes my application. I tried to find for solutions these few days but I cannot seem to get any.
Here is my code snippet for the part where I want to stream system audio and muting the microphone before it starts streaming.
//create a new AudioRecord object to record the audio data of an mp3 file
int bufferSize = AudioRecord.getMinBufferSize(frequency, channelConfiguration, audioEncoding);
audioRecord = new AudioRecord(AudioManager.STREAM_MUSIC,
frequency, channelConfiguration,
audioEncoding, bufferSize);
//a short array to store raw pcm data
short[] buffer = new short[bufferSize];
Log.i("decoder", "The audio record created fine ready to record");
try {
audioManager.setMicrophoneMute(true);
} catch (Exception e) {
e.printStackTrace();
}
audioRecord.startRecording();
isDecoding = true;
When the setMicrophoneMute(true) line is surrounded with try-catch, the program would only crash when I want to send the recording for play back. Errors are as follow:
"AudioFlinger could not create track, status: -12"
"Error initializing AudioTrack"
"[android.media.AudioTrack] Error code -20 when initializing AudioTrack."
When it is not surrounded with try-catch, the program would just crash the moment I click on the start streaming button.
"Decoding failed" < this is an error log from catching a throwable.
How can I mute the microphone input while streaming the system audio? Let me know if I can provide you with more codes. Thank you!
**EDIT
I have implemented my mutemicrophone successfully, it even returns me a true for isMicrophoneMute(), however, it's not muted as it still records from the microphone; it's a false true.
Based on the suggested answer, I have already created a class for audio focus as below:
private final Context c;
private final AudioManager.OnAudioFocusChangeListener changeListener =
new AudioManager.OnAudioFocusChangeListener()
{
public void onAudioFocusChange(int focusChange)
{
//nothing to do
}
};
AudioFocus(Context context)
{
c = context;
}
public void grabFocus()
{
final AudioManager am = (AudioManager) c.getSystemService(Context.AUDIO_SERVICE);
final int result = am.requestAudioFocus(changeListener,
AudioManager.STREAM_MUSIC,
AudioManager.AUDIOFOCUS_GAIN);
Log.d("audiofocus","Grab audio focus: " + result);
}
public void releaseFocus()
{
final AudioManager am = (AudioManager) c.getSystemService(Context.AUDIO_SERVICE);
final int result = am.abandonAudioFocus(changeListener);
Log.d("audiofocus","Abandon audio focus: " + result);
}
I then call the method from my Decoder class to request for audio focus:
int bufferSize = AudioRecord.getMinBufferSize(frequency, channelConfiguration, audioEncoding);
audioFocus.grabFocus();
audioRecord = new AudioRecord(AudioManager.STREAM_MUSIC,
frequency, channelConfiguration,
audioEncoding, bufferSize);
//a short array to store raw pcm data
short[] buffer = new short[bufferSize];
Log.i("decoder", "The audio record created fine ready to record");
audioRecord.startRecording();
isDecoding = true;
Log.i("decoder", "Start recording fine");
And then release the focus when stop decoding is pressed:
//stops recording
public void stopDecoding(){
isDecoding = false;
Log.i("decoder", "Out of recording");
audioRecord.stop();
try {
dos.close();
} catch (IOException e) {
e.printStackTrace();
}
mp.stop();
mp.release();
audioFocus.releaseFocus();
}
However, this makes my application crash. Where did I went wrong?
The following snippet requests permanent audio focus on the music audio stream. You should request the audio focus immediately before you begin playback, such as when the user presses play. I think this would be the way to go rather than muting the input microphone. Check out the developer audio focus docs for more information
AudioManager am = mContext.getSystemService(Context.AUDIO_SERVICE);
...
// Request audio focus for playback
int result = am.requestAudioFocus(afChangeListener,
// Use the music stream.
AudioManager.STREAM_MUSIC,
// Request permanent focus.
AudioManager.AUDIOFOCUS_GAIN);
if (result == AudioManager.AUDIOFOCUS_REQUEST_GRANTED) {
am.registerMediaButtonEventReceiver(RemoteControlReceiver);
// Start playback.
}

How to capture and encode audio in a video system_Android

I am trying to build a opensource video system in android, since we have no access to the data in a closed system. In this system, we can modify the raw data captured by camera.
I used MediaCodec and MediaMux to do the video data encoding and muxing job, and that works. But I have no idea about the audio part. I used onFramePreview to get each frame and do the encoding/muxing work by frame. But how do I do the audio recording at the same time(I mean capturing the audio by frame, encode it and send the data to the MediaMux).
I've done some research. It seems that we use audiorecorder to get the raw data of audio. But audiorecorder does a constant recording job, I don't think it can work.
Can anyone give me a hint? Thank you!
Create audioRecorder like this:
private AudioRecord getRecorderInstance() {
AudioRecord ar = null;
try {
//Get a audiorecord
int N = AudioRecord.getMinBufferSize(8000,AudioFormat.CHANNEL_IN_MONO,AudioFormat.ENCODING_PCM_16BIT);
ar = new AudioRecord(AudioSource.MIC, 8000, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, N*10);
}
catch (Exception e) {
}
return ar; //Returns null if mic is unavailable
}
Prepare and send the data for the encoding and muxing later like this is separate thread:
public class MicrophoneInput implements Runnable {
#Override
public void run() {
// Buffer for 200 milliseconds of data, e.g. 400 samples at 8kHz.
byte[] buffer200ms = new byte[8000 / 10];
try {
while (recording) {
audioRecorder.read(buffer200ms, 0, buffer200ms.length);
//process buffer i.e send to encoder
//don't forget to set correct timestamps synchronized with video
}
}
catch(Throwable x) {
//
}
}
}

Voice recognition fails to work when the voice is under recording

I am working on a function that when a button is pressed, it will launch voice recognition and at the same time will record what the user says. Codes as follows:
button_start.setOnTouchListener( new View.OnTouchListener()
{
#Override
public boolean onTouch(View arg0, MotionEvent event)
{
if (pressed == false)
{
Intent intent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH);
intent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL,RecognizerIntent.LANGUAGE_MODEL_FREE_FORM);
intent.putExtra(RecognizerIntent.EXTRA_CALLING_PACKAGE,"voice.recognition.test");
intent.putExtra(RecognizerIntent.EXTRA_LANGUAGE, "zh-HK");
intent.putExtra(RecognizerIntent.EXTRA_MAX_RESULTS,1);
sr.startListening(intent);
Log.i("111111","11111111");
pressed = true;
}
recordAudio();
}
if((event.getAction()==MotionEvent.ACTION_UP || event.getAction()==MotionEvent.ACTION_CANCEL))
{
stopRecording();
}
return false;
}
});
}
public void recordAudio()
{
isRecording = true;
try
{
mediaRecorder = new MediaRecorder();
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
mediaRecorder.setOutputFile(audioFilePath);
mediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
mediaRecorder.prepare();
}
catch (Exception e)
{
e.printStackTrace();
}
mediaRecorder.start();
}
public void stopRecording()
{
if (isRecording)
{
mediaRecorder.stop();
mediaRecorder.reset(); // set state to idle
mediaRecorder.release();
mediaRecorder = null;
isRecording = false;
}
else
{
mediaPlayer.release();
mediaPlayer.reset();
mediaPlayer = null;
}
}
class listener implements RecognitionListener
{
// standard codes onReadyForSpeech, onBeginningOfSpeech, etc
}
Questions:
I have made the app step by step, and at first the app does not have recording functions, and the voice recognition works perfectly.
After I have tested many times and considered the voice recognition is ok, I start to incorporate the recording functions using the MediaRecorder.
I then tested, once the button_start is pressed, ERROR3 AUDIO message immediately appears even before I tried to speak.
I play back the voice recording. The voice is recorded and saved properly too.
What is happening? Why Cannot recording at the same time when using voice recognition?
Thanks!
--EDIT-- module for Opus-Record WHILE Speech-Recognition also runs
--EDIT-- 'V1BETA1' streaming, continuous, recognition with minor change to sample project. Alter that 'readData()', so the raw PCM in 'sData' is shared by 2 threads ( fileSink thread , recognizerAPI thread from sample project). For the sink, just hook up an encoder using a PCM stream refreshed at each 'sData' IO. remember to CLO the stream and it will work. review 'writeAudiaDataToFile()' for more on fileSink....
--EDIT-- see this thread
There is going to be a basic conflict over the HAL and the microphone buffer when you try to do:
speechRecognizer.startListening(recognizerIntent); // <-- needs mutex use of mic
and
mediaRecorder.start(); // <-- needs mutex use of mic
You can only choose one or the other of the above actions to own the audio API's underlying the mic!
If you want to mimic the functionality of Google Keep where you talk only once and as output from the one input process (your speech into mic) you get 2 separate types of output (STT and a fileSink of say the MP3) then you must split something as it exits the HAL layer from the mic.
For example:
Pick up the RAW audio as PCM 16 coming out of the mic's buffer
Split the above buffer's bytes (you can get a stream from the buffer and pipe the stream 2 places)
STRM 1 to the API for STT either before or after you encode it (there are STT APIs accepting both Raw PCM 16 or encoded)
STRM 2 to an encoder, then to the fileSink for your capture of the recording
Split can operate on either the actual buffer produced by the mic or on a derivative stream of those same bytes.
For what you are getting into, I recommend you look at getCurrentRecording() and consumeRecording() here.
STT API reference: Google "pultz speech-api". Note that there are use-cases on the API's mentioned there.
buferUtils
code
more code

AAC format in Android

I have a problem with audio on android.
Short Question:
I need to play and record audio in aac format an all Android devices.
I found that it is possible starts from API 10, but on my BLU device(2.3.5) it works by using MediaRecorder and MediaPlayer.
But on HTC Nexus One it doesn't work.
Have you any suggestions?
Long question:
To record and play audio in AAC format I'm using following code. It pretty simple and stupid, but it works for testing.
String pathForAppFiles = getFilesDir()
.getAbsolutePath();
pathForAppFiles += "/bla.mp4";
if (audioRecorder == null) {
File file = new File(pathForAppFiles);
if (file.exists())
file.delete();
audioRecorder = new MediaRecorder();
audioRecorder
.setAudioSource(MediaRecorder.AudioSource.MIC);
audioRecorder
.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
audioRecorder
.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
audioRecorder.setOutputFile(pathForAppFiles);
try {
audioRecorder.prepare();
} catch (IllegalStateException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
audioRecorder.start();
} else {
audioRecorder.stop();
audioRecorder.release();
audioRecorder = null;
new AudioUtils().playSound(pathForAppFiles);
}
new AudioUtils().playSound(pathForAppFiles); - creates MediaPlayer and play sound from file;
To make it work on nexus I tried aac-decoder - it doesn't play file to the end (plays only 6 second of 10 seconds file). And it doesn't plays sound recorded by code above.
Also I tried to install FFmpeg, but I haven't experience to make this library work.
So can you recommend something?
I resolve this issue by changing audio type to MP3, because some devices (like Kindle Fire) do not play aac from anywhere.
My recommendation is:
If you want to make cross platform sounds - use MP3. You can convert any sound to MP3 with lame encoder.

JB Media codec decoder issue

I am using JB's Hardware Media Codec . I am trying to encode a video and decode it and display using the codecs (in video/avc format)...
I am using two buttons to "start" and "stop" the video rendering. The first time, when I render the video it is displayed correctly. When I start the video second time, it is not getting displayed and throws the following error:
"NOT in AVI Mode"
I copy paste the code snippets for the start and Stop button.
public void Stop(){
try {
//stopping the decoder alone
decoderMediaCodec.flush();
decoderMediaCodec.stop();
decoderMediaCodec.release();
//Tried with various combination of flush(), stop() and release();
} catch (Exception e) {
e.printStackTrace();
}
public void Start(Surface view){
try {
decoderMediaCodec = MediaCodec.createDecoderByType(mime);//Initialize the decoder again
MediaFormat format = MediaFormat.createVideoFormat(mime, mWidth, mHeight);
format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
format.setInteger(MediaFormat.KEY_BIT_RATE, bitrate);
format.setInteger(MediaFormat.KEY_COLOR_FORMAT, colorFormat);
format.setInteger(MediaFormat.KEY_FRAME_RATE, framerate);
decoderMediaCodec.configure(format, view, null, 0);
decoderMediaCodec.start();
} catch (Exception e) {
e.printStackTrace();
}
}
Kindly help me with the video rendering.
Note : data received in decoder is valid... data is checked using the beyond compare tool
I am getting -1 for outputBufferIndex
int outputBufferIndex = decoderMediaCodec.dequeueOutputBuffer(bufferInfo, 0);
In logs i get
E/( 271):
E/( 271):not in avi mode
E/( 271):
E/( 271): not in avi mode
It would be good if you could share more logs when you encounter the issue. From the description of your issue, could you confirm that the Surface being passed for your 2nd Start call is a valid handle?
If you can rebuild android, probably enabling log traces in Mediacodec.cpp would be helpful, specifically Mediacodec::setNativeWindow method.
P.S: For a decoder, why are I-frame interval, bitrate and framerate being set?

Categories

Resources