Video stream lag and disconnect with videoview using ip camera stream - android

So I have an IP camera that outputs a video stream I can connect to via the rtsp protocol. I want to display this in my android application, so I've set up a videoview using the following code;
VideoView vv = (VideoView)this.findViewById(R.id.videoView);
Uri uri = Uri.parse("rtsp://username:password#192.168.0.1:554/1/stream3");
vv.setVideoURI(uri);
vv.requestFocus();
vv.start();
I'm putting this in the onCreate() of the main activity class, so when the app loads up it automatically connects and starts streaming. My experience with this is that it works - but eventually gets choppy and or just stops randomnly and doesn't seem to ever get back to running again. I have to close the app and clear it from memory and restart it to get it back - but then it loses connection shortly after, meaning its pretty much useless.
I also found it seemed to lag a bit when touching on the screen objects like menus or buttons but that might just be a coincidence - I can't say for sure.
The thing is the stream is perfect from a PC on the same network via VLC using the same URL. So what am I doing wrong, and is there any better method of handling streaming video? I ultimately wanted to mate the videoview with some overlaid text and buttons, and potentially take screenshots when necessary. At the moment I'm lucky if I get video for a few seconds before it cuts out...
Some additional comments;
I've had some success running it for a longer frame of time - so it's not always bad which makes things difficult to diagnose. But when it stops it stops.
Does videoview actively try to reconnect if it has lost a connection?
Is there a way of demonstrating this with a progress indicator perhaps - so it doesn't look like it's doing nothing?

A bit late, but for others with the same problem: try debugging by setting listeners to your VideoView? i.e. onError, onCompletion, etc.
For example:
vv.setOnErrorListener(new MediaPlayer.OnErrorListener() {
#Override
public boolean onError(MediaPlayer mp, int what, int extra) {
Log.d("VideoViewError", Integer.toString(what));
//logs the error you're running into
//You can also put a switch case here to
determine what error it is running into exactly:
String errorString = "Media Player Error: ";
switch (what) {
case MediaPlayer.MEDIA_ERROR_UNKNOWN: {
errorString += "Unspecified media player error. ";
}
case MediaPlayer.MEDIA_ERROR_SERVER_DIED: {
errorString += "Media server died. ";
}
}
switch (extra) {
case MediaPlayer.MEDIA_ERROR_IO: {
errorString += "File or network related operation error.";
}
case MediaPlayer.MEDIA_ERROR_MALFORMED: {
errorString += "Bitstream is not conforming to the related coding standard or file spec.";
}
case MediaPlayer.MEDIA_ERROR_UNSUPPORTED: {
errorString += "Bitstream is conforming to the related coding standard or file spec, but the media framework does not support the feature.";
}
case MediaPlayer.MEDIA_ERROR_TIMED_OUT: {
errorString += "Media operation timed out.";
}
}
Log.d(TAG, errorString);
return true;
}
});
If the stream is 'ending', you will get an onCompletion
setOnCompletionListener(new MediaPlayer.OnCompletionListener() {
#Override
public void onCompletion(MediaPlayer mp) {
Log.d("VideoViewError", "Media Player reached end of file");
}
}
);
You can do likewise with setOnInfoListener, which lets you know the status of the video view during playback. (Codes are here: http://developer.android.com/reference/android/media/MediaPlayer.OnInfoListener.html)
Maybe not the answer you're looking for, but will hopefully lead you to the right one!

Related

Play music synchronous using 3 MediaPlayer Objects on Android/Eclipse

What i have:
I have implemented three MediaPlayer.Objects in my App.
All Three are created using a thread:
protected void onResume() {
// Threads
mTT1 = new TrackThread(this, R.raw.audiofile1, 1, mHandler);
mTT2 = new TrackThread(this, R.raw.audiofile2, 2, mHandler);
mTT3 = new TrackThread(this, R.raw.audiofile3, 3, mHandler);
// start thread
mTT1.start();
mTT2.start();
mTT3.start();
super.onResume();
}
"simplified" Code in the Thread for creating:
public class TrackThread extends Thread implements OnPreparedListener {
...
...
...
public void run() {
super.run();
try {
mMp.setDataSource(afd.getFileDescriptor(),
afd.getStartOffset(), afd.getDeclaredLength());
mMp.prepare();
} catch (IllegalArgumentException | IllegalStateException
| IOException e) {
Log.e(TAG, "Unable to play audio queue do to exception: "
+ e.getMessage(), e);
}
}
As I read in several Tutorials the "prepare()" methode takes a little bit of time to finish. Therefore i implemented a "Waiting loop" which waits until all MPs are prepared and created.
When "prepare and create" are done i enable the Start button and i want to start all 3 Mediaplayers SIMULTANEOUSLY.
I again use a Thread for dooing so:
public void onClick(View v) {
// Button 1
if (mBtn.getId() == v.getId()) {
mTT1.startMusic();
mTT2.startMusic();
mTT3.startMusic();
}
Code in the thread:
public class TrackThread extends Thread implements OnPreparedListener {
...
...
...
// start
public void startMusic() {
if (mMp == null)
return;
mMp.start();
}
Please note that the code above is not the full code, but it should be enough to define my problem.
What i want, My problem:
All MPs should play their Music in Sync, unfortunately sometimes when i start the music, there is a time delay between them.
The MPs must start at the exact same time as the 3Audio-files must be played simultaneously (and exactly in sync)
What i have already tried:
+) using SoundPool: My Audio-files are to big(5Megabyte and larger) for SoundPool
+) seekTo(msec): i wanted to seek every MP to a Specific time: eg.: 0, but this did not solve the problem.
+) to reach more Programmers i also asked this question on: coderanch.com
I hope somebody can help me!
Thanks in advance
The bottleneck here will certainly be preparing the mediaplayers to play. The Android framework provides an asynchronous method to perform this loading, and so with a bit of synchronization code you should be able to get these audio sources to play at roughly the same time. To keep from sound artifacting, you'll want less than 10ms of latency.
Initialize an atomic counter, C, to the number of things to load.
Use the prepareAsync() functions within MediaPlayer to prepare all three. Immediately after calling prepareAsync, supply a listener using setOnPreparedListener(listener).
Inside this listener, decrement C and check the value. If the value is greater than 0, wait on an object using the java object .wait() function. If the value is equal to 0, call notifyAll() on the object to wake up all of the other mediaplayer prepared-listener callback threads.
public void startMediaPlayers(List<MediaPlayer> mediaPlayers) {
private AtomicInteger counter = new AtomicInteger(mediaPlayers.size());
Object barrier = new Object();
/* start off all media players */
for (MediaPlayer player : mediaPlayers) {
player.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(final MediaPlayer mediaPlayer) {
int value = counter.decrementAndGet();
if (value == 0) {
// all media players are done loading.
// wake up all the ones that are asleep
barrier.notifyAll();
} else {
while (value > 0) {
try {
// wait for everyone else to load
barrier.wait();
} catch (InterruptedException e) {
// ignore
}
}
}
mediaPlayer.start();
callback.success(true);
}
player.prepareAsync();
}
}
As nobody could help me I found a solution on my own. MediaPlayer did not fulfill my requirements but Android JETPlayer in combination with JETCreator did.
CAUTION: Installing Python for using JETCreator is very tricky, therfore
follow this tutorial. And be careful with the versions of python and wxpython, not all versions support the JETCreator.
I used:
Python Version 2.5.4 (python-2.5.4.msi)
wxPython 2.8 (wxPython2.8-win32-unicode-2.8.7.1-py25.exe)
For those who do not know how to implement the Jetplayer watch this video
(at min.5 he starts with programming the Jetplayer).
Unfortunately I do not speak French so I just followed the code which worked for me.
Using Android JETCreator you can create your own JET Files and use them as your resource.
Useful links:
Demo data
Manual
Code/class

Play audio and video at the same time in Android

I'm trying to play two different files at the same time.
I have tried to find players and tried to extend the default player achieving the same but couldn't get success in that. so please help me with it, by letting me know what's the best way to play audio file and video at the same time?
The reason I'm taking separate files is to save space, because the app will be localized, having multiple audio files for each language instead of having multiple videos saves space. That's important because android doesn't allow the download of app size above 50MB.
Any help in this would be extremely helpful. And providing me code for this would be a great help.
Thanks in advance.
You can handle this with Audio Focus. Two or more Android apps can play audio to the same output stream simultaneously. The system mixes everything together. While this is technically impressive, it can be very aggravating to a user. To avoid every music app playing at the same time, Android introduces the idea of audio focus. Only one app can hold audio focus at a time.
When your app needs to output audio, it should request audio focus. When it has focus, it can play sound. However, after you acquire audio focus you may not be able to keep it until you’re done playing. Another app can request focus, which preempts your hold on audio focus. If that happens your app should pause playing or lower its volume to let users hear the new audio source more easily.
Beginning with Android 8.0 (API level 26), when you call requestAudioFocus() you must supply an AudioFocusRequest parameter. To release audio focus, call the method abandonAudioFocusRequest() which also takes an AudioFocusRequest as its argument. The same AudioFocusRequest instance should be used when requesting and abandoning focus.
To create an AudioFocusRequest, use an AudioFocusRequest.Builder. Since a focus request must always specify the type of the request, the type is included in the constructor for the builder. Use the builder's methods to set the other fields of the request.
The following example shows how to use an AudioFocusRequest.Builder to build an AudioFocusRequest and request and abandon audio focus:
audioManager = (AudioManager) Context.getSystemService(Context.AUDIO_SERVICE);
playbackAttributes = new AudioAttributes.Builder()
.setUsage(AudioAttributes.USAGE_GAME)
.setContentType(AudioAttributes.CONTENT_TYPE_MUSIC)
.build();
focusRequest = new AudioFocusRequest.Builder(AudioManager.AUDIOFOCUS_GAIN)
.setAudioAttributes(playbackAttributes)
.setAcceptsDelayedFocusGain(true)
.setOnAudioFocusChangeListener(afChangeListener, handler)
.build();
mediaPlayer = new MediaPlayer();
final Object focusLock = new Object();
boolean playbackDelayed = false;
boolean playbackNowAuthorized = false;
// ...
int res = audioManager.requestAudioFocus(focusRequest);
synchronized(focusLock) {
if (res == AudioManager.AUDIOFOCUS_REQUEST_FAILED) {
playbackNowAuthorized = false;
} else if (res == AudioManager.AUDIOFOCUS_REQUEST_GRANTED) {
playbackNowAuthorized = true;
playbackNow();
} else if (res == AudioManager.AUDIOFOCUS_REQUEST_DELAYED) {
playbackDelayed = true;
playbackNowAuthorized = false;
}
}
// ...
#Override
public void onAudioFocusChange(int focusChange) {
switch (focusChange) {
case AudioManager.AUDIOFOCUS_GAIN:
if (playbackDelayed || resumeOnFocusGain) {
synchronized(focusLock) {
playbackDelayed = false;
resumeOnFocusGain = false;
}
playbackNow();
}
break;
case AudioManager.AUDIOFOCUS_LOSS:
synchronized(focusLock) {
resumeOnFocusGain = false;
playbackDelayed = false;
}
pausePlayback();
break;
case AudioManager.AUDIOFOCUS_LOSS_TRANSIENT:
synchronized(focusLock) {
resumeOnFocusGain = true;
playbackDelayed = false;
}
pausePlayback();
break;
case AudioManager.AUDIOFOCUS_LOSS_TRANSIENT_CAN_DUCK:
// ... pausing or ducking depends on your app
break;
}
}
}
Hope this helps! Also you can check Android's official documentation. If this doesn't help, you can check this site and this site for more documentation.
To play audio: Audio Track reference
To play video: Media Player reference
And now you could start on the main thread by showing in a Video View the video you want, and when is the time to play the sound you start playing the Audio Track. The tricky part will be to syncronize the audio with the video

Widevine DRM and Android videoplayback issue

I am facing a really annoying problem with the Widevine Library for Android. For some reason when trying to stream HLS with Widevine some devices (specially Samsung devices) give the following error when trying to play:
WV_Info_GetCodecConfig ESDS returned error 2002
After this, as it does not starts playing I stop, reset and release the media player as well as the DrmClient.
#Override
public void onCompletion(MediaPlayer mp) {
mPlayIndicator.setEnabled(false);
mSurfaceView.setVisibility(View.INVISIBLE);
mTimelineSeekBar.setMax(0);
if (PlayerEnvConfig.USE_DEBUG_LOGGING) {
Log.d("VideoPlayer", "ClosingVideoThread");
}
hideLoading();
if (mScheduleTaskExecutor != null) {
mScheduleTaskExecutor.shutdown();
}
if (mp != null) {
if (PlayerEnvConfig.USE_DEBUG_LOGGING) {
Log.d("VideoPlayer", "Stop Playing");
}
if (mp.isPlaying()) {
mp.stop(); // It's always safe to call stop()
}
if (PlayerEnvConfig.USE_DEBUG_LOGGING) {
Log.d("VideoPlayer", "Reset");
}
mp.reset();
if (PlayerEnvConfig.USE_DEBUG_LOGGING) {
Log.d("VideoPlayer", "Release");
}
mp.release(); // release resources internal to the MediaPlayer
mMediaPlayer = null; // remove reference to MediaPlayer to allow GC
}
// IMPORTANT: It is important to release the DRM client after releasing the media player other wise there are
// situations where the media player it is left in a bad state and does not play any more DRM protected content
// until the restart of the device.
// If the video is DRM protected then release the resource associated with the DRM.
if (mIsDrmProtected) {
mDrmManager.releaseDrmClient();
}
if (!mStopEventFired) {
// Fire the event into the bus to the subscribed views to replace the views accordingly
fireStopVideoPlaybackEvent();
}
}
And the code in the DrmManager to release the DrmClient:
#SuppressLint("NewApi")
public void releaseDrmClient() {
BusProvider.getInstance().unregister(this);
if (mDrmManagerClient != null) {
if (PlayerEnvConfig.USE_DEBUG_LOGGING) {
Log.d("DRMManager", "Releasing DRM");
}
mDrmManagerClient.removeAllRights();
// Starting from API 16 they included this function to release the drm client.
int currentApiVersion = android.os.Build.VERSION.SDK_INT;
if (currentApiVersion >= android.os.Build.VERSION_CODES.JELLY_BEAN) {
mDrmManagerClient.release();
}
// Set to null so will be removed by the garbage collector
mDrmManagerClient = null;
if (PlayerEnvConfig.USE_DEBUG_LOGGING) {
Log.d("DRMManager", "Releasing DRM Finally");
}
}
}
Ok well it does not play. I can confirm that all this code it is executed as they appear in the logs. BUT here is the really big problem, after this situation a process is left in the background (I cannot manage to find anywhere in the device which is the process) as if the video was still being played and the following error is shown constantly in the logs.
WVSession::SetWarning: status=2014, desc=MPEG2-TS continuity counter error
I realised because fir-stable the device gets really hot and second-able a used wireshark to sniff the traffic and I can see the requests made in the background.
This only happens when using HLS and Widevine, and when this last one fails playing. (The rights are actually retrieved and installed correctly but when trying to play fails).
Does anyone have any clue why this could be happening and specially how to avoid it??
Btw: The media player is embedded in a fragment, and this fragment inside another fragment.
Thanks!

Best practices for audio streaming

I'm writing an application to play audio from remote server. I tried several ways to implement streaming audio, but they all are not good enough for me.
That's what I've tried:
Naive using of MediaPlayer
Something like:
MediaPlayer player = new MediaPlayer();
player.setDataSource(context, Uri.parse("http://whatever.com/track.mp3"));
player.prepare();
player.start();
(or prepareAsync, no matter)
But standard MediaPlayer is quite unstable when playing remote content. It is often falls or stops playback and I can't process this. On the other side, I want to implement media caching. But I haven't found any way to get buffered content from MediaPlayer to save it somewhere on device.
Implementing custom buffering
Then there became an idea to download media file by chunks, combine them into one local file and play this file. Downloading the whole file can be slow because of bad connection, so it will be fine to download enough initially piece, then start playback and continue downloading and appending local file. Besides, we get caching functionality.
Sounds like a plan, but it didn't always work. It works perfectly on HTC Sensation XE but didn't on 4.1 tablet playback stopped after finishing this initial piece. Don't know, why is so. I've asked question about this, but received no answers.
Using two MediaPlayers
I've created two MediaPlayer instances and tried to make them change each other. The logic is following:
Start downloading initial piece of media
When it is downloaded, start playback via currentMediaPlayer. The rest of media continues
downloading
When downloaded piece is almost played (1 sec before finish), prepare secondaryMediaPlayer with the same source file (as it was appended during playback)
261 ms before finish of currentMediaPlayer – pause it, start secondary, set secondary as current, schedule preparing of next secondary player.
The source:
private static final String FILE_NAME="local.mp3";
private static final String URL = ...;
private static final long FILE_SIZE = 7084032;
private static final long PREPARE_NEXT_PLAYER_OFFSET = 1000;
private static final int START_NEXT_OFFSET = 261;
private static final int INIT_PERCENTAGE = 3;
private MediaPlayer mPlayer;
private MediaPlayer mSecondaryPlayer;
private Handler mHandler = new Handler();
public void startDownload() {
mDownloader = new Mp3Downloader(FILE_NAME, URL, getExternalCacheDir());
mDownloader.setDownloadListener(mInitDownloadListener);
mDownloader.startDownload();
}
private Mp3Downloader.DownloadListener mInitDownloadListener = new Mp3Downloader.DownloadListener() {
public void onDownloaded(long bytes) {
int percentage = Math.round(bytes * 100f / FILE_SIZE);
// Start playback when appropriate piece of media downloaded
if (percentage >= INIT_PERCENTAGE) {
mPlayer = new MediaPlayer();
try {
mPlayer.setDataSource(mDownloader.getDownloadingFile().getAbsolutePath());
mPlayer.prepare();
mPlayer.start();
mHandler.postDelayed(prepareSecondaryPlayerRunnable, mPlayer.getDuration() - PREPARE_NEXT_PLAYER_OFFSET);
mHandler.postDelayed(startNextPlayerRunnable, mPlayer.getDuration() - START_NEXT_OFFSET);
} catch (IOException e) {
Log.e(e);
}
mDownloader.setDownloadListener(null);
}
}
};
// Starting to prepare secondary MediaPlayer
private Runnable prepareSecondaryPlayerRunnable = new Runnable() {
public void run() {
mSecondaryPlayer = new MediaPlayer();
try {
mSecondaryPlayer.setDataSource(mDownloader.getDownloadingFile().getAbsolutePath());
mSecondaryPlayer.prepare();
mSecondaryPlayer.seekTo(mPlayer.getDuration() - START_NEXT_OFFSET);
} catch (IOException e) {
Log.e(e);
}
}
};
// Starting secondary MediaPlayer playback, scheduling creating next MediaPlayer
private Runnable startNextPlayerRunnable = new Runnable() {
public void run() {
mSecondaryPlayer.start();
mHandler.postDelayed(prepareSecondaryPlayerRunnable, mSecondaryPlayer.getDuration() - mPlayer.getCurrentPosition() - PREPARE_NEXT_PLAYER_OFFSET);
mHandler.postDelayed(startNextPlayerRunnable, mSecondaryPlayer.getDuration() - mPlayer.getCurrentPosition() - START_NEXT_OFFSET);
mPlayer.pause();
mPlayer.release();
mPlayer = mSecondaryPlayer;
}
};
Again – sounds, like a plan, but works not perfectly. The moments of switching MediaPlayers are quite hearable. Here I have opposite situation: on 4.1 tablet it's ok, but on HTC Sensation there are evident lags.
I also tried to implement different download techniques. I've implemented download by 10Kb chunks and by MP3 frames. I don't know exactly, but it seems that in case of MP3 frames seekTo and start work better. But it's just a feeling, I don't know explanation.
StreamingMediaPlayer
I saw to this word several times while googling, and found this implementation: https://code.google.com/p/mynpr/source/browse/trunk/mynpr/src/com/webeclubbin/mynpr/StreamingMediaPlayer.java?r=18
It is a solution everybody use?
If yes, it's sad, because it is not working good for me too. And I don't see any fresh ideas in implementation.
So, the question
How do you guys implement audio streaming in your applications? I don't beleive I am the only person who faced problems like this. There should be some good practices.
In my case I use FFMPEG with OpenSL ES. The disadvantage is complexity. You must be familiar with a lot of things: JNI, OpenSL, FFMPEG. It's also hard to debug(comparing with pure java android app). In your case I suggest you to try low level Media API. The only thing is lack of examples. But there is a unit test which shows how you can handle audio(you need to change InputStream reference - line 82).

Android MediaPlayer is preparing too long

Hey,
I'm using MediaPlayer to play a regular ShoutCast stream. The code is straightforward with prepareAsync() and a handler to start the playback. While it works flawlessly with some streams like DI.FM or ETN.FM (http://u10.di.fm:80/di_progressive), with others (http://mp3.wpsu.org:8000/) it won't go past the prepare state. No other listeners are called either.
//Uri streamUri = Uri.parse("http://u10.di.fm:80/di_progressive"); /* works */
Uri streamUri = Uri.parse("http://mp3.wpsu.org:8000/"); /* stuck on prepare state */
MediaPlayer mediaPlayer = new MediaPlayer();
mediaPlayer.setOnPreparedListener(new OnPreparedListener() {
public void onPrepared(MediaPlayer mp) {
mp.start();
}
});
mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
mediaPlayer.setDataSource(this.getBaseContext(), streamUri);
mediaPlayer.prepareAsync();
Any feedback is appreciated!
I think that there are some compatibility problems with the server end.
This is rather strange since the emulator handles it ok in my case - just not on my Froyo Galaxy S, even though it is the same API version.
It could be a codec issue, http streaming issue, I do not know.
But all the servers that fail tend to be old ones, with "Copyright 1998 - 2004" at the bottom... Not exactly recent or up to date you would think.
One potential workaround (which I have not tried yet) would be to use the StreamProxy, which would also make your code compatible with 2.1 and possibly earlier versions too. At the cost of extra work, extra code, and without doubt extra bugs...
In case you are not aware of it, there is another player bug report for 2.2 which may be relevant too:
Basic streaming audio works in 2.1 but not in 2.2
I'm facing an issue when MP "hangs" at preparing state too long (stream) and i'm trying to stop it using reset(). This causes MP to hang and thus my whole app freezes. Seems like there is no way to stop MP at preparing state. Im thinking on use prepare() wrapped in thread instead of prepareAsync(). Then i'll be able to kill that thread. As for now i did it in following way:
private void actionCancel(){
try {
mp.setDataSource(new String());
} catch (Exception e) {
e.printStackTrace();
android.util.Log.d(TAG,"actionCancel(): mp.setDataSource() exception");
mp.reset();
}
}
and it works 4me.
Additionally i have a following counter:
#Override
public void onBufferingUpdate(final MediaPlayer mp, final int percent) {
if (!mp.isPlaying()){
// android.util.Log.d(TAG,"onBufferingUpdate(): onBufferingUpdateCount = "+onBufferingUpdateCount);
if (onBufferingUpdateCount>MAX_BUFFERING_UPDATES_AT_PREPARING_STATE)
restartMP();
onBufferingUpdateCount++;
return;
}
}
i'd discover this listener always triggers at preparing state. So if it triggers more than 10 times and MP is still not playing i'm just restarting it:
private void restartMP(){
if (mp!=null)
if (mpState==MediaPlayerState.Preparing)
actionCancel();
else
mp.reset();
else
mp = new MediaPlayer();
mpState = MediaPlayerState.Idle;
onBufferingUpdateCount=0;
//isRequestCancelled=false;
requestTrackInfoStartedAt=0;
requestPlay();
}
note MediaPlayerState is my custom enum which has "Preparing" value. Also mpState is a class property/field which holds current MediaPlayerState state. Before starting prepareAsync() im setting mpState to MediaPlayerState.Preparing after it completes im setting it to MediaPlayerState.Started or other corresponding value.

Categories

Resources