Android ExoPlayer changing selected track - android

I am using Andorid ExoPlayer to stream content from internet through http(HLS). When I start the player it is working ok, but when I try to change the quality of the playing content -
player.setSelectedTrack(TYPE_VIDEO, 1)
for example, I get and HTTP 403 Forbidden error. If I initialize the player and run again the content, it is fine. I am using the Demo project as a source.
Do you know what might be causing this behavior and what is the difference in playing the initial stream and changing the track?
Is there a way to reset the streaming without re-initializing the whole player, because the url is passed to the builder when the player is initializing?

I found the problem. The URL i was passing to the ExoPlayer was returned from an external API and was rather long so I haven't noticed that it is actually 2 URLs concatenated by "|"(http://my_url_1.m3u8|http://my_url_2.m3u8). The strange thing is that if you pass this string to the ExoPlayer it is playing a stream without error.. but if you try to change the quality of the playing stream you have problems.

You do not change quality of the stream using setSelectedTrack(). That is used for selecting what to play from the available streams (like what language of audio, subtitle, or for videos it is quite rare, but for example you can set up different camera angles of a sport event).
All of these streams can have multiple quality levels, and ExoPlayer's FormatEvaluator selects what quality to download based on the network conditions.
If the decoder is different when you select a new track, then the player needs to be reinitialized for continuing playback.
I suggest downloading the HLS manifest (.m3u8) manually, and then check the listed information, try the urls one by one in a browser. All should work, you should not get 403 test in this test either.

Related

ExoPlayer playing currently recording media files

Let me refraise my question, I wrote it in a hurry.
Current situation:
I have set up a digital video recorder to record broadcasts provided via DVB-C. It is running on a raspberry 3B using TVHeadend and jetty/cling to provide UPnP and other possibilities to access media files. For watching recordings, I wrote an android player app using IJKPlayer, which runs on smartphones, FireTV and AndroidTV.
One hassle when playing media files which are currently beeing recorded is, that IJKPlayer doesn not support timeshifting. Means, when I start playing a currently recording file, I can only watch the length which is known by the player at that moment. Anything which is recorded afterwards can not be played. I need to exit the player activity and start it again. I have resolved that issue by "simulating" a completed recoding using a custom servlet implementation. Since the complete length of the recording is already known, I can use ffmpeg to accomplish this.
Future situation:
I plan to move away from IJKPlayer to ExoPlayer, because it supports hardware playback and is much faster when playing h.264 media. I can of course use the same solution like above, but as far as I have found out yet, ExoPlayer can support media files which are currently being recorded by using the Timeline class. However, I don't seem to find neither a usefull documentation nor any good example. Hence, I would appreciate any help with the timeline object.
Regards
Harry
Looks like my approach won't work. At least, I didn't find a solution. Problem is, that the server returns the stream size as it is during player-start-time. I didn't find a method to update the media duration for "regular" files.
However, I can solve the problem by changing the server side. Instead of accessing a regular file, I convert the file to m3u8 in realtime, using ffmpeg. I then throw the m3u8 URI onto the player and it updates the duration of the stream (while playing) without the need to create any additional code on the client side.

buffer and play videos faster in VideoView

Using Picasso I was able to download and display my images very quickly in my Android app. Now i want to stream my videos from my S3 server and play them through my app faster than my code here:
try {
MediaController VideoController = new MediaController(VideoPlayerActivity.this);//Creates a media controller to this activity.
VideoController.setAnchorView(AdVideoView);//Adds the media controller to the video view.
Uri video = Uri.parse(VideoURL);//Creates a Uri to hold the URL of the video.
AdVideoView.setMediaController(VideoController);//Add the media controller to the video view.
AdVideoView.setVideoURI(video);//Make the video view play from the Uri.
} catch(Exception e) {
Log.e("Video Stream Error", e.getMessage());//Sets the message for the log.
e.printStackTrace();//Displays the error in the stack trace.
e.notify();
}
Is there a faster way to display videos through a GitHub or better code?
Thanks in advance!
The things that usually slow down streamed video playback are server and network related rather than client side - unless you have a very slow or very busy device it is unlikely it won't be able to play the video back at the rate it is received over the network.
Taking this and assuming you are are seeing delays in your streamed videos, there are a couple of common things to look for.
First, mp4 videos in normal format have the metadata at the end of the video file which is not good for streaming. There is a technique called quickstart, which moves the metadata to the start which you definitely want to use. More info here:
http://multimedia.cx/eggs/improving-qt-faststart/
Secondly, network connections can obviously vary and slow networks make streaming high quality video files a problem. A technique called adaptive bit rate streaming (ABR) allows the client request lower quality video 'chunks' if the network quality is bad and then change to higher quality when it improves.
ABR also helps startup time as it allows you quickly start the video stream by using a lower quality level, and hence smaller size chunk, and then increase the quality as the video progresses. You can see this effect when you start up most online video services, such as Netflix, today (July 2016).
One thing to note is that video hosting and streaming is a specialist area so it is generally easier to leverage existing streaming technologies and services rather than to build them your self. Some good places to look to get a feel for open source solutions:
https://gstreamer.freedesktop.org
http://www.videolan.org/vlc/streaming.html

Android RTSP stream from IP camera without delay

I need to stream rtsp-video from IP camera in local network to my android app. It's very easy to use VideoView and play it as url, or SurfaceView and play stream on it with native MediaPlayer. But when I stream that way - I've recieved a 6-second delay when my phone is buffering that video. As I read, there is no way to change buffer size of MediaPlayer. But I saw several apps that stream video from my camera in almost real-time. I've read a lot about this - cause I'm not the first one who encountered this problem - but didn't find any useful info.
Many thanks for any help!
I'm using vlc-android, it works well to play my cameras' rtsp links:
https://github.com/mrmaffen/vlc-android-sdk#get-it-via-maven-central
The delay is about 1 second.

How to get start for video streaming

I have never worked on video related project, however we have to now.
1. What we tried to do
Build an Andriod application which can take the real time steam of video an audio.
Send the captured stream to Server
Other clients(Either Android client or iOS or HTML5) can view these streams
All of the above three steps should work at the same time.
Video streamed to server should be cached by future play.
2. What I know at the moment
I have searched at google and sf to see if someone have the same requirement.
After that I know a little about the video transformation:
Protocol:
RTSP/RTP/RTCP
RTSP: control the state of the transformation like PLAY,PAUSE,STOP..
RTP: which do the real transport job
RTCP: work in conjunction with RTP(synchronize the stream)
HTTP:
1) Download the small pieces of the video file and play them, use the `range-requset` to control the download(play) location.
2) HLS by Apple. Even it said it is live stream, it is based on `.m3u8` file, by updating the index of which to do the live job.
RTMP by Adobe.
Encoding:
Nothing I know yet.
And it seems that RTSP/RTP/RTCP can be used for both uploading to server and playing at the client. So it apply for the application which need high real-timing. However since the RTSP/RTP/RTCP based on TCP/UDP so getting through the Router would be a problem.
While the HTTP can only be used for playing at the client(Technologically you can stream the small pieces of file by HTTP, but I think it is not a good idea), so it can be use to play the existed video stream either from file or something else. And you don't worry about the Router, which means it can be used under complex network environment.
For our application, since we do not have a strict requirement for the real-timing during the playing. So we tried to stream the video source from the Android client to Server by RTSP/RTP/RTCP, and serve these streams by HTTP.
2. Questions:
Anything wrong in all of the above?
Is it posssible of my idea:streaming by RTSP/RTP/RTCP and serving
by HTTP.
If yes, it seems that the Server shoud do something to cache the video to a proper format for further serving. I am not sure if this job can be done by a Video Server out of the box, or by myself?
What should I know more about the streaming development(at least for
my current project)? Any tutorial are welcome.

Streaming audio file and caching it

I want to stream an audio mp3 file and then play it through android media player plus I also want to cache this file, so that mediaplayer don't have to stream for recently played tracks.
I have tried using prepareAsync method but it doesn't give me access to buffer content, so I have decided to stream the audio file myself and then pass it to the media player for playing. I have achieved this by following this article here but this approach has a problem i.e. while transferring the file to media player it goes into error mode which causes my player to behave inconsistently.
When media player enters its error mode it doesn't come out of it automatically so I am forced to create a new media player and then re-provide it the downloaded file, this workaround causes the user to experience an undesired pause in the song playing.
So, does any one have improved an version of code given in above link? or do they know a better solution to this problem or is there is actually a library for streaming an audio file in android?
Thanks
The link you provided looks like a less than ideal solution (not to mention outdated). What you probably want is a local proxy server that gives you access to byte data before the MediaPlayer gets it. See my answer here for a little more explanation.

Categories

Resources