I'm working on a native Android application that streams HLS video. I'm trying to get CEA-608 closed caption data to display, but I'm running into a problem.
Methodology:
I'm running on a Jelly bean device (API 4.1+), which supports closed captions and HLS playback (http://developer.android.com/about/versions/android-4.1.html#Multimedia).
Test Feed:
I'm testing using the Apple sample HLS bip-bop feed that contains captions data (https://devimages.apple.com.edgekey.net/resources/http-streaming/examples/bipbop_16x9/bipbop_16x9_variant.m3u8)
Grabbing Caption Data:
As per the Android documentation, I'm attempting to get the caption track by calling:
"MediaPlayer.getTrackInfo()"
Problem:
The player crashes when I execute the getTrackInfo() line, with the following output in LogCat:
E/AndroidRuntime(7311): FATAL EXCEPTION: main
E/AndroidRuntime(7311): java.lang.RuntimeException: failure code: -38
Questions:
1) Is closed caption rendering from in-stream caption data on an HLS feed supported in native Android apps?
2) If not, are there alternatives?
3) The documentation in the link above (quoted below) indicates that in-stream captions are supported on MP4 or 3GPP media sources. Does that necessarily exclude HLS (which I believe is MPEG-2 based) from working?
The MediaPlayer now handles both in-band and out-of-band text tracks.
In-band text tracks come as a text track within an MP4 or 3GPP media
source. Out-of-band text tracks can be added as an external text
source via addTimedTextSource() method. After all external text track
sources are added, getTrackInfo() should be called to get the
refreshed list of all available tracks in a data source.
Thanks for any help you can offer!
We went thru exactly same exercise. MediaPlayer on Android doesn't seem to support CEA-608 embedded in HLS. OnTimedTextListener was never called when media player (MP) was playing. On trying to call getTrackInfo() when MP was playing resulted in crash as you mentioned.
May be Android MP can decode only included SRT in MP4.
Alternative would be to use external timed text track but in live streaming environment, it is going to be difficult.
The question is quite old, but some people still might face the problem.
ExoPlayer tries to solve the issue as an Android version independent general multimedia library.
It currently supports CEA-608, SubRip, TTML, TX3g and WebVtt subtitles. Of course not fully all the standards, but a useful part of them.
Related
Let me refraise my question, I wrote it in a hurry.
Current situation:
I have set up a digital video recorder to record broadcasts provided via DVB-C. It is running on a raspberry 3B using TVHeadend and jetty/cling to provide UPnP and other possibilities to access media files. For watching recordings, I wrote an android player app using IJKPlayer, which runs on smartphones, FireTV and AndroidTV.
One hassle when playing media files which are currently beeing recorded is, that IJKPlayer doesn not support timeshifting. Means, when I start playing a currently recording file, I can only watch the length which is known by the player at that moment. Anything which is recorded afterwards can not be played. I need to exit the player activity and start it again. I have resolved that issue by "simulating" a completed recoding using a custom servlet implementation. Since the complete length of the recording is already known, I can use ffmpeg to accomplish this.
Future situation:
I plan to move away from IJKPlayer to ExoPlayer, because it supports hardware playback and is much faster when playing h.264 media. I can of course use the same solution like above, but as far as I have found out yet, ExoPlayer can support media files which are currently being recorded by using the Timeline class. However, I don't seem to find neither a usefull documentation nor any good example. Hence, I would appreciate any help with the timeline object.
Regards
Harry
Looks like my approach won't work. At least, I didn't find a solution. Problem is, that the server returns the stream size as it is during player-start-time. I didn't find a method to update the media duration for "regular" files.
However, I can solve the problem by changing the server side. Instead of accessing a regular file, I convert the file to m3u8 in realtime, using ffmpeg. I then throw the m3u8 URI onto the player and it updates the duration of the stream (while playing) without the need to create any additional code on the client side.
I am developing an App on Xamarin right now for Android, alter IOS.
I have started a SIP session as a Client successfully and got a SDP.
With the session description I start my RTPClient (I am using Managed Media Aggregation (https://net7mma.codeplex.com/) also successfully it seems.
I get an URI ending with a Media Port.
I have tried different ways to Play back that stream:
Android.Widget.VideoView:
videoview.SetVideoURI(Android.Net.Uri.Parse(fullPath));
videoview.Start();
says in a popup box "Cannot Playback Video" or something like this just in german in my case.
So I tried an Android.Media.MediaPlayer:
player.Reset();
player.SetDataSource(fullPath);
player.Prepare();
runs into an exception and
player.PrepareAsync();
seems to run nowhere.
Should These work somehow and am I just doing wrong?
Or must I give it more info like it is H.264 decoded, has 640x480 Pixels and more stuff I know exactly from my SDP media description?
I have taken a look into that MediaFormat and MediaCodec classes from Android but still do not know how to use them exactly for my case (RTP Connection with known Media description).
Thanks a lot and have a nice Weekend!
Eric
You should use an external component like the Google ExoPlayer to media with encoding. There is a Nuget package available for this: https://github.com/martijn00/ExoPlayerXamarin
That's my project... (net7mma) e.g. I am the Author...
You can use either the Rtsp or Rtp client quite easily and I have verified such in Android recently.
If you have specific questions make a thread in the project's home page and I will address it.
I'm developing an Android application that allow the users to watch the tv channels via streaming.
The user must "tap" on the channel (for example chan 1) and an activity show the real time video, but I have one question, there are other solutions, different by the use of a webview to show the live video ?
Exist some solutions more "professional" or functionals?
You can use ExoPlayer to play streams. Take a look at the DemoApp. As official documentation says
ExoPlayer has support for Dynamic Adaptive Streaming over HTTP (DASH)
and SmoothStreaming, neither of which are are supported by MediaPlayer
(it also supports HTTP Live Streaming (HLS), MP4, MP3, WebM, M4A,
MPEG-TS and AAC).
But make sure you can get the direct link to your streams.
I've recently been struggling a lot with playback of a video inside an Android app. The video in question is an M3U8 file, which in turn links to a series of secondary M3U8s, each of which has a list of component MPEG-TS video files, and a single M3U8 file containing the audio components in AAC format. I had a fair bit of trouble making M3U8 cooperate, however that now seems to be working OK.
Unfortunately, the audio is lagging about a second behind the video. This lag is present both in my own Activity containing a MediaPlayer, and simply launching Android's default video player pointing to the stream. There is also frequent visual stuttering or corruption in the playback. Neither issue is present when played with VLC Beta, or in the iPhone version of this app, hence it's not a problem with the video file itself. I tried the Vitamio library, this ran even worse (far more frequent corruption, and no audio at all). Regrettably, changing the video format is not an option.
It seems to me that this issue may be caused by Android's limited support for MPEG-TS - the list of supported media formats specifies that it uses the AAC audio (whereas I assume the iOS version is simply playing both video and audio from the .ts file). Any recommendations/solutions for fixing this lag?
You could try Gstreamer which seems to work well on Android (http://gstreamer.com/), also Vitamio is
Can someone please explain the steps I need to take in order to add a
new codec to Android?
Also, I would like the codec to be installed as part of an application
installation (or first launch) and NOT as part of a full Android OS
build.
The reason I want to do this is that I have an application that needs
to show a video of a non supported codec (HLS or TS), but I wouldn't
want to build a full blown video player - just integrate with the
existing, built-in, player.
Thanks,
Alik.
Can someone please explain the steps I need to take in order to add a new codec to Android?
Build your own firmware, or build your own media player (like VLC for Android).
Also, I would like the codec to be installed as part of an application installation (or first launch) and NOT as part of a full Android OS build.
That is not possible, unless you build your own media player.
The reason I want to do this is that I have an application that needs to show a video of a non supported codec (HLS or TS), but I wouldn't want to build a full blown video player - just integrate with the existing, built-in, player.
VLC for Android is due out (at least for some phones) shortly, so it may be able to play your format.
I think it maybe possible to add custom codec(though I have not tried) by referring to the android developer page Adding custom codec to android.
You can try out adding your codec through openMAX IL layer then call up the android media player to play it(I believe vlc has done in this way but uses its own player). The awesome player, the android default player, just fetch a list of codecs available through openMAX API and if there is a codec, it plays. So it is worth to try adding your codec during initialization of your app, and call up media player.