Real time video player to embed in an Android app - android

I need a library that supports real time video streaming from an RTSP connection to embed in an Android application I've built. It must have a really low latency (1-2s should be fine). I've already tried with a simple VideoView. It works but it has a HUGE latency (more than 10s) because its buffer size cannot be lowered.
Is there any good and reliable solution?
I would prefer not to build my own player from scratch...
ExoPlayer doesn't seem to support RTSP.

I have solved using a modified version of Exoplayer (RTSP Exoplayer GitHub pull request). The buffer size can be edited, so I think it's the best choice for this use case.
It works flawlessly!

Related

Android VLC Embed vs Android VideoView

=== BACKGROUND SUMMARY===
At this moment, we are using Android VideoView to perform video play black. Everything seems to be working great until we encounter Live Streaming.
VideoView tends to have 10-15 seconds delay from the live stream within a local network (LAN).
While attempting to solve this issue, we came across VLC Embed for Android. After searching on the Internet, it seems there isn't any article compare pros and cons of using Android VLC Embed vs. Android VideoView.
=== QUESTION ===
What's the advantage (pros) and disadvantage (cons) of using Android
VLC Embed vs. Android VideoView?
Is VLC Embed stable?
Anything I should be careful when switching existing VideoView to VLC?
Thank you all in advanced
My view may not be very professional but it's about what I've experienced so far.
First, Android VideoView is good since it comes with the Android SDK so it does not require external library. But this one has some limits. For example, as far as I know, it doesn't support MMS and MMSH protocols and some others I didn't quote. Which is not the case for Android VLC SDK. This library is complete and supports almost all media formats I know so far.
It just increases your apk on size, on my side that's the only disadvantage.
Is the Android VLC SDK stable? Yes it's stable and maintained by a huge community.
Anything I should be careful when switching existing VideoView to VLC?
You should keep your sources same and care about aspect ratio.
What's the advantage (pros) and disadvantage (cons) of using Android VLC Embed vs. Android VideoView?
Advantage:
More features. VLC supports almost all media formats, hardware decoding. audio tracks, subtitles, chapter are also supported.
More integrated, simpler logic. You can easily get media information and cache them. The playback engine will proactively notify state changes and events, just register player event listening.
Disadvantage:
APK file size increas. If both arm64-v8a and armeabi-v7a are supported, it will increase more than 30MB.
Multiple instances are not perfect. For example, playing 2 videos at the same time is a hassle.
Is VLC Embed stable?
Stable. Starting with VLC 2.0.x (now 3.0.x), I use the VLC library in my Android App. It runs steadily from Android 5.1 to Android 8.0. A small number of 4k h265 video playback is not normal, but can be resolved by displaying "Can not play".
Anything I should be careful when switching existing VideoView to VLC?
To use LibVLC on Android The Medialibrary(org.videolan.medialibrary) is also required. You also need to note the licenses.
VLC for Android is licensed under GPLv3
This may be a concern for you if your project uses a different license.

How to record (and process?) a video that is streamable from Android

My company's app relies heavily on video recording and playback of web-based videos. I use the MediaRecorder API to record videos, through this library designed by me: https://github.com/afollestad/material-camera.
For playback, I use this library which is basically a wrapper around Google's ExoPlayer library: https://github.com/brianwernick/ExoMedia.
It works fine for the most part with small videos, especially if I decrease bit rates for audio and video. However, larger and higher quality videos have many issues. Sometimes they seem to buffer forever, sometimes playback doesn't even start successfully, etc. Again, these videos are being streamed over HTTP from Amazon S3.
I've read a little bit about FFMPEG, and how it can process MP4's for "faststart", splitting the files into chunks for DASH, etc. However, FFMPEG solutions for Android seem a bit complex, so...
Is there anyway to record MP4's from Android, with MediaRecorder, MediaCodec, or some other API which results in a video file that is fast to stream? It amazes me how well Snapchat has figured this out.
Well, I ended up trying the stock MediaPlayer API again. As long as you're on API 16 or above, there should be no major issues with the default hardcoded buffer size.
I ended up making a small library in case others need a nice solution to video playback: https://github.com/afollestad/easy-video-player

Most instant way to stream live video to iOS and Android

I'm making an app that needs to send a video feed from a single source to a server where it can be accessed by desktop browsers and mobile apps.
So far, I've been using Adobe Media Server 5 with a live RTMP stream. This gives me about a 2.5 second delay on desktop browsers, which gives me no native support for iOS, but leaves me with the option to use Air to export the app for iOS, which produces a minimum 5-6 second delay.
The iOS docs strongly recommend the use of HTTP Live Streaming which segments the stream into chunks and serves it using a dynamic playlist in a .m3u8 file. Doing this produces a 15+ second delay in desktop browsers and mobile devices. A Google search seemed to reveal that this is to be expected from HLS.
I need a maximum of 2-4 second delays across all devices, if possible. I've gotten poor results with Wowza, but am open to revisiting it. FFMpeg seems inefficient, but I'm open to that as well, if someone has had good results with it. Anybody have any suggestions?? Thanks in advance.
I haven't even begun to find the most efficient way to stream to Android, so any help in that department would be much appreciated.
EDIT: Just to be clear, my plan is to make an iOS app, whether it's written natively or in Air. Same goes for Android, but I've yet to start on that.
In the ios browser HLS is the only way to serve live video. The absolute lowest latency would be to use 2 second segments with a 2 segment windows in the manifest. This will give you 4 seconds latency on the client, plus another 2 to 4 on the server. There is no way to do better without writing an app.
15 Second delay for HLS streams is pretty good, to provide lower latency you need to use a different streaming protocol.
RTP/RTSP will give you the lowest latency and is typically used for VoIP and video conferencing, but you will find it very difficult to use over multiple mobile and WiFi networks (some of them unintentionally block RTP).
If you can write an iOS app that supports RTMP then that is the easiest way to go and should work on Android too (only old Androids support Flash/RTMP natively). Decoding in software will result in poor battery life. There are other iOS apps that don't use HLS for streaming, but I think you need to limit it to your service (not a generic video player).
Also please remember that higher latency equals higher video quality, less buffering, better user experience etc. so don't unnecessarily reduce latency.

Is it possible to create(deploy) HTTP Live Streaming (HLS) on Android(4.x)?

Is it possible to invoke(deploy) HTTP Live Streaming (HLS) on Android(4.x)?
https://developer.apple.com/streaming/
Obviously iOS devices can both capture/play, and I know android can at least play, but how about capturing? I wonder interoperability.
Thanks.
The best answer I found so far is
Creating a HLS video stream with FFmpeg
12 May 2013
http://walterebert.com/blog/creating-on-hls-video-stream-with-ffmpeg/
For video conversion I use FFmpeg. Creation of HLS is possible with FFmpeg, but not really well documented. So I had to figure out how to create the video streams. After a lot of research and experimentation I created my FFmpeg HLS reference implementation that is available on Bitbucket.
On iOS the created video plays without problems on new devices. Older iOS devices with a maximum resolution of 480×320 pixels seem to select the best quality stream available, even if they cannot play it. For Android you have to create a MP4 video and before converting it into a MPEG stream. Doing this in a single command creates a choppy stream on Android. Flash playback has still some issues if you change the bitrate. So I still have some work to do.
These are the writings of Walter Ebert on web development, web design and free, open source software
Yes. HLS is widely used on Android 4.x.

receive and decode H.264 live stream in Android

I am evaluating the possibility of displaying a continuous H.264 live feed (RTSP) on an Android device (2.3+ or even 4.0). I need the delay time (source to display, can assume source encoding to have zero delay) to be within 1 second or so. Wonder if anybody has done this already? What would be a good approach to achieve this?
Thanks in advance
Michael
Maybe you could just give it a try. H264 is supported by android and is even hardware-accelerated on the latest generation of (upper and middle class devices). There is the MediaPlayer class, which can be used to play video files and it also supports the streaming of http and rtsp URIs.

Categories

Resources