I'm trying to build an Android application that takes in an RTSP stream and displays the audio/video. I've been using VideoView and still getting a delay between 3 and 10 seconds from real time. I need this delay to be under 3 seconds.
On a PC, I can see the same RTSP stream using VLC with only a 1-2 second delay. How can I replicate this on Android? Even when I use other apps like MoboPlayer/RockPlayer, the delay is still 3 to 10 seconds. (If it matters, I'm connecting to the RTSP stream wirelessly on both PC and Android)
I've started looking into using FFmpeg for Android as well as the Gstreamer SDK for Android as alternatives to MediaPlayer, but they're both hard to work with for a novice like myself and I'm running into multiple problems.
Any and all information would be greatly appreciated!
First I think delay is because of initial buffer size which is uncontrollable on Android hardware MediaPlayer. This can be fixed by using custom SW media player as FFMpeg (AVconv now) or GStreamer. But it will cost you performance (depend on video stream compression,resolution, fps) and device battery life.
And of course native development using FFMpeg or similar C/C++ framework is complex and require a lot of time and experience.
Another option: you can try new Android 4.1 Java API to access video and audio decoder. And build your own RTSP player using this API and some code to load video stream from RTSP server (probably some Java library already exist).
Android 4.1 Multimedia
Related
I need a library that supports real time video streaming from an RTSP connection to embed in an Android application I've built. It must have a really low latency (1-2s should be fine). I've already tried with a simple VideoView. It works but it has a HUGE latency (more than 10s) because its buffer size cannot be lowered.
Is there any good and reliable solution?
I would prefer not to build my own player from scratch...
ExoPlayer doesn't seem to support RTSP.
I have solved using a modified version of Exoplayer (RTSP Exoplayer GitHub pull request). The buffer size can be edited, so I think it's the best choice for this use case.
It works flawlessly!
I am searching for a library which offer ability for streaming video from android device (5.1+) and recording it at the same time.
I tried MediaRecorder - the usual way to record videos on android - but with it I am not able to stream it over webrtc or rtsp because camera is busy.
Currently I am using libstreaming. With little modification done app can record and stream over rtsp concurrently. But this lib lacks support for hardware codec in MTK and SPRG chipsets.
I am wonder if you can recommend a solution or another lib which.
By the moment lib works only on nexus 4 with qcom chipset.
After several days of research, I came to the decision to use a combination of FFMpeg and MediaCodec.
It seems that the only way to get frames from camera at high rate is to use Android MediaCodec API. But MediaCodec supports only mp4 file formats, which is not an option for me (I need ts), while FFMpeg can process\create any kind of human known video formats.
Currently I am trying to make it work together (read ByteBuffer from MediaCodec and feed FFMpeg recorder with it).
Useful links:
Grafika project: https://github.com/google/grafika
ContinuousCapture and Show + record are the most interesting parts to check
javacpp (specifically FFMpeg wrapper): https://github.com/bytedeco/javacpp
Has example with recording and streaming.
kickflip sdk: https://github.com/Kickflip/kickflip-android-sdk
The library which makes two mentioned above tools works together and also is open sourced. Sadly it doesn't solve my problem fully. The feature I need is requested but not already implemented: https://github.com/bytedeco/javacv/issues/95
I want to stream a rtsp stream on android and I finally have come to
conclusion that I can't use android API's MediaPlayer,Videoview etc because
latency is big issue for me. I need an latency of <500 ms. Now I am
planning to use Gstreamer or ffmpeg to create an android rtsp client. I just have few
doubts
Will the Gstreamer or ffmpeg client be able to provide latency <500ms. I read there are
some parameters which I can tweak to get very low latency. Just want to
confirm. I have very good network bandwidth. The frame size is generally
1920X1080.
I read Gstreamer is one made one level above ffmpeg and uses ffmpeg
codecs to work. I want to know which one is easier to work with for creating an android client. Working on Gstreamer or workig directly on ffmpeg.
If I use Gstreamer android client, Will I have to use the Gstreamer server as well to stream the data? Currently I am using Live555 RTSP server to stream data
I can't speak about ffmpeg, but for GStreamer:
1) Yes, you can get latencies much lower than 500ms with GStreamer as an RTSP client. See the latency property on rtspsrc (which e.g. can be accessed via the setup-source signal if you use playbin... and you should). By default this is set to 2000 miliseconds (which is a safe default) but if you network is fast enough you can set this much lower.
2) That depends on your experience with both APIs. For myself a GStreamer application would be much easier, and you can find a few samples on the internet:
https://coaxion.net/blog/2014/08/gstreamer-playback-api/
http://cgit.freedesktop.org/~slomo/gst-sdk-tutorials/tree/gst-sdk/tutorials (the android tutorials)
3) You can use any standard conformant RTSP server, both should work. GStreamer's has a very simple but powerful API, and is included with the GStreamer binaries for Android that you can get here: http://gstreamer.freedesktop.org/data/pkg/android/1.4.3/
Is it possible to invoke(deploy) HTTP Live Streaming (HLS) on Android(4.x)?
https://developer.apple.com/streaming/
Obviously iOS devices can both capture/play, and I know android can at least play, but how about capturing? I wonder interoperability.
Thanks.
The best answer I found so far is
Creating a HLS video stream with FFmpeg
12 May 2013
http://walterebert.com/blog/creating-on-hls-video-stream-with-ffmpeg/
For video conversion I use FFmpeg. Creation of HLS is possible with FFmpeg, but not really well documented. So I had to figure out how to create the video streams. After a lot of research and experimentation I created my FFmpeg HLS reference implementation that is available on Bitbucket.
On iOS the created video plays without problems on new devices. Older iOS devices with a maximum resolution of 480×320 pixels seem to select the best quality stream available, even if they cannot play it. For Android you have to create a MP4 video and before converting it into a MPEG stream. Doing this in a single command creates a choppy stream on Android. Flash playback has still some issues if you change the bitrate. So I still have some work to do.
These are the writings of Walter Ebert on web development, web design and free, open source software
Yes. HLS is widely used on Android 4.x.
We are developing some sort of intercom system. We need to play real time audio stream in android application using RTSP or HTTP protocol with minimum delay. Standard approach with MediaPlayer.setDataSource(URL) gives too big delays (about 2-3 seconds); We are using android 2.2. As i understood size of buffer in media player can be set only on firmware level. Can you give me some advise how to make this or I should go deep in real VoIP?
I found flexible solution - to use AudioTrack API. Also interesting article about audio API available in Android: http://www.wiseandroid.com/post/2010/07/13/Intro-to-the-three-Android-Audio-APIs.aspx