Using Gstreamer or ffmpeg to create rtsp client on Android - android

I want to stream a rtsp stream on android and I finally have come to
conclusion that I can't use android API's MediaPlayer,Videoview etc because
latency is big issue for me. I need an latency of <500 ms. Now I am
planning to use Gstreamer or ffmpeg to create an android rtsp client. I just have few
doubts
Will the Gstreamer or ffmpeg client be able to provide latency <500ms. I read there are
some parameters which I can tweak to get very low latency. Just want to
confirm. I have very good network bandwidth. The frame size is generally
1920X1080.
I read Gstreamer is one made one level above ffmpeg and uses ffmpeg
codecs to work. I want to know which one is easier to work with for creating an android client. Working on Gstreamer or workig directly on ffmpeg.
If I use Gstreamer android client, Will I have to use the Gstreamer server as well to stream the data? Currently I am using Live555 RTSP server to stream data

I can't speak about ffmpeg, but for GStreamer:
1) Yes, you can get latencies much lower than 500ms with GStreamer as an RTSP client. See the latency property on rtspsrc (which e.g. can be accessed via the setup-source signal if you use playbin... and you should). By default this is set to 2000 miliseconds (which is a safe default) but if you network is fast enough you can set this much lower.
2) That depends on your experience with both APIs. For myself a GStreamer application would be much easier, and you can find a few samples on the internet:
https://coaxion.net/blog/2014/08/gstreamer-playback-api/
http://cgit.freedesktop.org/~slomo/gst-sdk-tutorials/tree/gst-sdk/tutorials (the android tutorials)
3) You can use any standard conformant RTSP server, both should work. GStreamer's has a very simple but powerful API, and is included with the GStreamer binaries for Android that you can get here: http://gstreamer.freedesktop.org/data/pkg/android/1.4.3/

Related

streaming video from android camera to pc?

What is the best (performance wise) way to get and stream a video from an android device's camera to a PC?
I have seen this question asked here before and there exist a few open source programs that do just that, but there exist multiple ways from which I don't know which one is the best!
for example:
Should the android part be written in c++ or java (performance/api wise)?
Which api should I use to get the video from camera?
What is the best way to stream the video?
I don't intend to support old android versions (<4.x), so if the best way/api is relatively new it's fine by me.
I'm not familiar with Android development but I'll try to answer.
I suppose that the actual encoding of the raw image data is probably done on hardware chip (otherwise software encoding would probably kill your battery) and it looks like MediaCodec class is exactly what you need. I suppose you want to implement some kind of live streaming service and the latency is important. If so, then you should stick to UDP based transmission methods. Using RTP protocol or MPEG-TS container format would be the best choice for this purpose. Of course you can also use TCP based methods for streaming like HLS or DASH (both of them use HTTP).
You should also take a look at Table 1 Core media format and codec support:
It tells us for example that using H.264 AVC Encoder supports MPEG-TS container and that HLS version 3 is also supported for Android 4.0 and above.

Decoding h264 raw stream on Android 2.3.3

I'm trying to decode a raw h264 stream on "older" Android versions.
I've tried MediaPlayer class and does not seem to support the stream format.
I can see the stream on other Cam Viewer apps from the market, so I figure there must be a way to do it, probably using the NDK.
I've read about OpenMAX and Stagefright, but couldn't find a working example about streaming.
Could someone please point me in the right direction?
Also, I'm reading in several places about "frameworks/av/include/media/stagefright/MediaSource.h" and other sources, but they don't seem to be either in the regular SDK or the NDK.
Where is this source located? is there another sdk?
Thanks in advance.
Update: I'm receiving a rtsp connection.
If you wish to perform only a simple experiment to verify certain functionality, you can consider employing the command line stagefright utility. Please do consider this condition where your streaming input may not be supported.
If you wish to build a more comprehensive player pipeline, you can consider the handling for rtsp as in here or http as in here. Please note that NuCachedSource2 implementation is essential for streaming input as this provides a page cache implementation which acts as a jitter for the streaming data.
Please do note one critical point: Command line stagefright utility doesn't render to the screen. Hence, if you wish to render, you will to implement the complete playback pipeline with rendering.
On a related note, if your input is streaming input, the standard player implementation does have support for streaming inputs as can be observed here. Did you face any issues with the same?
As fadden has already pointed out, your work is made far more simpler with the introduction of MediaCodec in Android 4.x.
You should use third-party libs like android-h264-decoder which uses JNI for increasing the performance! Also look at this lib Intel
Update: Media codec wasn't exposed until API 16 (Android 4.1), so that won't work for a 2.3.3 device.
Stagefright and OpenMAX IL were (and still are) internal components of Android. You can find the code (including MediaSource.h) at https://android.googlesource.com/platform/frameworks/av/+/master Note that the media framework has moved to a separate "tree" frameworks/av only recently. Before it was part of 'frameworks/base', e.g. https://android.googlesource.com/platform/frameworks/base/+/gingerbread/media/

Android: Any way to reduce media delay/latency on VideoView/MediaPlayer?

I'm trying to build an Android application that takes in an RTSP stream and displays the audio/video. I've been using VideoView and still getting a delay between 3 and 10 seconds from real time. I need this delay to be under 3 seconds.
On a PC, I can see the same RTSP stream using VLC with only a 1-2 second delay. How can I replicate this on Android? Even when I use other apps like MoboPlayer/RockPlayer, the delay is still 3 to 10 seconds. (If it matters, I'm connecting to the RTSP stream wirelessly on both PC and Android)
I've started looking into using FFmpeg for Android as well as the Gstreamer SDK for Android as alternatives to MediaPlayer, but they're both hard to work with for a novice like myself and I'm running into multiple problems.
Any and all information would be greatly appreciated!
First I think delay is because of initial buffer size which is uncontrollable on Android hardware MediaPlayer. This can be fixed by using custom SW media player as FFMpeg (AVconv now) or GStreamer. But it will cost you performance (depend on video stream compression,resolution, fps) and device battery life.
And of course native development using FFMpeg or similar C/C++ framework is complex and require a lot of time and experience.
Another option: you can try new Android 4.1 Java API to access video and audio decoder. And build your own RTSP player using this API and some code to load video stream from RTSP server (probably some Java library already exist).
Android 4.1 Multimedia

Advice about streaming live video to android/ios/pc

i would like some advice about the best way to stream a only-video live stream from a server to:
Android (>4.0 is ok)
PC with web-browser
iOS
I would like to keep latency as low as 1/2 second.
I can use:
flash: works on PC but no iOS and no Android(works only on some tablets)
HLS: not good because of latency
proprietary library: it should work but i have to implement it everywhere
RTSP: works only on Android
Any other way? Is a proprietary library the way to go?
I'm working on Linux but i'm mainly interested in "use this technology" and not "use this code".
Not sure, but you can try HTTP streaming of MP4/3gp formats using a web server. Both Android and iOS supports HTTP streaming. But you need to implement Progressive Download.
Please specify on which OS you want to implement your server.
For Windows - you can use following binary to relocate your moov atoms to the beginning of media file to enable them for progressive download
http://notboring.org/devblog/2009/07/qt-faststartexe-binary-for-windows/
Let us know your progress.
You can implement FFmpeg Server for Live broadcast. It gives you various options. Enable/Disable options from its configuration file located at /etc/ffserver.conf
You can get detail documentation at
http://ffmpeg.org/ffserver.html
Rtsp might be the way to go , but that 1/2 second latency might be hard to get.
I guess for video only and if you don't buffer at all , this may work for ios anyway
https://github.com/mooncatventures-group/FFPlayer-tests
Android supports rtsp , but its not very good.
You can compile ffmpeg for android and write a simple player using OpenGL. I can't share the code because we did it for a client but its not to difficult.

Convert video Input Stream to RTMP

I want to stream video recording from my android phone to network media server.
The first problem is that when setting MediaRecorder output to socket, the stream is missing some mdat size headers. This can be fixed by preprocessing that stream locally and adding missing data to stream in order to produce valid output stream.
The question is how to proceed from there.
How can I go about output that stream as an RTMP stream?
First, let's unwind your question. As you've surmised, RTMP isn't currently supported by Android. You can use a few side libraries to add support, but these may not be full implementations or have other undesirable side effects and bugs that cause them to fail to meet your needs.
The common alternative in this case is to use RTSP. It provides a comparable session format that has its own RFC, and its packet structure when combined with RTP is very similar (sans some details) to your desired protocol. You could perform the necessary fixups here to transmute RTP/RTSP into RTMP, but as mentioned, such effort is currently outside the development scope of your application.
So, let's assume you would like to use RTMP (invalidating this thread) and that the above-linked library does not meet your needs.
You could, for example, follow this tutorial for recording and playback using Livu, Wowza, and Adobe Flash Player, talking with the Livu developer(s) about licensing their client. Or, you could use this client library and its full Android recorder example to build your client.
To summarize:
RTSP
This thread, using Darwin Media Server, Windows Media Services, or VLC
RTMP
This library,
This thread and this tutorial, using Livu, Wowza, and Adobe Flash Player
This client library and this example recorder
Best of luck with your application. I admit that I have a less than comprehensive understanding of all of these libraries, but these appear to be the standard solutions in this space at the time of this writing.
Edit:
According to the OP, walking the RTMP library set:
This library: He couldn't make the library demos work. More importantly, RTMP functionality is incomplete.
This thread and this tutorial, using Livu, Wowza, and Adobe Flash Player: This has a long tutorial on how to consume video, but its tutorial on publication is potentially terse and insufficient.
This client library and this example recorder: The given example only covers audio publication. More work is needed to make this complete.
In short: more work is needed. Other answers, and improvements upon these examples, are what's needed here.
If you are using a web-browser on Android device, you can use WebRTC for video capturing and server-side recording, i.e with Web Call Server 4
Thus the full path would be:
Android Chrome [WebRTC] > WCS4 > recording
So you don't need RTMP protocol here.
If you are using a standalone RTMP app, you can use any RTMP server for video recording. As i know Wowza supports H.264+Speex recording.

Categories

Resources