VideoView RTSP delay - android

I'm currently working on a project which involves the rtsp streaming from an ip camera to an android device.
The phone and ip camera are connected to the same access point / router.
The problem is that the stream has a very big delay, ~ 5 seconds, if you watch a stream from the internet I assume that is ok for the buffer, but for my kind of application is just unacceptable, is not that real time anymore, it's useless... Of course, just to be sure that the camera is not the one to blame, I tested my ip cam stream to my pc in VLC with cache set to 0 and it's perfectly working.
I didn't find any property for the videoview class regarding my problem, so I started looking in the opencore sources hoping to find something in which I can modify and reduce the cache/buffer for the rtsp, I tried to understand how those work, but since those are very complicated I didn't manage to do that.
Now I'm stuck at this point in my project and can't see to find a way out... and the applications deadline is coming very fast. :|
Everyone who has any idea in getting this thing resolved please help me, because I'm kind of desperate right now.

As it's stated here, buffer size for standard VideoView is hardcoded into firmware. So, you will have to use a custom library like Vitamio to have some control over buffer size (more on that particular matter in their tutorial)

Set buffer size as 1000 before you start playing
mVideoView.setBufferSize(1000);

Related

Android RTSP streaming buffer size for delayed playback

I can access and view RTSP streams from IP cameras on Android via the VideoView component without problems.
Now I need to play the RTSP stream with a delay (i.e. if I specify a 30 second delay, the playback on screen should be 30 seconds behind the source and the delay needs to be variable though not during playback, only at the point of connecting to the source).
I originally thought this would not be a problem as I could simply change the RTSP buffer duration before connecting to the camera but unfortunately it seems the buffer size is baked into the firmware and cannot be changed in software. Now I have got a horrible feeling that my way forward will be to compile a version of FFMpeg for Android and somehow get the stream data out from the library, buffer it and then render it myself and I have no experience with FFMpeg.
I am unsure how I would now go about solving this problem and any help or pointers in the right direction would be greatly appreciated.
Update:
Sorry I forgot to mention, the RTSP stream is being accessed over WiFi on a LAN so no huge latency issues here from going over the Internet.

Decode RTP stream (MPEG4) on Android

Hi everybody I hope you can help me out with this.
The problem:
I have an RTP stream which I'm multicasting on my private network (WiFi). I would like to use a number of android tablets for displaying the stream. The number of tablets cannot be restricted and the quality should not degrade with increasing number of clients. This explains why I need multicasting rather than unicasts.
The approach:
Theoretically by creating a RTSP or HTTP stream on the server side I should be able to serve the video to my clients. However, my understanding is that the server would take a performance hit when too many clients are connecting at the same time, which I need to avoid. Ideally I would like all clients to simply be listening on the very same multicast. That way the number of clients would have no impact on server performance. [NOTE: The IP is local and TTL is set to 0/1 so no danger of clogging anything else than my own network with the multicast packets.]
The implementation
To implement the approach above I thought to write a multicast client in Android that receives the RTP packets and stitches together the stream. I tried this with JPEG payload and it works quite well. The problem with JPEG, however, is that the BitmapFactory.decodeByteArray call to decode each frame is very expensive (almost 100ms!) which limits the frame rate considerably. The load on the network is also quite high since JPEG is not a good video streaming protocol.
What I would like to do is to do for video what I already did for pictures, i.e. stitch together the payload stream (e.g. MPEG4) from the RTP packets and feed it to "something". Initially I thought VideoView would work with a raw input stream but I was wrong, VV seems to work only with a rtsp or http url (correct?).
Solution?
Now, what are my options? I'd like to avoid setting up a RTSP server from the raw RTP stream and serve all tablets for the reasons above. I did look around for 2 days and checked all the solutions proposed on SO and on the net but nothing seemed to apply to my problem (the RTSP url or a unicast was the solution in most cases, but I don't think I can use it) so I thought it was finally time to ask this question.
Any help is very appreciated!
cheers
After reading your post again, I picked up on something I missed the first time. I used BitmapFactory.decodeByteArray for MJPEG over HTTP from an Axis camera multicast. The call can be done in a few ms. The problem there is that it normally wants to make a new Bitmap every call. There is a way to make the Bitmap persist and that will get the times way down. I just can't remember the call offhand and my normal dev computer is currently being destroyed... err, 'upgraded' by our IT, so I can't tell you off the top of my head, but you should find it if you search a bit. I was able to get 30fps on a Xoom and Galaxy Tab 10.1 and some others no problem.
Mark Boettcher
mboettcher#ara.com
OK, I checked and I used the overloaded BitmapFactory.decodeByteArray to use an immutable Bitmap with the InBitmap flag set in BitmapFactory.Options. May have been something else I had to do for the Bitmap itself, probably made it Static at the very least. May have been some other flags to set also, but you should definitely have enough to go on now.
We had a problem trying to play MJPEG steaming over RTSP on the Android. The multicast video server we had was not able to send MJPEG over HTTP and we did not want to use H.264 over RTSP because of latency. The application was a ROV sending live video back to a Droid for display.
Just to save you a lot of trouble, if I understand the problem correctly, you simply cannot do it with anything in the Android SDK, like MediaPlayer, etc. In the end we got it working by paying a guy to do some custom code using MPlayer, ffmpeg and Live555.
Hope this helps.

receive and decode H.264 live stream in Android

I am evaluating the possibility of displaying a continuous H.264 live feed (RTSP) on an Android device (2.3+ or even 4.0). I need the delay time (source to display, can assume source encoding to have zero delay) to be within 1 second or so. Wonder if anybody has done this already? What would be a good approach to achieve this?
Thanks in advance
Michael
Maybe you could just give it a try. H264 is supported by android and is even hardware-accelerated on the latest generation of (upper and middle class devices). There is the MediaPlayer class, which can be used to play video files and it also supports the streaming of http and rtsp URIs.

Viewing MJPEG Video Streams

I know Android doesn't support MJPEG natively but are there any jar files/drivers available that can be added to a project to make it possible?
There is a View available to display MJPEG streams:
Android and MJPEG Topic
Hardly, unless it's your Android platform (i.e. you are the integrator of special-purpose devices running Android).
A good place to start looking on how the Android framework handles video streams is here:
http://opencore.net/files/opencore_framework_capabilities.pdf
If you want to cook up something entirely incompatible, I guess you could do that with the NDK, jam ffmpeg into there, and with a bit of luck (and a nightmare supporting different Android devices) you can have it working.
What is the root problem you are trying to solve, perhaps we could work something out.
You can of course write or port software to handle any documented video format, the problem is that you won't have the same degree of hardware optimized code as the built in video codecs, and won't have as efficient low-level access to the framebuffer. So your code is likely to not be able to play back at full speed. Sometimes that might be okay, if you just want to get a sense of something. Also mjpeg compresses frames individually, so it should be trivial to write something that just skips a lot of frames and only decodes whatever fraction of them it can keep up with.
I think that some people have managed to build ffmpeg or mplayer using the optional features of the cpus in some phones and get to full frame rate for some videos, but it's tricky and device-specific.
I'm probably stating the obvious here, but MJPEG consists simply of multiple JPEGs. If you can grab the frames by cutting out data, you can probably get that data to be displayed as any other image.
I couldn't find any information on when exactly this was implemented, but as of now (testing on Android 8) you can view MJPEG stream just fine using a WebView.

Does anyone have an example of how to stream the camera to a server on android?

I find plenty of examples of downstreaming a video from a server to an android, but I actually want to stream live images from my droid to a server.
I know that Qik claims to do this. But as I am now reading the Wikipedia article more closely, it says that it doesn't really do live streaming for the iPhone or for Android.
For iPhone, it starts uploading after the recording has finished, and for Android, there are 15-20 seconds delay (according to Wikipedia).
So it seems if not even those Qik guys, who seem to have experience with live streaming, can do it, it's a very hard problem.
On the other hand, I have not tried Qik. Maybe you can install and test it, and do some traffic sniffing with Wireshark to see how they do it on the network level.

Categories

Resources