Decode RTP stream (MPEG4) on Android - android

Hi everybody I hope you can help me out with this.
The problem:
I have an RTP stream which I'm multicasting on my private network (WiFi). I would like to use a number of android tablets for displaying the stream. The number of tablets cannot be restricted and the quality should not degrade with increasing number of clients. This explains why I need multicasting rather than unicasts.
The approach:
Theoretically by creating a RTSP or HTTP stream on the server side I should be able to serve the video to my clients. However, my understanding is that the server would take a performance hit when too many clients are connecting at the same time, which I need to avoid. Ideally I would like all clients to simply be listening on the very same multicast. That way the number of clients would have no impact on server performance. [NOTE: The IP is local and TTL is set to 0/1 so no danger of clogging anything else than my own network with the multicast packets.]
The implementation
To implement the approach above I thought to write a multicast client in Android that receives the RTP packets and stitches together the stream. I tried this with JPEG payload and it works quite well. The problem with JPEG, however, is that the BitmapFactory.decodeByteArray call to decode each frame is very expensive (almost 100ms!) which limits the frame rate considerably. The load on the network is also quite high since JPEG is not a good video streaming protocol.
What I would like to do is to do for video what I already did for pictures, i.e. stitch together the payload stream (e.g. MPEG4) from the RTP packets and feed it to "something". Initially I thought VideoView would work with a raw input stream but I was wrong, VV seems to work only with a rtsp or http url (correct?).
Solution?
Now, what are my options? I'd like to avoid setting up a RTSP server from the raw RTP stream and serve all tablets for the reasons above. I did look around for 2 days and checked all the solutions proposed on SO and on the net but nothing seemed to apply to my problem (the RTSP url or a unicast was the solution in most cases, but I don't think I can use it) so I thought it was finally time to ask this question.
Any help is very appreciated!
cheers

After reading your post again, I picked up on something I missed the first time. I used BitmapFactory.decodeByteArray for MJPEG over HTTP from an Axis camera multicast. The call can be done in a few ms. The problem there is that it normally wants to make a new Bitmap every call. There is a way to make the Bitmap persist and that will get the times way down. I just can't remember the call offhand and my normal dev computer is currently being destroyed... err, 'upgraded' by our IT, so I can't tell you off the top of my head, but you should find it if you search a bit. I was able to get 30fps on a Xoom and Galaxy Tab 10.1 and some others no problem.
Mark Boettcher
mboettcher#ara.com

OK, I checked and I used the overloaded BitmapFactory.decodeByteArray to use an immutable Bitmap with the InBitmap flag set in BitmapFactory.Options. May have been something else I had to do for the Bitmap itself, probably made it Static at the very least. May have been some other flags to set also, but you should definitely have enough to go on now.

We had a problem trying to play MJPEG steaming over RTSP on the Android. The multicast video server we had was not able to send MJPEG over HTTP and we did not want to use H.264 over RTSP because of latency. The application was a ROV sending live video back to a Droid for display.
Just to save you a lot of trouble, if I understand the problem correctly, you simply cannot do it with anything in the Android SDK, like MediaPlayer, etc. In the end we got it working by paying a guy to do some custom code using MPlayer, ffmpeg and Live555.
Hope this helps.

Related

How to sequentially play chunks of videos in Exoplayer

I'm trying to develop an Android app that reads a video coming from either the server or other peers that have the video. For this usecase I have to split my videos into smaller pieces to optimize transfer times and each piece can be provided by either the central server or another peer.
I would like to know if exoplayer is able to read a sequence of video pieces without interruption ?
I am free to do whatever I want in the splitting process e.g. split the video with linux command split.
Most adaptive bit rate streaming formats work something like your description - they spit the video into multiple chunks and the video player requests them one at a time. Examples of adaptive rate streaming protocols are HLS, MPEG-DASH, SmoothStreaming.
It is possible to have the url for the next 'chunk' of video route to a 'central' server which could proxy the request to another 'peer', if this would meet your needs.
Its worth noting that many videos are delivered via CDN's which might interfere with your desired approach (or alternatively might actually match what you want, depending on what your underlying requirements are) so you may want to check this also.
Update
Assuming you mean that some chunks will come from the server and some chunks from peer devices on the network, then the above server proxy method would not work.
One way you could do this would be to have all the chunks delivered to the device from whatever source is best for each chunk, and then put them together as you require on the device and stream the result from 'localhost' on your device to the player.
This sounds like a huge amount of overhead and something that would never work but I believe it is actually a technique used in some apps to convert from one streaming format to another (can't provide example - sorry...).
One example of a 'localhost' server on Android that might be useful to look at is:
http://www.laptopmag.com/articles/android-web-server
An alternative, if you were to use HTML5 inside a web page on the device you could use the Media Source Extension mechanism to load the video chunks from the different sources before passing them to the player. This does require Chrome at this point rather than the standard Android browser as the latter does not support the MSE extension at the time of writing.
In all these approaches you obviously need to make sure you load enough in advance to keep the video pipeline and buffer full, to avoid pauses.

Android How to show into surfaceview a stream from an IP camera

I am trying to develop a simple application that show the video stream from an IP camera into a surfaceview.
I am totally new to video decode/encode. In the last few days I have read a lot of information about mediacodec API and about how to implement it, but I can not find the right way. I still have to fully understand how buffers works and how depacketize the RTP packets from UDP and pass each frame to MediaCodec
I have a couple of Sony EP521 IP camera. From the CGI Command Manual I get that the cameras support Mpeg-4/H264 HTTP bit stream ("GET /h264...", the camera will send H.264 raw data as its response.) or RTP (UDP) bit stream.
My problem is that I do not know where to start:
Which is the "best" way to implement this? (with best I mean the most reliable/correct but still easy way)
Should I use HTTP bit stream or RTP?
Are MediaCodec strictly needed or can I implement this in another way? (ie, android.media.mediaplayer class already support h.264 raw data over RTP (I do not if it actually does or not))
How can I extract the video data from an HTTP bit stream?
I know that there are a lot if similar question, but no one seems to fully answer my doubts.
The camera also support MJpeg. This would be easier to implement, but for the moment I do not want to use MJpeg encoding.
Here the Camera CGI manual: http://wikisend.com/download/740040/G5%20Camera%20CGI%20manual.pdf
Thank you, and sorry If already been discussed.

Transfer real-time video stream to server using Android

We have to capture the real-time video using Android Camera, and send them to the server, then other users would read them through the browser or something else.
I have Googled and searched at SO, and there are some examples about video stream app like:
1 Android-eye: https://github.com/Teaonly/android-eye
2 Spydroid-ipcamera:https://code.google.com/p/spydroid-ipcamera/
However it seems that they have different environments, most of the apps will start an HTTP server for stream requests, then the client will visit the page through the local network and see the video.
Then the video stream source and the server are both the device like this:
But we need the internet support like this:
So I wonder if there are any alternative ideas.
I can see you have designed the three stages correctly, in your second diagram.
So what you need is to determine how to choose among these protocols and how to interface them.
No one can give you a complete solution but having completed an enterprise project on Android Video Streaming I will try to straighten your sight towards your goal.
There are three parts in your picture, I'll elaborate from left to right:
1. Android Streamer Device
Based on my experience, I can say Android does well sending Camera streams over RTP, due to native support, while converting your video to FLV gives you headache. (In many cases, e.g. if later you want to deliver the stream on to the Android devices.)
So I would suggest building up on something like spyDroid.
2. Streaming Server
There are tools like Wowza Server which can get a source stream
and put it on the output of the server for other clients. I guess
VLC can do this too, via File-->Stream menu, an then putting the
RTSP video stream address from your spyDroid based app. But I have
not tried it personally.
Also it is not a hard work to implement your own streamer server.
I'll give you an example:
For Implementation of an HLS server, you just need three things:
Video files, segmented into 10 second MPEG2 chunks. (i.e. .ts files)
An m3U8 playlist of the chunks.
A Web Server with a simple WebService that deliver the playlist to the Clients (PC, Android, iPhone, mostly every device) over HTTP. The clients will then look up the playlist file and ask for the appropriate chunks on their according timing. Because nearly all players have built-in HLS support.
3. The Client-Side
Based on our comments, I suggest you might want to dig deeper into Android Video Streaming.
To complete a project this big, you need much more research. For example you should be able to distinguish RTP from RTSP and understand how they are related to each other.
Read my answer here to get a sense of state-of-the-art Video Streaming and please feel free to ask for more.
Hope you got the big picture of the journey ahead,
Good Luck and Have Fun
Quite a general question, but I will try to give you a direction for research:
First of all you will need answer several questions:
1) What is the nature and purpose of a video stream? Is it security application, where details in stills are vital (then you will have to use something like MJPEG codec) or it will be viewed only in motion?
2) Are stream source, server and clients on the same network, so that RTSP might be used for more exact timing, or WAN will be involved and something more stable like HTTP should be used?
3) What is the number of simultaneous output connection? In other words, is it worth to pay for something like Wowza with transcoding add-on (and maybe nDVR too) or Flussonic, or simple solution like ffserver will suffice?
To cut long story short, for a cheap and dirty solution for couple of viewers, you may use something like IP Webcam -> ffserver -> VLC for Android and avoid writing your own software.
You can handle it this way:
Prepare the camera preview in the way described here. The Camera object has a setPreviewCallback method in which you register the preview callback. This callback provides data buffer (byte array) in YUV format that you can stream to your server.

How to do time-shifting of live audio stream on Android?

I'm working on a radio Android app in which I'd like to have options to rewind/fast-forward/back to live the audio stream.
It seems that it's not possible with Mediaplayer (I can't find any method to do that), so how can I do that?
The developer of the iOS version of the app is using the RadioKit SDK. Is there anything similar for Android?
I found this link that goes over some of the reasons why HTTP streaming isn't well-supported on Android. You can write your own HTTP streaming client and insert it as a proxy between the MediaPlayer and the media source, but that is the only way as far as I am aware. As far as trick mode, there is no real fast-forward or rewind protocol built into HTTP streaming. You have to simply request the correct byte from the server (see here for a little more info). The good news is it should be much easier to estimate the byte to request given a time position for audio than video (I've seen some pretty ridiculous algorithms for video).

VideoView RTSP delay

I'm currently working on a project which involves the rtsp streaming from an ip camera to an android device.
The phone and ip camera are connected to the same access point / router.
The problem is that the stream has a very big delay, ~ 5 seconds, if you watch a stream from the internet I assume that is ok for the buffer, but for my kind of application is just unacceptable, is not that real time anymore, it's useless... Of course, just to be sure that the camera is not the one to blame, I tested my ip cam stream to my pc in VLC with cache set to 0 and it's perfectly working.
I didn't find any property for the videoview class regarding my problem, so I started looking in the opencore sources hoping to find something in which I can modify and reduce the cache/buffer for the rtsp, I tried to understand how those work, but since those are very complicated I didn't manage to do that.
Now I'm stuck at this point in my project and can't see to find a way out... and the applications deadline is coming very fast. :|
Everyone who has any idea in getting this thing resolved please help me, because I'm kind of desperate right now.
As it's stated here, buffer size for standard VideoView is hardcoded into firmware. So, you will have to use a custom library like Vitamio to have some control over buffer size (more on that particular matter in their tutorial)
Set buffer size as 1000 before you start playing
mVideoView.setBufferSize(1000);

Categories

Resources