RTSP Stream into OpenCV - android

For a school project my partner and I are trying to push video from an android tablet (Nexus 7) to a server (as an ip webcam), pull from the server into OpenCV 2.4.6, process that, send it back to a server, and have the tablet display the feed in real or near-real time.
We arent using opencv for android because the goal is for a remote user to decide how to process the video (i.e. selecting a template to match or something from the stream).
Our question is this: we have managed to get the android webcam stream onto a server as an h264 rtsp stream. All the documentation for how to pull an rtsp stream is either outdated, really confusing, or altogether non-existent. We tried using a VideoCapture object and then tried using cvCreateFileCapture but neither seem to be working. How do we do this?

There is an open source project that does precisely this combination.
Android generating the content from front or rear camera
RTSP protocol for streaming
h264
https://code.google.com/p/spydroid-ipcamera/

Related

Android: Recording and Streaming at the same time

This is not really a question as much as it is a presentation of all my attempts to solve one of the most challenging functionalities I was faced with.
I use libstreaming library to stream realtime videos to Wowza Server and I need to record it at the same time inside the SD card. I am presenting below all my attempts in order to collect new ideias from the community.
Copy bytes from libstreaming stream to a mp4 file
Development
We created an interception in libstreaming library to copy all the sent bytes to a mp4 file. Libstreaming sends the bytes to Wowza server through a LocalSocket. It users MediaRecorder to access the camera and the mic of the device and sets the output file as the LocalSocket's input stream. What we do is create a wrapper around this input stream extending from InputStream and create a File output stream inside it. So, every time libstreaming executes a reading over the LocaSocket's input stream, we copy all the data to the output stream, trying to create a valid MP4 file.
Impediment
When we tried to read the file, it is corrupted. We realized that there are meta information missing from the MP4 file. Specifically the moov atom. We tried to delay the closing of the streaming in order to give time to send this header (this was still a guessing) but it didn't work. To test the coherence of this data, we used a paid software to try to recover the video, including the header. It became playable, but it was mostly green screen. So this became an not trustable solution. We also tried using "untrunc", a free open source command line program and it couldn't even start the recovery, since there was no moov atom.
Use ffmpeg compiled to android to access the camera
Development
FFMPEG has a gradle plugin with a java interface to use it inside Android apps. We thought we could access the camera via command line (it is probably in "/dev/video0") and sent it to the media server.
Impediment
We got the error "Permission Denied" when trying to access the camera. The workaround would be to root the device to have access to it, but it make the phones loose their warranty and could brick them.
Use ffmpeg compiled to android combined with MediaRecorder
Development
We tried to make FFMPEG stream a mp4 file being recorded inside the phone via MediaRecorder
Impediment
FFMPEG can not stream MP4 files that are not yet done with the recording.
Use ffmpeg compiled to android with libstreaming
Development
Libstreaming uses LocalServerSocket as the connection between the app and the server, so we thought that we could use ffmpeg connected with LocalServerSocket local address to copy the streaming directly to a local file inside the SD card. Right after the streaming started, we also ran the ffmpeg command to start recording the data to a file. Using ffmpeg, we believed that it would create a MP4 file in the proper way, which means with the moov atom header included.
Impediment
The "address" created is not readable via command line, as a local address inside the phone. So the copy is not possible.
Use OpenCV
Development
OpenCV is an open-source, cross-platform library that provides building blocks for computer vision experiments and applications. It offers high-level interfaces for capturing, processing, and presenting image data. It has their own APIs to connect with the device camera so we started studding it to see if it had the necessary functionalities to stream and record at the same time.
Impediment
We found out that the library is not really defined to do this, but more as image mathematical manipulation. We got even the recommendation to use libstreaming (which we do already).
Use Kickflip SDK
Development
Kickflip is a media streaming service that provides their own SDK for development in android and IOS. It also uses HLS instead of RTMP, which is a newer protocol.
Impediment
Their SDK requires that we create a Activity with camera view that occupies the entire screen of the device, breaking the usability of our app.
Use Adobe Air
Development
We started consulting other developers of app's already available in the Play Store, that stream to servers already.
Impediment
Getting in touch with those developers, they reassured that would not be possible to record and stream at the same time using this technology. What's more, we would have to redo the entire app from scratch using Adobe Air.
UPDATE
Webrtc
Development
We started using WebRTC following this great project. We included the signaling server in our NODEJS server and started doing the standard handshake via socket. We were still toggling between local recording and streaming via webrtc.
Impediment
Webrtc does not work in every network configuration. Other than that, the camera acquirement is all native code, which makes a lot harder to try to copy the bytes or intercept it.
If you are willing to part with libstreaming, there is a library which can easily stream and record to a local file at the same time.
https://github.com/pedroSG94/rtmp-rtsp-stream-client-java
Clone the project and run the sample app. For example, tap "Default RTSP." Type in your endpoint. Tap "Start stream" then tap "Start record." Then tap "Stop Stream" and "Stop record." I've tested this with Wowza Server and it works well. The project can also be used as a library rather than standalone app.
As soon as OpenCV 3.0 is available (the RC1 can be downloaded here), we could add another option to this list:
Using OpenCV's built in Motion-JPEG encoder

Livestreaming video from android device to server using openCV in android

My basic requirement is to stream live-video from android device to server. When I was researching about the same, I came across OpenCV 2.4.6. My question here is it possible to stream the video from android camera to server using OpenCV 2.4.6 in android. If so can u please suggest how to go about it...
I don't know if you can use OpenCV for streaming video from android device to PC.In my knowledge OpenCV is an Image Processing library that can be used to do process images and videos for,Face recognition,Image compression,Augmented Reality etc. My idea is to create a Socket between the Client device and the Server PC and pass the video through that socket just like what they did with this spydroid application

Capture IP Camera Stream and Publish on my Website

I am using IP Cam (an app for android phones) to stream live video. It basically functions as a basic IP camera. It gives me a URL where the feed is. It looks like this http://192.168.2.32:8080/ when I'm connected to my WiFi network. The video stream is at http://192.168.2.32:8080/videofeed.
I want to capture the video feed of the camera which is at http://192.168.2.32:8080/videofeed and embed it in an HTML5 player (the one I plan to use is Video JS) or a Flash player if the HTML5 doesn't work (prefer HTML5 though). The HTML5 player is asking me for a source file (such as .mp4, webm, mov, etc) but at http://192.168.2.32:8080/videofeed there is no source file. It's just HTML stream.
My question is how to I embed that video stream into my HTML5 player and post it on my website.
From what I could see, the streaming is already made by the app to a webpage and this page has the video for you to see. So you want to stream something that's already streaming.
You could try to signup for a dynamic dns like www.no-ip.com to get a url for your network, change your router to accept incoming connection on port 8080 and then use an iframe on your website with the dynamic dns url.
I used to stream a lot of live concerts to websites, but I had a camera connected to a computer and using Adobe Flash Media Streaming (free) connected to a server running Wowza streaming.
You need to get the data that camera is capturing decode it to some common format.
RGB or YUV2 or whatever. Encode it to Vp8/webm or thora/ogg . h264/Mp4 wont do it as it needs a special header in the mp4 file called mdat(unless it is fragmented mp4).
Client video tag makes the request to your phones ip/where you handling the http GET for that app hold that http connection and start streaming to it. This is sort of like long polling.
Most IP cameras have a way to get the raw stream of video. Using RTSP or RTMP
I suggest you get the steeam URI for the «camera», which would be something like rtsp://<camera-ip>:<some-port> or rtmp://<camera-ip>:<some-port>. That is a common feature of IP cameras, even those emulated on a phone, so probably it is mentioned on the docs or can be enable/set on the app configuration.
If there is no documentation, you can do some research by using Chrome to access the feed on the webpage and open the developer tools to see the actual code for the page, the URI may be visible on the embedded player they provide.
Once you get it, open that stream with VLC and see the properties (encoding, framerate, size, etc) and with that you can choose a compatible embedded player for your site.
Hope it helps!
///Pablo

Possible to capture video and stream it to flash server on realtime in android?

I working on a project where the client side need to capture video and audio from the camera, use some library (proberly ffmpeg) to convert from mp4 to flv and send it to flash server on realtime,in the other side the client need to get flv and convert it to video type of android on real time. Is it possible to do??
Tnx
If you use Adobe Air (with or without Flex), your client can access an Android device's camera directly. You can then use a NetConnection and NetStream to publish the camera feed directly to a Flash Media Server, without having to transcode the video.
It really can be as straight forward as this example shows :)

Mandatory to use Darwin or wowza or VLC to stream live video in android?

I want to know is it mandatory to use any of the streaming servers like Darwin,Wowza or VLC to stream an RTSP live stream video? I am receiving an RTSP link from my client and it tends to change everytime. I can successfully play it in the VLC player but on phone I cant see anything. I tried playing a sample link having .3gp extension and it worked fine. But my links dont have an extension. They look like this rtsp://122.166.229.151:1950/1346a0cf0ef7c2. Please help me.If its compulsory to use an extension or a server, I will continue working in that direction.
A streaming server (as you describe) isn't strictly necessary - as long as you can pull RTSP from whatever your source is, you should be able to see it. Most IP cameras have onboard RTSP servers (although I wouldn't put too many connections on it). If you can see it in VLC, the phone should be able to consume it as well, given that the codec used to encode is one supported by the android device (in most cases, if you're doing H.264 Baseline 3.0 with AAC, you should be good to go).
A streaming server like Wowza can make that stream available to a wider audience than pulling directly from the source device, but if you're not intending to broadcast to a wide audience, it's not required for streaming to Android devices.
Newer versions of Android (Gingerbread and later) are also able to consume Apple HTTP Live Streaming.

Categories

Resources