Android RTSP to v4l2 device - android

I have created an application that takes an RTSP stream and displays it SurfaceView. This is straightforward enough. However I was wondering if it was possible to take an RTSP stream in Android and use it to create a v4l2-device in the same way connecting a webcam does at a location such as /dev/video0.
Can this be done using a compiled version of FFMPEG for Android or perhaps a simpler method?
Any direction on this would be greatly appreciated!

Related

Android and Nginx Rtmp Module solution

folks
I have an Android app that streams videos to Wowza Server. Right now, I am using libstreaming (https://github.com/fyhertz/libstreaming) in the Android app to livestream audio and video to Wowza.
It works fine, but I am building an open source solution and I would like to stop using Wowza (since it is a payed product) and start using nginx-rtmp-module (https://github.com/arut/nginx-rtmp-module). The problem is that libstreaming does not work with rtmp protocol, and, as much as I researched, I still couldn't find a good solution on the Android side to livestream to nginx.
Does anybody know any solution to do that? Did anybody already implemented it? Thanks in advance!
You can probably use ffmpeg to convert RTP into RTMP on server side.
e.g. Pipe UDP input to FFMPEG

RTSP Stream into OpenCV

For a school project my partner and I are trying to push video from an android tablet (Nexus 7) to a server (as an ip webcam), pull from the server into OpenCV 2.4.6, process that, send it back to a server, and have the tablet display the feed in real or near-real time.
We arent using opencv for android because the goal is for a remote user to decide how to process the video (i.e. selecting a template to match or something from the stream).
Our question is this: we have managed to get the android webcam stream onto a server as an h264 rtsp stream. All the documentation for how to pull an rtsp stream is either outdated, really confusing, or altogether non-existent. We tried using a VideoCapture object and then tried using cvCreateFileCapture but neither seem to be working. How do we do this?
There is an open source project that does precisely this combination.
Android generating the content from front or rear camera
RTSP protocol for streaming
h264
https://code.google.com/p/spydroid-ipcamera/

Advice about streaming live video to android/ios/pc

i would like some advice about the best way to stream a only-video live stream from a server to:
Android (>4.0 is ok)
PC with web-browser
iOS
I would like to keep latency as low as 1/2 second.
I can use:
flash: works on PC but no iOS and no Android(works only on some tablets)
HLS: not good because of latency
proprietary library: it should work but i have to implement it everywhere
RTSP: works only on Android
Any other way? Is a proprietary library the way to go?
I'm working on Linux but i'm mainly interested in "use this technology" and not "use this code".
Not sure, but you can try HTTP streaming of MP4/3gp formats using a web server. Both Android and iOS supports HTTP streaming. But you need to implement Progressive Download.
Please specify on which OS you want to implement your server.
For Windows - you can use following binary to relocate your moov atoms to the beginning of media file to enable them for progressive download
http://notboring.org/devblog/2009/07/qt-faststartexe-binary-for-windows/
Let us know your progress.
You can implement FFmpeg Server for Live broadcast. It gives you various options. Enable/Disable options from its configuration file located at /etc/ffserver.conf
You can get detail documentation at
http://ffmpeg.org/ffserver.html
Rtsp might be the way to go , but that 1/2 second latency might be hard to get.
I guess for video only and if you don't buffer at all , this may work for ios anyway
https://github.com/mooncatventures-group/FFPlayer-tests
Android supports rtsp , but its not very good.
You can compile ffmpeg for android and write a simple player using OpenGL. I can't share the code because we did it for a client but its not to difficult.

Audio/Video Conferencing Application in Android

I have to develop an application in android for audio/video conferencing. Which is the most efficient way of implementing this? While my research, I came across Android's SIP API. Can it be used for implementing the audio as well as video conferencing ? And If yes, what shall I use to stream the videos in real time? Shall I use any RTSP library for this?
Please Guide me.
Thanks,
Rupesh
Okz for my practical project i used Spydroid which use rtsp protocol sdp less. you can customize it for audio use only. i will prefer spydroid because it uses pure java in which it reads camera packets and right them to linux socket and read them from there by rtsp server.
on other hand if i am not wrong sip uses c/c++ codes too

Streaming video from Android camera to server [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Closed 1 year ago.
Locked. This question and its answers are locked because the question is off-topic but has historical significance. It is not currently accepting new answers or interactions.
I've seen plenty of info about how to stream video from the server to an android device, but not much about the other way, ala Qik. Could someone point me in the right direction here, or give me some advice on how to approach this?
I have hosted an open-source project to enable Android phone to IP camera:
http://code.google.com/p/ipcamera-for-android
Raw video data is fetched from LocalSocket, and the MDAT MOOV of MP4 was checked first before streaming. The live video is packed in FLV format, and can be played via Flash video player with a build in web server :)
Took me some time, but I finally manage do make an app that does just that. Check out the google code page if you're interested: http://code.google.com/p/spydroid-ipcamera/
I added loads of comments in my code (mainly, look at CameraStreamer.java), so it should be pretty self-explanatory.
The hard part was actually to understand the RFC 3984 and implement a proper algorithm for the packetization process. (This algorithm actually turns the mpeg4/h.264 stream produced by the MediaRecorder into a nice rtp stream, according to the rfc)
Bye
I'm looking into this as well, and while I don't have a good solution for you I did manage to dig up SIPDroid's video code:
http://code.google.com/p/sipdroid/source/browse/trunk/src/org/sipdroid/sipua/ui/VideoCamera.java
I've built an open-source SDK called Kickflip to make streaming video from Android a painless experience.
The SDK demonstrates use of Android 4.3's MediaCodec API to direct the device hardware encoder's packets directly to FFmpeg for RTMP (with librtmp) or HLS streaming of H.264 / AAC. It also demonstrates realtime OpenGL Effects (titling, chroma key, fades) and background recording.
Thanks SO, and especially, fadden.
Here is complete article about streaming android camera video to a webpage.
Android Streaming Live Camera Video to Web Page
Used libstreaming on android app
On server side Wowza Media Engine is used to decode the video stream
Finally jWplayer is used to play the video on a webpage.
I am able to send the live camera video from mobile to my server.using this link
see the link
Refer the above link.there is a sample application in that link. Just you need to set your service url in RecordActivity.class.
Example as:
ffmpeg_link="rtmp://yourserveripaddress:1935/live/venkat";
we can able to send H263 and H264 type videos using that link.
Check Yasea library
Yasea is an Android streaming client. It encodes YUV and PCM data from
camera and microphone to H.264/AAC, encapsulates in FLV and transmits
over RTMP.
Feature:
Android mini API 16.
H.264/AAC hard encoding.
H.264 soft encoding.
RTMP streaming with state callback handler.
Portrait and landscape dynamic orientation.
Front and back cameras hot switch.
Recording to MP4 while streaming.
Mux (my company) has an open source android app that streams RTMP to a server, including setting up the camera and user interactions. It's built to stream to Mux's live streaming API but can easily stream to any RTMP entrypoint.
Depending by your budget, you can use a Raspberry Pi Camera that can send images to a server. I add here two tutorials where you can find many more details:
This tutorial show you how to use a Raspberry Pi Camera and display images on Android device
This is the second tutorial where you can find a series of tutorial about real-time video streaming between camera and android device

Categories

Resources