My basic requirement is to stream live-video from android device to server. When I was researching about the same, I came across OpenCV 2.4.6. My question here is it possible to stream the video from android camera to server using OpenCV 2.4.6 in android. If so can u please suggest how to go about it...
I don't know if you can use OpenCV for streaming video from android device to PC.In my knowledge OpenCV is an Image Processing library that can be used to do process images and videos for,Face recognition,Image compression,Augmented Reality etc. My idea is to create a Socket between the Client device and the Server PC and pass the video through that socket just like what they did with this spydroid application
Related
I have created an application that takes an RTSP stream and displays it SurfaceView. This is straightforward enough. However I was wondering if it was possible to take an RTSP stream in Android and use it to create a v4l2-device in the same way connecting a webcam does at a location such as /dev/video0.
Can this be done using a compiled version of FFMPEG for Android or perhaps a simpler method?
Any direction on this would be greatly appreciated!
I am a rookie at Android and have been working on making Android a streaming server.Now I have known is that live555 can be a media server and be ported to Android.According to many resources, I know how to build livlib555.so with ndk-build and I have done that.
But the question is that I don't know how to use it.I want to stream some local video files just like that I can easily stream video files in my windows7 pc by run the live555mediaserver.exe(and access it with "rtsp://ip:80/mp4_file_name").I am looking for a same way(or similar) in android.
I have a project where I need to do a PC to mobile chat application in android and iOS, I was thinking if that could be possible using phonegap?
I searched a little and found this plugin for capturing video
https://github.com/EddyVerbruggen/VideoCapturePlus-PhoneGap-Plugin
Will it work?
Ok, to add more the backend will be in .net azure server and front end will be the mobile/tablet device and a user should be able to communicate with the admin/supervisor PC web browser.
if you want a livechat, then the plugin will be of no use. I created it for recording video and when recording is done, the result can be played back or sent to a server. Live streaming is not supported.
Best regards,
Eddy
For a school project my partner and I are trying to push video from an android tablet (Nexus 7) to a server (as an ip webcam), pull from the server into OpenCV 2.4.6, process that, send it back to a server, and have the tablet display the feed in real or near-real time.
We arent using opencv for android because the goal is for a remote user to decide how to process the video (i.e. selecting a template to match or something from the stream).
Our question is this: we have managed to get the android webcam stream onto a server as an h264 rtsp stream. All the documentation for how to pull an rtsp stream is either outdated, really confusing, or altogether non-existent. We tried using a VideoCapture object and then tried using cvCreateFileCapture but neither seem to be working. How do we do this?
There is an open source project that does precisely this combination.
Android generating the content from front or rear camera
RTSP protocol for streaming
h264
https://code.google.com/p/spydroid-ipcamera/
I'm trying to make an android app that sends live video from the phone camera to wowza media server using eclipse and android sdk.I tried to use the spydroid ip camera on code.google (this is the link https://code.google.com/p/spydroid-ipcamera/) but i could'nt know exactly what to change in this app to make it stream to my localhost wowza server.The tutorial that comes with spydroid is not clear(this is the link to the tutorial:https://code.google.com/p/spydroid-ipcamera/issues/detail?id=2) .Can you help me please ?
For all of you who still looking for answer - try looking at fyhertz lib. I think it's easy to apply and stable enough