Would it be possible to attain video from a webcam, and display it on an android device? Say, the webcam is displayed on a computer, and the android device streams the live video feed from the computer to the Android Device?
Can someone guide me on how do I do this?
Yes, it can.
You will have to code your own simple server which captures the raw bytes from the webcam and turns them into Bitmap.
The server will then listen for phones to connect on a specific port (greater than 1024) and then send the Bitmaps to them.
On the phone, you will have to make an app that connects to that port, gets the data, decodes it and then displays it on the screen.
Basically, you are sending a lot of Bitmap over the wire and at a very fast rate.
Or, you can look into YAWCam, or android-eye for video streaming. The choice is yours.
I've done this couple years ago using motion software (on the linux computer) and tinyCam on the android smartphone. Motion does require a few configurations, but it very simple and straightforward, although via text. You will also need to open and forward ports on your router.
Related
At school, we are busy with a project where we need to have 2 camera input's connected to an android phone with a unity application running.
At the moment we have the idea to use some USB webcams and plug those into a raspberry pi. This has been done before. But the part that we are not really sure about is how we can connect the raspberry pi to an android phone and transfer the video.
Here is a schematic of what I want to achieve. The idea is that we want to simulate different kinds of vision of animals in a google cardboard experience.
The part I'm not sure about is the connection to the android phone and unity. The USB webcams to raspberry shouldn't make a problem.
I hope someone can give an answer or point me in the right direction.
Many thanks in advance,
Bastien Olivier Dijkstra
This sounds like a neat project. Are you setting it up so each camera gives the view for each eye (hence the 3D aspect)?
For starters, if you simply want static images, you can try to SCP or FTP the resultant video files to your android.
AndFTP does a pretty good job of FTP for Android devices.
On the other hand, if you want to watch a live stream of the video from the RaspiCam, you can do that also with a myriad of other apps, but I personally use IP Cam Viewer to view the RTSP stream.
You will basically be viewing each RTSP stream independently. You may need to adjust the viewing resolution so it doesn't overload your RPi, but Unity can deal with up to eight cameras and up to eight viewing 'screens'.
I have a plan to develop an instrument app, when we shake the android phone, it will produce "angklung" (Google it) sound.
THE PROBLEM:
How to make one android phone can share its produced sound (by shake
gesture) to the other android phones having my application?
The connection that I want to use is mobile data connection and wi-fi.
I think this person has the same problem, but I don't know how to communicate with him. Stream android to android
But there is no help..
I need solution/example/suggestion for this problem. So far I succeed to produce the "angklung" sound when it is shaken.
I have no idea how to start this application. I've searched in the internet but there is no help :(
Thanks for your help.
I would give you the suggestion of streaming the audio data to a server and beaming that to other android devices (that are registered to your app). As the question/issue you have asked are way bigger than couple of lines code, hence am pointing you to some good resources, dig those deep & good luck.
Live-stream video from one android phone to another over WiFi
Stream Live Android Audio to Server
I want to mirror Android Screen to desktop web browser. I am able to capture the screen using MediaProjection - thanks to sample app.
But next part is hard one - sending captured data to desktop! I know the technique to establish HTTP connection to desktop program by ADB port forwarding but I guess FPS will be very low.
How can I stream this captured screen data to desktop? What sort of connection would I need and what codecs would I need on Android side to ensure speed?
Thanks
I'm working with the google glass (it is considered as a normal android device) and openCV Lib (c++). I need to transfer (REAL-TIME) the video source from the android camera to visual studio and process it on my PC. I am not processing the video directly in the glass because it is too computationally expensive. I tried to stream using rtsp, http.. protocols but the quality of the frames is bad and there is an inconvenient latency.
Hence, I was wondering if anyone of you know how to stream the video via USB and get it on visual studio. I read something about using ADB but it does not seem to have a real-time function.
Otherwise I'am all ears for any suggestion.
thank you in advance!!
Matt
You can use adb forward to foward a certain TCP port over USB.
That should allow you to open a socket between the Android device and your host PC through USB data transfer, which should give you fast enough speeds to send frames to the PC in real-time and analyse them in OpenCV. You can just send the frames as bytes over the socket.
I am using C++ on a Windows 7 computer and a 2013 Google Nexus 7. I am hoping to take a video stream from the tablet's camera, send it to the computer like an IP webcam, process it, and then send the processed video back to the tablet all in real time or near-real time. Is this possible? How would I do it? I was thinking that if I can output opencv into vlc that I can potentially stream that to the tablet, but I have no idea how I would do that either.
Thanks in advance!
why you wouldn't use the opencv for android (it is java though) and process it on the tablet? that isn't an option?
if you would connect them via wi-fi, upload and download in parallel won't be smooth, so some buffering should be provided to overcome that.