I'm working with the google glass (it is considered as a normal android device) and openCV Lib (c++). I need to transfer (REAL-TIME) the video source from the android camera to visual studio and process it on my PC. I am not processing the video directly in the glass because it is too computationally expensive. I tried to stream using rtsp, http.. protocols but the quality of the frames is bad and there is an inconvenient latency.
Hence, I was wondering if anyone of you know how to stream the video via USB and get it on visual studio. I read something about using ADB but it does not seem to have a real-time function.
Otherwise I'am all ears for any suggestion.
thank you in advance!!
Matt
You can use adb forward to foward a certain TCP port over USB.
That should allow you to open a socket between the Android device and your host PC through USB data transfer, which should give you fast enough speeds to send frames to the PC in real-time and analyse them in OpenCV. You can just send the frames as bytes over the socket.
Related
I'm using a Raspberry Pi to stream audio from a USB converter to my Android app, but the big problem is that I need it to have no latency.
I've already tried Icecast w/ ices2/darkice but the latency was too big. I'm not able to work with GStreamer; GStreamer in RPi to GStreamer in a PC client was OK, but I dont't managed to install it on Android.
Now I wrote my own Server in Java using UDP to transmit the data, but i get 1+ seconds of latency and I need it to be less.
Does anyone have an idea of what can I use in RPi to manage the latency problem? Thanks!
I am using C++ on a Windows 7 computer and a 2013 Google Nexus 7. I am hoping to take a video stream from the tablet's camera, send it to the computer like an IP webcam, process it, and then send the processed video back to the tablet all in real time or near-real time. Is this possible? How would I do it? I was thinking that if I can output opencv into vlc that I can potentially stream that to the tablet, but I have no idea how I would do that either.
Thanks in advance!
why you wouldn't use the opencv for android (it is java though) and process it on the tablet? that isn't an option?
if you would connect them via wi-fi, upload and download in parallel won't be smooth, so some buffering should be provided to overcome that.
Would it be possible to attain video from a webcam, and display it on an android device? Say, the webcam is displayed on a computer, and the android device streams the live video feed from the computer to the Android Device?
Can someone guide me on how do I do this?
Yes, it can.
You will have to code your own simple server which captures the raw bytes from the webcam and turns them into Bitmap.
The server will then listen for phones to connect on a specific port (greater than 1024) and then send the Bitmaps to them.
On the phone, you will have to make an app that connects to that port, gets the data, decodes it and then displays it on the screen.
Basically, you are sending a lot of Bitmap over the wire and at a very fast rate.
Or, you can look into YAWCam, or android-eye for video streaming. The choice is yours.
I've done this couple years ago using motion software (on the linux computer) and tinyCam on the android smartphone. Motion does require a few configurations, but it very simple and straightforward, although via text. You will also need to open and forward ports on your router.
gang:
I'm an embedded software engineer starting to work on starting to work with Linux streaming solutions.
And got no experience with network and smartphone OSes.
When we stream video to PC(receiving via VLC) on PC side, we found there's a latency of 1.2 seconds for H.264. It includes
sensor grabs data
data is sent over the network
VLC buffers and plays
We found out after a while, that there's buffering control on VLC. For H.264 streaming, however, the minimum we can set is 400 -- 500 ms. However, on Android phones, we were NOT able to find some software that has very short(minimum) delays/buffering.
Can anyone suggest
How is latency generally measured/profiled for video streaming to smart phones?
Do you have any network sniffing software on Android/iOS to recommend?
I saw in Apple's documentation that HTTP live streaming recommends 10s "file size". Anyway to overcome this? (Is jailbreaking required for installing sniffing tool on iOS?)
I'm trying to stream video from the camera on my android device. I've just followed the code in this link: Android Camera RTSP/RTP Stream?
It seems the user had a problem with the YUV decoding, to solve this I've used:
parameters.setPreviewFormat(ImageFormat.RGB_565);
to obtain preview frames on rgb format.
The logs tell that the packets are sent with no error, so what i would like to do next is to play the data stream on VLC player located on a local pc. I introduce the local ip of my pc on the code, so the packets are sent to it. But how to play them????
I'm really newby at this point and any advice could help me a lot.
Thanks.
I believe that you want to publish a stream ip with your android, this part as you are saying it is working fine. While from the PC you can open this from VLC network stream. The missing part is to find out what is the address of this published stream, assuming that the PC has access to this android link by being connected together, same wifi network ...