I'm trying to stream video from the camera on my android device. I've just followed the code in this link: Android Camera RTSP/RTP Stream?
It seems the user had a problem with the YUV decoding, to solve this I've used:
parameters.setPreviewFormat(ImageFormat.RGB_565);
to obtain preview frames on rgb format.
The logs tell that the packets are sent with no error, so what i would like to do next is to play the data stream on VLC player located on a local pc. I introduce the local ip of my pc on the code, so the packets are sent to it. But how to play them????
I'm really newby at this point and any advice could help me a lot.
Thanks.
I believe that you want to publish a stream ip with your android, this part as you are saying it is working fine. While from the PC you can open this from VLC network stream. The missing part is to find out what is the address of this published stream, assuming that the PC has access to this android link by being connected together, same wifi network ...
Related
I'm working with the google glass (it is considered as a normal android device) and openCV Lib (c++). I need to transfer (REAL-TIME) the video source from the android camera to visual studio and process it on my PC. I am not processing the video directly in the glass because it is too computationally expensive. I tried to stream using rtsp, http.. protocols but the quality of the frames is bad and there is an inconvenient latency.
Hence, I was wondering if anyone of you know how to stream the video via USB and get it on visual studio. I read something about using ADB but it does not seem to have a real-time function.
Otherwise I'am all ears for any suggestion.
thank you in advance!!
Matt
You can use adb forward to foward a certain TCP port over USB.
That should allow you to open a socket between the Android device and your host PC through USB data transfer, which should give you fast enough speeds to send frames to the PC in real-time and analyse them in OpenCV. You can just send the frames as bytes over the socket.
I need to view mjpeg stream on android device. I found mjpegview and this library. But when I configured ip address and port on its async doRead method gets stream I can see its content when debugging. But not shows in the mjpegview. I download that apk of that library from playstore it also didnt do streaming.
This app plays my stream on same device.
Any idea for that? Or suggest something ?
For a school project my partner and I are trying to push video from an android tablet (Nexus 7) to a server (as an ip webcam), pull from the server into OpenCV 2.4.6, process that, send it back to a server, and have the tablet display the feed in real or near-real time.
We arent using opencv for android because the goal is for a remote user to decide how to process the video (i.e. selecting a template to match or something from the stream).
Our question is this: we have managed to get the android webcam stream onto a server as an h264 rtsp stream. All the documentation for how to pull an rtsp stream is either outdated, really confusing, or altogether non-existent. We tried using a VideoCapture object and then tried using cvCreateFileCapture but neither seem to be working. How do we do this?
There is an open source project that does precisely this combination.
Android generating the content from front or rear camera
RTSP protocol for streaming
h264
https://code.google.com/p/spydroid-ipcamera/
Recently I was digging all around but was unable to get some decent solution. My problem is I want to send audio recorded from Android Device to Server running at PC which will receive this audio and will use windows speech recognition and will try to convert it into text. I have searched a lot. But couldn't find any solution. I have tried:
Android-Audio Streaming From Pc
Android: Streaming audio over TCP Sockets
Java - Broadcast voice over Java sockets
http://eurodev.blogspot.com/2009/09/raw-audio-manipulation-in-android.html
Can you help me??
since I'm new and cannot comment on your question or ask for clarifications, but just answer it, I'll do my best here.
This is how I would try to solve the problem if it was mine:
record the audio on the android device
send it to the laptop via email/ftp/whatever transport is easier for me
open it in the laptop and send it to the Windows Speech Recognition Api for conversion to text
send it back via email/ftp/whatever transport is easier for me
hope it helps.
Gustavo.
Let's say that I have Microsoft Media Server stream (i.e. mms://[some ip address here]). This stream contains both audio and video. Is it possible to stream this to an Android phone? How would I go about doing this? Preferably with video, but if it is an audio stream only that would also be okay.
mms:// is used as placeholder url schema. WMS will stream through either RTSP or HTTP protocols. However in order to playback stream on the phone you need streaming code and codec. Android seems to support WMA/WMV codecs, but I do not see any information about protocols.
Sorry, Microsoft halted support in 08 for MMS. I would dump them and find a better solution.