Recently I was digging all around but was unable to get some decent solution. My problem is I want to send audio recorded from Android Device to Server running at PC which will receive this audio and will use windows speech recognition and will try to convert it into text. I have searched a lot. But couldn't find any solution. I have tried:
Android-Audio Streaming From Pc
Android: Streaming audio over TCP Sockets
Java - Broadcast voice over Java sockets
http://eurodev.blogspot.com/2009/09/raw-audio-manipulation-in-android.html
Can you help me??
since I'm new and cannot comment on your question or ask for clarifications, but just answer it, I'll do my best here.
This is how I would try to solve the problem if it was mine:
record the audio on the android device
send it to the laptop via email/ftp/whatever transport is easier for me
open it in the laptop and send it to the Windows Speech Recognition Api for conversion to text
send it back via email/ftp/whatever transport is easier for me
hope it helps.
Gustavo.
Related
I'm working with the google glass (it is considered as a normal android device) and openCV Lib (c++). I need to transfer (REAL-TIME) the video source from the android camera to visual studio and process it on my PC. I am not processing the video directly in the glass because it is too computationally expensive. I tried to stream using rtsp, http.. protocols but the quality of the frames is bad and there is an inconvenient latency.
Hence, I was wondering if anyone of you know how to stream the video via USB and get it on visual studio. I read something about using ADB but it does not seem to have a real-time function.
Otherwise I'am all ears for any suggestion.
thank you in advance!!
Matt
You can use adb forward to foward a certain TCP port over USB.
That should allow you to open a socket between the Android device and your host PC through USB data transfer, which should give you fast enough speeds to send frames to the PC in real-time and analyse them in OpenCV. You can just send the frames as bytes over the socket.
How do I stream a video from my android smartphone to my Laptop(windows 7) without using any app? Any suggestions? A brief explanation would really help. And, I mean "streaming" and not file transfer.
You cannot use this feature without using any app,what you need to do is to program a socket client that sends the video to the server which is connected over wifi .try googling about TCP sockets in android
Go to the following link .
https://code.google.com/p/spydroid-ipcamera/
you will get full source code for wifi streaming which u can see both on a browser in laptop as well as vlc media player also (rtsp). And also u can see in another mobile/tablet using vlc plugin
I'm trying to stream video from the camera on my android device. I've just followed the code in this link: Android Camera RTSP/RTP Stream?
It seems the user had a problem with the YUV decoding, to solve this I've used:
parameters.setPreviewFormat(ImageFormat.RGB_565);
to obtain preview frames on rgb format.
The logs tell that the packets are sent with no error, so what i would like to do next is to play the data stream on VLC player located on a local pc. I introduce the local ip of my pc on the code, so the packets are sent to it. But how to play them????
I'm really newby at this point and any advice could help me a lot.
Thanks.
I believe that you want to publish a stream ip with your android, this part as you are saying it is working fine. While from the PC you can open this from VLC network stream. The missing part is to find out what is the address of this published stream, assuming that the PC has access to this android link by being connected together, same wifi network ...
Now I'm working on a voice-call project between Android and PC. I use JMF library for PC client, and normal API Android to create a voice-call between them. I use JMF because it supports RTP protocol. My problem is that the PC client can understand the packets sent from Android one, but not vice versa.
I customized code from SipDroid application and see that only two codecs are used - PCMA and PCMU. I'm not good at audio/video codec. So my question is if JMF library supports those codecs (PCMA and PCMU). I searched in Internet, and some guys say that PCMA or PCMU is same with ULAW/ALAW, but I'm not sure that's right.
Does anyone have experience on this?
JMF supports ulaw which is called PCMU.
See here.
And yes PCMU/PCMA is same as ulaw/alaw.
See here.
I am trying to develop a Asterisk Android Client. My preferred codec is GSM. I have downloaded SIPDroid source code and some other helping projects. But since I am totally new on this area, I am not sure where to start from.
Here is what I am trying to do at starting.
Record sound
Convert that sound to GSM RTP packet
Play that codec sound
Stream that GSM RTP Packet
Integrate SIP Session with the App
I have one Android device (HTC Wildfire). Is it possible to test these steps from simulator to my set using Wi-Fi network?
Please give me an appropriate steps/algorithm on which I can develop the client App.
It'll be great if someone give me some tips to use the existing projects.Thanks
I asked a friend with an android phone to install SIPDroid, and it does support the GSM codec.