I'm using a Raspberry Pi to stream audio from a USB converter to my Android app, but the big problem is that I need it to have no latency.
I've already tried Icecast w/ ices2/darkice but the latency was too big. I'm not able to work with GStreamer; GStreamer in RPi to GStreamer in a PC client was OK, but I dont't managed to install it on Android.
Now I wrote my own Server in Java using UDP to transmit the data, but i get 1+ seconds of latency and I need it to be less.
Does anyone have an idea of what can I use in RPi to manage the latency problem? Thanks!
Related
My goal is to stream the audio/video content in real time from Android mobile to Wowza Server.
1- As I understand, the protocol for the Wowza's incoming and outgoing(broadcasted) streams can be different. True?
2- If that is so, then I can either upload my video data through HLS protocol or RTMP protocol.
3- I'm little bit familiar with these protocols after searching here and there, but I don't have enough knowledge to make a decision which protocol will be best to stream recorded audio and video data to Wowza server in real time, so that it can be broadcasted and seen through HLS protocol. So help in selecting the best protocol will be appreciated.
It should be noted that the video will be recorded and streamed in real time, that means I will have to get the encoded buffer from MediaCodec and send it to the Wowza Server.
Again, if the point 1 above is true then it means my only concern should be to choose the best protocol for the Wowza's inbound stream.
Any advice regarding Android OS versions, SDK vs. NDK etc. will be highly appreciated.
Thanks.
1 - True.
2 - RTMP. HLS is a pull base protocol over HTTP. fine for server to client, but not great for client to server.
3 - This is not a question.
Protocol it completely independent of OS choice.
I'm working with the google glass (it is considered as a normal android device) and openCV Lib (c++). I need to transfer (REAL-TIME) the video source from the android camera to visual studio and process it on my PC. I am not processing the video directly in the glass because it is too computationally expensive. I tried to stream using rtsp, http.. protocols but the quality of the frames is bad and there is an inconvenient latency.
Hence, I was wondering if anyone of you know how to stream the video via USB and get it on visual studio. I read something about using ADB but it does not seem to have a real-time function.
Otherwise I'am all ears for any suggestion.
thank you in advance!!
Matt
You can use adb forward to foward a certain TCP port over USB.
That should allow you to open a socket between the Android device and your host PC through USB data transfer, which should give you fast enough speeds to send frames to the PC in real-time and analyse them in OpenCV. You can just send the frames as bytes over the socket.
gang:
I'm an embedded software engineer starting to work on starting to work with Linux streaming solutions.
And got no experience with network and smartphone OSes.
When we stream video to PC(receiving via VLC) on PC side, we found there's a latency of 1.2 seconds for H.264. It includes
sensor grabs data
data is sent over the network
VLC buffers and plays
We found out after a while, that there's buffering control on VLC. For H.264 streaming, however, the minimum we can set is 400 -- 500 ms. However, on Android phones, we were NOT able to find some software that has very short(minimum) delays/buffering.
Can anyone suggest
How is latency generally measured/profiled for video streaming to smart phones?
Do you have any network sniffing software on Android/iOS to recommend?
I saw in Apple's documentation that HTTP live streaming recommends 10s "file size". Anyway to overcome this? (Is jailbreaking required for installing sniffing tool on iOS?)
I am trying to develop a Asterisk Android Client. My preferred codec is GSM. I have downloaded SIPDroid source code and some other helping projects. But since I am totally new on this area, I am not sure where to start from.
Here is what I am trying to do at starting.
Record sound
Convert that sound to GSM RTP packet
Play that codec sound
Stream that GSM RTP Packet
Integrate SIP Session with the App
I have one Android device (HTC Wildfire). Is it possible to test these steps from simulator to my set using Wi-Fi network?
Please give me an appropriate steps/algorithm on which I can develop the client App.
It'll be great if someone give me some tips to use the existing projects.Thanks
I asked a friend with an android phone to install SIPDroid, and it does support the GSM codec.
I am working on an Android app to play video stream in RTSP protocol, which provided by the Darwin streaming server.
My problem is that the RTSP stream cannot be played using Android's VideoView/MediaPlayer via some specific WiFi hotspots, e.g. at my workplace. I searched around and found that Darwin streaming server use UDP Ports 6970 - 6999 for media data streaming, and the firewall may be the problem. but the same stream can be played using VLC on PC via the same WiFi hotspot.
What's the difference between the mechanism that VLC and the Android's build-in media framework OpenCore use? Is that possible for me to write my own rtsp client with live555's openRTSP source on Android? Any help will be very appreciated.
Bolton
I've used wireshark to scan my network and I think I now know the difference:
When I use android emulator, I can see the client keeps sending UDP requests through ports 6970, 6971 but get no response. And when using VLC, the RTP data is transfered in TCP via port 554.
And the problem is caused by the firewall I think.
As you stated in your answer VLC Switches to Interleaved Rtp over Rtsp when Udp fails.
This is why VLC continues to work.
You can use my library # https://net7mma.codeplex.com/ if you can use .Net or you can use it as a reference for your own development.