Voice call between android client and (JMF) PC client - android

Now I'm working on a voice-call project between Android and PC. I use JMF library for PC client, and normal API Android to create a voice-call between them. I use JMF because it supports RTP protocol. My problem is that the PC client can understand the packets sent from Android one, but not vice versa.
I customized code from SipDroid application and see that only two codecs are used - PCMA and PCMU. I'm not good at audio/video codec. So my question is if JMF library supports those codecs (PCMA and PCMU). I searched in Internet, and some guys say that PCMA or PCMU is same with ULAW/ALAW, but I'm not sure that's right.
Does anyone have experience on this?

JMF supports ulaw which is called PCMU.
See here.
And yes PCMU/PCMA is same as ulaw/alaw.
See here.

Related

Android 2.3.4 RTP Server and Client

I'm trying to implement RTP Server to transport Voice from Micro to another Phone. And RTP Client to play Audio from another Phone to my phone.
So do you have a library, demo or any mention please talk to me.
Additional, I also need a RTP Server application to test.
UPDATE
In case, I can get an URL like this: RTP://192.168.43.123:5678. So How to streaming it in Android 2.3.4 ?
Thank alot
Note: I must using Android 2.3.4 :(
I think you should have a look at this library: https://github.com/fyhertz/libstreaming
It uses a RTP Server and several Codecs, but unfortunately it is only for android 4.0 and above.
There is another library which works with API Level 8, but doesn't provide RTP: https://github.com/Teaonly/android-eye

RTP-Server in Android 4.0 and above

Working on Android 4.0+ above.
I am in process of analyzing ways to live stream my camera video to Window PC using RTP , encoding MPEG-2.
Is there readily available "rtp-server" in android 4.0+ ?
Is following true:: "The Android platform lacks support for
streaming protocol, which makes it difficult to stream live audio /
video to Android enabled devices." extracted from website
Currently I analyzed used the ffserver from the ffmpeg
libraries, but the FPS is < 5. which is far slow. Did any one
explored other solution which has more FPS?
Did anybody tried using StageFright for same? Capturing raw data
from camera and sending it to stagefright framework for encoding and
then streaming the same using RTP ??
Many Thanks.
The answers to your questions are as below. Though the links are related to Android 4.2.2, the same is true for Android 4.0 also.
Yes, there is a RTP transmitter available. You could look at this example in MyTransmitter as a starting point or you can consider using the standard recorder as in startRTPRecording.
You can stream data via RTP from an Android device to an external sink or you could have a different use-case as in Miracast a.k.a. Wi-Fi Display. However, streaming from one android device to another device through Wi-Fi Direct is still not completely enabled. The latter statement is mainly coming from Miracast scenario.
You can use the standard android software, which is capable of high resolution recording and transmission. This is mainly dependent on the underlying hardware as the overhead from software stack is not very high.
Yes. This is already answered in Q1 above.

Advice about streaming live video to android/ios/pc

i would like some advice about the best way to stream a only-video live stream from a server to:
Android (>4.0 is ok)
PC with web-browser
iOS
I would like to keep latency as low as 1/2 second.
I can use:
flash: works on PC but no iOS and no Android(works only on some tablets)
HLS: not good because of latency
proprietary library: it should work but i have to implement it everywhere
RTSP: works only on Android
Any other way? Is a proprietary library the way to go?
I'm working on Linux but i'm mainly interested in "use this technology" and not "use this code".
Not sure, but you can try HTTP streaming of MP4/3gp formats using a web server. Both Android and iOS supports HTTP streaming. But you need to implement Progressive Download.
Please specify on which OS you want to implement your server.
For Windows - you can use following binary to relocate your moov atoms to the beginning of media file to enable them for progressive download
http://notboring.org/devblog/2009/07/qt-faststartexe-binary-for-windows/
Let us know your progress.
You can implement FFmpeg Server for Live broadcast. It gives you various options. Enable/Disable options from its configuration file located at /etc/ffserver.conf
You can get detail documentation at
http://ffmpeg.org/ffserver.html
Rtsp might be the way to go , but that 1/2 second latency might be hard to get.
I guess for video only and if you don't buffer at all , this may work for ios anyway
https://github.com/mooncatventures-group/FFPlayer-tests
Android supports rtsp , but its not very good.
You can compile ffmpeg for android and write a simple player using OpenGL. I can't share the code because we did it for a client but its not to difficult.

Audio/Video Conferencing Application in Android

I have to develop an application in android for audio/video conferencing. Which is the most efficient way of implementing this? While my research, I came across Android's SIP API. Can it be used for implementing the audio as well as video conferencing ? And If yes, what shall I use to stream the videos in real time? Shall I use any RTSP library for this?
Please Guide me.
Thanks,
Rupesh
Okz for my practical project i used Spydroid which use rtsp protocol sdp less. you can customize it for audio use only. i will prefer spydroid because it uses pure java in which it reads camera packets and right them to linux socket and read them from there by rtsp server.
on other hand if i am not wrong sip uses c/c++ codes too

Camera streaming using RTP from Android to PC

I'd like to write an application for Android for camera streaming to PC (H.263, MPEG_4). I found some libraries: sipandroid, jlibrtp.
SIPandroid:
RTP packets are streamed (wireshark catches it on PC well), but VLC can't play it.
Jlibrtp:
API is shady, stream is not played correctly using VLC.
May be there are some adaptations to these libraries (to make it working for camera streaming), or there are some other libraries with clean API and samples?
Thanks for your answer.
VLC has built-in support for RTP, and as #Lukas said, the network interfaces are likely the problem on VLC. If you stream everything to one port, and listen on that port, you will at least get something. You can also look into the RTP packets to see if they are well formed.
VLC itself uses the LiveMedia library, so you may be able to use that.

Categories

Resources