Android receive RTP/UDP audio stream from VLC/ffmpeg - android

I was searching for a good answer for half a day, but I am a beginner at this stuff and I would appreciate any help.
What I would like to achieve is to stream audio (mp3 files) within ffmpeg or vlc and receive it on an Android device by udp/rtp.
This is what I was able to figure out myself sofar:
1) There is Android class AudioStream and RTPStream. What I don't know is how to use it. For example I create a stream via ffmpeg with: ffmpeg -re -i mymp3.mp3 -ar 8000 -acodec copy -f rtp rtp://192.168.0.100:5533, where 192.168.0.100 is the address of my Android device. Now I would like to receive it and play it.
I found something like this on Stack:
AudioStream audioStream;
AudioGroup audioGroup;
AudioCodec codec = AudioCodec.PCMU;
StrictMode.ThreadPolicy policy = new StrictMode.ThreadPolicy.Builder().permitNetwork().build();
StrictMode.setThreadPolicy(policy);
AudioManager audio = (AudioManager)getSystemService(AUDIO_SERVICE);
audio.setMode(AudioManager.MODE_IN_COMMUNICATION);
audioGroup = new AudioGroup();
audioGroup.setMode(AudioGroup.MODE_NORMAL);
InetAddress inetAddress;
try {
inetAddress = InetAddress.getByName("163.11.62.208");
audioStream = new AudioStream(inetAddress);
audioStream.setMode(RtpStream.MODE_RECEIVE_ONLY);
audioStream.setCodec(codec);
InetAddress inetAddressRemote = InetAddress.getByName("163.11.169.206");
audioStream.associate(inetAddressRemote, 5004);
audioStream.join(audioGroup);
}
What is the first inetAddress 163.11.62.208 and what is the second one 163.11.169.206? Shoudln't I just give an address of a stream?
2) Can I submit only streams in PCMU format? Can I stream mp3 files?
3) Is it even possible?

I've implemented Cisco Jabber integration with our server and Android and had similar set up.
audioStream = new AudioStream(inetAddress)
inetAddress(163.11.62.208) is the local network address of that android device.
We get it using the following:
WifiManager wifiMgr = (WifiManager) context.getSystemService(Context.WIFI_SERVICE);
WifiInfo wifiInfo = wifiMgr.getConnectionInfo();
int ip = wifiInfo.getIpAddress();
String ipAddress = Formatter.formatIpAddress(ip);
Log.w(TAG, "ipAddress=" + ipAddress);
inetAddress = InetAddress.getByName(ipAddress);
There is may be other ways, i'm not and Android developer.
audioStream.associate(inetAddressRemote, 5004)
inetAddressRemote(163.11.169.206) is the remote address of the server from which you'll be sending audio to Android.
5004 is the port to send audio to and from on both Android and server side.
Now there is a catch - make sure the local port you send audio from the server to Android is also a 5004. For example test audio stream:
ffmpeg -re -f lavfi -i aevalsrc="sin(400*2*PI*t)" -map 0:0 -c:a pcm_mulaw -b:a 64k -ar 8000 -f rtp rtp://163.11.62.208:5004?localrtpport=5004

Related

Receive multiple RTP, mix, output RTSP stream

I'm currently trying to receive multiple RTP audio stream, mixing them, and output RTSP stream by using ffmpeg or ffserver.
RTP audio stream is send by Android AudioStream.
Here is code Android side.
AudioManager audio = (AudioManager) getSystemService(Context.AUDIO_SERVICE);
audio.setMode(AudioManager.MODE_IN_COMMUNICATION);
audioGroup = new AudioGroup();
audioGroup.setMode(AudioGroup.MODE_ECHO_SUPPRESSION);
audioStream = new AudioStream(InetAddress.getByAddress(getLocalIPAddress()));
audioStream.setCodec(AudioCodec.PCMU);
audioStream.setMode(RtpStream.MODE_NORMAL);
audioStream.associate(InetAddress.getByName(SipStackAndroid.getRemoteIp()), REMOTE_PORT);
audioStream.join(audioGroup);
Then I prepare server side.
Here is ffserver.conf
HTTPPort 5555
HTTPBindAddress 0.0.0.0
RTSPPort 5454
RTSPBindAddress 0.0.0.0
MaxHTTPConnections 100
MaxClients 1000
MaxBandwidth 10000
CustomLog -
<Feed feed1.ffm>
File /tmp/feed1.ffm
FileMaxSize 500M
</Feed>
<Stream live.wav>
Format rtp
Feed feed1.ffm
AudioBitRate 13
AudioChannels 1
AudioSampleRate 8000
AudioCodec pcm_mulaw
# AudioCodec libmp3lame
NoVideo
</Stream>
And here is ffserver or ffmpeg command.
ffserver -d -f ffserver.conf
ffmpeg -i "rtp://192.168.150.10:12345" -acodec auto http://127.0.0.1:5555/feed1.ffm
I can't solve how to receive multiple rtp stream and how to mix them.
Please give some ideas and actual links where I can find an answer.

Stream video from Raspberry pi camera to Android app

I'm currently writing a project using Raspberry pi and mobile (Android).
I have problem to send data from Camera Rpi to Android App.
I'm using library Picamera on python: https://picamera.readthedocs.io/en/release-1.13/recipes1.html#recording-to-a-network-stream .
My actual code on Rpi looks something like this:
import socket
import time
import picamera
camera = picamera.PiCamera()
camera.resolution = (640, 480)
camera.framerate = 24
server_socket = socket.socket()
server_socket.bind(('0.0.0.0', 8000))
server_socket.listen(0)
# Accept a single connection and make a file-like object out of it
connection = server_socket.accept()[0].makefile('wb')
try:
camera.start_recording(connection, format='h264')
camera.wait_recording(60)
camera.stop_recording()
finally:
connection.close()
server_socket.close()
To receive stream we can use: tcp/h264://x.x.x.x:8000 . It works on PC when I used vlc.
On Android I try use VideoView or ExoPlayer, but problem is with URI because, android can't parse tcp/h264 protocol.
When I try stream using vlc:
raspivid -o - -t 99999 |cvlc -vvv stream:///dev/stdin --sout '#standard{access=http,mux=ts,dst=:8000}' :demux=h264
It works on Android if I pass url with prefix http:// but is not from my program on python.
It seems to me that I have 2 ways.
On python use different way to stream video output.
Somehow handle protocol tcp/h264 (probably used socket and independently parse stream bytes to video). It is possible: https://github.com/ShawnBaker/RPiCameraViewer but i am looking for better (not low level) solution.
You can stream it from python easily, just use
import subprocess
subprocess.Popen("raspivid -o - -t 99999 |cvlc -vvv stream:///dev/stdin --sout '#standard{access=http,mux=ts,dst=:8000}' :demux=h264", shell=True)
This will launch it in a different thread, so it won't be blocking your program/

ffmpeg on android: playing MPEG2 TS udp multicast stream

I want to make android media player using ffmpeg.
(catch MPEG2 TS multicast stream via WIFI network and decode it)
I checked followings:
My iptime AP supports WIFI multicast protocol.
(send multicast stream in wired PC, and wifi connected PC can receive it)
My Android phone can receive multicast stream via WIFI.
I coded NDK socket programming which is join udp multicast group and receive packets
(I added multicast access grant to AndroidManifest.xml)
FFMPEG library is ported to android and it can play local media file.
But when I try to open network stream using FFMPEG library, avformat_open_input() function returns fail.
gFormatCtx = avformat_alloc_context();
av_register_all();
avcodec_register_all();
avformat_network_init();
if(avformat_open_input(&gFormatCtx,"udp://#239.100.100.100:4000",NULL,NULL) != 0)
return -2;
this code always return "-2".
If I use "av_dict_set()" api, which option should I use?
av_dict_set(&options, "udp_multicast", "mpegtsraw", 0);
please let me know what should I check for avformat_open_input error?
thanks.

Reading Audio file in C and forwarding over bluetooth to play in Android Audio track

What I am Trying to do : Reading .wav file in C(linux) ,forwarding buffer data through bluetooth rfcomm socket , receiving buffer in android and then giving buffer to Audio Track to play.(Need android application to play audio streaming)
code :
1- C-code for rfcomm socket creation Ccode for rfcomm socket
2 - C-code for forwarding data
FILE *fp;
char buffer[1024];
fp = fopen("feelgood.wav","r"); //for audio track use reading .wav file
while(i=fread(buffer, sizeof(buffer),1, fp) > 0){
status=write(bluetooth_socket, buffer,strlen(buffer));
usleep(100000);
}
3- Android code for reading from socket is something like this:
//Audio Track initialization for Streaming
AudioTrack track = new AudioTrack(AudioManager.STREAM_MUSIC,44100,AudioFormat.CHANNEL_OUT_MONO,AudioFormat.ENCODING_PCM_8BIT,10000, AudioTrack.MODE_STREAM);
track.play();
//Receiving data from socket
byte[] buffer = new byte[1024];
int bytes;
bytes = socket.getInputStream().read(buffer);
track.write(buffer, 0,bytes);
Problem : Actually problem I am not getting why Audio track is not playing properly(hint of audio music with lot of noise is heard).How to listen noisefree audio on Android application part with this approach .Is there audio track implementation problem or buffer problem.
Related question(Receive audio via Bluetooth in Android) but cannot follow a2dp approach on Android as sink.

Android : Audio call using rtpstream

I want to develop an application that enables the users to do real time audio chatting with each other. I am using rtpstream to implement this. Following is my code. I am using two phones to test my application. The port number to the audio stream is assigned on run-time randomly. This means I have to send the port number from Phone 1 to Phone 2 on run-time to establish a connection.The problem here is that the communication is only one-sided i.e. Phone 1 can talk to Phone 2 but cannot hear Phone 1's reply. What should I do to make it two sided? Also is there any way to assign the port number to audiostream manually? Any help will be appreciated.
audioGroup = new AudioGroup();
audioGroup.setMode(AudioGroup.MODE_NORMAL);
audioStream = new AudioStream(InetAddress.getByAddress(MyIP));
PORT = audioStream.getLocalPort();
audioStream.setCodec(AudioCodec.PCMU);
audioStream.setMode(RtpStream.MODE_NORMAL);
audioStream.associate(InetAddress.getByAddress(ReceiverIP), PORT);
audioStream.join(audioGroup);
AudioManager Audio = (AudioManager) getSystemService(Context.AUDIO_SERVICE);
Audio.setMode(AudioManager.MODE_IN_COMMUNICATION);
The right way to do this is to setup RTP Stream first then get the port number on which stream is listening and then send that port in the SDP part of SIP INVITE. Take a look at this example https://github.com/Mobicents/restcomm-android-sdk/tree/master/Examples/JAIN%20SIP
I'm trying to accomplish the same , one possibility is for user1 to share his ip with the user2. Both users can create a audiogroup and and audiostream . the audiostream joins to the other user's audio group.

Categories

Resources