I'm currently trying to receive multiple RTP audio stream, mixing them, and output RTSP stream by using ffmpeg or ffserver.
RTP audio stream is send by Android AudioStream.
Here is code Android side.
AudioManager audio = (AudioManager) getSystemService(Context.AUDIO_SERVICE);
audio.setMode(AudioManager.MODE_IN_COMMUNICATION);
audioGroup = new AudioGroup();
audioGroup.setMode(AudioGroup.MODE_ECHO_SUPPRESSION);
audioStream = new AudioStream(InetAddress.getByAddress(getLocalIPAddress()));
audioStream.setCodec(AudioCodec.PCMU);
audioStream.setMode(RtpStream.MODE_NORMAL);
audioStream.associate(InetAddress.getByName(SipStackAndroid.getRemoteIp()), REMOTE_PORT);
audioStream.join(audioGroup);
Then I prepare server side.
Here is ffserver.conf
HTTPPort 5555
HTTPBindAddress 0.0.0.0
RTSPPort 5454
RTSPBindAddress 0.0.0.0
MaxHTTPConnections 100
MaxClients 1000
MaxBandwidth 10000
CustomLog -
<Feed feed1.ffm>
File /tmp/feed1.ffm
FileMaxSize 500M
</Feed>
<Stream live.wav>
Format rtp
Feed feed1.ffm
AudioBitRate 13
AudioChannels 1
AudioSampleRate 8000
AudioCodec pcm_mulaw
# AudioCodec libmp3lame
NoVideo
</Stream>
And here is ffserver or ffmpeg command.
ffserver -d -f ffserver.conf
ffmpeg -i "rtp://192.168.150.10:12345" -acodec auto http://127.0.0.1:5555/feed1.ffm
I can't solve how to receive multiple rtp stream and how to mix them.
Please give some ideas and actual links where I can find an answer.
Related
I am making an application similar to DVB broadcast TV player based on Android. The data I can receive is a series of MPEG-TS packets, each packet may be 188 bytes or 204 bytes. This is not HLS and does not carry There are m3u8 files. May I ask how can I decode these MPEG-TS input streams on Android and play them?
The received data packet is a data packet that conforms to the MPEG-TS encapsulation standard, similar to this
enter image description here
Android client through the microphone to admit real-time voice, and the voice stream data sent to the server through RTMP, I want to achieve the client's transmission function, I am thinking is to convert AudioTrack ffmpeg ffmpeg (to AAC PCM), and then sent to the server, but the code is difficult to achieve.
Waiting for your answer,Thanks!
Use voaacenc to encode pcm to aac, you can reference to AAC_E_SAMPLES.c in the project.
Use librtmp rtmpdump.mplayerhq.hu/ to send aac packet, Reference this blog www.codeman.net/2014/01/439.html. (So sory I cannot post more than 2 links)
connect
send meta data
send aac spec data
send aac packets, the pts must increase
disconnect
I was searching for a good answer for half a day, but I am a beginner at this stuff and I would appreciate any help.
What I would like to achieve is to stream audio (mp3 files) within ffmpeg or vlc and receive it on an Android device by udp/rtp.
This is what I was able to figure out myself sofar:
1) There is Android class AudioStream and RTPStream. What I don't know is how to use it. For example I create a stream via ffmpeg with: ffmpeg -re -i mymp3.mp3 -ar 8000 -acodec copy -f rtp rtp://192.168.0.100:5533, where 192.168.0.100 is the address of my Android device. Now I would like to receive it and play it.
I found something like this on Stack:
AudioStream audioStream;
AudioGroup audioGroup;
AudioCodec codec = AudioCodec.PCMU;
StrictMode.ThreadPolicy policy = new StrictMode.ThreadPolicy.Builder().permitNetwork().build();
StrictMode.setThreadPolicy(policy);
AudioManager audio = (AudioManager)getSystemService(AUDIO_SERVICE);
audio.setMode(AudioManager.MODE_IN_COMMUNICATION);
audioGroup = new AudioGroup();
audioGroup.setMode(AudioGroup.MODE_NORMAL);
InetAddress inetAddress;
try {
inetAddress = InetAddress.getByName("163.11.62.208");
audioStream = new AudioStream(inetAddress);
audioStream.setMode(RtpStream.MODE_RECEIVE_ONLY);
audioStream.setCodec(codec);
InetAddress inetAddressRemote = InetAddress.getByName("163.11.169.206");
audioStream.associate(inetAddressRemote, 5004);
audioStream.join(audioGroup);
}
What is the first inetAddress 163.11.62.208 and what is the second one 163.11.169.206? Shoudln't I just give an address of a stream?
2) Can I submit only streams in PCMU format? Can I stream mp3 files?
3) Is it even possible?
I've implemented Cisco Jabber integration with our server and Android and had similar set up.
audioStream = new AudioStream(inetAddress)
inetAddress(163.11.62.208) is the local network address of that android device.
We get it using the following:
WifiManager wifiMgr = (WifiManager) context.getSystemService(Context.WIFI_SERVICE);
WifiInfo wifiInfo = wifiMgr.getConnectionInfo();
int ip = wifiInfo.getIpAddress();
String ipAddress = Formatter.formatIpAddress(ip);
Log.w(TAG, "ipAddress=" + ipAddress);
inetAddress = InetAddress.getByName(ipAddress);
There is may be other ways, i'm not and Android developer.
audioStream.associate(inetAddressRemote, 5004)
inetAddressRemote(163.11.169.206) is the remote address of the server from which you'll be sending audio to Android.
5004 is the port to send audio to and from on both Android and server side.
Now there is a catch - make sure the local port you send audio from the server to Android is also a 5004. For example test audio stream:
ffmpeg -re -f lavfi -i aevalsrc="sin(400*2*PI*t)" -map 0:0 -c:a pcm_mulaw -b:a 64k -ar 8000 -f rtp rtp://163.11.62.208:5004?localrtpport=5004
I stream webcam data to my client.
I can see the data is arriving by listening on('data'). However, when I create it I am not able to view it and it's probably garbage data or missing some headers. VLC cannot play it.
My next step is to make it real-time streamable to browser.
What am I doing wrong?
net = require('net');
fs = require('fs');
// Start a TCP Server
net.createServer(function (socket) {
console.log("client connected");
var file = fs.createWriteStream("temp.mp4");
socket.pipe(file, {end: false});
socket.on('end', function(){
console.log("ended");
});
}).listen(5000);
I tested to see if did it really capture video output:
$ mediainfo temp.mp4
General
Complete name : temp.mp4
Format : H.263
Format version : H.263
File size : 126 KiB
Video
Format : H.263
Width : pixel0
Height : pixel0
Color space : YUV
Chroma subsampling : 4:2:0
Bit depth : 8 bits
Compression mode : Lossy
And this is the following Android code for setting mediaRecorder (Assume socket is connected, no problem)
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
mediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.DEFAULT);
mediaRecorder.setVideoSize(320, 240);
mediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.DEFAULT);
mediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.DEFAULT);
ParcelFileDescriptor pfd = ParcelFileDescriptor.fromSocket(socket);
mediaRecorder.setOutputFile(pfd.getFileDescriptor());
mediaRecorder.setMaxDuration(5000);
mediaRecorder.setMaxFileSize(5000000);
There are a few open source projects that solve this problem, such as Spydroid (browser/VLC streaming) and Android IP Camera (browser streaming). Your implementation seems similar to Spydroid, so maybe you can adapt some of its code.
The central problem is that MediaRecorder is writing raw video frames into the socket. It needs to wait until the video is finished to write the headers, but they need to appear at the beginning of the file. Since the socket is not seekable, the headers can't be written at the correct location. The projects linked above deal with this problem by packetizing the stream into RTSP (Spydroid) or "streaming" a series of still images to the browser (Android IP Camera).
Android will play both the audio and video (AAC,h263) that I server it from my rtp server, but when I serve an AAC/H264 stream, I only get the audio and no video.
In the working scenario, Android issues a SETUP command for both track ids, but the H264, android never issues the SETUP command for the second Video track.
Is my SDP file correct? The profile-id and sprops I believe are correct as are copied directly from the sps and pps NALs from the H264 encoder. The video is baseline#2.1
Is Android failing to repsond or recognise the second track?
If I stream the video file by itself with live555 it works fine, and I have compared the SDP file it produces with my own one.
Any ideas?
Thanks
H264/AAC SDP file:
v=0
o=xxx IN IP4 192.168.13.43
s=live.3gp
u=http:///
e=admin#
c=IN IP4 0.0.0.0
b=AS:187
t=0 0
a=control:rtsp://192.168.13.43:555/live.3gp
a=isma-compliance:1,1.0,1
a=range:npt=0- 2630.336000
m=audio 0 RTP/AVP 97
a=rtpmap:97 MP4A-LATM/44100/2
a=control:rtsp://192.168.13.43:555/live.3gp/trackID=1
a=fmtp:97 profile-level-id=41; cpresent=0; config=400024203fc0
m=video 0 RTP/AVP 96
a=rtpmap:96 H264/90000
a=control:rtsp://192.168.13.43:555/live.3gp/trackID=2
a=cliprect:0,0,256,432
a=framesize:96 432-256
a=fmtp:96 packetization-mode=1; profile-level-id=42C015;sprop-parameter- sets=Njc0MkMwMTVGNDBEODQzNjAyMjAwMDAwMDMwMDIwMDAwMDAzMDNDMUUzMDY1NA==,NjhDRTA0NzI=
SDP file produced by live555 for the same video file which does display on Android:
v=0
o=- 1303401850159891 1 IN IP4 192.168.13.58
s=H.264 Video, streamed by the LIVE555 Media Server
i=live.3g
t=0 0
a=tool:LIVE555 Streaming Media v2011.01.19
a=type:broadcast
a=control:*
a=range:npt=0-
a=x-qt-text-nam:H.264 Video, streamed by the LIVE555 Media Server
a=x-qt-text-inf:baseCasterCap.264
m=video 0 RTP/AVP 96
c=IN IP4 0.0.0.0
b=AS:500
a=rtpmap:96 H264/90000
a=fmtp:96 packetization-mode=1;profile-level-id=42C015;sprop-parameter-sets=Z0LAFfQNhDYCIAAAAwAgAAADA8HjBlQ=,aM4Ecg==
a=control:track1
sprop-parameter-sets shouldn't have a tab or space in it (may be a copy/paste bug).
Android (or the player being used there) may not support packetization-mode 1. 0 is required; 1 is optional.
a=framesize and a=cliprect - those aren't standard H.264, but may not be a problem.
I assume the port 0 is normal for your usage (since audio works)? In offer-answer, port 0 would be a rejected stream (in an answer; in an offer it means a disabled stream).
I've seen implementations (I'm looking at YOU, Grandstream!) that insist on spaces after semi-colons for H264 fmtp (they're wrong) - you have a mixture.
Is the "C0" in the profile-level-id correct? that adds constraints; try it without the constraints and see what the response is. (You can still send a more-constrained stream than the SDP indicated.)
Thanks for your help Jesup, it was very much appreciated.
The problem was the sprop parameters, I noticed when I copy and pasted the second SDP file for you.
The encoder I used to do the base64 calculation for my testing was for character data, not binary data. So 65 was being interpreted as the character 6, and the character 5, and encoded, rather than the single number (A in ascii I guess).
Does that make sense?
Silly mistake on my part. Thanks again
Ian