Android RTP SDP X264 problem (Have audio, but no Video) - android

Android will play both the audio and video (AAC,h263) that I server it from my rtp server, but when I serve an AAC/H264 stream, I only get the audio and no video.
In the working scenario, Android issues a SETUP command for both track ids, but the H264, android never issues the SETUP command for the second Video track.
Is my SDP file correct? The profile-id and sprops I believe are correct as are copied directly from the sps and pps NALs from the H264 encoder. The video is baseline#2.1
Is Android failing to repsond or recognise the second track?
If I stream the video file by itself with live555 it works fine, and I have compared the SDP file it produces with my own one.
Any ideas?
Thanks
H264/AAC SDP file:
v=0
o=xxx IN IP4 192.168.13.43
s=live.3gp
u=http:///
e=admin#
c=IN IP4 0.0.0.0
b=AS:187
t=0 0
a=control:rtsp://192.168.13.43:555/live.3gp
a=isma-compliance:1,1.0,1
a=range:npt=0- 2630.336000
m=audio 0 RTP/AVP 97
a=rtpmap:97 MP4A-LATM/44100/2
a=control:rtsp://192.168.13.43:555/live.3gp/trackID=1
a=fmtp:97 profile-level-id=41; cpresent=0; config=400024203fc0
m=video 0 RTP/AVP 96
a=rtpmap:96 H264/90000
a=control:rtsp://192.168.13.43:555/live.3gp/trackID=2
a=cliprect:0,0,256,432
a=framesize:96 432-256
a=fmtp:96 packetization-mode=1; profile-level-id=42C015;sprop-parameter- sets=Njc0MkMwMTVGNDBEODQzNjAyMjAwMDAwMDMwMDIwMDAwMDAzMDNDMUUzMDY1NA==,NjhDRTA0NzI=
SDP file produced by live555 for the same video file which does display on Android:
v=0
o=- 1303401850159891 1 IN IP4 192.168.13.58
s=H.264 Video, streamed by the LIVE555 Media Server
i=live.3g
t=0 0
a=tool:LIVE555 Streaming Media v2011.01.19
a=type:broadcast
a=control:*
a=range:npt=0-
a=x-qt-text-nam:H.264 Video, streamed by the LIVE555 Media Server
a=x-qt-text-inf:baseCasterCap.264
m=video 0 RTP/AVP 96
c=IN IP4 0.0.0.0
b=AS:500
a=rtpmap:96 H264/90000
a=fmtp:96 packetization-mode=1;profile-level-id=42C015;sprop-parameter-sets=Z0LAFfQNhDYCIAAAAwAgAAADA8HjBlQ=,aM4Ecg==
a=control:track1

sprop-parameter-sets shouldn't have a tab or space in it (may be a copy/paste bug).
Android (or the player being used there) may not support packetization-mode 1. 0 is required; 1 is optional.
a=framesize and a=cliprect - those aren't standard H.264, but may not be a problem.
I assume the port 0 is normal for your usage (since audio works)? In offer-answer, port 0 would be a rejected stream (in an answer; in an offer it means a disabled stream).
I've seen implementations (I'm looking at YOU, Grandstream!) that insist on spaces after semi-colons for H264 fmtp (they're wrong) - you have a mixture.
Is the "C0" in the profile-level-id correct? that adds constraints; try it without the constraints and see what the response is. (You can still send a more-constrained stream than the SDP indicated.)

Thanks for your help Jesup, it was very much appreciated.
The problem was the sprop parameters, I noticed when I copy and pasted the second SDP file for you.
The encoder I used to do the base64 calculation for my testing was for character data, not binary data. So 65 was being interpreted as the character 6, and the character 5, and encoded, rather than the single number (A in ascii I guess).
Does that make sense?
Silly mistake on my part. Thanks again
Ian

Related

How to decode and play MPEG-TS stream on Android (188 byte TS packet)

I am making an application similar to DVB broadcast TV player based on Android. The data I can receive is a series of MPEG-TS packets, each packet may be 188 bytes or 204 bytes. This is not HLS and does not carry There are m3u8 files. May I ask how can I decode these MPEG-TS input streams on Android and play them?
The received data packet is a data packet that conforms to the MPEG-TS encapsulation standard, similar to this
enter image description here

Receive multiple RTP, mix, output RTSP stream

I'm currently trying to receive multiple RTP audio stream, mixing them, and output RTSP stream by using ffmpeg or ffserver.
RTP audio stream is send by Android AudioStream.
Here is code Android side.
AudioManager audio = (AudioManager) getSystemService(Context.AUDIO_SERVICE);
audio.setMode(AudioManager.MODE_IN_COMMUNICATION);
audioGroup = new AudioGroup();
audioGroup.setMode(AudioGroup.MODE_ECHO_SUPPRESSION);
audioStream = new AudioStream(InetAddress.getByAddress(getLocalIPAddress()));
audioStream.setCodec(AudioCodec.PCMU);
audioStream.setMode(RtpStream.MODE_NORMAL);
audioStream.associate(InetAddress.getByName(SipStackAndroid.getRemoteIp()), REMOTE_PORT);
audioStream.join(audioGroup);
Then I prepare server side.
Here is ffserver.conf
HTTPPort 5555
HTTPBindAddress 0.0.0.0
RTSPPort 5454
RTSPBindAddress 0.0.0.0
MaxHTTPConnections 100
MaxClients 1000
MaxBandwidth 10000
CustomLog -
<Feed feed1.ffm>
File /tmp/feed1.ffm
FileMaxSize 500M
</Feed>
<Stream live.wav>
Format rtp
Feed feed1.ffm
AudioBitRate 13
AudioChannels 1
AudioSampleRate 8000
AudioCodec pcm_mulaw
# AudioCodec libmp3lame
NoVideo
</Stream>
And here is ffserver or ffmpeg command.
ffserver -d -f ffserver.conf
ffmpeg -i "rtp://192.168.150.10:12345" -acodec auto http://127.0.0.1:5555/feed1.ffm
I can't solve how to receive multiple rtp stream and how to mix them.
Please give some ideas and actual links where I can find an answer.

Android Audio RTMP publish

Android client through the microphone to admit real-time voice, and the voice stream data sent to the server through RTMP, I want to achieve the client's transmission function, I am thinking is to convert AudioTrack ffmpeg ffmpeg (to AAC PCM), and then sent to the server, but the code is difficult to achieve.
Waiting for your answer,Thanks!
Use voaacenc to encode pcm to aac, you can reference to AAC_E_SAMPLES.c in the project.
Use librtmp rtmpdump.mplayerhq.hu/ to send aac packet, Reference this blog www.codeman.net/2014/01/439.html. (So sory I cannot post more than 2 links)
connect
send meta data
send aac spec data
send aac packets, the pts must increase
disconnect

Android: Stream Camera Data and Write it to Server

I stream webcam data to my client.
I can see the data is arriving by listening on('data'). However, when I create it I am not able to view it and it's probably garbage data or missing some headers. VLC cannot play it.
My next step is to make it real-time streamable to browser.
What am I doing wrong?
net = require('net');
fs = require('fs');
// Start a TCP Server
net.createServer(function (socket) {
console.log("client connected");
var file = fs.createWriteStream("temp.mp4");
socket.pipe(file, {end: false});
socket.on('end', function(){
console.log("ended");
});
}).listen(5000);
I tested to see if did it really capture video output:
$ mediainfo temp.mp4
General
Complete name : temp.mp4
Format : H.263
Format version : H.263
File size : 126 KiB
Video
Format : H.263
Width : pixel0
Height : pixel0
Color space : YUV
Chroma subsampling : 4:2:0
Bit depth : 8 bits
Compression mode : Lossy
And this is the following Android code for setting mediaRecorder (Assume socket is connected, no problem)
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
mediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.DEFAULT);
mediaRecorder.setVideoSize(320, 240);
mediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.DEFAULT);
mediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.DEFAULT);
ParcelFileDescriptor pfd = ParcelFileDescriptor.fromSocket(socket);
mediaRecorder.setOutputFile(pfd.getFileDescriptor());
mediaRecorder.setMaxDuration(5000);
mediaRecorder.setMaxFileSize(5000000);
There are a few open source projects that solve this problem, such as Spydroid (browser/VLC streaming) and Android IP Camera (browser streaming). Your implementation seems similar to Spydroid, so maybe you can adapt some of its code.
The central problem is that MediaRecorder is writing raw video frames into the socket. It needs to wait until the video is finished to write the headers, but they need to appear at the beginning of the file. Since the socket is not seekable, the headers can't be written at the correct location. The projects linked above deal with this problem by packetizing the stream into RTSP (Spydroid) or "streaming" a series of still images to the browser (Android IP Camera).

streaming from Android to VLC

I know similar questions have been asked but I have not been able to find the answer to my SPECIFIC problem.
I am attempting to create a RTSP/RTP video stream from the Android camera to a VLC Player client. I have written a small RTSP server to handle all the the setup and VLC seems to like my parameters. However, after the PLAY command is issued and My app starts sending the video stream (via DatagramPackets) the VLC player does not receive any data.
I am using the jlibrtp library and setup my stream like this
sendSoc = new DatagramSocket(1238);
recSoc = new DatagramSocket(1239);
sess = new RTPSession(sendSoc, recSoc);
FakeClass fc = new FakeClass(); //This implements the RTPAppIntf but all the functions are empty
sess.RTPSessionRegister(fc, null, null);
sess.payloadType(96);
Participant p = new Participant("localhost",1236,1237);
sess.addParticipant(p);
This is the logs I see from VLC player
Opening connection to 192.168.1.221, port 1234...
[0xb1003790] main art finder debug: no art finder module matching "any" could be loaded
[0xb1003790] main art finder debug: TIMER module_need() : 6.331 ms - Total 6.331 ms / 1 intvls (Avg 6.331 ms)
[0x9f653e0] main playlist debug: art not found for rtsp://192.168.1.221:1234
...remote connection opened
Sending request: OPTIONS rtsp://192.168.1.221:1234 RTSP/1.0
CSeq: 2
User-Agent: LibVLC/2.0.1 (LIVE555 Streaming Media v2011.12.23)
Received 76 new bytes of response data.
Received a complete OPTIONS response:
RTSP/1.0 200 OK
CSeq: 2
Public: DESCRIBE, SETUP, TEARDOWN, PLAY, PAUSE
Sending request: DESCRIBE rtsp://192.168.1.221:1234 RTSP/1.0
CSeq: 3
User-Agent: LibVLC/2.0.1 (LIVE555 Streaming Media v2011.12.23)
Accept: application/sdp
Received 240 new bytes of response data.
Received a complete DESCRIBE response:
RTSP/1.0 200 OK
CSeq: 3
Content-Type: application/sdp
v=0
o=- 1343306778867 1343306778867 IN IP4 192.168.1.221
s=Droid Stream
i=Live Stream from Android Camera
t=1343306778873 0
m=video 1236/2 RTP/AVP 96
a=rtpmap:96 H264/9000
[0xb0101190] live555 demux debug: RTP subsession 'video/H264'
Sending request: SETUP rtsp://192.168.1.221:1234/ RTSP/1.0
CSeq: 4
User-Agent: LibVLC/2.0.1 (LIVE555 Streaming Media v2011.12.23)
Transport: RTP/AVP;unicast;client_port=1236-1237
Received 128 new bytes of response data.
Received a complete SETUP response:
RTSP/1.0 200 OK
CSeq: 4
Session: 1343306779273
Transport: RTP/AVP/UDP;unicast;client_port=1236-1237;server_port=1238-1239
[0xb5203c18] main input debug: selecting program id=0
[0xb0101190] live555 demux debug: setup start: 0.000000 stop:0.000000
Sending request: PLAY rtsp://192.168.1.221:1234 RTSP/1.0
CSeq: 5
User-Agent: LibVLC/2.0.1 (LIVE555 Streaming Media v2011.12.23)
Session: 1343306779273
Range: npt=0.000-
Received 71 new bytes of response data.
Received a complete PLAY response:
RTSP/1.0 200 OK
CSeq: 5
Session: 1343306779273
Range: npt=0.000-
.
.
[Snip]
.
.
[0xb5203c18] main input debug: `rtsp://192.168.1.221:1234' successfully opened
[0xb0101190] live555 demux warning: no data received in 10s. Switching to TCP
Sending request: TEARDOWN rtsp://192.168.1.221:1234 RTSP/1.0
CSeq: 6
User-Agent: LibVLC/2.0.1 (LIVE555 Streaming Media v2011.12.23)
Session: 1343306779273
So I don't know what is wrong. VLC should be listening on the android device port 1236 but its not seeing the packets so i dont know where it is listening. Can tell me if this looks right?
Found out the issue was writing my packets to the Android device port 1236 instead of the client device port 1236. So the transport parameter in the SETUP command that states
Transport: RTP/AVP/UDP;unicast;client_port=1236-1237;server_port=1238-1239
Is stating the server (Android phone) will send RTP packets from server port 1238 to client device port 1236. Likewise, RTCP communication will occur between server port 1239 and client device port 1237.
Didn't you try to check session? Does it contain packets on port 1236? Is it possible that your FakeClass must contain any functions which must send data to VLC?

Categories

Resources