Android client through the microphone to admit real-time voice, and the voice stream data sent to the server through RTMP, I want to achieve the client's transmission function, I am thinking is to convert AudioTrack ffmpeg ffmpeg (to AAC PCM), and then sent to the server, but the code is difficult to achieve.
Waiting for your answer,Thanks!
Use voaacenc to encode pcm to aac, you can reference to AAC_E_SAMPLES.c in the project.
Use librtmp rtmpdump.mplayerhq.hu/ to send aac packet, Reference this blog www.codeman.net/2014/01/439.html. (So sory I cannot post more than 2 links)
connect
send meta data
send aac spec data
send aac packets, the pts must increase
disconnect
Related
I am making an application similar to DVB broadcast TV player based on Android. The data I can receive is a series of MPEG-TS packets, each packet may be 188 bytes or 204 bytes. This is not HLS and does not carry There are m3u8 files. May I ask how can I decode these MPEG-TS input streams on Android and play them?
The received data packet is a data packet that conforms to the MPEG-TS encapsulation standard, similar to this
enter image description here
How can I get and convert the remote Audio stream data on Android? Preferably in the form of byte array buffers. I would then upload the data to some third party service to process.
I've seen something similar done on Web here and here with the help of MediaStream Recording API, where they used MediaRecorder as the media to get the buffered data from MediaStream into an array. How can I do something similar on Android?
I know that it's possible to obtain the local audio data by setting a listener that implements SamplesReadyCallback in the setSamplesReadyCallback() method when creating the local AudioDeviceModule used for capturing local audio. This method doesn't really involve WebRTC's PeerConnection, it's essentially just setting a callback listener on the local audio data. Worst case scenario, I can just retrieve the local audio data by this method and send them off to the remote side via DataChannel, but I would still like a more ideal solution that can avoid sending essentially the same data twice.
I need to stream audio from external bluetooth device and video from camera to wowza server so that I can then access the live stream through a web app.
I've been able to successfully send other streams to Wowza using the GOCOder library, but as far as I can tell, this library only sends streams that come from the device's camera and mic.
Does anyone have a good suggesting for implementing this?
In the GoCoder Android SDK, the setAudioSource method of WZAudioSource allows you to specify an audio input source other than the default. Here's the relevant API doc for this method:
public void setAudioSource(int audioSource)
Sets the actively configured input device for capturing audio.
Parameters:
audioSource - An identifier for the active audio source. Possible values are those listed at MediaRecorder.AudioSource. The default value is MediaRecorder.AudioSource.CAMCORDER. Note that setting this while audio is actively being captured will have no effect until a new capture session is started. Setting this to an invalid value will cause an error to occur at session begin.
I'm creating an Android application of live video streaming between two android phone. I've already established a socket connection between these devices. I'm capturing video in one device and send the stream to other device but currently I just want to save in the receiver side mobile device and save it. I'm recording using MediaRecorder in one device , so to stream to the receiver I,m using parcelfiledescriptor object by setting the data
Client side code
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
mediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
mediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.H263);
mediaRecorder.setOutputFile(pfd.getFileDescriptor());
Receiver side code
pfd= ParcelFileDescriptor.fromSocket(s);
InputStream in = new FileInputStream(pfd.getFileDescriptor());
DataInputStream clientData = new DataInputStream(in);
OutputStream newDatabase = new FileOutputStream(file);
int available=in.available();
byte[] buffer = new byte[available];
int length;
while((length = in.read(buffer)) > 0)
{
newDatabase.write(buffer, 0, length);
}
newDatabase.close();
The video file is being created on the receiver side mobile, but it's not able to receive any bytes. So Do I've to decode the coming stream on the receiver side since the video stream sent is encoded while recording. So how can I decode the stream that is received ? I found some solution like MediaExtractor and MediaCodec...but will this work with live video capturing and moreover I'm testing on android version 2.3.6 GingerBread
Is it possible to decode the video stream from MediaCodec for version 2.3.6 or some other method is available ?
The video file is being created on the receiver side mobile, but it's not able to receive any bytes.
If I understand you right, you are getting no data from the socket. That is a separate problem, which has nothing to do with the video format, decoding or encoding.
To debug your sockets, it may be helpful to use a separate application which just dumps the recieved data. Once the data looks fine, you can go to the next step - decoding the video.
Second part of the problem is the video format. You are using mp4, which is not usable for streaming. Here is more info about the format structure. You can use mp4 to record a video into a local file and then transfer the whole file over socket somewhere, but true realtime streaming cannot be done because of the non-seekable nature of the socket (as described in the linked article). There is a block of metadata at the beginning of the file, which acts as a "table of contents" and without it, the previous data are just junk. The problem is, you can assemble a "table of contents" only after you got all the contents. But at that moment, the data was already sent through the socket and you cannot insert anything at its beginning.
There are few walkarounds, but that's just for your future research and I haven't used them yet.
The most intuitive way would be to switch from mp4 to mpeg-ts, a container designed for streaming. Take a look at a hidden constant in MediaRecorder.OutputFormat with value 8.
Another option is to pack the raw H.264 data into RTP/RTCP packets, which is again a protocol designed for streaming. Also your application would be able to stream to any device that support this protocol (for example a PC running VLC). To further reasearch, take a look at Spydroid IP camera, which does exactly the thing.
I am using Android 4.1.2 on Galaxy S3. Currently android mediaplayer always tries RTSP UDP (RTP/AVP/UDP) method to connect with RTSP server.
If Android MediaPlayer does not receive the data on its UDP ports..it timesout and then tries RTSP TCP interleaved (RTP/AVP/TCP). This is fine but it introduces delay of 10 secs or so. I want to avoid this delay, and force Android MediaPlayer to always use RTSP TCP interleaved (RTP/AVP/TCP) for all or specific URL's.
I tried suggestion given in Here to send 461 or 400 error response code to SETUP request. But it seems mediaplayer does not care about the response, and sends SETUP command for both tracks, and then just hangs the connection.
How can I resolve this issue ?
I'm using VLC instead of the native one.
Read the Living555 source code pls.
You can specify the Transport: RAW/RAW/UDP field in the SETUP request to choose what protocal to use.
i might be wrong but AFAIK android mediaplayer does not support RTP over TCP.