Context
I'm creating an Android application playing Media Source Extensions streams using Multimedia Tunneling. I'm using the API call flow as provided by the documentation. Audio part is handled with an AudioTrack. AudioSessionID is shared between the video MediaCodec and AudioTrack. Android SDK version is 26.
Problem
Video is being played correctly but no audio can be heard.
I do not have any error reported by the API.
Buffers are written in OutputBuffer using AudioTrack.write.
Non tunneling playback audio works well.
audio_hal does not produce any error in the logs.
Question
I've looked into the ExoPlayer implementation and I see the use of a sync header before writing the buffer to the AudioTrack in tunneling playback.
ByteBuffer avSyncHeader = ByteBuffer.allocate(16);
avSyncHeader.order(ByteOrder.BIG_ENDIAN);
avSyncHeader.putInt(0x55550001);
avSyncHeader.putInt(4, size);
avSyncHeader.putLong(8, presentationTimeUs * 1000);
avSyncHeader.position(0);
audioTrack.write(avSyncHeader, avSyncHeader.remaining(), WRITE_NON_BLOCKING);
I have tried adding that header too but audio was still not heard.
Is this sync header necessary?
Is there any other non documented requirement for Multimedia Tunneling?
Avsync header is for the level SDK, you can use another AudioTrack.write, to write the every buffer timestamp. It can auto generate the AV sync header.
Use another API, which can write timestamp.
Try:
int write(ByteBuffer audioData, int sizeInBytes, int writeMode, long timestamp)
Writes the audio data to the audio sink for playback in streaming mode on a HW_AV_SYNC track
Related
The AudioRecord class allows recording of phone calls with one of the following options as the recording source:
VOICE_UPLINK: The audio transmitted from your end to the other party. IOW, what you speak into the microphone.
VOICE_DOWNLINK: The audio transmitted from the other party to your end.
VOICE_CALL: VOICE_UPLINK + VOICE_DOWNLINK.
I'd like to build an App that records both VOICE_UPLINK & VOICE_DOWNLINK and identify the source of the voice.
When using VOICE_CALL as the AudioSource option, the UP/DOWN-LINK streams are bundled together in to the received data buffer which makes it hard to identify the source of the voice.
Using two AudioRecords with VOICE_UPLINK & VOICE_DOWNLINK does not work - the second AudioRecord fails to start because the first AudioRecord locks the recording stream.
Is there any creative way to bypass the locking problem presented at case (2), thus enable recording of the VOICE_UPLINK & VOICE_DOWNLINK streams simultaneously and easily identifying the source?
I need to stream audio from external bluetooth device and video from camera to wowza server so that I can then access the live stream through a web app.
I've been able to successfully send other streams to Wowza using the GOCOder library, but as far as I can tell, this library only sends streams that come from the device's camera and mic.
Does anyone have a good suggesting for implementing this?
In the GoCoder Android SDK, the setAudioSource method of WZAudioSource allows you to specify an audio input source other than the default. Here's the relevant API doc for this method:
public void setAudioSource(int audioSource)
Sets the actively configured input device for capturing audio.
Parameters:
audioSource - An identifier for the active audio source. Possible values are those listed at MediaRecorder.AudioSource. The default value is MediaRecorder.AudioSource.CAMCORDER. Note that setting this while audio is actively being captured will have no effect until a new capture session is started. Setting this to an invalid value will cause an error to occur at session begin.
we can make distinguish between audio and video if we use android standard api to implement apk to play music/movie. no matter under libaudioflinger or decoder's lib.
when decode audio/video in awesomeplayer.cpp,we can judge the source data't type,audio? or video?
we can make distinguish the app's type under libaudioflinger
use getCallingPid()
Question:
how can we make distinguish 3rd's data source type(Audio?video?)under audioflinger?
yes audioflinger process the pcm data .
However if you want to set some parameters from Application then you can use AudioManager's setParametes API and then have a handling for that parameter in AudioFlinger .
AudioManager am = (AudioManager)context.getSystemService(context.AUDIO_SERVICE);
am.setParameters("key_value_pair");
I'm creating an Android application of live video streaming between two android phone. I've already established a socket connection between these devices. I'm capturing video in one device and send the stream to other device but currently I just want to save in the receiver side mobile device and save it. I'm recording using MediaRecorder in one device , so to stream to the receiver I,m using parcelfiledescriptor object by setting the data
Client side code
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
mediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
mediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.H263);
mediaRecorder.setOutputFile(pfd.getFileDescriptor());
Receiver side code
pfd= ParcelFileDescriptor.fromSocket(s);
InputStream in = new FileInputStream(pfd.getFileDescriptor());
DataInputStream clientData = new DataInputStream(in);
OutputStream newDatabase = new FileOutputStream(file);
int available=in.available();
byte[] buffer = new byte[available];
int length;
while((length = in.read(buffer)) > 0)
{
newDatabase.write(buffer, 0, length);
}
newDatabase.close();
The video file is being created on the receiver side mobile, but it's not able to receive any bytes. So Do I've to decode the coming stream on the receiver side since the video stream sent is encoded while recording. So how can I decode the stream that is received ? I found some solution like MediaExtractor and MediaCodec...but will this work with live video capturing and moreover I'm testing on android version 2.3.6 GingerBread
Is it possible to decode the video stream from MediaCodec for version 2.3.6 or some other method is available ?
The video file is being created on the receiver side mobile, but it's not able to receive any bytes.
If I understand you right, you are getting no data from the socket. That is a separate problem, which has nothing to do with the video format, decoding or encoding.
To debug your sockets, it may be helpful to use a separate application which just dumps the recieved data. Once the data looks fine, you can go to the next step - decoding the video.
Second part of the problem is the video format. You are using mp4, which is not usable for streaming. Here is more info about the format structure. You can use mp4 to record a video into a local file and then transfer the whole file over socket somewhere, but true realtime streaming cannot be done because of the non-seekable nature of the socket (as described in the linked article). There is a block of metadata at the beginning of the file, which acts as a "table of contents" and without it, the previous data are just junk. The problem is, you can assemble a "table of contents" only after you got all the contents. But at that moment, the data was already sent through the socket and you cannot insert anything at its beginning.
There are few walkarounds, but that's just for your future research and I haven't used them yet.
The most intuitive way would be to switch from mp4 to mpeg-ts, a container designed for streaming. Take a look at a hidden constant in MediaRecorder.OutputFormat with value 8.
Another option is to pack the raw H.264 data into RTP/RTCP packets, which is again a protocol designed for streaming. Also your application would be able to stream to any device that support this protocol (for example a PC running VLC). To further reasearch, take a look at Spydroid IP camera, which does exactly the thing.
I'm developing an android app that will play mp3 files. However the mp3 files are encrypted on the sd card or sqlite. Either ways, after decryption, i'll have a stream of bytes. How do i play them? MediaPlayer does not take inputstream as parameter, so i cannot consider that.
I think you need to store the stream to file system; then you could try using setDataSource method of MediaPlayer
FileInputStream rawmp3file= new FileInputStream(yourByteArrayAsMp3File);
mediaPlayer.setDataSource(rawmp3file.getFD());
If you could switch to PCM audio source, have a look at AudioTrack class.
The AudioTrack class manages and plays
a single audio resource for Java
applications. It allows to stream PCM
audio buffers to the audio hardware
for playback. This is achieved by
"pushing" the data to the AudioTrack
object using one of the write(byte[],
int, int) and write(short[], int, int)
methods.
This won't be easy. I've done it before on BlackBerry so I bet it's doable on Android too.
If I were you I'd register a new provider for the a custom "encryptedmp3:" protocol. Then I would specify this in a data source for the MediaPlayer:
mediaplayer.setDataSource(this, URI.parse("encryptedmp3://....yourfile"));
I'm not sure how to create a new protocol handler on Android.
I hope this helps a little bit.
Emmanuel
Take a look at the BASS library (free for non-commercial use).
Here is a link for Android: http://www.un4seen.com/forum/?topic=13225.0
There is BASS_StreamCreateFile function:
HSTREAM BASS_StreamCreateFile(
BOOL mem,
void *file,
QWORD offset,
QWORD length,
DWORD flags
);
Parameters:
mem TRUE = stream the file from memory.
file Filename (mem = FALSE) or a memory location (mem = TRUE).