I have an android app which plays audio from server by http streaming. Server supports range headers and provides response based on range requests.
Can android media player request with range header , like chrome player does?
We can use SetDataSource for adding header parameters - but is there any way to get current byte position , which mediaplayer is playing or how can we do current time to byte calculation
Thanks
Related
Context
I'm creating an Android application playing Media Source Extensions streams using Multimedia Tunneling. I'm using the API call flow as provided by the documentation. Audio part is handled with an AudioTrack. AudioSessionID is shared between the video MediaCodec and AudioTrack. Android SDK version is 26.
Problem
Video is being played correctly but no audio can be heard.
I do not have any error reported by the API.
Buffers are written in OutputBuffer using AudioTrack.write.
Non tunneling playback audio works well.
audio_hal does not produce any error in the logs.
Question
I've looked into the ExoPlayer implementation and I see the use of a sync header before writing the buffer to the AudioTrack in tunneling playback.
ByteBuffer avSyncHeader = ByteBuffer.allocate(16);
avSyncHeader.order(ByteOrder.BIG_ENDIAN);
avSyncHeader.putInt(0x55550001);
avSyncHeader.putInt(4, size);
avSyncHeader.putLong(8, presentationTimeUs * 1000);
avSyncHeader.position(0);
audioTrack.write(avSyncHeader, avSyncHeader.remaining(), WRITE_NON_BLOCKING);
I have tried adding that header too but audio was still not heard.
Is this sync header necessary?
Is there any other non documented requirement for Multimedia Tunneling?
Avsync header is for the level SDK, you can use another AudioTrack.write, to write the every buffer timestamp. It can auto generate the AV sync header.
Use another API, which can write timestamp.
Try:
int write(ByteBuffer audioData, int sizeInBytes, int writeMode, long timestamp)
Writes the audio data to the audio sink for playback in streaming mode on a HW_AV_SYNC track
I use ExoPlayer library for my Hls live broadcast.
I have started a live stream with wirecast. Broadcast is in the second hour. But when test the stream with my android application, it always starts at 0. How to I get real duration on server with Exoplayer?
I need to stream audio from external bluetooth device and video from camera to wowza server so that I can then access the live stream through a web app.
I've been able to successfully send other streams to Wowza using the GOCOder library, but as far as I can tell, this library only sends streams that come from the device's camera and mic.
Does anyone have a good suggesting for implementing this?
In the GoCoder Android SDK, the setAudioSource method of WZAudioSource allows you to specify an audio input source other than the default. Here's the relevant API doc for this method:
public void setAudioSource(int audioSource)
Sets the actively configured input device for capturing audio.
Parameters:
audioSource - An identifier for the active audio source. Possible values are those listed at MediaRecorder.AudioSource. The default value is MediaRecorder.AudioSource.CAMCORDER. Note that setting this while audio is actively being captured will have no effect until a new capture session is started. Setting this to an invalid value will cause an error to occur at session begin.
we can make distinguish between audio and video if we use android standard api to implement apk to play music/movie. no matter under libaudioflinger or decoder's lib.
when decode audio/video in awesomeplayer.cpp,we can judge the source data't type,audio? or video?
we can make distinguish the app's type under libaudioflinger
use getCallingPid()
Question:
how can we make distinguish 3rd's data source type(Audio?video?)under audioflinger?
yes audioflinger process the pcm data .
However if you want to set some parameters from Application then you can use AudioManager's setParametes API and then have a handling for that parameter in AudioFlinger .
AudioManager am = (AudioManager)context.getSystemService(context.AUDIO_SERVICE);
am.setParameters("key_value_pair");
I have a android application that plays HLS (Http Live Streaming) videos using VideoView.
I am using Local http proxy for forwarding http requests from VideoView to main HLS server as my stream (transport segment) is encrypted.
Flow of my application:
0. Preparing VideoView with local proxy url. ex. "http:// localhost :9878/index.m3u8"
1. VideoView sends requests to my proxy for M3U8 and ts segments.
2. Proxy forwards requests for M3u8 and ts from VideoView to HLS server.
3. Proxy checks for transport stream requests and before sending response to VideoView decrypts transport stream and sends it to VideoView.
4. VideoView plays the video stream.
This is working properly but some times i get following error:
output buffer is smaller than decoded data size Out Length
When i get this error in logcat my video becomes garbage (green video)
I see this issue usually when video stream bitrate size increases, is there any workaround for this issue?