I have an MPEG2-transport file, and using the newfangled MediaCodec API in Android, I can play the video frames. I'm using MediaExtractor to load the file, and call getTrackFormat to get a MediaFormat. The reported MIME type for the format is "video/avc".
Is there a straightforward way to play the audio at the same time, and make sure the video and audio are synchronized?
I'm aware of the SimplePlayer sample code for stagefright, but that's in C and uses undocumented interfaces. Can it be done in Java?
Related
I have to get video/audio tracks or if it's possible MediaStream object from the Android WebView which plays HLS stream ("m3m8").
I load HLS stream in the WebView using method loadUrl("...m3m8"). It's running without issues, but i can't figure out how to get live video and audio tracks from the HLS stream. I read a lot of articles and I was not able to find any solution. So my question is - Is it possible to get audio and video tracks from the running HLS stream on the WebView? I need to get the audio and video tracks because I should send them via PeerConnection(WebRTC) which accepts MediaStream or audio tracks and video tracks. Any ideas and examples will be helpful. Thank you in advance.
HLS works by defining a playlist (your .m3u8 file) which points to chunks of the video (say segment00000.ts, segment00001.ts, ...). It can contain different streams for different resolutions, but the idea (very simplified) is that your player can download a chunk, play it right away, and download the next one in the meantime.
A WebRTC video track takes RTP packets. I don't know exactly what the Android API exposes, but it feels like either you can pass it an m3u8 as a source (though that may be a bit weird for a generic API), or you can't, and you have to read the HLS stream yourself, extract RTP frames, and pass them to WebRTC.
For instance, gstreamer's hlsdemux seems to be able to read from an HLS source. From there you could extract RTP frames and feed them to your WebRTC track.
Probably that's a bit lower-level than you hoped, but if the Android API doesn't do it for you, you have to do it yourself (or find a library that does it).
There is some good documentation on this site called big flake about how to use media muxer and mediacodec to encode then decode video as mp4, or extract video then encode it again and more stuff.
But it doesnt seem that there is a way to encode audio with video at the same time, no documentation or code about this. It doesn't seem impossible.
Question
Do you know any stable way of doing it that will work on all devices greater than android 18?
Why no one implemented it, is it hard to implement?
You have to create 2 Mediacodec instances, one for video and one for audio and then use MediaMuxer to mux the video with audio after encoding, you can take a look at ExtractDecodeEditEncodeMuxTest.java and at this project to capture camera/mic and save to mp4 file using Mediamuxer and Mediacodec
I am able to record(encode) video with the help of MediaCodec and MediaMuxer. Next, I need to work on audio part and mux audio with video with help of MediaCodec and MediaMuxer.
I am facing two problems:
How to encode audio with MediaCodec. Do I need to encode audio and
video in separate threads?
How can I pass audio and video data to MediaMuxer (as
writeSampleData() method takes only one type of data at a time)?
I referred to MediaMuxerTest but it is using MediaExtractor. I need to use MediaCodec as video encoding is done with MediaCodec. Please correct me if I am wrong.
Any suggestion or advice will be very helpful as there is no proper documentation available for these new APIs.
Note:
My app is targeting to API 18+ (Android 4.3+).
I have referred Grafika for video encoding.
No, you don't necessarily need a separate thread for audio, just use two separate MediaCodec instances.
The first parameter of writeSampleData is trackIndex, which allows you to specify which track each packet corresponds to. (By running addTrack twice, once for each track, you get two separate track IDs.)
I have already developed Streaming Audio application using MediaPlayer API. All the features are works fine, expect it takes more time to start the playback (buffer time is more).
i want to add recording live audio stream(save the live stream data in disk, not the recording from MIC). as MediaPlayer does not provide any API to access the raw data stream, i am planning to build custom audio player.
i want to control the buffering time, access to the raw audio stream, should able to play all the audio format which are supported in android natively. which api (libPd or OpenSL ES or AudioTrack) will be suitable to build the custom audio player in Android?
In my experience OpenSL_ES would be the choice, here there is a Link that explains how to do audio streaming that you may find useful. bufferframes determines how many samples you will collect before playing, so smaller bufferframes faster response time, but you have to balance that with your device processing capabilities.
You can record with libpd (pd-for-android) too.
All the recording process is managed by libpd.
Check the ScenePlayer project, it uses libpd and lets you record audio into a folder on sdcard:
https://github.com/libpd/pd-for-android/tree/master/ScenePlayer
I have a Vorbis stream that I can decode to PCM if necessary, and I have a raw h264 stream all three of which are supported by Android when in a container. Wondering if there is any way to manually feed video and audio samples into the android MediaPlayer without any container. I would imagine I would have to override methods within the MediaPlayer. Does anyone have experience with this or have an easier way to do this? I can't imagine its impossible...
You may be able to play the audio pcm samples (For that also, I guess you may have to put a wav header) but you may not be able to play the H264 elementary stream without a container. In the media framework (Stagefright), there are sniffers functions registered for various container formats which could tell what kind of parser need to be used and then extractor(parser) is created. I don't think that you will be able to play the H264 elementary stream from the application using the inbuilt media framework of Android.