Video call recording in Twilio Android SDK - android

I know Twilio doesn't support video call recording on server but I've been trying to figure out how to do it locally on the android end. I have studied the video-quickstart-android code in my try to figure out how i can extract the video stream from the LocalVideoTrack and VideoTrack classes of the Twilio android conversations API but couldn't find any such method from where i could extract the underlying Video Stream and record it locally on the android device.
Anyone have any idea how I can get video stream for recording the video locally on the android device from Twilio conversations api for android?

You would have to write a custom video renderer that takes each frame and converts them into your preferred media format.
As an example the VideoViewRenderer takes frames and passes them to the org.webrtc.SurfaceViewRenderer, rendering them to a View. In this case you would write another renderer, perhaps named VideoRecorderRenderer that implemented the VideoRenderer interface and did the work of taking each I420Frame and converting to a media type. You could then add the VideoRecorderRenderer to the VideoTrack. However, this alone may not be the solution you are looking for since this is only the video portion of the media, and does not provide the audio. The AudioTrack does not expose an interface to capture the audio output at the moment.

Related

WebView getVideo/Audio tracks or MediaStream and send it to the server via WebRTC

I have to get video/audio tracks or if it's possible MediaStream object from the Android WebView which plays HLS stream ("m3m8").
I load HLS stream in the WebView using method loadUrl("...m3m8"). It's running without issues, but i can't figure out how to get live video and audio tracks from the HLS stream. I read a lot of articles and I was not able to find any solution. So my question is - Is it possible to get audio and video tracks from the running HLS stream on the WebView? I need to get the audio and video tracks because I should send them via PeerConnection(WebRTC) which accepts MediaStream or audio tracks and video tracks. Any ideas and examples will be helpful. Thank you in advance.
HLS works by defining a playlist (your .m3u8 file) which points to chunks of the video (say segment00000.ts, segment00001.ts, ...). It can contain different streams for different resolutions, but the idea (very simplified) is that your player can download a chunk, play it right away, and download the next one in the meantime.
A WebRTC video track takes RTP packets. I don't know exactly what the Android API exposes, but it feels like either you can pass it an m3u8 as a source (though that may be a bit weird for a generic API), or you can't, and you have to read the HLS stream yourself, extract RTP frames, and pass them to WebRTC.
For instance, gstreamer's hlsdemux seems to be able to read from an HLS source. From there you could extract RTP frames and feed them to your WebRTC track.
Probably that's a bit lower-level than you hoped, but if the Android API doesn't do it for you, you have to do it yourself (or find a library that does it).

Playing audio with video using MediaDecoder

I have an MPEG2-transport file, and using the newfangled MediaCodec API in Android, I can play the video frames. I'm using MediaExtractor to load the file, and call getTrackFormat to get a MediaFormat. The reported MIME type for the format is "video/avc".
Is there a straightforward way to play the audio at the same time, and make sure the video and audio are synchronized?
I'm aware of the SimplePlayer sample code for stagefright, but that's in C and uses undocumented interfaces. Can it be done in Java?

Android custom audio player to replace MediaPlayer with libPd or OpenSL ES or AudioTrack

I have already developed Streaming Audio application using MediaPlayer API. All the features are works fine, expect it takes more time to start the playback (buffer time is more).
i want to add recording live audio stream(save the live stream data in disk, not the recording from MIC). as MediaPlayer does not provide any API to access the raw data stream, i am planning to build custom audio player.
i want to control the buffering time, access to the raw audio stream, should able to play all the audio format which are supported in android natively. which api (libPd or OpenSL ES or AudioTrack) will be suitable to build the custom audio player in Android?
In my experience OpenSL_ES would be the choice, here there is a Link that explains how to do audio streaming that you may find useful. bufferframes determines how many samples you will collect before playing, so smaller bufferframes faster response time, but you have to balance that with your device processing capabilities.
You can record with libpd (pd-for-android) too.
All the recording process is managed by libpd.
Check the ScenePlayer project, it uses libpd and lets you record audio into a folder on sdcard:
https://github.com/libpd/pd-for-android/tree/master/ScenePlayer

Playing video from custom live stream (Android)

My android app needs to play a live video from a remote RTMP server (Adobe Flash Media Server).
As android's android.media.MediaPlayer doesn't support rtmp protocol, I found a library which provides me with streaming functionality. I.e. I can connect and receive portions of video stream as byte array.
The question is how I can use this incoming video stream data to display it in a view?
It seems that current standard API doesn't allow me to do that. MediaPlayer accepts either file or url. For audio data there is an AudioTrack which allows to achieve similar goal to mine, but for audio. For video I don't see an option.
Any suggestions are appreciated, maybe there is a third party media library which may provide the functionality for Android.

How to capture output stream of audio in Android?

I am a newbie in development and I trying to create an equalizer on Android platform.
How I can capture output audio stream on android? I just need to take audio information that goes out from my application.
(I already searched www.developers.android.com and i have not found any information)
There's currently no functionality in Android for recording audio output (well, there's the Visualizer API that let's you grab partial, low-quality audio for audio visualization purposes).
If you only need to apply the effect to the audio from your own app then you could do the "recording" internally. I.e., in your app, send the decoded audio data to your effect to be processed, and then send the processed data to your AudioTrack or OpenSL ES buffer queue.

Categories

Resources