Unable to access frames by FFmpegMediaMetadataRetriever using OPTION_CLOSEST - android

I have an mp4 video on my device whose frames I'm trying to extract using FFmpegMediaMetadataRetriever. I am able to retrieve a frame by
mmr.getFrameAtTime(time*1000000L)
I am trying to retrieve multiple frames in a time range using
for(i in 0..frames step 1 ){
val frame = mmr.getFrameAtTime(i*1000000L/fps!!, FFmpegMediaMetadataRetriever.OPTION_CLOSEST)
if(frame != null)
bitMapList.add(frame)
}
Here I am getting all the frames as null when using the FFmpegMediaMetadataRetriever.OPTION_CLOSEST
When I omit the option, I get frames, all being same.
The fps is retrieved using
fps = mmr.extractMetadata(FFmpegMediaMetadataRetriever.METADATA_KEY_FRAMERATE).toInt()
Can someone please guide me how to go about correctly here?

Related

WebRTC ScreenCapturerAndroid

I'm trying to create a screen capturer app for android. I already have the webRTC portion setup with a video capturer using Camera2Enumerator library from here. How can I modify this to create a pre-recorded video capturer instead of camera capturer.
Thanks!
Just wanted to give an update that I have solved this. I'm unable to share the entire code but here's a process that might help:
Acquire one frame of your pre-recorded file and store in a byte array(must be in YUV format)
Replace the VideoCapturer() with the following:
fun onGetFrame(p0: ByteArray?) {
var timestampNS = java.util.concurrent.TimeUnit.MILLISECONDS.toNanos(SystemClock.elapsedRealtime())
var buffer:NV21Buffer = NV21Buffer(p0,288,352,null)
var videoFrame:VideoFrame = VideoFrame(buffer,0,timestampNS)
localVideoSource.capturerObserver.onFrameCaptured(videoFrame)
videoFrame.release()
}
where p0 is the byte array with the frame
Call this function in startLocalVideoCapture() using a timer (every few milliseconds...I used 10 nanoseconds) https://developer.android.com/reference/android/os/CountDownTimer
remove this line in startLocalVideoCapture()--->
VideoCapturer.initialize(
surfaceTextureHelper,
localVideoOutput.context,
localVideoSource.capturerObserver)

The most efficient way to generate mp4 video containing looped boomerang-alike video file?

I've developed an Android app that allows user to create boomerang-alike mp4 video. This video consists of 10 still images being played back and forth quite fast. I know that such video (boomerang effect) can be easily looped from single video file while playing it, but I really need to create a mp4 video that would essentially contain already prepared boomerang video. The output video can be downloaded and played by user on any external player (over which obviously I don't have any control).
For that purpose currently I create a video from images in a loop. The loop starts from 1st picture and goes to 10th picture with 0.25 sec delay between frames, then goes back from 10th to 1st including delay. And there is 5 of those loops, which essentialy means creating a single video from 5 * 10 * 2 = 100 images. I know it's kinda ridiculous, so the time that it takes to prepare this video is riduculous as well (around 1:40 min).
What solution could you recommend assuming that the output video really has to consist of 5 loops back-and-forth? I've thought about creating single loop video (20 pictures) and then create final output video by concatenating it 5 times. But could it be any good? I'm trying to find an efficient yet understandable for a beginner Android programmer way.
You can use FFMPEG to Create boomerang like video below is a simple example code :-
ffmpeg -i input_loop.mp4 -filter_complex "[0]reverse[r];[0][r]concat,loop=5:250,setpts=N/55/TB" output_looped_video.mp4
1.5 seconds of video file as input named input_loop.mp4
n loop=5:250, 5 is number of loops, 250 is frame rate x double length of clip. The setpts is applied to avoid frame drops, and the value 25 should be replaced with the framerate of the clip
setpts=N/<VALUE>/TB" you can alter value according to your need
increase value to speed up boomerang effect
decrease value to slow down boomerang effect
I was looking for a way to create a boomerang video and found a pretty cool example of how to do it on GitHub.
You create the video by using the FFMPEG library org.bytedeco.javacpp-presets to clone the frames.
https://github.com/trantrungduc/boomerang-android
This is the place in code in which you can customize the video loop:
for (int k = 0; k < 3; k++) {
for (Frame frame1 : loop) {
frecorder.record(frame1);
}
for (int i=loop.size()-1;i>=0;i--){
frecorder.record(loop.get(i));
}
}

Android: ExoPlayer - Get current frame number from video

I'm developing an Android video app where I need to get the current frame number of the video being displayed while in pause mode.
I need to send my Server the frame number currently paused in video and get back a list of items regarding that frame/time, right now I'm sending the current paused time in milliseconds, but it doesn't work quite well, because the Server compare the time sent to a specific frame it calculated, based on the time, but sometimes the comparison is not exact.
I know you can get a bitmap from that frame if you use MediaMetaDataRetriever, and I did it but it returns bitmap image and what I need is an index.
I'm using ExoPlayer (I need that feature for MP4 and for HLS, too, if that matters).
Is there a way to get that info from the video?
I post a solution to my problem, In order to get the exact frame time I simply extended MediaCodecVideoTrackRenderer.java class from ExoPlayer library and used the value of lastOutputBufferTimestamp which is in function:
#Override
protected boolean processOutputBuffer(long positionUs, long elapsedRealtimeUs,
MediaCodec codec, ByteBuffer buffer, MediaCodec.BufferInfo bufferInfo, int bufferIndex,
boolean shouldSkip) {
boolean processed = super.processOutputBuffer(positionUs, elapsedRealtimeUs, codec, buffer,
bufferInfo, bufferIndex, shouldSkip);
if (!shouldSkip && processed) {
lastOutputBufferTimestamp = bufferInfo.presentationTimeUs;
}
return processed;
}
It does give me the exact time and not a rounded time from, last say, mPlayer.getDuration() or something like that.
If you have a constant FPS in your video you can calculate that by division and get the number of the frame.
It was simply enough for me to know the exact frame time.
I'm using ExoPlayer version r1.5.3 so I don't know if this solution will work for newer version since code has probably changed.

MediaCodec audio/video muxing issues ond Android

I am transcoding videos based on the example given by Google (https://android.googlesource.com/platform/cts/+/master/tests/tests/media/src/android/media/cts/ExtractDecodeEditEncodeMuxTest.java)
Basically, transocding of MP4 files works, but on some phones I get some weird results. If for example I transcode a video with audio on an HTC One, the code won't give any errors but the file cannot play afterward on the phone. If I have a 10 seconds video it jumps to almost the last second and you only here some crackling noise. If you play the video with VLC the audio track is completely muted.
I did not alter the code in terms of encoding/decoding and the same code gives correct results on a Nexus 5 or MotoX for example.
Anybody having an idea why it might fail on that specific device?
Best regard and thank you,
Florian
I made it work in Android 4.4.2 devices by following changes:
Set AAC profile to AACObjectLC instead of AACObjectHE
private static final int OUTPUT_AUDIO_AAC_PROFILE = MediaCodecInfo.CodecProfileLevel.AACObjectLC;
During creation of output audio format, use sample rate and channel count of input format instead of fixed values
MediaFormat outputAudioFormat = MediaFormat.createAudioFormat(OUTPUT_AUDIO_MIME_TYPE,
inputFormat.getInteger(MediaFormat.KEY_SAMPLE_RATE),
inputFormat.getInteger(MediaFormat.KEY_CHANNEL_COUNT));
Put a check just before audio muxing audio track to control presentation timestamps. (To avoid timestampUs X < lastTimestampUs X for Audio track error)
if (audioPresentationTimeUsLast == 0) { // Defined in the begining of method
audioPresentationTimeUsLast = audioEncoderOutputBufferInfo.presentationTimeUs;
} else {
if (audioPresentationTimeUsLast > audioEncoderOutputBufferInfo.presentationTimeUs) {
audioEncoderOutputBufferInfo.presentationTimeUs = audioPresentationTimeUsLast + 1;
}
audioPresentationTimeUsLast = audioEncoderOutputBufferInfo.presentationTimeUs;
}
// Write data
if (audioEncoderOutputBufferInfo.size != 0) {
muxer.writeSampleData(outputAudioTrack, encoderOutputBuffer, audioEncoderOutputBufferInfo);
}
Hope this helps...
If original CTS tests fail you need to go to device vendors and ask for fixes

Live encode H.264 stream on Android

I am writing an Android app where I plan to encode several images in to a live h.264 video stream that can be replayed on any browser. I am using the MediaCodec API for encoding and then MediaMuxer to write it to a file as per the example here http://bigflake.com/mediacodec/.
What I am stuck with is that how to tell the encoder/muxer to encode it such that it can be progressively played back. From the examples only when the encoder/muxer.stop()/encoder/muxer.release() call is made, then the video file gets the right meta headers, etc..
Thanks
I guess you are considering the time at which each frame is shown.
You need to give MediaMuxer, along with the frame, the right "MediaCodec.BufferInfo," whose "presentationTimeUs" is set accordingly.
For example, there are 3 frames, each is shown for 1 second in the video:
sec 0---------1---------2-----------
frame1 frame2 frame3
int[] timestampSec = {0, 1, 2};
for (int i = 0; i < 3; i++) {
muxer.writeSampleData(trackId,
frame[i],
timeStampSec[i] * 1000000);
}
As to initialization and ending of MediaMuxer:
addTrack: when you get index == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED when you call MediaCodec.dequeueOutputBuffer(), send the format to MediaMuxer to initialize a new track for this format(it's "vidio/avc" in this case).
mediamuxer.start()
start put frames as above
mediamuxer.stop(), release()

Categories

Resources