Record IFrames only in Android - android

My app requires to record a video to capture the IFrames only (and not the P/B frames).
I am making use of the Media Recorder to capture the video (Not using cameraX as I need HEVC and camerax is recording in H.264).
The 'MediaRecorder.MetricsConstants.VIDEO_IFRAME_INTERVAL' is set to 1, by default - hence I get an IFrame every 1 sec and the rest of the frames in a second are 'P'. I need the VIDEO_IFRAME_INTERVAL constant to be set to 0, inorder to make all the frames 'I'. I have tried different ways to set it. Any help would be much appreciated.
I have tried setting it this way:
mRecorder.metrics.deepCopy().apply {
this.putInt(MediaRecorder.MetricsConstants.VIDEO_IFRAME_INTERVAL, 0)
}
Even if I set it to '2', '5' or '10' it doesn't seem to be having an effect.

Related

How much Hz/Time is covered by a getWaveForm or getFft call by Visualizer?

I have some trouble understanding the Android Visualizer interface: if I call getFft or getWaveForm on the Android Visualizer how much time/Hz is covered in the returned ByteArray.
If I would use the setDataCaptureListener, I assume that the returned buffer would relate to the samplingRate Param -> so if I would set the CaptureRate to 20Hz, the return ByteArray would cover 0.05s of the current playing song. If I would set 1Hz, it would cover 1s of the current playing song. Is this correct?
But how does it behave in terms of getWaveForm and getFft?

Seek field of video node is not working properly

I am working on video player using video node of Scene Graph Component. My issue is when i set seek field like
m.video.seek = 20 it's start playing from 15 seconds or 18 seconds but not from exact position 20 seconds. My code is...
inner = createObject("RoSGNode", "ContentNode")
inner.url = "http://-------------.m3u8"
inner.streamformat = "hls"
inner.SwitchingStrategy = "full-adaptation"
Video file format is m3u8 and using ui_resolutions=fhd in manifest file of roku app.
Is this issue related to stream format or others? Please help me.
This is correct behavior you have "hls" streamformat, this is format specific, hls video stream is divided into chunks, so when you set seek, video will start from chunk start not from the middle of this chunk.
I solved this issue by setting video component attribute seekMode to "accurate"
<Video id="videoPlayer" visible="true" translation="[0, 0]" width="1920" height="1080" seekMode = "accurate" enableTrickPlay="true" enableUI="false"/>

Adjust Dash stream volume with Android Exoplayer

I'm trying to set up a seekbar to control the level of an instance of exoplayer streaming dash.
The setup I am using is a modified version of the demo project and am having trouble figuring out which element I should be trying to affect with the seekbars output ie how to use MSG_SET_VOLUME properly etc etc
Any input would be massively appreciated.
The end result I am looking for is an application with two instances of exoplayer both streaming dash content with a fader(seekbar) controlling the mix of the two players (which once this is figured out i presume should be easy enough if the maths is correct)
Again any help would be massively appreciated I have had a bit of a time with Exoplayer so far being such a novice! thanks guys!
If I read the ExoPlayer source code correctly you have to keep references to the audioRenderers you use when preparing the ExoPlayer instance.
exoPlayer.prepare(audioRenderer);
To change volume you have to send the following message:
exoPlayer.sendMessage(audioRenderer, MediaCodecAudioTrackRenderer.MSG_SET_VOLUME, 0.1f);
First you pass the audioRenderer for which you want to change the volume. Secondly you define the message which you want to send to the renderer which is MSG_SET_VOLUME, since you want to affect audio.
Finally you pass the value you want to set the volume to. In this example I chose 0.1f but of course you can use any value which fits your needs.
You can effect two different playback volumes if you send messages to both MediaCodecAudioTrackRenderers which you used to prepare playback. So you could send two messages with for example value 0.4f for audioRenderer1 and 0.6f for audioRenderer2 to blend the playbacks into one another.
I did not try this myself, but I think it should work.
This is the snippet of the original ExoPlayer code which is responsible for handling the MSG_SET_VOLUME message:
public void handleMessage(int messageType, Object message) throws ExoPlaybackException {
if (messageType == MSG_SET_VOLUME) {
audioTrack.setVolume((Float) message);
} else {
super.handleMessage(messageType, message);
}
}

MediaCodec encoding ignores my BUFFER_FLAG_SYNC_FRAME flag

In my Android application, I am encoding some media in webm (vp8) format using MediaCodec. The encoding is working as expected. However, I need to ensure that I create a sync frame once in a while. Here is what I do:
encoder.queueInputBuffer(..., MediaCodec.BUFFER_FLAG_SYNC_FRAME);
Later in the code, I check for sync frame:
encoder.dequeueOutputBuffer(bufferInfo, 0);
boolean isSyncFrame = (bufferInfo.flags & MediaCodec.BUFFER_FLAG_SYNC_FRAME);
The problem is that isSyncFrame never gets a true value.
I am wondering if I am making a mistake in my encoding configuration. May be there is a better way to tell the encoder to create a sync frame once in a while.
I hope it is not a bug in MediaCodec. Thank you in advance for your help.
There is no (current as of Android 4.3) way to request an on-demand sync frame using MediaCodec encoders. This is partly due to OMX, the underlying codec implementation in Android, that does not provide a way to specify which input frame should be encoded as a sync frame; although it has a way to trigger a sync frame "in the near future".
feisal's answer is the only currently supported way to control sync frames, but you have to do it at configuration time.
==edit re: jesup
You can trigger a sync frame in the near future using MediaCodec.setParameter:
Bundle params = new Bundle();
params.putInt(MediaCodec.PARAMETER_KEY_REQUEST_SYNC_FRAME, 0);
mCodec.setParameters(syncFrame);
Unfortunately, there is no (reliable) way to tell in MediaCodec if an encoded buffer is a sync frame other than doing it on your own by inspecting the byte-codes.
you can set the rate of I-frames in the MediaFormat object of your encoder by setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, int secs_between_iframes );

Pause media recorder programmatically. Camera.apk from samsung galaxy has `this.mMediaRecorder.pause();` does not work in my code

Now, i have made a library to concatenate 2 videos, using the mp4parser library.
And with this i can pause and resume recording a video (after it records the second video, it appends it to the first one).
Now, my boss told me to do a wrapper, and use this for the phones that do not have hardware support for pausing a video. For phones that have that (Samsung Galaxy S2 and Samsung Galaxy S1 can pause a video recording , with their camera application), i need to do this with no libraries, so it would be fast.
How can I implement this native, if as seen on the media recorder state diagram, http://developer.android.com/reference/android/media/MediaRecorder.html , there is no pause state?
I have decompiled the Camera.apk app from an Samsung Galaxe Ace, and the code has in the CamcorderEngine.class a method like this:
public void doPauseVideoRecordingSync()
{
Log.v("CamcorderEngine", "doPauseVideoRecordingSync");
if (this.mMediaRecorder == null)
{
Log.e("CamcorderEngine", "MediaRecorder is not initialized.");
return;
}
if (!this.mMediaRecorderRecording)
{
Log.e("CamcorderEngine", "Recording is not started yet.");
return;
}
try
{
this.mMediaRecorder.pause();
enableAlertSound();
return;
}
catch (RuntimeException localRuntimeException)
{
Log.e("CamcorderEngine", "Could not pause media recorder. ", localRuntimeException);
enableAlertSound();
}
}
If I try this.mMediaRecorder.pause(); in my code, it does not work, how is this possible, they use the same import (android.media.MediaRecorder). Have they rewritten the whole code at a system level?
Is it possible to take the input stream of the second video (while recording it), and directly append this data to my first video?
for my concatenate method, i use 2 parameters (the 2 videos, which both are FileInputStream), is it possible to take the InputStream from the recording function and pass it as the second parameter?
If I try this.mMediaRecorder.pause();
The MediaRecorder class does not have a pause() function, so this is obvious that there is a custom MediaRecorder class on this specific device. This is not something unusual, as the only thing required from the OEMs is to pass the "android compatability tests" on the device; there is no restriction on adding functionality.
Is it possible to take the input stream of the second video (while
recording it), and directly append this data to my first video?
I am not sure if you can do this, because the video stream is encoded data (codec header, key frames, and so on), and just combining 2 streams into 1 file will not produce a valid video file in my opinion.
Basically what you can do:
get raw data images from camera preview surface (see Camera.setPreviewCallback())
use a android.media.MediaCodec to encode the video
and then use an OutputFilStream to write to the file.
This will give you the flexability you want, as in this case you in you app decide which frames get into encoder, and which do not.
However, it maybe an overkill for your specific project, as well as some performance issues may rise.
PS. Oh, an by the way, try taking a look at the MediaMuxer - maybe it can help you too. developer.android.com/reference/android/media/MediaMuxer.html

Categories

Resources