How to find the input fps from camera in Android - android

I am working with a Qualcomm Snapdragon processor with a periphery camera. The project I am working on needs to process the frames from the camera with OpenCV as quickly as possible. In order to verify that things are working, I would like to know the output fps achieved and also the input fps to see how many frames are being skipped. Is there a way to see how many frames per second the camera is providing to the processor? It should be able to work in the general case, for any input camera.
I've looked through the Camera2 API a bit, and found the parameter SENSOR_FRAME_DURATION, but I am not quite sure how to access it. Also, I am using JavaCameraView, which seems to act as a bridge between Camera and OpenCV. Any advice would be appreciated.

I'm getting frame rate calculated as below. In your onCameraFrame method you can measure the time taken between 2 consecutive frames.
prevTime = time;
time = SystemClock.elapsedRealtime();
Frame rate= (1000 / (time - prevTime))

Related

How to schedule per frame callback with any form of android video capture

I try to make an android app which will record yaw of phone into a binary file per frame of video capture. I initially tried to make that work with OpenCV, but the fact that video writer doesn't work seem to introduce a lot of complications. Sequentially I looked into camera2 API and CameraX in hopes of using them instead. However I found no way to implement per frame callbacks. I thought of making a service app which will intercept camera events in non blocking manner and then transfer seconds to frames since my framerate is fixed 30 at the moment but I'm struggling to find a way here as well. Can anyone suggest a way to achieve the expected result?
I also tried to see the composition of android mp4 (because from my understanding it could carry per frame metadata) but to no avail.
Update:
I.ve managed to record video with cameraX while calling the writing of the yaw on background thread from sensor callback set to duration of one frame from framerate. But this is still not very precise. I've tried to add ImageAnalysis UseCase to my UseCaseGroup since its called per frame, but I have an issue which was already mentioned here. Anyone knows how to resolve this, or any workaround to achieve per frame callback?
Camera2 provides per-frame callbacks in various ways - the simplest probably being the onCaptureComplete callback.
That callback will get all the frame metadata, which includes the start of exposure timestamp. You can use that timestamp to match up to sensor yaw information and its timestamps, and then send that on to whatever recording mechanism you have.
CameraX also lets you hook into this via:
https://developer.android.com/reference/androidx/camera/camera2/interop/Camera2Interop.Extender?hl=en#setSessionCaptureCallback(android.hardware.camera2.CameraCaptureSession.CaptureCallback)
if there's no existing convenient path.

How to fix the frame rate of camera in Android phone

I want to fix the frame rate of camera preview in Android, i.e., 20fps, or 30 fps. However, we find the frame rate is unstable.
In the android document, it is said that the frame rate is fluctuated between the minimum frame rate and the maximum one which are defined in getSupportedPreviewFpsRange.
https://developer.android.com/reference/android/hardware/Camera.Parameters.html#getSupportedPreviewFpsRange%28%29
My questions are:
1) Which factors influence the frame rate? exposure time, white balance, frame resolution, background CPU loading, and etc.?
2) Is there any method to fix the frame rate by customised above factors?
3) In my project, higher frame rate is better. If the frame rate is unstable in the end. Can I increase the minimum frame rate? or fix the minimum frame rate?
4) It seems that the video taking is somewhat different with preview model, Can I fix the frame rate or minimum frame rate of video taking in Android?
Finally, we found that IOS can fix the frame rate using videoMinFrameDuration and
videoMaxFrameDuration.
Thanks.
First of all, please note that the camera API that you ask about was deprecated more than 3 years ago. The new camera2 API provides much more control over all aspects of capture, including frame rate.
Especially, if your goal is smooth video recording. Actually, the MediaRecorder performs its job decently on older devices, but I understand that this knowledge has little practical value if for some reason you cannot use the MediaRecorder.
Usually, the list of supported FPS ranges includes fixed ranges, e.g. 30 fps, intended exactly for video recording. Note that you are expected to choose a compliant (recommended) preview (video) resolution.
Two major factors cause frame rate variations within the declared range: exposure adjustments and focus adjustments. To achieve uniform rate, you should disable autofocus. If your camera supports exposure control, you should lock it, too. Refrain from using exotic "scenes" and "effects". SCENE_MODE_BARCODE and EFFECT_MONO don't seem to cause problems with frame rate. Whitebalance is OK, too.
There exist other factors that cause frame rate fluctuations that are completely under your control.
Make sure that your camera callbacks do not interfere with, and are not delayed by the Main (UI) thread. To achieve that, you must open the camera on a secondary HandlerThread. The new camera2 API makes thread management for camera callbacks easier.
Don't use setPreviewCallback() which automatically allocates pixel buffers for each frame. This is a significant burden for garbage collector, which may lock all threads once in a while for major cleanup. Instead, use setPreviewCallbackWithBuffer() and preallocate just enough pixel buffers to keep it always busy.
Don't perform heavy calculations in the context of your onPreviewFrame() callback. Pass all work to a different thread. Do your best to release the pixel buffer as early as possible.
Even for the old camera API, if the device lists a supported FPS range of (30, 30), then you should be able to select this range and get consistent, fixed video recording.
Unfortunately, some devices disregard your frame rate request once the scene conditions get too dark, and increase exposure times past 1/30s. For many applications, this is the preferable option, but such applications should simply be selecting a wider frame rate range like (15, 30).

Frame rate (interval) issues in recording GL rendering with a GLSurfaceView using MediaCodec

I've been developing a recording component of our app, which records GL renderings on a GLSurfaceView using MediaCodec and Muxer. I found a nice set of examples by fadden (like bigflake and Grafika... Thanks so much fadden), and tried those stuff out. I've built a recorder based on the game recording model of Android Breakout, because our app used GLSurfaceView to do the rendering.
It seemed working well on my Nexus 7 (2013 with 4.4.2 and 5.0.2). I was able to record rendered screens, and those encoded MP4s were played well on other devices. However, when a bit more complicated rendering got involved, it started dropping frames. Well, it's not that complicated, as it takes only around 4 ms to render the frame. So, I've been trying to identify the source of the problem.
My pseudo code are very similar to the Android Breakout Recorder. OnDrawFrame() looks like the following.
GLES20.glViewport(mViewportXoff, mViewportYoff, mViewportWidth, mViewportHeight);
drawFrame();
GameRecorder recorder = GameRecorder.getInstance();
if (recorder.isRecording() && recordThisFrame()) {
saveRenderState();
// switch to recorder state
recorder.makeCurrent();
recorder.getProjectionMatrix(mProjectionMatrix);
recorder.setViewport();
// render everything again
drawFrame();
recorder.swapBuffers();
restoreRenderState();
}
I've tried to measure performance of line-by-line calls and found that recorder.swapBuffers() takes around 10 ms on Nexus 7. Moving buffers could take that much time, which I think is reasonable. Another one takes longer than expected was recorder.makeCurrent(), which takes around another 10 ms.
I was also trying to measure the call intervals of onDrawFrame(). When recording was off (with recordThisFrame() setting to be false), I've been consistently getting 16.7 ~ 17 ms, which is expected. When recordThisFrame() is set to alternate between true and false to record every other frame, I've been getting around 22 ~ 25 ms (when recording) and 4~10 ms. Even after changing the drawFrame() call to simple glclear(), I got the same result. I tried this on Android Breakout Recorder (fadden's original code), and got the same result as well.
I was trying the "Record GL app with FBO" activity in Grafika, and I found it gets more consistent 16-17 ms intervals even with the recordings. I was tempted to replace GLSurfaceView with SurfaceView as in the Record GL app in Grafika, but that change does not seems to be a viable option for now.
Is the makeCurrent() call that much costly? Is there anyone experiencing similar interval issues? Any thoughts would be appreciated.

Getting a stable frame rate from Android

I'm trying to get a stable frame rate from a simple camera Android application written based on the guide below.
http://developer.android.com/guide/topics/media/camera.html - Non intent version.
There is only preview and there is no image or video capture. Everytime onPreview (implementating previewcallback) is called, I check timestamp differences to measure frame rate. Though on average it meets the 15 FPS rate I set ( setPreviewFpsRange(15000, 15000 and verified that it is supported on the device using getSupportedPreviewFpsRange() ), the individual frame rate vary from 5 fps to 40 fps.
Is there a way to fix this. What are the reasons for it? Guess one reason is application process priority. It was observed that adding more applications reduced the fps. One solution is to increase priority of this camera application. Second reason could be garbage collection and slow buffer copies to preview. Third reason is that camera api (not the new camera2 api of Android L - my device is not supported yet) was not designed for streaming camera data.
Also the exposure time lock was enabled to fix frame rates.
It's most likely just how you are timestamping the frames. Try saving a sequence of preview frames pointed at a stopwatch to disk and viewing them using a YUV player (i.e http://www.yuvtoolkit.com/).

Taking photo every 66 milliseconds on Android phone for colour analysis (Heart Rate Monitor)

I'm doing a final year project at university which involves making a medical application for Android, as a practice I have to make a heart rate monitor app.
I have worked out that the best way to do this is to look for colour changes in your blood when holding the camera against your finger with the flash switched on.
This is where the problems come into play, is it possible to take a photo every 66 milliseconds on the camera, then compare each pair of photos for any intensity changes in order to count a heart beat? or am I better off recording a video and analysing each frame looking for a change.
Heck is it even possible to just look at the video preview and compare each frame.
The questions I need answering for this problem are neatly listed below
What is the best method for this, taking photos, recording video or looking at the live preview.
Is there any posts or pages I can visit on the internet where people have attempted similar things
Anyone got a basic method I should do to get two images that I can compare within the time frame.
Lastly If I do take the basic take a picture every 66 milliseconds approach, what can I do to ensure the picture is taken at the correct time intervals
What is the best method for this, taking photos, recording video or looking at the live preview.
I would think that live preview would be the right answer. Taking photos is not -- they will not happen anywhere near that quickly. Recording video and post-processing it would be possible, but I fail to see how this will be applicable for a real-time heart monitor.
Is there any posts or pages I can visit on the internet where people have attempted similar things
You can examine the Barcode Scanner source code, which uses the live preview to scan for barcodes.
Lastly If I do take the basic take a picture every 66 milliseconds approach, what can I do to ensure the picture is taken at the correct time intervals
Android is not a hard RTOS. AFAIK it will be impossible for you to precisely time things to be 66 milliseconds.

Categories

Resources