I'm doing some work with computer vision on Android. I have made an app that keeps the exposure low after calibration with exposurelock, and that works fine. The exposure is how I want it with the LED on and off for almost all video frames.
However after I toggle the LED flash from the camera on, the image is shortly overexposed.
Since the exposure is locked, this is something that I do not understand why it is happening.
I'm testing with a Moto G2nd and Nexus 5X. Only the Nexus 5X seems to show the problem. From what I've read there are some difficulties with setting exposure compensation (which the Nexus 5X lacks), but the exposurelock is sure to work, although I'm beginning to doubt it has flaws. The Moto G2nd has the exposure compensation set to minimum (which here it does work).
Begging the question; is the shortly overexposed frame the result of the Nexus drivers, missing exposure compensation, or is it a natural occuring problem?
If the latter is the case, can I counteract it? I've thought about keeping track of the overall brightness of the frame and comparing that with a not overexposed one a few frames back. This seemed tricky and not to work.
Sidenote: maybe worth to mention, is that the Nexus shortly seems to stutter the video output after setting the camera parameters (flash on or off). Would it be conceivable that during this stutter the CMOS is charged too long?
I've looked into this some more. Nexus support wasn't of any help, so I basically had to work around the problem. I ended up timing the flash (there's a delay in case of Nexus 5x) by watching the brightness of the frame. Instead of just timing from flash low to high, I add another frame at the end to account for one overexposed frame.
I'm not sure this same issue happens when turning the flash off. I would guess so as there's a significant framedrop around powering off the flash, just as when powering on. Either way, that doesn't seem a problem for me (yet).
My guess this is a problem with the camera driver and the CMOS is indeed charged too long, because frames are not fetched in time, due to a delay by enabling/disabling the flash.
Related
Here is some context. I am working on a 3d game on Android, and have been getting fluctuating FPS which is annoying.
Then I profiled the game and figured out that the processors were not always running at max clock speed. In fact, this was due to the fact that the active (as well as default) governor appeared to be 'interactive', which would try to act smart in that it would adjust clock speed according to current load.
I then rooted my device and set the governor to be 'performance', which locks clock speed at max. As the result the game was running super smooth giving a stable fps of 30.
The issue is that you don't have control, as an application, over what governor to use. That is, you cannot do anything if a player has an un-rooted phone, as it is an OS-level thing anyway. Even if they had their phones rooted, it would be still their, not our, judgement to make in which governor profile to activate.
So my question is, what can we do as developers, on application level, to 'trick' processors so that they are believed to have much work to do, which in fact they do not? The goal is to trick whatever governor to set clock speed at max, or at least at some fixed numbers, so that FPS would not fluctuate and make the game run at a performance that is lacking and below user expectations.
Thanks!
I am writing an application which has a video recording feature. During normal day-light hours with lots of light I am able to get 30fps video to record.
However, when there is less light, the frame rate drops to around 7.5fps (with exactly the same code). My guess would be that android is doing something behind the scenes with the exposure time to ensure that the resulting video has the best image quality.
I, however, would prefer a higher fps to a better quality image. Assuming exposure is the issue, is there any way to control the exposure time to ensure a decent fps (15fps+). There are the functions setExposureCompensation() and setAutoExposureLock() but they seem to do nothing.
Has anyone had this issue before? Is it even exposure that is causing my issue?
Any hits/suggestions would be great.
I am sorry but the accepted answer is totally wrong. In fact I have created an account just to correct this.
Noise information gets discarded depending on the bitrate anyway, I do not understand why someone would think that this would be an extra load on cpu at all.
In fact, video framerate on a mobile device has a lot to do with light exposure. In a low light situation, exposure is increased automatically, which also means the shutter will stay open longer to let more light in. Which will reduce the number of frames you can capture in a second, and add some motion blur on top. With a DSLR camera you could change your aperture for more light, without touching the shutter speeds, but on mobile devices your aperture is fixed.
You could mess with exposure compensation to get more fps, but I do not think super dark video is what you want.
More information;
https://anyline.com/news/low-end-android-devices-exposure-triangle/
There is a simple explanation here. The lower light means there is more noise in the video. With more noise the encoding engine has to put far more effort to get the compression it needs. Unless the encoder has a denoiser the encoding engine has far more noise to deal with than normal conditions.
If you want a more technical answer: More noise means that the motion-estimation engine of the encoder is thrown for a toss. This is the part that consumes maximum CPU cycles. The more the noise, the worse the compression and hence even other parts of the encoder are basically crunching more. More bits are generated which means that the encoding and entropy engines are also crunching more and hence the worse performance.
Generally in high end cameras a lot of noise is removed by the imaging pipeline in the sensor. However don't expect that in a mobile phone sensor. [This is the ISO performance that you see in DSLRs ].
I had this issue with Android 4.2 Galaxy S III. After experimenting with parameters found one call which started to work.
Look on Camera.Parameters, if you print them out, you'll see:
preview-fps-range=15000,30000;
preview-fps-range-values=(8000,8000),(10000,10000),(15000,15000),(15000,30000),(30000,30000);
The range allows the fps to "slow down".
The call setPreviewFpsRange(30000, 30000); enforces the fps to stay around 30.
This is right, you should call setPreviewFpsRange() to get constant fps. The frame rate you see is dropping because of the CCD, when light is low the fps goes down so it can produce better pictures (in still mode).
Also to achieve higher frame rate you should use:
Camera.Parameters parameters=camera.getParameters();
parameters.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO);
parameters.setRecordingHint(true);
camera.setParameters(parameters);
Given a known periodic motion (e.g., walking), I'd like to take a full resolution snapshot at the same point in the motion (i.e., the same time offset within different periods). However on the Nexus S (currently running OS 4.1.1 but the same was true of previous OS versions), I'm seeing so much variability in the shutter lag that I cannot accurately plan the timing of the snapshot. Here is a histogram of the shutter lags of 50 photographs. (I measured the shutter lag with one System.nanoTime() just before Camera.takePicture() and another System.nanoTime() at the beginning of the shutter callback. The camera lens was consistently covered to remove any variability due to lighting.)
Is there anything I can do in the application to reduce this shutter lag variability? (In this application, the mean lag can be any duration but the standard deviation must be small ... much smaller than the 0.5 s standard deviation of the shutter lags shown in the above histogram.) I'm hoping someone has a clever suggestion. If I don't get any suggestions, I'll post a feature request in the Android bug tracker.
UPDATE:
I turned off auto-focus (by setting focus to infinity) following RichardC's suggestion at https://groups.google.com/forum/?hl=en&fromgroups#!topic/android-developers/WaHhTicKRA0
It helped, as shown in the following histogram. Any ideas to reduce the shutter lag variability even more?
UPDATE 2:
I disabled the remaining auto parameters: white balance, scene mode, flash. The following histogram of shutter lag time variability seems to be about as good as it gets for a Nexus S running OS 4.1.1. It would be nice to have much less variability in the shutter lag time, perhaps by specifying an optional minimum shutter lag time in Camera.takePicture() which would delay the shutter if it were ready before the specified minimum. I've posted a feature request to the Android issue tracker. If you are also interested in getting the feature, "star" it at http://code.google.com/p/android/issues/detail?id=35785
what's the x axis and what's the y axis?
you can use low level coding (like NDK/renderscript) instead of using java .
you might be able to change the settings of the camera to make it faster .
I am trying to write a variable brightness flashlight app by using PWM (might use it for communication later). For that I need fast switching of the camera LED (say 100-200Hz), which is not possible through the Camera API's setParameters functionality (I guess the camera itself slows things down considerably).
Now – The LED is capable of switching rapidly and there are apps doing something similar (HTC flashlight for example, unfortunately couldn't find source code for it) so it all comes down to controlling the LED without the camera.
Any thoughts or ideas?
I know this is 4 years later, but you'd need a lot more than 100-200hz for PWM to work properly, without irritating the eye. You might get some control, but you won't be able to get 10% brightness without the pulses becoming noticeable, and even then, the duration of those pulses is too long to fool the eye. Typically PWM is handled at the microsecond level, around 100khz. I would like this to be possible as well. Except, if we could have say a 100khz carrier frequency in the flash, it would be possible to calculate distance to a subject with dedicated pixels in the sensor, as well as reject all ambient light through demodulation, if all pixels could be scanned fast enough. Sadly not possible though.
Normally to do that there'll be a PWM peripheral in the processor that handles the rapid switching for you, but that would need driver support; it won't be accessible to user applications. Here's a question which uses the driver to do it: Set brightness of flash in Android
When running the basic sample:
http://android-developers.blogspot.com/2009/04/introducing-glsurfaceview.html
Add some basic timing using System.currentTimeMillis() or dropping to native code and using clock_gettime(CLOCK_REALTIME_HR, &res) both yield the same results it appears. I tested with both separately.
I always calculate a frame rate of ~56. That is a rough average of 18ms between frames. I realize the timing of frames is not a perfect science (timer accuracy, device doing stuff in the background). I simply let it run doing absolutely nothing, not even a glClear. After an hour this is the results I got which is about the same as letting it run for 1 minutes.
I've tried many things to achieve a true 59/60fps. I'm beginning to wonder if it is my device (HTC Incredible S). Although I have tried Samsung Galaxy S and Nexus S both with the same results. Tablets do not exhibit this behavior at all. On my EEE Transformer and Xoom I get ~16ms a frame consistently.
This leads me to believe VSYNC is the issue and that the vsync on that generation of Android phones is slow. Really VSYNC should be ~16ms. Achieving a true 60fps is unlikely but 59fps should be common.
Has anyone else experienced this? Any tips for disabling vsync? I don't think that is possible. Any tips for gaining control of controlling the "swap" step of rendering?