Android display refresh rate - android

I'm developing an Android app which acts like a movie clapperboard/clapboard/slate. Is there any way in which I can set the display's refresh rate?
It is very important because when you edit the movie it's necessary to "land" on specific frames. The point is that if the timer is set to 25 frames per second, I need the display to update exactly 25 times per second, when the timer changes its value.
The problem on physical devices is that, let's say my Samsung Spica GT-I5700 returns a refresh rate of 62.016 which is totally inappropriate for a 25 fps timecode, and when editing you can see Frame1-Frame1-Frame2-Frame2 etc. or intermediaries, when you should see exactly Frame1-Frame2 etc.
The point is that I would need the refresh rate to be in sync with the timecode. If the user sets 25 fps, then the display should refresh exactly 25 times per second.
Any ideas, please? Thank you!

Unfortunately, there is no way to set an arbitrary refresh rate for any given device, whether it be android or not. Video circuitry is limited to the fidelity of its timers, and like all discrete systems, cannot operate with continuous precision. However, this shouldn't be too much of a problem. If you're performing a dirty render (on demand), simply base your frame build around a 25Hz system timer, and the video hardware will render it during the next raster pass - a 60Hz raster pass will almost certainly occur within your 25Hz interval. You'll also have to consider frame-dropping when the system is too busy to honour a 25Hz interval. If you're using fixed rate rendering, simply use the elapsed time between renders in order to determine the appropriate frame for a given frame rate.

Related

Determine exact screen flip times on Android

I am attempting to determine (to within 1 ms) when particular screen flips happen on Android. Choreographer fires every time a frame flips, but gives no way of determining which frame is actually being displayed. According to https://source.android.com/devices/graphics/architecture.html, there are several layers in the process: the user land buffer, which flips to a triple-buffered queue, which flips to the surface flinger, which flips to the hardware. Each of these layers can potentially drop a frame, but at this point I have only determined how to to monitor the user land buffer. Is there a way to monitor the other buffers/flips (in real time, on a non-rooted, non-custom phone)?
I have observed unexpected frame delays on the HTC M8 (about 1 every 5 minutes), but the Nexus 7 does not appear to have this problem. I measure the delays by using a Cedrus StimTracker (http://cedrus.com/stimtracker/) with a photo sensor and the Lab Streaming Layer (https://github.com/sccn/labstreaminglayer). I have tried using eglPresentationTimeANDROID to control when screens are flipped, and that has not fixed the problem.
Note that I'm using the ndk, but I can usually use the JNI to get access to non-ndk features when I need to.
The reason I care is in order to use Android for psychological and neurological experiments, where 1 ms precision is highly desirable.
As far as accessible APIs go, it sounds like you've found the relevant bits and pieces. If you haven't yet, please read through this stackoverflow item.
Using Choreographer and extrapolation, you can guess at when the next display refresh will occur. Using eglPresentationTimeANDROID() on an Android 5.0+ device, you can tell SurfaceFlinger when you want a particular frame to be sent to the display. Assuming SurfaceFlinger is properly accounting for all latency (such as additional frames added by "smart" panels), that should get you reliable timing.
(Bear in mind that the timing is based on when the display latches the next frame, not when the next frame is fully visible on the display... the latency there will depend on the panel.)
Grafika's "scheduled swap" Activity uses this feature, but it sounds like you're already familiar.
The only way to get signaled by the display when it does the swap would be to dup() the display-retire fence fd from the previous frame, and wait on it. Some of the code in SurfaceFlinger does this, notably DispSync watches the retire fences to see if the software "VSYNC" is drifting. There is no public API for fences, and the user-space response time could certainly be more than 1ms anyway... it usually works out better to schedule ahead than it does to react. Your requirement for non-rooted non-custom devices makes this problematic.
If you're mostly seeing correct behavior, but occasionally seeing a miss, your best bet is to use systrace to track down the cause.

Camera2 API - How to set long exposure times

I'm trying to capture images with 30 seconds exposure times in my app (I know it's possible since the stock camera allows it).
But SENSOR_INFO_EXPOSURE_TIME_RANGE (which it's supposed to be in nanoseconds) gives me the range :
13272 - 869661901
in seconds it would be just
0.000013272 - 0.869661901
Which obviously is less than a second.
How can I use longer exposure times?
Thanks in advance!.
The answer to your question:
You can't. You checked exactly the right information and interpreted it correctly. Any value you set for the exposure time longer than that will be clipped to that max amount.
The answer you want:
You can still get what you want, though, by faking it. You want 30 continuous seconds' worth of photons falling on the sensor, which you can't get. But you can get something (virtually) indistinguishable from it by accumulating 30 seconds' worth of photons with tiny missing intervals interspersed.
At a high level, what you need to do is create a List of CaptureRequests and pass it to CameraCaptureSession.captureBurst(...). This will take the shots with as minimal an interstitial time as possible. When each frame of image data is available, pass it to some new buffer somewhere and accumulate the information (simple point-wise addition). This is probably most properly done with an Allocation as the output Surface and some RenderScript.
Notes on data format:
The right way to do this is to use the RAW_SENSOR output format if you can. That way the accumulated output really is directly proportional to the light that was incident to the sensor over the whole 30s.
If you can't use that, for some reason, I would recommend using YUV_420_888 output, and make sure you set the tone map curve to be linear (unfortunately you have to do this manually by creating a curve with two points). Otherwise the non-linearity introduced will ruin our scheme. (Although I'm not sure simple addition is exactly right in a linear YUV space, but it's a first approach at least.) Whether you use this approach or RAW_SENSOR, you'll probably want to apply your own gamma curve/tone map after accumulation to make it "look right."
For the love of Pete don't use JPEG output, for many reasons, not the least of which is that this will most likely add a LOT of interstitial time between exposures, thereby ruining our approximation of 30s on continuous exposure.
Note on exposure equivalence:
This will produce almost exactly the exposure you want, but not quite. It differs in two ways.
There will be small missing periods of photon information in the middle of this chunk of exposure time. But on the time scale you are talking about (30s), missing a few milliseconds of light here and there is trivial.
The image will be slightly nosier than if you had taken a true single exposure of 30s. This is because each time you read out the pixel values from the actual sensor, a little electronic noise gets added to the information. So in the end you'll have 35 times as much of this additive noise (from the 35 exposures for your specific problem) as a single exposure would. There's no way around this, sorry, but it might not even be noticeable- this is usually fairly small relative to the meaningful photographic signal. It depends on the camera sensor quality (and ISO, but I imagine for this application you need that to be high.)
(Bonus!) This exposure will actually be superior in one way: Areas that might have been saturated (pure white) in a 30s exposure will still retain definition in these far shorter exposures, so you're basically guaranteed not to lose your high end details. :-)
You can't always trust SENSOR_INFO_EXPOSURE_TIME_RANGE as of May 2017. Try manually increasing the time and see what happens. I know my Pixel will actually take a 1.9 sec shot but SENSOR_INFO_EXPOSURE_TIME_RANGE has a value in the sub second range.

Unity3D Fixed TimeStep - Android FPS issue

I don't understand why, but when I increase the Fixed TimeStep in the Time setting in Unity3D, I have a bad frame issue on Android only.
In iOS, I have a better performance, but Android the animation is very very bad..
Somebody can tell me why increasing the Fixed TimeStep have an issue with the FPS, but on Android, not on iOS.
A fixed time step of 60 (Hz) means Unity guarantees that the FixedUpdate method runs this many times per second, regardless of framerate. FixedUpdate can be set to run multiple times per frame even.
However you can't force a CPU to do ever more per frame/second. Eventually this will affect the framerate because there isn't enough time to compute and render a frame in the time necessary.
For instance, to get a constant 60 frames per second each frame must be computed and rendered within a 0.01666 second time window. If the computation and rendering takes 0.017 seconds Unity is no longer rendering 60 fps. If vertical synch is enabled (as it is on mobile devices) a constant time per frame of just over 0.01666 means the framerate will be 30 fps (not 55 or something). So on mobile you're more likely to notice the effect of constantly being over the 0.01666 time per frame.
If there are enough FixedUpdate iterations running per second the app has more to compute and thus takes longer per frame. Eventually the FixedUpdate iterations per frame (plus the time it takes to render) no longer complete within 0.01666 seconds, that's what you see as a drop in framerate.

Android Activity Animations Timing

I want to put some animations on my activities when they open and close. But so far if i put them on it just makes it feel like the phone is lagging. And if I speed them up then you barely notice them.
What is the smallest time fragment which the average human can identify to? 50ms 100ms?
How can I make an animation noticeable, but not take away from the responsiveness of the application? Because obviously the animations them selves probably slow down the app allot.
Maybe Im asking a stupid question, If so I am sorry. But I thought that it is a fairly important aspect of designing a gpood app.
http://en.wikipedia.org/wiki/Frame_rate says:
"The human eye and its brain interface, the human visual system, can process 10 to 12 separate images per second, perceiving them individually."
The 1896 standard movie frame rate was 16 fps. Today it is at least 24 fps.
I also remember the rate 18 events per second, not sure if it was MS-DOS or old TV.
The car driver's reaction time is estimated to be about 0.2 sec. (In the case that the driver is prepared to react.) This means that if something happens faster, a human has no time to move, despite of seeing it.
You can base your design on these digits.

OpenGL buffer swap refresh rate, how to calculate ideal

I am timing my OpenGL frame rates, with v-sync turned on, and notice my timings aren't the precise frequency as set by the monitor. That is, on my desktop I have a 60Hz refresh, but the FPS is stable at 59.88, whereas on my tablet it's also 60Hz but the FPS can be 61/62 FPS. I'm curious as to what precisely causes these slight deviations.
These are the ideas I've had so far:
Dropped Frames: This is the obvious answer: I'm just missing some frames. This is however not the cause as I can verify I am not dropping frames and the drop in FPS would be higher if this happened. I calculate the time over 120 frames, so if 1 frame was lost the FPS would drop below 59.5 on the desktop.
Inaccurate timings: I use clock_gettime to get my timings. On Linux I know this is accurate enough (as I previously did nanosecond based timings with it, but here we could even live with +/- several hundred microseconds). On Android however I'm not sure of the accuracy of this.
API Oddity: I use glXSwapBuffers on the desktop and eglSwapBuffers on Android. There could be an oddity here, but I don't see how this could so subtley affect the frame rate.
Approximate Hz: This is my biggest guess that the video cards/monitor aren't actually running at 60Hz. This is probably tied to the exact speed of the monitor, and the video card frequency. This seems like something that could be concretely determined, but I don't know which tools can be used to do this. (Update: My current video mode in Linux shows 59.93Hz, so closer, but still not there)
If the answer is indeed #4 this is perhaps not the best exchange site for the question. But in all cases my ultimate goal is to figure out programmatically what the ideal refresh rate actaully is. So I'm hoping somebody can confirm/deny my ideas, and possibly point me in the right direction to getting the information I need.

Categories

Resources