Measure elapsed time between some code segments in Android - android

everyone. I'm a newbie to android, so please forgive me for asking questions that may be a bit elementary.
I need to calculate the running time (in nanoseconds) of a code segment, and I just started using System.nanoTime(), something like this:
long startTime = System.nanoTime();
// some code seqment
long endTime = System.nanoTime();
long elapsedTime = startTime - endTime;
But I've observed online that sometimes elapsedTime is a negative number! After checking, it seems that the problem is due to the size of the cores in android, the frequency of the two cores is different and if my starttime and endTime run on different cores, the calculation is not credible.
I found elapsedRealTime,elpseRealTimeNanos, in the android documentation and they seem to meet my needs, but I don't know if they have the same problem as System.nanoTime, and I didn't find them on google, so I would like to ask if these two functions can meet my needs

It must be endTime - startTime to get positive number.
long elapsedTime = endTime - startTime;
OR you can try to use elapsedRealtime. (Not that it changed if device is reboot).
val startTime = SystemClock.elapsedRealtime()
// your code segment
val elapsedTime = SystemClock.elapsedRealtime() - startTime

Related

Generate TimeStamp In Android Application Project

I am working on a project that requires the actions of a user to be time logged and represented as for example : 1 min ago, ...3hrs ago...5 days ago. I am new to this and don't know how to proceed. Keep in mind the project is NOT REST based. How do I implement this?
Get time in milliseconds for user action, save that time somewhere and then every time take difference of that time with current time to find out how long this action happens
you can get time in millisecond using
Calendar calendar = Calendar.getInstance();
calendar.getTimeInMillis();
You can get the actual timestamp with:
long timestamp = System.currentTimeMillis();
Regarding your question you can do something like:
long eventTimestamp = System.currentTimeMillis();
..
// some other stuff happens
..
//get the passed time
long actualTimestamp = System.currentTimeMillis();
long timestampDifference = actualTimestamp - eventTimestamp;
int passedSeconds = timestampDifference / 1000; //get the passed time in seconds
int passedMinutes= passedSeconds / 60; //get the passed time in minutes
Since you are on Android you could try the helper class DateUtils built into Android platform. Something similar to this untested code:
String relativeTime =
DateUtils.getRelativeTimeSpanString(
jud.getTime(),
System.currentTimeMillis(),
DateUtils.MINUTE_IN_MILLIS,
DateUtils.FORMAT_ABBREV_RELATIVE);
If you don't like the API-style or don't find enough features or see other problems like missing timezone awareness then you can also download and try one of two external libraries:
PrettyTime (slim classic library for relative times)
// your possible input
Date jud = new Date(System.currentTimeMillis() - 3600000);
org.ocpsoft.prettytime.PrettyTime pt =
new org.ocpsoft.prettytime.PrettyTime(Locale.ENGLISH);
String relativeTime = pt.format(jud);
System.out.println(relativeTime); // output: 1 hour ago
or my library Time4A (bigger but also more features and languages):
// your possible input
Date jud = new Date(System.currentTimeMillis() - 3600000);
Moment moment = TemporalType.JAVA_UTIL_DATE.translate(jud);
String relativeTime =
net.time4j.PrettyTime.of(Locale.US)
.withShortStyle()
.printRelativeInStdTimezone(moment);
System.out.println(relativeTime); // output: 1 hr. ago
Other libraries don't support printing of relative times well, if at all.

SensorEvent.timestamp inconsistency

my application performs in background step counting using the step detector sensor API's introduced in android 4.4.X.
It's essential to my app to know the exact time (at least accuracy of a second) each step event has accrued.
because I perform sensor batching , the time onSensorChanged(SensorEvent event) been called is not the same time when the step event took place - I must use the event.timestampfield to get the event time.
the documentation about this field is:
The time in nanosecond at which the event happened
The problem:
In some devices (such Moto X 2013) seems like this timestamp is time in nano seconds since boot, while in some devices (such Nexus 5) it's actually returns universal system time in nano seconds same as System.currentTimeMills() / 1000.
I understand, there's already an old open issue about that, but since sensor batching is introduced - it becomes important to use this field to know the event time, and it's not possible to rely anymore on the System.currentTimeMills()
My question:
What should I do to get always the event time in system milliseconds across all devices?
Instead of your "2-day" comparison, you could just check if event.timestamp is less than e.g. 1262304000000000000 - that way you'd only have a problem if the user's clock is set in the past, or their phone has been running for 40 years...
Except that a comment on this issue indicates that sometimes it's even milliseconds instead of nanoseconds. And other comments indicate that there's an offset applied, in which case it won't be either system time or uptime-based.
If you really have to be accurate, the only way I can see is to initially capture an event (or two, for comparison) with max_report_latency_ns set to 0 (i.e. non-batched) and compare the timestamp to the system time and/or elapsedRealtime. Then use that comparison to calculate an offset (and potentially decide whether you need to compensate for milliseconds vs nanoseconds) and use that offset for your batched events.
E.g. grab a couple of events, preferably a couple of seconds apart, recording the System.currentTimeMillis() each time and then do something like this:
long timestampDelta = event2.timestamp - event1.timestamp;
long sysTimeDelta = sysTimeMillis2 - sysTimeMillis1;
long divisor; // to get from timestamp to milliseconds
long offset; // to get from event milliseconds to system milliseconds
if (timestampDelta/sysTimeDelta > 1000) { // in reality ~1 vs ~1,000,000
// timestamps are in nanoseconds
divisor = 1000000;
} else {
// timestamps are in milliseconds
divisor = 1;
}
offset = sysTimeMillis1 - (event1.timestamp / divisor);
And then for your batched events
long eventTimeMillis = (event.timestamp / divisor) + offset;
One final caveat - even if you do all that, if the system time changes during your capture, it may affect your timestamps. Good luck!
I found a work-around solution that solving the problem. the solution assumes that the timestamp can be only one of the two: system timestamp, or boot time:
protected long getEventTimestampInMills(SensorEvent event) {
long timestamp = event.timestamp / 1000 / 1000;
/**
* work around the problem that in some devices event.timestamp is
* actually returns nano seconds since last boot.
*/
if (System.currentTimeMillis() - timestamp > Consts.ONE_DAY * 2) {
/**
* if we getting from the original event timestamp a value that does
* not make sense(it is very very not unlikely that will be batched
* events of two days..) then assume that the event time is actually
* nano seconds since boot
*/
timestamp = System.currentTimeMillis()
+ (event.timestamp - System.nanoTime()) / 1000000L;
}
return timestamp;
}
According to the link in your question:
This is, in fact, "working as intended". The timestamps are not
defined as being the Unix time; they're just "a time" that's only
valid for a given sensor. This means that timestamps can only be
compared if they come from the same sensor.
So, the timestamp-field could be completely unrelated to the current system time.
However; if at startup you were to take two sensor samples, without batching, you could calculate the difference between the System.currentTimeMillis() and the timestamp, as well as the quotient to the differences between the different times you should be able to convert between the different times:
//receive event1:
long t1Sys = System.currentTimeMillis();
long t1Evt = event.timestamp;
//receive event2:
long t2Sys = System.currentTimeMillis();
long t2Evt = event.timestamp;
//Unregister sensor
long startoffset = t1Sys - t1Evt; //not exact, but should definitely be less than a second, possibly use an averaged value.
long rateoffset = (t2Sys - t1Sys) / (t2Evt - t1Evt);
Now any timestamp from that sensor can be converted
long sensorTimeMillis = event.timestamp * rateoffset + startoffset;

System.currentTimeMillis() work wrong

I need stopWatch and I used http://www.goldb.org/stopwatchjava.html
It did not work well so I tried write out the value every 1000ms:
stopWatch.start();
HandlerScrollBar.postDelayed(TtScroll, 1000);
private Runnable TtScroll = new Runnable() {
public void run() {
long time = stopWatch.getElapsedTime();
HandlerScrollBar.postDelayed(TtScroll,(long) 1000);
Log.d(TAG, Long.toString(time));
}
};
I can see value of time every second in CatLog and this is result:
Real time is max +5ms but in Column it is at least +3 seconds! How is it possible? It is the same with
new Date().getTime().
Is there some StopWatch class which will pass this test as expected?
Thank you.
If you are measuring elapsed time, and you want it to be correct, you must use System.nanoTime(). You cannot use System.currentTimeMillis(), unless you don't mind your result being wrong.
The purpose of nanoTime is to measure elapsed time, and the purpose of currentTimeMillis is to measure wall-clock time. You can't use the one for the other purpose. The reason is that no computer's clock is perfect; it always drifts and occasionally needs to be corrected.
Since nanoTime's purpose is to measure elapsed time, it is unaffected by any of these small corrections.I would suggest to pick the nanoTime() as it has better accuracy in those microcalculations.
for extremely precise measurements of elapsed time. From its javadoc:
long startTime = System.nanoTime();
// ... the code being measured ...
long estimatedTime = System.nanoTime() - startTime;
Seems impossible. I've never had System.currentTimeMillis() act that way. Also, you're logging out as Log.d() but the logcat you show indicates a Log.e(). You sure that's the right logcat?

How to make Android custom clock time accurate?

I made a appWidget which show the current time of the server(2012-08-29, 12:00:08 for example). I request the server time every fix duration(1 hour for example). If receives the server time, updates the appWidget display. During the duration, I launch a Handler to update the time like this:
mTick = new Runnable() {
public void run() {
mMillis += 1000;
long now = SystemClock.uptimeMillis();
long next = now + (1000 - now % 1000);
mHandler.postAtTime(mTicker, next);
}
}
mTicker.run();
My questions:
1 After a long time(one day elapsed), The time displayed in AppWidget is slow than the real server time.
I doubt that my method used above is not accurate enough to update the time.
Any suggestions about this problem?
You should not use SystemClock.uptimeMillis() because it does not include time spent in deep sleep, thats why your app widget is out of sync.
You should use SystemClock.elapsedRealtime() call instead
Upd: sorry, I think I misunderstand the problem here. What you are trying to do is to use postAtTime to post runnable after some time in future. Please notice that postAtTime does not include time when device is in deep sleep.
What you need is to track accurate ammount of deltas between redrawing of your widget. You should use SystemClock.elapsedRealtime() for that.
Algorithm should be like this:
long serverTime = getServerTime();
long lastTime = SystemClock.elapsedRealtime();
// Somewhere in updateWidget() or call on timer:
serverTime = serverTime + SystemClock.elapsedRealtime() - lastTime;
lastTime = SystemClock.elapsedRealtime();
// At this moment in serverTime variable you have "server" time adjusted by the time which passed on device, including time spent in deep sleep

How to limit framerate when using Android's GLSurfaceView.RENDERMODE_CONTINUOUSLY?

I have a C++ game running through JNI in Android. The frame rate varies from about 20-45fps due to scene complexity. Anything above 30fps is silly for the game; it's just burning battery. I'd like to limit the frame rate to 30 fps.
I could switch to RENDERMODE_WHEN_DIRTY, and use a Timer or ScheduledThreadPoolExecutor to requestRender(). But that adds a whole mess of extra moving parts that might or might not work consistently and correctly.
I tried injecting Thread.sleep() when things are running quickly, but this doesn't seem to work at all for small time values. And it may just be backing events into the queue anyway, not actually pausing.
Is there a "capFramerate()" method hiding in the API? Any reliable way to do this?
The solution from Mark is almost good, but not entirely correct. The problem is that the swap itself takes a considerable amount of time (especially if the video driver is caching instructions). Therefore you have to take that into account or you'll end with a lower frame rate than desired.
So the thing should be:
somewhere at the start (like the constructor):
startTime = System.currentTimeMillis();
then in the render loop:
public void onDrawFrame(GL10 gl)
{
endTime = System.currentTimeMillis();
dt = endTime - startTime;
if (dt < 33)
Thread.Sleep(33 - dt);
startTime = System.currentTimeMillis();
UpdateGame(dt);
RenderGame(gl);
}
This way you will take into account the time it takes to swap the buffers and the time to draw the frame.
When using GLSurfaceView, you perform the drawing in your Renderer's onDrawFrame which is handled in a separate thread by the GLSurfaceView. Simply make sure that each call to onDrawFrame takes (1000/[frames]) milliseconds, in your case something like 33ms.
To do this: (in your onDrawFrame)
Measure the current time before your start drawing using System.currentTimeMillis (Let's call it startTime)
Perform the drawing
Measure time again (Let's call it endTime)
deltaT = endTime - starTime
if deltaT < 33, sleep (33-deltaT)
That's it.
Fili's answer looked great to me, bad sadly limited the FPS on my Android device to 25 FPS, even though I requested 30. I figured out that Thread.sleep() works not accurately enough and sleeps longer than it should.
I found this implementation from the LWJGL project to do the job:
https://github.com/LWJGL/lwjgl/blob/master/src/java/org/lwjgl/opengl/Sync.java
Fili's solution is failing for some people, so I suspect it's sleeping until immediately after the next vsync instead of immediately before. I also feel that moving the sleep to the end of the function would give better results, because there it can pad out the current frame before the next vsync, instead of trying to compensate for the previous one. Thread.sleep() is inaccurate, but fortunately we only need it to be accurate to the nearest vsync period of 1/60s. The LWJGL code tyrondis posted a link to seems over-complicated for this situation, it's probably designed for when vsync is disabled or unavailable, which should not be the case in the context of this question.
I would try something like this:
private long lastTick = System.currentTimeMillis();
public void onDrawFrame(GL10 gl)
{
UpdateGame(dt);
RenderGame(gl);
// Subtract 10 from the desired period of 33ms to make generous
// allowance for overhead and inaccuracy; vsync will take up the slack
long nextTick = lastTick + 23;
long now;
while ((now = System.currentTimeMillis()) < nextTick)
Thread.sleep(nextTick - now);
lastTick = now;
}
If you don't want to rely on Thread.sleep, use the following
double frameStartTime = (double) System.nanoTime()/1000000;
// start time in milliseconds
// using System.currentTimeMillis() is a bad idea
// call this when you first start to draw
int frameRate = 30;
double frameInterval = (double) 1000/frame_rate;
// 1s is 1000ms, ms is millisecond
// 30 frame per seconds means one frame is 1s/30 = 1000ms/30
public void onDrawFrame(GL10 gl)
{
double endTime = (double) System.nanoTime()/1000000;
double elapsedTime = endTime - frameStartTime;
if (elapsed >= frameInterval)
{
// call GLES20.glClear(...) here
UpdateGame(elapsedTime);
RenderGame(gl);
frameStartTime += frameInterval;
}
}
You may also try and reduce the thread priority from onSurfaceCreated():
Process.setThreadPriority(Process.THREAD_PRIORITY_LESS_FAVORABLE);

Categories

Resources