I'm new to Android development.I want to create a stopwatch with precision of 0.01 seconds.Here is part of my code(which I think the problem lies within):
private void runTimer()
{
final TextView timeView = (TextView) findViewById(R.id.time_view);
final Handler handler = new Handler();
handler.post(new Runnable() {
#Override
public void run() {
int seconds = centiseconds / 100;
int centisecs = centiseconds % 100;
int hours = seconds / 3600;
int minutes = (seconds % 3600) / 60;
int secs = seconds % 60;
String time = String.format("%d:%02d:%02d.%02d", hours, minutes, secs, centisecs);
timeView.setText(time);
if (isRunning) {
centiseconds += 1;
}
handler.postDelayed(this, 10);
}
});
}
}
Since postDelayed method's delay is in milliseconds, ten times of it would be 1 centiseconds. So I'm incrementing my centiseconds variable every 10ms.So far so good.
But when I test my app on my device, it seems the seconds are ticking slower than they should. Is is probable that the codes corresponding to division and modulo operations cause so much lag that hinder the increment and reduce the accuracy?
I've rewritten the app for 0.1 seconds(deciseconds) (by: centiseconds / 10 and % 10 and postDelayed(..., 100) ) and it seems it is ticking correctly.
P.S.
Is this the reason my 4.3 Jellybean's Stopwatch has 0.1 seconds accuracy?
What is the limit of precision in android for such app? ( Timely's has 0.01 seconds so I think it is at least 0.01 seconds)
This is wrong approach as you cannot rely on system message queue as source of precise ticks as it does NOT guarantee any precision in delivery. postDelayed() queues your runnable to be delivered no sooner than now + delay but for precise delivery is not quaranteed and additional delays can happen for many reasons which in longer run would give you noticeable cumulative error in measurements.
You can however use postDelayed() to update your UI, but to know how much time passed you should use system clock methods, not own counters.
You also should fire your runnable at least twice per precision, i.e. if you want to update timer display once per minute, you should "tick" twice per minute so if there'd be any delay in message queue handling your UI shall still get at least one tick per second on time.
Related
I have two handlers running within an Android Service.
handler1 runs every 30 seconds
handler2 runs every 5 seconds
The problem is, handler2 can't run at the same time as handler1.
I mean, when handler2 reaches 30, 60, 90... secs it will run at the same time as handler1.
So, I need to find a way during those 30, 60 90 secs to run one handler after another.
I know a solution for this could be, but it's not elegant, neither accurate:
Run handler1 at second 0
Wait 7 seconds (or any other x # of secs no-multiple of 5)
Since handler2 runs more frequently than handler1 does, you can use counter in handler2 to track when to fire the events that you plan to trigger in handler1.
handler2.postDelayed(new Runnable() {
int count = 0;
#Override
public void run() {
if (count == 0) {
// fire handler1 events
}
count = (count+1) % 6;
}
}, 5);
How about using TimerTask and Timer to schedule repeated executions (every 5 seconds)? Depending on the interval, you run either the job currently done in handler1, or the one in handler2, or both. Perhaps you can break out these jobs into functions that are called from the TimerTask's run method. That would let the handler1 job run before the handler2 job, in a synchronized manner on the same thread.
Is there a way by which using the android chronometer class to set base of the chronometer in 15 minutes and from that period the times goes down until 0 seconds?
I have tried with setBase(60000) but this isn't work.
Check out this thread Android: chronometer as a persistent stopwatch. How to set starting time? What is Chronometer "Base"? as well as this thread Android - Get time of chronometer widget. Neither answers your question directly, but the nuggets there should lead you to an answer.
In general the chronometer works like this (if you would like to set the Base to a specific nr):
mChronometer.setBase(SystemClock.elapsedRealtime() - (nr_of_min * 60000 + nr_of_sec * 1000)))
what you are asking can be done through a countdown (http://developer.android.com/reference/android/os/CountDownTimer.html)
Or create your own countdown by using the chronometer like this (more work should be done cause i just wrote this and did not test it yet)
private OnChronometerTickListener countUp = new OnChronometerTickListener(){
#Override
public void onChronometerTick(Chronometer chronometer){
long elapsedTime = (SystemClock.elapsedRealtime() - mChronometerCountUp.getBase()) / 60000;
Log.v("counting up", elapsedTime);
// you will see the time counting up
count_down--;
if(count_down == 0){
mChronometerCountUp.stop();
}
// an int which will count down,
// this is not (very) accurate due to the fact that u r using the update part of the chronometer
// u just might implement the countdown i guess
// or 2 chronometers (one counting up and an other counting down using the elapsed time :p)
// just remember programming is creating ur solution to problems u face its like expression urself
};
};
http://developer.android.com/reference/android/widget/Chronometer.html
For set the base time you can use elapsedRealtime(), and you can output format with setFormat()
I need stopWatch and I used http://www.goldb.org/stopwatchjava.html
It did not work well so I tried write out the value every 1000ms:
stopWatch.start();
HandlerScrollBar.postDelayed(TtScroll, 1000);
private Runnable TtScroll = new Runnable() {
public void run() {
long time = stopWatch.getElapsedTime();
HandlerScrollBar.postDelayed(TtScroll,(long) 1000);
Log.d(TAG, Long.toString(time));
}
};
I can see value of time every second in CatLog and this is result:
Real time is max +5ms but in Column it is at least +3 seconds! How is it possible? It is the same with
new Date().getTime().
Is there some StopWatch class which will pass this test as expected?
Thank you.
If you are measuring elapsed time, and you want it to be correct, you must use System.nanoTime(). You cannot use System.currentTimeMillis(), unless you don't mind your result being wrong.
The purpose of nanoTime is to measure elapsed time, and the purpose of currentTimeMillis is to measure wall-clock time. You can't use the one for the other purpose. The reason is that no computer's clock is perfect; it always drifts and occasionally needs to be corrected.
Since nanoTime's purpose is to measure elapsed time, it is unaffected by any of these small corrections.I would suggest to pick the nanoTime() as it has better accuracy in those microcalculations.
for extremely precise measurements of elapsed time. From its javadoc:
long startTime = System.nanoTime();
// ... the code being measured ...
long estimatedTime = System.nanoTime() - startTime;
Seems impossible. I've never had System.currentTimeMillis() act that way. Also, you're logging out as Log.d() but the logcat you show indicates a Log.e(). You sure that's the right logcat?
I made a appWidget which show the current time of the server(2012-08-29, 12:00:08 for example). I request the server time every fix duration(1 hour for example). If receives the server time, updates the appWidget display. During the duration, I launch a Handler to update the time like this:
mTick = new Runnable() {
public void run() {
mMillis += 1000;
long now = SystemClock.uptimeMillis();
long next = now + (1000 - now % 1000);
mHandler.postAtTime(mTicker, next);
}
}
mTicker.run();
My questions:
1 After a long time(one day elapsed), The time displayed in AppWidget is slow than the real server time.
I doubt that my method used above is not accurate enough to update the time.
Any suggestions about this problem?
You should not use SystemClock.uptimeMillis() because it does not include time spent in deep sleep, thats why your app widget is out of sync.
You should use SystemClock.elapsedRealtime() call instead
Upd: sorry, I think I misunderstand the problem here. What you are trying to do is to use postAtTime to post runnable after some time in future. Please notice that postAtTime does not include time when device is in deep sleep.
What you need is to track accurate ammount of deltas between redrawing of your widget. You should use SystemClock.elapsedRealtime() for that.
Algorithm should be like this:
long serverTime = getServerTime();
long lastTime = SystemClock.elapsedRealtime();
// Somewhere in updateWidget() or call on timer:
serverTime = serverTime + SystemClock.elapsedRealtime() - lastTime;
lastTime = SystemClock.elapsedRealtime();
// At this moment in serverTime variable you have "server" time adjusted by the time which passed on device, including time spent in deep sleep
I have a C++ game running through JNI in Android. The frame rate varies from about 20-45fps due to scene complexity. Anything above 30fps is silly for the game; it's just burning battery. I'd like to limit the frame rate to 30 fps.
I could switch to RENDERMODE_WHEN_DIRTY, and use a Timer or ScheduledThreadPoolExecutor to requestRender(). But that adds a whole mess of extra moving parts that might or might not work consistently and correctly.
I tried injecting Thread.sleep() when things are running quickly, but this doesn't seem to work at all for small time values. And it may just be backing events into the queue anyway, not actually pausing.
Is there a "capFramerate()" method hiding in the API? Any reliable way to do this?
The solution from Mark is almost good, but not entirely correct. The problem is that the swap itself takes a considerable amount of time (especially if the video driver is caching instructions). Therefore you have to take that into account or you'll end with a lower frame rate than desired.
So the thing should be:
somewhere at the start (like the constructor):
startTime = System.currentTimeMillis();
then in the render loop:
public void onDrawFrame(GL10 gl)
{
endTime = System.currentTimeMillis();
dt = endTime - startTime;
if (dt < 33)
Thread.Sleep(33 - dt);
startTime = System.currentTimeMillis();
UpdateGame(dt);
RenderGame(gl);
}
This way you will take into account the time it takes to swap the buffers and the time to draw the frame.
When using GLSurfaceView, you perform the drawing in your Renderer's onDrawFrame which is handled in a separate thread by the GLSurfaceView. Simply make sure that each call to onDrawFrame takes (1000/[frames]) milliseconds, in your case something like 33ms.
To do this: (in your onDrawFrame)
Measure the current time before your start drawing using System.currentTimeMillis (Let's call it startTime)
Perform the drawing
Measure time again (Let's call it endTime)
deltaT = endTime - starTime
if deltaT < 33, sleep (33-deltaT)
That's it.
Fili's answer looked great to me, bad sadly limited the FPS on my Android device to 25 FPS, even though I requested 30. I figured out that Thread.sleep() works not accurately enough and sleeps longer than it should.
I found this implementation from the LWJGL project to do the job:
https://github.com/LWJGL/lwjgl/blob/master/src/java/org/lwjgl/opengl/Sync.java
Fili's solution is failing for some people, so I suspect it's sleeping until immediately after the next vsync instead of immediately before. I also feel that moving the sleep to the end of the function would give better results, because there it can pad out the current frame before the next vsync, instead of trying to compensate for the previous one. Thread.sleep() is inaccurate, but fortunately we only need it to be accurate to the nearest vsync period of 1/60s. The LWJGL code tyrondis posted a link to seems over-complicated for this situation, it's probably designed for when vsync is disabled or unavailable, which should not be the case in the context of this question.
I would try something like this:
private long lastTick = System.currentTimeMillis();
public void onDrawFrame(GL10 gl)
{
UpdateGame(dt);
RenderGame(gl);
// Subtract 10 from the desired period of 33ms to make generous
// allowance for overhead and inaccuracy; vsync will take up the slack
long nextTick = lastTick + 23;
long now;
while ((now = System.currentTimeMillis()) < nextTick)
Thread.sleep(nextTick - now);
lastTick = now;
}
If you don't want to rely on Thread.sleep, use the following
double frameStartTime = (double) System.nanoTime()/1000000;
// start time in milliseconds
// using System.currentTimeMillis() is a bad idea
// call this when you first start to draw
int frameRate = 30;
double frameInterval = (double) 1000/frame_rate;
// 1s is 1000ms, ms is millisecond
// 30 frame per seconds means one frame is 1s/30 = 1000ms/30
public void onDrawFrame(GL10 gl)
{
double endTime = (double) System.nanoTime()/1000000;
double elapsedTime = endTime - frameStartTime;
if (elapsed >= frameInterval)
{
// call GLES20.glClear(...) here
UpdateGame(elapsedTime);
RenderGame(gl);
frameStartTime += frameInterval;
}
}
You may also try and reduce the thread priority from onSurfaceCreated():
Process.setThreadPriority(Process.THREAD_PRIORITY_LESS_FAVORABLE);