I am trying to log the start up latency of my app. They way I am doing it is setting the start time of the app on Application.onCreate and provide a public method that returns the time.
MyApplication extends Application {
Date startUpTime;
//Declare variables
#Override
public void onCreate() {
super.onCreate();
setStartupTime();
//other initializations
}
private void setStartUpTime() {
startUpTime = new Date();
}
public Date getStartUpTime() {
return startUpTime;
}
}
MyActivity extends Activity {
.
.
.
#Override
public void onStart(){
logStartUpLatency();
//other onStart stuff
}
private void logStartUpLatency() {
Date currentTime = new Date();
Date startTime = (MyApplication)getApplicationContext().getStartUpTime();
long latency = currentTime.getTime() - startTIme.getTime();
Log.d("Start up Latency is ", Long.toString(latency)):
}
This is how I am testing my start up latency:
adb install myapk
run the app to get the first start up latency. I can see the latency logged is correct for the first start
run the app again to test the start latency. The latency logged is correct for the start(or any number of subsequent starts)
Now I increase my app's version code and name by 1. To simulate an upgrade, I used the command adb install -r myapk.
Now I run the app again to test the first start latency after upgrade, even though it takes 3 seconds, the latency logged is off the charts.
Does any one know why that might happen?
Update
So if I install the apk using "adb install -r myapk", the app isn't going through the Myapplication.onCreate().
I suggest the use of the TimingLogger class. As per the documentation, you can easily track the elapsed time and even add splits in the process.
This
TimingLogger timings = new TimingLogger(TAG, "methodA");
// ... do some work A ...
timings.addSplit("work A");
// ... do some work B ...
timings.addSplit("work B");
// ... do some work C ...
timings.addSplit("work C");
timings.dumpToLog();
produces
D/TAG (3459): methodA: begin
D/TAG (3459): methodA: 9 ms, work A
D/TAG (3459): methodA: 1 ms, work B
D/TAG (3459): methodA: 6 ms, work C
D/TAG (3459): methodA: end, 16 ms
The latency you are calculating is in milliseconds. Date#getTime() returns the number of milliseconds since Jan. 1, 1970, midnight GMT.
The observed time of 3 seconds is because of the overhead for uninstalling the old and installing the new app build.
So if I install the apk using "adb install -r myapk", the app isn't going through the Myapplication.onCreate(). So that answers this question. I will ask a separate question on why installing an application using "adb install -r myapk" and then starting myapk doesn't go through MyApplication.onCreate()
Related
Let's say I'm developing a game and there exists such thing as respawn. The user may respawn after 15 minutes. Is there any common practice to avoid time cheating? I mean, nothing can stop user from changing system time and set it to future. I know, partially this can be resolved by using server side, but nothing can stop user from disabling the network at all.
PS. the game is cross platform so the solution is interesting for both antroid and iOS. PS2. I know a couple of games that have the solution.
For something like this you could simply start your own timer completely separate from the system time. If you start a 15 minute countdown when a player dies they won't be able to modify your internal timer. I'm not as familiar with iOS dev (NSTimer looks like a possibility) but I know in Android it's as easy as:
// create 30 second internal countdown timer
new CountDownTimer(30000, 3000) {
public void onTick(long millisUntilFinished) {
// store time remaining in database every 3 seconds in case user exits the game
dbHelper.updateDeathTimer(millisUntilFinished);
}
public void onFinish() {
// player respawns now
}
}.start();
To combat the issue of players closing the game I would suggest you also set up your internal timer to cache its' current state in the database at a regular interval, say every 30 seconds or so if you were going to stick with a 15 minute timer.
Below is some psuedocode for what it might look like when a player exits the game while they are dead and our respawn timer was still in progress.
// When game resumes check database and begin updated death timer if necessary
onGameResume() {
if(dbHelper.isUserDead()) {
// resume respawn timer
new CountDownTimer(dbHelper.getRespawnTime(), 15000) {
public void onTick(long millisUntilFinished) {
// store time remaining in database every 15 seconds in case user exits the game
dbHelper.updateDeathTimer(millisUntilFinished);
}
public void onFinish() {
// player respawns now
}
}.start();
}
}
You can store the time of the event (spawn in your case), and then run a timeIntervalSinceReferenceDate against it and see if it exceeds your limit (15 minutes).
The risk on the iOS side is that the app goes to the background and the timers stop.
It's not actually the answer, but just an idea.
What if we use tickCount? Each operating system has this property.
For instance, android.os.SystemClock.elapsedRealtime() in Android and
[NSProcessInfo processInfo].systemUptime in iOS
They should give required values.
I'm currently facing a problem.
I create a games where user can race within a level, and there's stopwatch to count the time user finished the level.
I use this code snippet for convert the counter to stopwatch
void update(float dt)
{
if (!paused)
{
fcheckbutton+=dt;
ftimer+=dt;
if (ftimer >= 0.01f)
{
this->timer();
ftimer = 0.0f;
}
}
}
void timer()
{
m_timer++;
int milliseconds = m_timer%60;
int seconds = (m_timer/60)%60;
int minutes = m_timer/3600;
CCString * P1Time = CCString::createWithFormat("%02d:%02d:%02d", minutes,seconds,milliseconds);
m_label->setString(P1Time->getCString());
}
I called this function in my update method...
The problem is in every device (iOS and Android) the stopwatch produce various result...
In my test case, if user doesn't make any input, he should be lost in about 32 secs....
iOS (iPhone 4) is the closest one, with stopwatch ~32 secs
But in Android device, The result is vary..
Google Nexus S 28 seccs
new Google Nexus 7 18 secs
Galaxy Note 30 secs
It's important that stopwatch must be sync in every device, because i have a leaderboard based on user's stopwatch result.
How do I supposed to do this?
EDIT: update to call timer method
Where do you update your m_timer variable? I think the problem is that some device has lower FPS.
1) Exclusive time is the time spent in the method
2) Inclusive time is the time spent in the method plus the time spent in any called functions
3) We refer to calling methods as "parents" and called methods as "children."
Reference Link : Click here
Question here is :
what are difference between
Incl CPU Time & Incl Real CPU Time ?
Excl CPU Time & Excl Real CPU Time ?
in my one example trace file
for Method1() : Incl CPU Time = 242 msec & Incl Real CPU Time = 5012 msec
i can not identify reason behind 5012-242 = 4770 msec gap in above both times.
Please help me if you have any idea.
Here's the DDMS documentation
Incl CPU time is the inclusive cpu time. It is the sum of the time spent in the function itself, as well as the sum of the times of all functions that it calls.
Excl CPU time is the exclusive cpu time. It is only the time spent in the function itself. You'll notice that it is always the same as the "incl time" of the "self" child.
The documentation doesn't clarify the difference between CPU time and real time, but I agree with Neetesh that CPU time is the time that the function is actually running (this would not include waiting on IO) and the real time is the wall clock time (which would include time spent doing IO).
cpu time is the time for which the process uses the cpu and cpu real time is the total time from the starting of process to end of process it includes waiting time of process to execute.
from the source code of .trace, you can see the cpu time detail different from the real cpu time, it's the same with the description of the android doc:
CPU time considers only the time that the thread is actively using CPU time, and real time provides absolute timing information from the moment your app enters a method to when it exits that method—regardless of whether the thread is active or sleeping.
Just as Chris and David said, I did a test.
#include <unistd.h>
#define S ((long long)1000 * 1000 * 1000)
// My CPU frequency is 3 GHz
void run() {
for (int i = 0; i < S; ++i);
}
void g() {
run();
run();
run();
for (int i = 0; i < S; ++i);
}
int main() {
g();
// run();
return 0;
}
As you can see, the inclusive time of function g is 8 s and its exclusive time is 2 s:
I have a problem with this code used for Android (Java)
handler.postDelayed(new Runnable(){
public void run(){
// Your code goes here...
}
}, 500);
If the delay is about 500ms then the program seems to repeat the task at 0.5s, but if I change to less than 100ms or even less it does not follow any more. I test the brightness change and for a while it can repeat the change of brightness at that rate, but then slow down and come back to normal flash rate again. It seems unstable. Do you have any code that give exact delay regardless of the load of the phone's CPU.
Many thanks
Not from Java, no; stock Java isn't a real-time system.
Timing precision is subject to the whims of the JVM and the OS's scheduler. You may be able to get incrementally more precise, but there's no guarantee of the kind of precision you're looking for.
You might be able to do something more precise if you use a CountDownTimer which has a periodic tick. Essentially you set it to count down for a period which can be hours if need be, and there are two methods one method is called on each tick, and the other at the end of the timer at which point you could start another one. Anyway you could set the tick to be very fast, and then only kick off the code at the delay point by check the actual time difference in the click. I think thats about the best you could do. Essentially inside the tick you would issue a signal if the right amout of time had actually passed. That signal would either kick off the thread or release something the already running thread was waiting on. What is the value of the CountDownTimer, I guess its just that you can do a very frequent polling, and elapsed time check. Although its not guaranteed, the time between the ticks you can set it to a high frequency and check/poll very frequently. This could lead to a smooth performance not unlike a realtime system. Its more likely to be accurate because its just issuing a signal and not taking up the resources of threading just to issue the signal. You might also try an IntentService to perform the tasks and just call startService(intentToIntentService) each call. See if the threading works better inside a service like IntentService which does queue them up I believe.
Date startDate = new Date();
long startTime = startDate.getTime();
// Tick called every 10th of a second. OnFinish called at Signal.
CountDownTimer ctDownTimer = new CountDownTimer(30000, 100) {
long startIntervalTime=startTime;
public void onTick(long millisUntilFinished) {
Date now = new Date();
long nowTime = now.getTime();
if ((startIntervalTime - nowTime) > 100)
{
issueSignal();
intervalStartTime=nowTime;
}
now=null;
}
public void onFinish() {
Log.d("MyClass", "Done") // Maybe start out.
}
}.start();
im working on a audio profile switcher for android and as part of the entire project, i have a service that is running in the background using the following timer code:
timer.scheduleAtFixedRate( new TimerTask() {
public void run() {.....}, 0, nextUpdateInterval);
what im noticing is that the timer is not honoring the dynamically generated next update interval period...the nextUpdateInterval is declared as private static long which is initialized to 30000 (30 seconds) for the first run....then once a profile is found, i do some math and update the nextUpdateInterval...i have converted the nextUpdateInterval value back out to hours/minutes for debugging purpose, and the calculation is working as expected...like it shows me in hours and minutes, when the next timer execution should take place...
nextUpdateInterval calculation: long entirePeriodDiff = toTimeMiliseconds - fromTimeMiliseconds;
then once a profile is found, i calculate the elapsedTime like so: long elapsedTime = rightNowDate.getTime() - fromDate.getTime();
and then i update the nextUpdateInterval: nextUpdateInterval = entirePeriodDiff - elapsedTime;
one example scenario: Profile of 'Work' is set from 9AM to 4:30PM, the service/app is executed at 2:02PM (EST), my toast message is executing constantly and is acting as a count down telling me how much time is left...in this case 2:28 and decreasing...ideally this should not display until the 2:28 is up...any ideas?
As per android doc:
With fixed-rate execution, the start time of each successive run of a task is scheduled without regard for when the previous run took place. This may result in a series of bunched-up runs (one launched immediately after another) if delays prevent the timer from starting tasks on time.
I think that could be the reason, may be you need to consider alternative 'fixed period'