In the app I'm writing I have a bunch of stats which I want to display for the user.
The stats include when a specific module was last run, when it will be run next, when the last communication with the server was made and then the next one is going to be.
As well as this there are stuff like memory usage (simple memory usage, not measuring the actual usage).
The memory usage etc can be updated every few seconds so that not a problem the but the times needs to be updated every second at least (for counters).
Since running every second (or even with 500ms period) results in irregular updates/skipped seconds I now run it at 300ms period.
I did notice however that my app began to lag when starting.
After some profiling it turns out it's the views that need to resize that is taking 70% of the time and the string formatter (for formatting the counter) takes pretty much the rest.
Apart from the CPU being used I see a lot of allocations, every few seconds I see a GC_CONCURRENT in the logcat.
Any tips on solving this efficiently?
Can you restructure it in a way so that the fiews require less resizing? Eg: set the width of your element to fill_screen or a DP size that is bigger than the longest string size
I solved the problem by writing my own timer that sleeps in short increments and only updates the view when a full second has passed.
This way the fire interval will be [period, period+sleepTime) which is acceptable when you choose a short sleepTime.
I've also changed so it says "5 minutes ago" and I have two timers, one that fires every minute and one that fires every second.
Related
I am developing a small app like game, I have done this before and still not sure if it was the correct way to do it. What I need is percentile chance of some event, for example gaining an item after win 10% chance.
I have been using random number generator each time on server side, if number is <= 10 then user will gain reward, but it still does not satisfy 10% criteria across the all users environment.
I was thinking about recording user's turn number on server side and reward an item every nth time, but don't know if it's right way to do it. I would like to know about your ideas of doing it and suggestions. Also I did not know if stackoverflow is right place to post this or any other community of stackexchange group. If so please guide me in comment and I'll move question in appropriate community. Thanks.
There are two completely different things that you mention in your question. You have to decide which you want because a given algorithm cannot do both.
Every single time you try, there is a 10% chance. You might see it hit two in a row and then it might not hit for 200 tries. But, in the long run, it hits 10% of the time.
You are guaranteeing that out of every 10 tries, it will hit exactly once, never more, never less.
The first one above just fine with a proper random number pick and comparison. But, you will get streaks of more hits than expected and fewer hits than expected. Over time, the average number of hits will be your 10%.
function getRandomTenPercentOutcome() {
return Math.random() < 0.1;
}
The second one requires a more complicated implementation that combines a random generator with keeping track of recent events. The simplest way to implement a guaranteed 1 in 10 hit is to have the server create an filled with zeroes and then randomly select one cell in the array and change it to a 1.
Then, as you need to pick a random outcome, you .shift() off the first item in the array and use that value. When the array becomes empty, you create a new one. This forces exactly 1 in 10 outcomes (starting from the beginning) to hit. Any given 10 outcomes might have 2 or 0 as you cross boundaries of creating a new array, but can't ever be further off than that.
You can make the array be either system-wide (one array for all users) or per-user (each user has their own outcome array). If you want each user to perceive that it is very close to 1 in 10 for them personally, then you would need to make a separate outcome array for each user.
The length of the array can be adjusted to control how much variance you want to allow. For example, you could create an array 50 long and randomly select 5 cells in the array. That would allow more variability within the 50, though would still force 5 hits within the 50.
Both methods will average 10% in the long run. The first may have greater deviations from 10% over any given interval, but will be much more random. The second has much smaller deviations from 10% but some would say it is less random (which is not necessarily bad - depends upon your objective). Both are "fair".
I know from my son's gaming that he perceives things to be "unfair" when a streak of unlikely things happen even though that streak is well within the occasionally expected results from a series of random events. So, sometimes truly random may not seem as fair to a participant due to the occasionally sporadic nature of truly random.
This questions seems to have been asked for multiple languages in other stackoverflow pages.
For example here it is for Java: Java: do something x percent of the time
Check out the link for some ideas, but remember that it will tend to 10% of the time eventually given a large sample, you cant expect 10% immediately for just a couple of calls.
I do agree rewarding an item every nth time is not the way to go.
I'm trying to return time in milliseconds between both iOS and Android devices. I was hoping that since most devices sync their time with a networked server, they would be the same. However, I'm noticing they are not precisely the same.
I'm using this method for iOS:
[[NSDate date] timeIntervalSince1970]
and this for Android:
System.currentTimeMillis()
Is there a better way to try to return the same exact time across devices? I'm noticing these values can be off from each other by up to 2 seconds depending upon the Android device.
The use-case for needing this synchronized time is to display a looping animation that is also synced across the devices. So the animation would need to start at the same time, perform its animation for a set duration, and then loop again.
Thanks for any help.
You'll never get exactly the same time. The problem is that clocks aren't perfect, and that they aren't always synched to exactly the same time source at the same rate. Even if you synch to the same time source, the latency between when they process update messages will make a difference between them. 2 seconds is actually pretty good.
Time is tricky. Take 2 devices in perfect synchronimity. Fly from the US to Europe with one of them. They're no no longer in synch, and both are right- the general relativistic effects of traveling at high speed means one is now several milliseconds older than the other.
Basically what you want isn't going to happen. You'll have to settle for close enough. Although if you post why you need them so synchronized maybe we can give you some ideas.
The thing is, that I have to keep a thread running for a pretty long time (it's really an indefinite time, could be 1 minute or even months), and it needs to update the UI about once every 1 milisecond...
There is the Executor, AsyncTask, Handler and the native Thread class... but which one is better for this case?
The problem with AsyncTask is that it is destroyed (or detached from the Activity?) like one hour or so after the Activity starts running on background, and the user could return to the Activity at any time just to find that it's not working (and even causing memory leaks), and the UI lags when changing to another activity or even pulling down the notification panel.
Natural Threads are even more laggy due to the post() method being called on the TextView every time I need an update to the UI (remember that I need to report progress about once every 1 milisecond)...
tl:dr
I'm developing a long running stopwatch, that can measure from miliseconds to days, weeks or even years. What is the best UI-intensive threading technique for this?
Could you please help me? Thanks!!
--- Edit:
Solved. It had to do something with system resources and the app moving to background. I just had to save the inital time and pauses as a bundle and load them when the app is started again. Thank you everyone!
I'm no Android expert, but I'm guessing that the reason why your AsyncTask gets cancelled is because the OS, being a battery powered OS, is deciding that a long running background task is a bad thing for battery consumption.
Having your program run a background thread for years is going to be a big disappointment for your users who will wonder why their mobiles run out of juice within a couple of hours. I suggest finding another way of doing your time measurement. What's wrong with using the device's real time clock?
I see little point in trying to update a GUI once every millisecond. The OS isn't refreshing the screen at that rate anyway, and no user on earth is going to notice anyway. Plump for once every 40ms at most.
And then there's the matter of accuracy. There's no point trying to measure time with millisecond precision on a device like an Android mobile over periods of hours, never mind days or months. Left alone the clocks and oscillators will be wrong by several seconds a day. The best one will be the real time clock, but even that is going to be pretty poor (they always are). Android is probably doing an NTP updated a couple of times per day, so there will be brief periods in a day when the local clock is close to being accurate (but even then it won't be millisecond accurate).
So even if you do manage to measure time with millisecond precision over months the answers you'll be displaying to the user are going to be wrong by several seconds. You'll be lucky to get within minutes of the actual elapsed time.
If your goal is just to have a stopwatch display rapidly updating when the application is in the foreground, just loop reading the real time clock and calculate/display the time since the stopwatch was started. Don't bother doing anything in the background, just sleep. When your app becomes foreground again resume the loop; the device's real time clock will have been ticking away all the time you're app is asleep, allowing you to calculate and display the time difference. This will be a lot simpler than trying to have a long running background thread, and it will also be more accurate than any other way you might choose (though still not millisecond accurate).
I'm looking to create a drum machine in ActionScript 3 (as an Adobe AIR Android app), which will keep to a user defined tempo (BPM).
I am struggling to find a way to keep the project in time, I have, at the moment, made it so that 5 different sounds are represented in rows of 8 squares, and the user can click each square to choose when to play that sound (hope this makes sense).
At the moment I am using Timer to keep the project in time, which is very laggy and inconsistent.
using timer is a bad idea for this, there I said it...
The issue is that the timer has a drift and fires several milliseconds later.
Try a simple test where you have a timer that executes every 500ms, and then compare the getTimer() count. What I have found in my experiments that the timer is continually off and it looks like it doesn't self correct. I've tried using a self-correcting timer, that changes the firing time based on the getTimer() difference since last run, but it's still not reliable. and anytime your processor's load picks up, the timer will be off anyway.
The correct way of dealing with this is to use byteArray data as a source for the sound. Based on the calculation of sampling resolution you can populate the stream with the data in advance, and the sound will play on time, pretty much guaranteed. I haven't gone as far as to create something that does this myself. But there are several libraries that you can utilize that can help you with this.
My top two decremented libraries are SiON and tonfall
you can see a sample of SiON here http://wonderfl.net/c/qf4b
and tonfall example at http://tonematrix.audiotool.com/
While I haven't tried them on android, I think either should work
my project is growing large, and I am facing the following problem: the compile/transform-to-dalvik/install cycle takes too long.
It takes around 2 minutes since I press F11 (or ^F11) until the dialog to choose the device is shown, and around another minute from there until the application is shown on the phone.
3 minutes may not seem like a waste of time, but it breaks the concentration (actually, I am writing this post during those 3 minutes).
Do you have any trick to avoid this? Any way to avoid the full dalvik conversion when only a couple of classes have been modified? Any way to live-copy the classes to the phone without the need of the .class-dalvik-.apk-copy-install cycle?
Any other hint?