I'm looking to create a drum machine in ActionScript 3 (as an Adobe AIR Android app), which will keep to a user defined tempo (BPM).
I am struggling to find a way to keep the project in time, I have, at the moment, made it so that 5 different sounds are represented in rows of 8 squares, and the user can click each square to choose when to play that sound (hope this makes sense).
At the moment I am using Timer to keep the project in time, which is very laggy and inconsistent.
using timer is a bad idea for this, there I said it...
The issue is that the timer has a drift and fires several milliseconds later.
Try a simple test where you have a timer that executes every 500ms, and then compare the getTimer() count. What I have found in my experiments that the timer is continually off and it looks like it doesn't self correct. I've tried using a self-correcting timer, that changes the firing time based on the getTimer() difference since last run, but it's still not reliable. and anytime your processor's load picks up, the timer will be off anyway.
The correct way of dealing with this is to use byteArray data as a source for the sound. Based on the calculation of sampling resolution you can populate the stream with the data in advance, and the sound will play on time, pretty much guaranteed. I haven't gone as far as to create something that does this myself. But there are several libraries that you can utilize that can help you with this.
My top two decremented libraries are SiON and tonfall
you can see a sample of SiON here http://wonderfl.net/c/qf4b
and tonfall example at http://tonematrix.audiotool.com/
While I haven't tried them on android, I think either should work
Related
Recently I'm working on a mobile offline game based on Cocos2dx-lua.
I found an app on Google Play called GameGuardian which can set the time speed. The app maybe modify the method gettimeofday() in libc.so. I've tried many APIs such as os.time(), SystemClock.elapsedRealtime(), but all failed.
Could somebody please give me a way to avoid the effect of the app?
The only sure method is to make your app contact your game's server to ensure time. Many android games is doing that. "The Battle Cats" is one example.Say, you can check time validity as soon as app starts, and if no connection available, you can allow resources generated for some allowed time. Not more than 1 hour worth since last confirmed time, for example.
Other idea might be checking current fps. Hardware can't speed up, so frame will be rendered in roughly same time when VSync enabled. If you find that rendering single frame takes significantly more time than it should be, and if it happens for many frames, then it might indicate cheating possibility. But this is not stable solution, since you'll have to be sure your fps normally doesn't drop on weak devices, and it doesn't tell you anything about time spent while game wasn't running.
Extreme case might include complete erasing app's state if you find time jumped back since last saved unconfirmed time for more than ~1 hour (in case user adjust DST, etc).
But generally you can't protect completely offline game from time manipulations.
I am programming alarm clock for myself and i got that problem. I also want to use the solution of my problem to set specified(by me) loud of alarm and ringtone. Please, show the code, if possible?
A staple of programming is breaking down a requirement to it's very basic components. Imagine the alarm clock app you want to build as a big cube, made out of many smaller cubes. You need to break each one of them down into their atomic elements. Once you've done that, you'll have your answer.
For example, in your case, I would consider some of the following problems:
Running a background process, which would still be activated, after a user has minimized the app.
Playing a sound.
Taking priority of any other app, disregarding of the state of the phone(locked/unlocked) and displaying the 'Wake up' window.
Getting the date
Getting the time
I would then start searching SO/Google for answers to my specific questions. The Android website and academy is also an incredible resource for all things Android.
I hope this will be helpful to you.
I'm trying to return time in milliseconds between both iOS and Android devices. I was hoping that since most devices sync their time with a networked server, they would be the same. However, I'm noticing they are not precisely the same.
I'm using this method for iOS:
[[NSDate date] timeIntervalSince1970]
and this for Android:
System.currentTimeMillis()
Is there a better way to try to return the same exact time across devices? I'm noticing these values can be off from each other by up to 2 seconds depending upon the Android device.
The use-case for needing this synchronized time is to display a looping animation that is also synced across the devices. So the animation would need to start at the same time, perform its animation for a set duration, and then loop again.
Thanks for any help.
You'll never get exactly the same time. The problem is that clocks aren't perfect, and that they aren't always synched to exactly the same time source at the same rate. Even if you synch to the same time source, the latency between when they process update messages will make a difference between them. 2 seconds is actually pretty good.
Time is tricky. Take 2 devices in perfect synchronimity. Fly from the US to Europe with one of them. They're no no longer in synch, and both are right- the general relativistic effects of traveling at high speed means one is now several milliseconds older than the other.
Basically what you want isn't going to happen. You'll have to settle for close enough. Although if you post why you need them so synchronized maybe we can give you some ideas.
I'm developing an engine and a game at the same time in C++ and I'm using box2D for the physics back end. I'm testing on different android devices and on 2 out of 3 devices, the game runs fine and so do the physics. However, on my galaxy tab 10.1 I'm sporadically getting a sort of "stutter". Here is a youtube video demonstrating:
http://www.youtube.com/watch?v=DSbd8vX9FC0
The first device the game is running on is an Xperia Play... the second device is a Galaxy Tab 10.1. Needless to say the Galaxy tab has much better hardware than the Xperia Play, yet Box2D is lagging at random intervals for random lengths of time. The code for both machines is exactly the same. Also, the rest of the engine/game is not actually lagging. The entire time, it's running at solid 60 fps. So this "stuttering" seems to be some kind of delay or glitch in actually reading values from box2D.
The sprites you see moving check to see if they have an attached physical body at render time and set their positional values based on the world position of the physical body. So it seems to be in this specific process that box2D is seemingly out of sync with the rest of the application. Quite odd. I realize it's a long shot but I figured I'd post it here anyway to see if anyone had ideas... since I'm totally stumped. Thanks for any input in advance!
Oh, P.S. I am using a fixed time step since that seems to be the most commonly suggested solution for things like this. I moved to a fixed time step while developing this on my desktop, I ran into a similar issue just more severe and the fixed step was the solution. Also like I said the game is running steady at 60 fps, which is controlled by a low latency timer so I doubt simple lag is the issue. Thanks again!
As I mentioned in the comments here, this came down to being a timer resolution issue. I was using a timer class which was supposed to access the highest resolution system timer, cross platform. Everything worked great, except when it came to Android, some versions worked and some versions it did not. The galaxy tab 10.1 was one such case.
I ended up re-writing my getSystemTime() method to use a new addition to C++11 called std::chrono::high_resolution_clock. This also worked great (everywhere but Android)... except it has yet to be implemented in any NDK for android. It is supposed to be implemented in version 5 of the crystax NDK R7, which at the time of this post is 80% complete.
I did some research into various methods of accessing the system time or something by which I could base a reliable timer on the NDK side, but what it comes down to is that these various methods are not supported on all platforms. I've went through the painful process of writing my own engine from scratch simply so that I could support every version of android, so betting on methods that are inconsistently implemented is nonsensical.
The only sensible solution for anyone facing this problem, in my opinion, is to simply abandon the idea of implementing such code on the NDK side. I'm going to do this on the Java end instead, since thus far in all my tests this has been sufficiently reliable across all devices that I've tested on. More on that here:
http://www.codeproject.com/Articles/189515/Androng-a-Pong-clone-for-Android#Gettinghigh-resolutiontimingfromAndroid7
Update
I have now implemented my proposed solution, to do timing on the java side and it has worked. I also discovered that handling any relatively large number, regardless of data type (a number such as the nano seconds from calling the monotonic clock) in the NDK side also results in serious lagging on some versions of android. As such I've optimized this as much as possible by passing around a pointer to the system time, to ensure we're not passing-by-copy.
One last thing too, my statement that calling the monotonic clock from the NDK side is unreliable is however, it would seem, false. From the Android docks on System.nanoTime(),
...and System.nanoTime(). This clock is guaranteed to be monotonic,
and is the recommended basis for the general purpose interval timing
of user interface events, performance measurements, and anything else
that does not need to measure elapsed time during device sleep.
So it would seem, if this can be trusted, that calling the clock is reliable, but as mentioned there are other issues that then arise, like handling allocating and dumping the massive number that results which alone nearly cut my framerate in half on the Galaxy Tab 10.1 with Android 3.2. Ultimate conclusion: supporting all android devices equally is either damn near or flat out impossible and using native code seems to make it worse.
I am very new to game development, and you seem a lot more experienced and it may be silly to ask, but are you using delta time to update your world? Altough you say you have a constant frame rate of 60 fps, maybe your frame counter calculates something wrong, and you should use delta time to skip some frames when the FPS is low, or your world seem to "stay behind". I am pretty sure that you are familiar with this, but I think a good example is here : DeltaTimeExample altough it is a C implementation. If you need I can paste some code from my Android Projects of how I use delta time, that I've developed following this book : Beginning Android Games.
I have searched for this online, but am still a bit confused (as I'm sure others will be if they think of something like this). I'd like to preface by saying that this is not for homework and/or profit.
I wanted to create an app that could listen to your microwave as you prepare popcorn. It would work by sounding an alarm when there's a certain time interval between pops (say 5-6 seconds). Again, this is simply a project to keep me occupied - not for a class.
Either way, I'm having trouble trying to figure out how to analyze the audio intake in real-time. That is, I need a way to log the time when a "pop" occurs. So that you guys don't think I didn't do any research into the matter, I've checked out this SO question and have extensively searched the AudioRecord function list.
I'm thinking that I will probably have to do something with one of the versions of read() and then compare the recorded audio every 2 seconds or so to the recorded audio of a "pop" (i.e. if 70% or more of the byte[] audioData array is the same as that of a popping sound, then log the time). Can anyone with Android audio input experience let me know if I'm at least on the right track? This is not a question of me wanting you to code anything for me, but a question as to whether I'm on the correct track, and, if not, which direction I should head instead.
I think I have an easier way.
You could use the MediaRecorder 's getMaxAmplitude method.
Anytime your recorder detects a big jump in amplitude, you have detected a corn pop!
Check out this code (ignore the playback part): Playing back sound coming from microphone in real-time
Basically the idea is that you will have to take the value of each 16-bit sample (which corresponds to the value of the wave at that time). Using the sampling rate, you can calculate the time between peaks in volume. I think that might accomplish what you want.
this may be a bit overkill, but there is a framework from MIT media labs called funf: http://code.google.com/p/funf-open-sensing-framework/
They already created classes for audio input and some analysis (FFT and the like), also saving to files or uploading is implemented as far as I've seen, and they handle most of the sensors available on the phone.
You can also get inspired from the code they wrote, which I think is pretty good.