I'm trying to learn how to use the accelerometer by creating (what I thought would be) a simple app to mimic a crude maraca.
The objective is that when the phone is flicked downwards quickly, it emits a maraca sound at the end of that flick, and likewise a different sound is emitted at the end of an upward flick.
The strategy for implementing this is to detect when the acceleration passes over a certain threshold. When this happens, ShakeIsHappening is set to true, and the data from the z axis is fed into an array. A comparison is made to see whether the first position in the z array is greater or lesser than the most recent position, to see whether the phone has been moved upwards or downwards. This is stored in a boolean called zup.
Once the acceleration goes below zero, we assume the flick movement has ended and emit a sound, chosen depending on whether the movement was up or down (zup).
Here is the code:
public class MainActivity extends Activity implements SensorEventListener {
private float mAccelNoGrav;
private float mAccelWithGrav;
private float mLastAccelWithGrav;
ArrayList<Float> z = new ArrayList<Float>();
public static float finalZ;
public static boolean shakeIsHappening;
public static int beatnumber = 0;
public static float highZ;
public static float lowZ;
public static boolean flick;
public static boolean pull;
public static SensorManager sensorManager;
public static Sensor accelerometer;
private SoundPool soundpool;
private HashMap<Integer, Integer> soundsMap;
private boolean zup;
private boolean shakeHasHappened;
public static int shakesound1 = 1;
public static int shakesound2 = 2;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
results = (TextView) findViewById(R.id.results);
clickresults = (TextView) findViewById(R.id.clickresults);
sensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);
accelerometer = sensorManager
.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
mAccelNoGrav = 0.00f;
mAccelWithGrav = SensorManager.GRAVITY_EARTH;
mLastAccelWithGrav = SensorManager.GRAVITY_EARTH;
soundpool = new SoundPool(4, AudioManager.STREAM_MUSIC, 100);
soundsMap = new HashMap<Integer, Integer>();
soundsMap.put(shakesound1, soundpool.load(this, R.raw.shake1, 1));
soundsMap.put(shakesound2, soundpool.load(this, R.raw.shake1, 1));
}
public void playSound(int sound, float fSpeed) {
AudioManager mgr = (AudioManager)getSystemService(Context.AUDIO_SERVICE);
float streamVolumeCurrent = mgr.getStreamVolume(AudioManager.STREAM_MUSIC);
float streamVolumeMax = mgr.getStreamMaxVolume(AudioManager.STREAM_MUSIC);
float volume = streamVolumeCurrent / streamVolumeMax;
soundpool.play(soundsMap.get(sound), volume, volume, 1, 0, fSpeed);
}
#Override
public boolean onCreateOptionsMenu(Menu menu) {
getMenuInflater().inflate(R.menu.main, menu);
return true;
}
#Override
public void onAccuracyChanged(Sensor sensor, int accuracy) {
}
#Override
public void onSensorChanged(SensorEvent event) {
float x = event.values[0];
float y = event.values[1];
z.add((event.values[2])-SensorManager.GRAVITY_EARTH);
mLastAccelWithGrav = mAccelWithGrav;
mAccelWithGrav = android.util.FloatMath.sqrt(x * x + y * y + z.indexOf(z.size()-1) * z.indexOf(z.size()-1));
float delta = mAccelWithGrav - mLastAccelWithGrav;
mAccelNoGrav = mAccelNoGrav * 0.9f + delta; // Low-cut filter
if (mAccelNoGrav > 3) {
shakeIsHappening = true;
z.clear();
}
if (mAccelNoGrav < 0) {
if (shakeIsHappening) {
shakeIsHappening = false;
shakeHasHappened = true;
}
}
if (shakeIsHappening && z.size() != 0) {
if (z.get(z.size()-1) > z.get(0)) {
zup = true;
} else if (z.get(0) > z.get(z.size()-1)) {
zup = false;
}
}
if (shakeHasHappened) {
Log.d("click", "up is" + zup + "Low Z:" + z.get(0) + " high Z:" + z.get(z.size()-1));
if (!zup) {
shakeHasHappened = false;
playSound(shakesound2, 1.0f);
z.clear();
} else if (zup) {
shakeHasHappened = false;
playSound(shakesound1, 1.0f);
z.clear();
}
}
}
Some of the problems I'm having are:
I think ShakeHasHappened kicks in when deceleration starts, when acceleration goes below zero. Perhaps this should be when deceleration stops, when acceleration has gone negative and is now moving back towards zero. Does that sound sensible?
The way of detecting whether the motion is up or down isn't working - is this because I'm not getting an accurate reading of where the phone is when looking at the z axis because the acceleration is also included in the z-axis data and therefore isn't giving me an accurate position of the phone?
I'm getting lots of double clicks, and I can't quite work out why this is. Sometimes it doesn't click at all.
If anyone wants to have a play around with this code and see if they can find a way of making it more accurate and more efficient, please go ahead and share your findings. And if anyone can spot why it's not working the way I want it to, again please share your thoughts.
To link sounds to this code, drop your wav files into your res\raw folder and reference them in the R.raw.shake1 bit (no extension)
Thanks
EDIT: I've done a bit of research and have stumbled across something called Dynamic Time Warping. I don't know anything about this yet, but will start to look in to it. Does anyone know if DTW could be a different method of achieving a working maraca simulator app?
I can give you some pointers on this:
First of all, I noticed that you're using the same resource for both outcomes:
soundsMap.put(shakesound1, soundpool.load(this, R.raw.shake1, 1));
soundsMap.put(shakesound2, soundpool.load(this, R.raw.shake1, 1));
The resource in case of shakesound2 should be R.raw.shake2.
Second, the following only deals with one of the motions:
if (mAccelNoGrav > 3)
This should be changed to:
if (mAccelNoGrav > 3 || mAccelNoGrav < -3)
Currently, you are not intercepting downward motion.
Third, acceleration value of 3 is rather low. If you want to avoid/filter-out normal arm movement, this value should be around 6 or 7 and -6 or -7.
Fourth, you do not need to store z values to check whether the motion was up or down. You can check whether:
mAccelnoGrav > 6 ===> implies motion was upwards
mAccelnoGrav < -6 ===> implies motion was downwards
You can use this information to set zup accordingly.
Fifth: I can only guess that you are using if (mAccelNoGrav < 0) to play the sound when the motion ends. In that case, this check should be changed to:
if (mAccelNoGrav < epsilon || mAccelNoGrav > -epsilon)
where epsilon is some range such as (-1, 1).
Sixth, you should include a lockout period in your application. This would be the period after all conditions have been met and a sound is about to be played. For the next, say 1000 ms, don't process the sensor values. Let the motion stabilize. You'll need this to avoid getting multiple playbacks.
Note: Please include comments in your code. At the very least, place comments on every block of code to convey what you are trying to accomplish with it.
I tried to implement it by myself a time ago, and ended up using this solution.-
http://jarofgreen.co.uk/2013/02/android-shake-detection-library/
based on the same concept in your question.
Related
Anyone knows how to get smooth vertical orientation degree in Android?
I already tried OrientationEventListener as shown below but it's very noisy. already tried all rates, Normal, Delay, Game and Fastest, all shown the same result.
myOrientationEventListener = new OrientationEventListener(this, SensorManager.SENSOR_DELAY_NORMAL) {
#Override
public void onOrientationChanged(int arg0) {
orientaion = arg0;
Log.i("orientaion", "orientaion:" + orientaion);
}
};
So there are two things going on that can affect what you need.
Sensor delay. Android provides four different sensor delay modes: SENSOR_DELAY_UI, SENSOR_DELAY_NORMAL, SENSOR_DELAY_GAME, and SENSOR_DELAY_FASTEST, where SENSOR_DELAY_UI has the longest interval between two data points and SENSOR_DELAY_FASTEST has the shortest. The shorter the interval the higher data sampling rate (number of samples per second). Higher sampling rate gives you more "responsive" data, but comes with greater noise, while lower sampling rate gives you more "laggy" data, but more smooth.
Noise filtering. With the above in mind, you need to decide which route you want to take. Does your application need fast response? If it does, you probably want to choose a higher sampling rate. Does your application need smooth data? I guess this is obviously YES given the context of the question, which means you need noise filtering. For sensor data, noise is mostly high frequency in nature (noise value oscillates very fast with time). So a low pass filter (LPF) is generally adequate.
A simple way to implement LPF is exponential smoothing. To integrate with your code:
int orientation = <init value>;
float update_rate = <value between 0 to 1>;
myOrientationEventListener = new OrientationEventListener(this, SensorManager.SENSOR_DELAY_NORMAL) {
#Override
public void onOrientationChanged(int arg0) {
orientation = (int)(orientation * (1f - update_rate) + arg0 * update_rate);
Log.i("orientation", "orientation:" + orientation);
}
};
Larger update_value means the resulting data is less smooth, which should be intuitive: if update_value == 1f, it falls back to your original code. Another note about update_value is it depends on the time interval between updates (related to sensor delay modes). You probably can tune this value to find one works for you, but if you want to know exactly how it works, check the alpha value definition under Electronic low-pass filters -> Discrete-time realization.
I had a similar problem showing an artificial horizon on my device. The low pass filter (LPF) solved this issue.
However you need to consider when you use the orientation angle in degrees and apply the LPF on it blindly, the result is faulty when the device is in portrait mode and turned from left to ride or opposite. The reason for this is the shift between 359 and 0 degree. Therefore I recommend to convert the degree into radians and apply the LPF on the sin and cos values of the orientation angle.
Further I recommend to use a dynamic alpha or update rate for the LPF. A static value for the alpha might be perfect on your device but not on any other.
The following class filters based on radians and uses a dynamic alpha as described above:
import static java.lang.Math.*;
Filter {
private static final float TIME_CONSTANT = .297f;
private static final float NANOS = 1000000000.0f;
private static final int MAX = 360;
private double alpha;
private float timestamp;
private float timestampOld;
private int count;
private int values[];
Filter() {
timestamp = System.nanoTime();
timestampOld = System.nanoTime();
values = new int[0];
}
int filter(int input) {
//there is no need to filter if we have only one
if(values.length == 0) {
values = new int[] {0, input};
return input;
}
//filter based on last element from array and input
int filtered = filter(values[1], input);
//new array based on previous result and filter
values = new int[] {values[1], filtered};
return filtered;
}
private int filter(int previous, int current) {
calculateAlpha();
//convert to radians
double radPrev = toRadians(previous);
double radCurrent = toRadians(current);
//filter based on sin & cos
double sumSin = filter(sin(radPrev), sin(radCurrent));
double sumCos = filter(cos(radPrev), cos(radCurrent));
//calculate result angle
double radRes = atan2(sumSin, sumCos);
//convert radians to degree, round it and normalize (modulo of 360)
long round = round(toDegrees(radRes));
return (int) ((MAX + round) % MAX);
}
//dynamic alpha
private void calculateAlpha() {
timestamp = System.nanoTime();
float diff = timestamp - timestampOld;
double dt = 1 / (count / (diff / NANOS));
count++;
alpha = dt/(TIME_CONSTANT + dt);
}
private double filter(double previous, double current) {
return (previous + alpha * (current - previous));
}
}
For further readings see this discussion.
I have an app that needs to detect shake possibly whenever the user has their phone's screen on. I've found plenty of examples of how to detect shake. The example below being the most intriguing so far with use of Google code and adding in the gravity component. My question is, can this code be improved? Shake detection is pretty solid and i'm getting no false positives. I'm mostly concerned with battery life improvement.
private static final int mMinimumForce = 5;
private static final int mShakeFrequency = 500;
private static final int mMovesRequired = 4;
private float[] mGravity = { 0.0f, 0.0f, 0.0f };
private float[] mAcceleration = { 0.0f, 0.0f, 0.0f };
private static final int mXAxis = 0;
private static final int mYAxis = 1;
private static final int mZAxis = 2;
private long mCurrentTime = 0;
private long mLastTime = 0;
private int mMoveCount = 0;
private final float mAlpha = 0.8f;
public void onSensorChanged(SensorEvent event)
{
if(event.sensor.getType() != Sensor.TYPE_ACCELEROMETER)
{
return;
}
// Set linear acceleration
// Gravity components of x, y, and z acceleration
mGravity[mXAxis] = mAlpha * mGravity[mXAxis] + (1 - mAlpha) * event.values[mXAxis];
mGravity[mYAxis] = mAlpha * mGravity[mYAxis] + (1 - mAlpha) * event.values[mYAxis];
mGravity[mZAxis] = mAlpha * mGravity[mZAxis] + (1 - mAlpha) * event.values[mZAxis];
// Linear acceleration of x, y, z with gravity effect removed
mAcceleration[mXAxis] = event.values[mXAxis] - mGravity[mXAxis];
mAcceleration[mYAxis] = event.values[mYAxis] - mGravity[mYAxis];
mAcceleration[mZAxis] = event.values[mZAxis] - mGravity[mZAxis];
// Set maximum linear acceleration amongst x, y, z
float maxAcceleration = mAcceleration[mXAxis];
if (mAcceleration[mYAxis] > maxAcceleration)
{
maxAcceleration = mAcceleration[mYAxis];
}
if (mAcceleration[mZAxis] > maxAcceleration)
{
maxAcceleration = mAcceleration[mZAxis];
}
// Process shake
if (maxAcceleration > mMinimumForce)
{
Log.d(TAG, "Shake detected");
mCurrentTime = System.currentTimeMillis();
if (mLastTime == 0)
{
mLastTime = mCurrentTime;
}
long elapsedTime = mCurrentTime - mLastTime;
if (elapsedTime > mShakeFrequency)
{
mLastTime = 0;
mMoveCount = 0;
}
else
{
mMoveCount++;
if (mMoveCount > mMovesRequired)
{
Log.d(TAG, "Shake moves detected: " + mMovesRequired);
// do some work here
mLastTime = 0;
mMoveCount = 0;
}
}
}
}
Google I/O docs have great information on all of your concerns. Here's one such document.
https://dl.google.com/io/2009/pres/W_0300_CodingforLife-BatteryLifeThatIs.pdf
Your point on floating point math is correct. While your code doesn't do much as far as calculations, the constant calling of it at high frequency could tax the CPU.
Your point on accelerometer using battery. While each device is different regarding power consumption and this device doesn't use close to what the gyroscope does, it will show a marked difference if used non-stop on a full day of active screen use.
I agree that coding to the highest standards and efficiency is good regardless of if you see a marked difference in battery usage. It's just being a good citizen. If I can get 30 more minutes out of my phone, give it to me!!!
My suggestions on your code, which are really just reflective of Google's recommendations and many you're already speaking about.
Register your listener with the lowest possible polling rate.
Supplement point one with a filter at the beginning of the code to immediately return based on that poll rate not being satisfied. (Current Time - Last Time) > POLL_RATE. This is important because Android may not adhere to the poll rate registered in the listener.
Favor integer math over floating point if you can.
Investigate whether you can use AlarmManager and/or other sensors first that are more cost effective before engaging the accelerometer. I don't know if this is possible in your case, but worth checking into.
:D
I'm currently creating my first ever LibGDX game for Android.
And I've run into a issue regarding the movement of my little player(which is a bucket ;)).
I want to make the bucket(player) only draggable on the x-axis, and not clickable to change position.
The y-axis won't be an issue, because the bucket will only be able to change position on the x-axis.
So I basicly want to make the bucket only draggable.
Sorry for my writing, English isn't my native language.
Here's a code snippet from my code, where the movement operates:
if (Gdx.input.isTouched()){
Vector3 touchPos = new Vector3();
touchPos.set(Gdx.input.getX(), Gdx.input.getY(), 0);
//Check the values
float touchPosX = Gdx.input.getX();
float touchPosY = Gdx.input.getY();
String touchXString = Float.toString(touchPosX);
String touchYString = Float.toString(touchPosY);
Gdx.app.log("Movement - X", touchXString);
//Gdx.app.log("Movement - Y", touchYString);
Gdx.app.log("Movement - BucketX", bucketXString);
//Gdx.app.log("Movement - BucketY", bucketYString)
camera.unproject(touchPos);
bucket.x = touchPos.x - 64 / 2;;}
The code is a bit messy...
If I understand correctly, what you want is that the bucket only moves when you are dragging it, and right now what it does is move to whatever "x" you clicked/taped, even if that "x" is far from the bucket, thus "teleporting" it.
To do this, first have a boolean lets say, "btouched".
boolean btouched = false;
Then you will have to use an Input Processor (instead of InputPolling). to check in the touchdown event if the click/tap was inside the bucket, then only if it was, it will move in the drag event:
MyInputProcessor inputProcessor;
public class MyInputProcessor implements InputProcessor{
#Override
public boolean touchDown(int screenX, int screenY, int pointer, int button){
camera.unproject(touchPos.set(Gdx.input.getX(pointer), Gdx.input.getY(pointer), 0));
if(bucket.contains(touchPos)){
btouched = true;
}else{
btouched = false;
}
return false;
}
#Override
public boolean touchDragged(int screenX, int screenY, int pointer){
camera.unproject(touchPos.set(Gdx.input.getX(pointer), Gdx.input.getY(pointer), 0));
if(btouched){
bucket.x = touchPos.x - 64 / 2;
}
return false;
}
...
}
Dont forget to set the input processor:
inputProcessor = new MyInputProcessor();
Gdx.input.setInputProcessor(inputProcessor);
is that possible to slow the the accelerometer update frequenzy to 1hz and how?
i've tried it on nexus 7 tab with this: changed the sensorDelay_Normal to 1.000.000 but nothing changed.
Thank You!
here is the code:
mAccelerometer.registerListener(listener,mAccelerometer.getDefaultSensor(Sensor.TYPE_ACCELEROMETER),SensorManager.1000000);
This is how I get 1Hz acceleration:
static int ACCE_FILTER_DATA_MIN_TIME = 1000; // 1000ms
long lastSaved = System.currentTimeMillis();
#Override
public void onSensorChanged(SensorEvent event) {
if ((System.currentTimeMillis() - lastSaved) > ACCE_FILTER_DATA_MIN_TIME) {
lastSaved = System.currentTimeMillis();
float x = event.values[0];
float y = event.values[1];
float z = event.values[2];
}
}
You can always disregard measurements except one each second. Or do you want to spare battery?
okay i have found the solution:
public static final double accFreq = 15; //15 = 1 sec
long nowA = 0;
long timeA = 0;
int tempA = 0;
public SensorEventListener listener = new SensorEventListener() {
long tSA;
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
float x = event.values[0];
float y = event.values[1];
float z = event.values[2];
// Get timestamp of the event
tSA = event.timestamp;
if (nowA != 0) {
tempA++;
if (tempA == settings.accFreq) {
timeA = tSA - nowA;
tempA = 0;
Log.e("Accelerometer:", "" + x + " " + y + " " + z);
}
}
if (tempA == 0) {
nowA = tSA;
}
}
}
This is the only way i can get it to work in 1Hz!
A custom value for the value of the sensor speed dont work well.
If you take a look into the class you will find the "getDelay" function in every .registerListener() methode.
public boolean registerListener(SensorEventListener listener, Sensor sensor, int rateUs,
int maxBatchReportLatencyUs) {
int delay = getDelay(rateUs);
return registerListenerImpl(listener, sensor, delay, null,maxBatchReportLatencyUs, 0);
}
Wich is also defined in the same class:
private static int getDelay(int rate) {
int delay = -1;
switch (rate) {
case SENSOR_DELAY_FASTEST:
delay = 0;
break;
case SENSOR_DELAY_GAME:
delay = 20000;
break;
case SENSOR_DELAY_UI:
delay = 66667;
break;
case SENSOR_DELAY_NORMAL:
delay = 200000;
break;
default:
delay = rate;
break;
}
return delay;
}
So normaly it should work to set a custom value.
But if you try to use custom values above about 100 milliseconds (100000 microseconds) the measured time between the events can jump up to 1000 milliseconds and stay there. If you go for 1000 ,2000 or much higher milliseconds the time between the events still are around 1000 milliseconds.
You can use SENSOR_DELAY_NORMAL (= 200 milliseconds) and catch the event if enough time passed by checking that every time (like David did)
If you go higher you should not use this enery draining approach - instead register the sensorListener catch the value and unregister it every time.
With the current approach the application is still getting events every 60ms. Just because it discards 14 out of 15 events, does NOT reduce the battery consumption as the sensor hardware still needs to process data once every 60ms and the system receives it and sends it to the application. i.e NOT much battery savings.
(nitpick alert) To be accurate with the above method, one needs to pick a sample from 16 or 17 events as each event occurs at around 60ms on an average.
The device freezing upon registering multiple sensorEventListeners at SENSOR_DELAY_NORMAL is NOT an issue with the Android framework (which can easily handle the scenario). This indicates an issue with how the application is written. The first thing to look for is a lot of cpu-intensive code within the main UI-thread. Follow the developer guidelines to make the application responsive with multiple sensorEventListeners() registered with SENSOR_DELAY_NORMAL and multiple worker threads performing the cpu-intensive calculations.
After going through the lovely set of Java tutorials and a while spent buried in source code I'm beginning to get the feel for it. For my next step, I will dive in with a fully featured application with graphics, sound, sensor use, touch response and a full menu.
My grand idea was to make an animated screwdriver where sliding the controls up and down modulate the frequency and that frequency dictates the sensor data it returns.
Now I have a semi-working sound system but its pretty poor for what its designed to represent and I just wouldn't be happy producing a sub-par end product whether its my first or not.
the problem:
sound must begin looping when the user presses down on the control
the sound must stop when the user releases the control
when moving the control up or down the sound effect must change pitch accordingly
if the user doesn't remove there finger before backing out of the application it must plate the casing of there device with gold (Easter egg ;P)
now I'm aware of how monolithic the first 3 look and that's why I would really appreciate any help I can get.
sorry for how bad this code looks but my general plan is to create the functional components then refine the code later, no good painting the walls if the roofs not finished.
here's my user input, he set slide stuff is used in the graphics for the control
#Override
public boolean onTouchEvent(MotionEvent event)
{
//motion event for the screwdriver view
if(event.getAction() == MotionEvent.ACTION_DOWN)
{
//make sure the users at least trying to touch the slider
if (event.getY() > SonicSlideYTop && event.getY() < SonicSlideYBottom)
{
//power setup, im using 1.5 to help out the rate on soundpool since it likes 0.5 to 1.5
SonicPower = 1.5f - ((event.getY() - SonicSlideYTop) / SonicSlideLength);
//just goes into a method which sets a private variable in my sound pool class thing
mSonicAudio.setPower(1, SonicPower);
//this handles the slides graphics
setSlideY ( (int) event.getY() );
#Override
public boolean onTouchEvent(MotionEvent event)
{
//motion event for the screwdriver view
if(event.getAction() == MotionEvent.ACTION_DOWN)
{
//make sure the users at least trying to touch the slider
if (event.getY() > SonicSlideYTop && event.getY() < SonicSlideYBottom)
{
//power setup, im using 1.5 to help out the rate on soundpool since it likes 0.5 to 1.5
SonicPower = 1.5f - ((event.getY() - SonicSlideYTop) / SonicSlideLength);
//just goes into a method which sets a private variable in my sound pool class thing
mSonicAudio.setPower(1, SonicPower);
//this handles the slides graphics
setSlideY ( (int) event.getY() );
//this is from my latest attempt at loop pitch change, look for this in my soundPool class
mSonicAudio.startLoopedSound();
}
}
if(event.getAction() == MotionEvent.ACTION_MOVE)
{
if (event.getY() > SonicSlideYTop && event.getY() < SonicSlideYBottom)
{
SonicPower = 1.5f - ((event.getY() - SonicSlideYTop) / SonicSlideLength);
mSonicAudio.setPower(1, SonicPower);
setSlideY ( (int) event.getY() );
}
}
if(event.getAction() == MotionEvent.ACTION_UP)
{
mSonicAudio.stopLoopedSound();
SonicPower = 1.5f - ((event.getY() - SonicSlideYTop) / SonicSlideLength);
mSonicAudio.setPower(1, SonicPower);
}
return true;
}
and here's where those methods end up in my sound pool class its horribly messy but that's because I've been trying a ton of variants to get this to work, you will also notice that I begin to hard code the index, again I was trying to get the methods to work before making them work well.
package com.mattster.sonicscrewdriver;
import java.util.HashMap;
import android.content.Context;
import android.media.AudioManager;
import android.media.SoundPool;
public class SoundManager
{
private float mPowerLvl = 1f;
private SoundPool mSoundPool;
private HashMap<Integer, Integer> mSoundPoolMap;
private AudioManager mAudioManager;
private Context mContext;
private int streamVolume;
private int LoopState;
private long mLastTime;
public SoundManager()
{
}
public void initSounds(Context theContext)
{
mContext = theContext;
mSoundPool = new SoundPool(2, AudioManager.STREAM_MUSIC, 0);
mSoundPoolMap = new HashMap<Integer, Integer>();
mAudioManager = (AudioManager)mContext.getSystemService(Context.AUDIO_SERVICE);
streamVolume = mAudioManager.getStreamVolume(AudioManager.STREAM_MUSIC);
}
public void addSound(int index,int SoundID)
{
mSoundPoolMap.put(1, mSoundPool.load(mContext, SoundID, 1));
}
public void playUpdate(int index)
{
if( LoopState == 1)
{
long now = System.currentTimeMillis();
if (now > mLastTime)
{
mSoundPool.play(mSoundPoolMap.get(1), streamVolume, streamVolume, 1, 0, mPowerLvl);
mLastTime = System.currentTimeMillis() + 250;
}
}
}
public void stopLoopedSound()
{
LoopState = 0;
mSoundPool.setVolume(mSoundPoolMap.get(1), 0, 0);
mSoundPool.stop(mSoundPoolMap.get(1));
}
public void startLoopedSound()
{
LoopState = 1;
}
public void setPower(int index, float mPower)
{
mPowerLvl = mPower;
mSoundPool.setRate(mSoundPoolMap.get(1), mPowerLvl);
}
}
I almost forgot, that looks pretty ineffective but I omitted my thread which actuality updates it, nothing fancy it just calls:
mSonicAudio.playUpdate(1);
There are some confusing points in there which I think are just cut and paste issues trying to get the source into this page, but assuming you're not having problems with your onTouchEvent handling, my random comments are:
It looks like you are calling play() every 250 ms while the touch is held. I can't see the loop argument to the play() call but I assume it is -1. If so, then you are launching a brand new looped sound every 250 msc (play returns a unique streamId for every audio stream you create).
I think you wanted to modify the pitch and amplitude of a single existing stream. So I think you wanted something like this:
int mySoundStreamId = 0;
...
onDown()
if( mySoundStreamId == 0 ) {
// create the one true stream
mySoundStreamId = mySOundPool.play( initial power and freq modifiers, loop = -1 )
} else {
// resume the one true stream
mySoundPool.resume( mySoundStreamId ); // note: a STREAM id is NOT a SOUND id.
}
onUp()
if( mySoundStreamId != 0 ) {
// pause the one true stream
mySoundPool.pause( mySoundStreamId ) // stop() will release all the samples you held
}
onWiggle()
if( mySoundStreamId != 0 ) {
// modify parameters of the one true stream
mySoundPool.setPitch( mySoundStreamId, newPitch ); // too lazy to look up real soundPool command
}
onGameOver
if( mySoundStreamId != 0 ) {
// shut down and release the samples of the one true stream
mySoundPool.setLoop( mySountStreamId, 0 ); // otherwise some phones will keep looping after shutdown
mySoundPool.stop( mySoundStreamId ); // no resume possible after this, need to reload samples
mySOundStreamId = 0;
}
I omit the creation/destruction of the sound pool itself. It sounds like you were successfully loading the sound data into the sound pool ok.
Note that the LOAD returns a SOUND ID which you pass to the PLAY command
but that PLAY returns a STREAM ID which you use in most of the other soundPool methods
Of course, I have my own problems with 'resume' on a looped sound, so take what I say with a grain of salt :-)
Good Luck! I guess I should have checked the time stamp. My apologies if you posted thie 3 years ago :-)