I have an app that stores sensor data into a database every 10ms, but the signal itself has some very sharp peaks:
So it doesn't look like the signal is necessarily noisy, its just around the peaks there aren't enough data points surrounding the peak itself to smooth the peak out
I know I can apply a filter to smooth the signal out, but I'm wondering whether there is anything that I'm doing wrong when polling the sensors/storing into the database that is causing this behaviour
I would have thought that by using a short polling rate like 10ms would give me more data around the peaks to help smooth the peaks out, but reducing that value POLL_FREQUENCY doesn't seem to give me smooth data and I'm not sure why?
I'm collecting the data in a service:
public void onSensorChanged(SensorEvent event) {
sensor = event.sensor;
int i = sensor.getType();
if (i == MainActivity.TYPE_ACCELEROMETER) {
accelerometerMatrix = event.values;
} else if (i == MainActivity.TYPE_GYROSCOPE) {
gyroscopeMatrix = event.values;
} else if (i == MainActivity.TYPE_GRAVITY) {
gravityMatrix = event.values;
} else if (i == MainActivity.TYPE_MAGNETIC) {
magneticMatrix = event.values;
}
long curTime = System.currentTimeMillis();
long diffTime = (curTime - lastUpdate);
// only allow one update every POLL_FREQUENCY (10ms)
if(diffTime > POLL_FREQUENCY) {
lastUpdate = curTime;
//insert into database in background thread
try{
//this simply takes the value from accelerometerMatrix
// and the runnable just inserts into the database as a new row
executor.execute(insertHandler);
} catch (SQLException e) {
Log.e(TAG, "insertData: " + e.getMessage(), e);
}
}
}
So all this code does is wait for a sensor change event, detect the type of event it is (accelerometer, gyro etc), store the value into the appropriate matrix, and if more than 10ms has elapsed, store the matrix values into the database
Can anyone suggest why the peaks are sharp despite using a short poll frequency (10ms)?
Related
I write an app to save sensors data into a file. Goal is to save IMU datas with 100 Hz.
I use Asynctask for the storage part. All seems well; but when i saw values in file, there's a many data written many times. Do you have any ideas:
#Override
public final void onSensorChanged(SensorEvent event) {
//timestamp = (new Date()).getTime() + (event.timestamp - System.nanoTime()) / 1000000L;
timestamp = new Date().getTime();
// Handle accelerometer reading
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
bufferData[0] = event.values[0];
bufferData[1] = event.values[1];
bufferData[2] = event.values[2];
}
// Handle a gyro reading
else if (event.sensor.getType() == Sensor.TYPE_GYROSCOPE) {
bufferData[3] = event.values[0];
bufferData[4] = event.values[1];
bufferData[5] = event.values[2];
}
save_IMU save_imu = new save_IMU();
save_imu.execute();
}
AsyncTask is not good choice for your problem.
You'r asynctask reads shared data from the bufferData array which is overiden each time sensor value changes. The AsyncTask is not started in the moment you call execute method. It's put into a queue and than it is waiting for free thread to be executed. So it can take same time before it'll run. Writing to the storage can be slower than you need so the queue will fill up and more tasks can be run between two onSensorChanged calls. Than they can write multiple equal rows as you see.
My suggestion:
Create queue / buffer for measured values. Add new items in onSensorChanged
Use Handler to periodically (e.g. each 250ms) check buffer and write items to disk (more at on run). If you don't want to experiment with handlers it can be also done with AsyncTask.
bufferData should be a local variable of onSensorChanged() function.
You should transfer the buffer to the asynctask with new save_IMU(bufferData );.
I'm trying to utilize the max fifo size of Accelerometer on Nexus 6
SensorManager sensorManager =
(SensorManager) getSystemService(Context.SENSOR_SERVICE);
Sensor sensor = sensorManager.getDefaultSensor(typeAccelerometer);
Log.e("test",
"Max delay: " + sensor.getMaxDelay() + " - Fifo count" + sensor.getFifoReservedEventCount()); // prints 1000000 (1 second) - Fifo count 10000
// Register the listener for this sensor in batch mode.
// Following code reports every 190ms when screen is ON, and every 10 seconds when screen is OFF. I always want every 10 seconds.
final boolean batchMode = sensorManager.registerListener(
mListener, sensor, 1000000 /* 1 second */, 10000000 /* 10 seconds */);
private final SensorEventListener mListener = new SensorEventListener() {
long lastTimeStamp;
#Override
public void onSensorChanged(SensorEvent event) {
long current = System.currentTimeMillis();
long time = current - lastTimeStamp;
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
Log.e("test",
"New ACCELERO "+ " -> "+ time + "ms -> " + (int)event.values[0] + " -> "+ (int)event.values[1] +" -> "+ (int)event.values[2]);
lastTimeStamp = current;
}
}
#Override
public void onAccuracyChanged(Sensor s, int accuracy) {
}
};
When Screen is ON, I get the events every 190ms. However, when screen is off, it obeys 10 seconds.
How do I utilize the full batching FIFO (with delayed listeners) when screen is ON, for minimal CPU impact.
Code above is fine. On Nexus 6, accelerometer doesnt batch as long as screen is ON. All other sensors do fine.
sensor.getFifoReservedEventCount() returns the amount of values it can store, i dont think its time in ms
Beware of the third argument to registerListener. It is not in seconds like you wrote:
final boolean batchMode = sensorManager.registerListener(
mListener, sensor, 1000000 /* WRONG */, 10000000);
It should be one of the following constants:
SENSOR_DELAY_NORMAL
SENSOR_DELAY_UI
SENSOR_DELAY_GAME
SENSOR_DELAY_FASTEST.
Source: https://developer.android.com/reference/android/hardware/SensorManager.html#registerListener(android.hardware.SensorEventListener,%20android.hardware.Sensor,%20int,%20int)
quick question. I am developing a top-down 2d Platformer game with lots of enemies in the map (at least a hundred spawn at the start of each level). Each enemy uses an AI that searches the map for objects with a specified tag, sorts each object into a list based on their distance, then reacts to the object closest to them.
My code works, but the thing is, if the machine my game is running on is slow, then my game lags. I want to be able to port my game to Android and iOS with low end specs.
In pursuit of putting less strain on the CPU, is there a better way to write my AI?
Here is my code:
void Start () {
FoodTargets = new List<Transform>(); // my list
SelectedTarget = null; // the target the enemy reacts to
myTransform = transform;
AddAllFood ();
}
public void AddAllFood()
{
GameObject[] Foods = GameObject.FindGameObjectsWithTag("Object");
foreach (GameObject enemy in Foods)
AddTarget (enemy.transform);
}
public void AddTarget(Transform enemy)
{
if (enemy.GetComponent<ClassRatingScript>().classrating != 1) { // classrating is an attribute each enemy has that determines their identity (like if they are a plant, a herbivore or a carnivore)
FoodTargets.Add (enemy); // adds the object to the list
}
}
private void SortTargetsByDistance() // this is how I sort according to distance, is this the fastest and most efficient way to do this?
{
FoodTargets.Sort (delegate(Transform t1, Transform t2) {
return Vector3.Distance(t1.position, myTransform.position).CompareTo(Vector3.Distance(t2.position, myTransform.position));
});
}
private void TargetEnemy() // this is called every 4 frames
{
if (SelectedTarget == null) {
SortTargetsByDistance ();
SelectedTarget = FoodTargets [1];
}
else {
SortTargetsByDistance ();
SelectedTarget = FoodTargets [1];
}
}
if (optimizer <= 2) { // this is a variable that increments every frame and resets to 0 on the 3rd frame. Only every 3rd frame is the target enemy method is called.
optimizer++;
} else {
TargetEnemy ();
// the rest are attributes that the AI considers when reacting to their target
targetmass = SelectedTarget.GetComponent<MassScript> ().mass;
targetclass = SelectedTarget.GetComponent<ClassRatingScript> ().classrating;
mass = this.GetComponent<MassScript> ().mass;
classrating = this.GetComponent<ClassRatingScript> ().classrating;
distance = Vector3.Distance (transform.position, SelectedTarget.transform.position);
optimizer = 0;
}
Is there a more optimized way of doing this? Your help will be much appreciated. Thanks in advance!
I'm not awfully familiar with C# or Unity but I would look very carefully at what sorting algorithm your sorting method is using. If all you want is the closest Game Object, then sorting isn't necessary.
The fastest sorting algorithms, such as Quicksort, are O(n*log(n)). That is to say that the time it takes to sort a collection of n objects is bounded by some constant multiple of n*log(n). If you just want the k closest objects, where k << n, then you can perform k iterations of the Bubble Sort algorithm. This will have time-complexity O(k*n), which is much better then before.
However, if you only need the single closest object, then just find the closest object without sorting (pseudocode):
float smallestDistance = Inf;
object closestObject = null;
foreach object in objectsWithTag {
float d = distance(object, enemy);
if (d < smallestDistance) {
smallestDistance = d;
closestObject = object;
}
}
This extremely simple algorithm has time complexity O(n).
I am reading data from a sensor into two static double []'s called Gain and Phase. Then depending on what button the user pressed to start collecting data from the sensor, I save this data to another double [].
Ex:
if (What_Button == 1){
oGain = gain;
oPhase = phase;
output.setText("OPEN saved");
}
if (What_Button == 2){
sGain = gain;
sPhase = phase;
output.setText("SHORT saved");
}
if (What_Button == 3){
lGain = gain;
lPhase = phase;
output.setText("LOAD saved");
}
I then wish to plot the original Gain and Phase data. Before I do this I convert the gain into dB and the phase into degrees.
i.e.
for (int i=0; i<_steps; i++) {
phase[i]=Math.toDegrees(phase[i]);
gain[i]=20*Math.log10(gain[i]);
}
This plotting works fine but after gain and phase have been converted my saved data "lgain" "lphase", etc are changed. It is as if they have been reassigned to the new Gain and Phase values instantly. I surrounded the code above with System.out.pritln commands to view the lgain, lphase, etc. values before and after and this is certainly where they are being changed at. I used ctrl-f to find all instances of lgain, lphase, etc and they are not being reassigned anywhere else. Any ideas how to fix this?
C
You are copying phase array by reference rather than value.
Try looking at Arrays.CopyOf documentation to choose a static method to copy the array.
Something like:
oGain = Arrays.copyOf(gain,gain.length);
When the android text to speech functionality translates audio waves to text, is it possible to determine the 'confidence levels' of spoken text? So for example, if someone speaks too far away from the mic and the android device picks up distorted sounds, would it both output translated text along with a low confidence interval to state it isn't sure how accurate that particular translation was.
if you are implementing the RecognitionListener examine this code clip from my onResults method.
#Override
public void onResults(Bundle results) {
String LOG = "SpeechRecognizerActivity"
Log.d(LOG, "onResults");
ArrayList<String> strlist = results.getStringArrayList(SpeechRecognizer.RESULTS_RECOGNITION);
float [] confidence = results.getFloatArray(SpeechRecognizer.CONFIDENCE_SCORES);
for (int i = 0; i < strlist.size(); i++) {
Log.d(LOG, "result=" + strlist.get(i));
}
Log.d(LOG + " result", strlist.get(0));
if (confidence != null){
if (confidence.length > 0){
Log.d(LOG + " confidence", String.valueOf(confidence[0]));
} else {
Log.d(LOG + " confidence score not available", "unknown confidence");
}
} else {
Log.d(LOG, "confidence not found");
}
}
You won't see anything unless you add this to your recognizer intent:
iSpeechIntent.putExtra(RecognizerIntent.EXTRA_CONFIDENCE_SCORES, true);
Yes. In the returned Bundle, there's a float array called CONFIDENCE_SCORES. From the docs:
Key used to retrieve a float array from the Bundle passed to the onResults(Bundle) and onPartialResults(Bundle) methods. The array should be the same size as the ArrayList provided in RESULTS_RECOGNITION, and should contain values ranging from 0.0 to 1.0, or -1 to represent an unavailable confidence score.
Confidence values close to 1.0 indicate high confidence (the speech recognizer is confident that the recognition result is correct), while values close to 0.0 indicate low confidence.
This value is optional and might not be provided.
Please note that it is not guaranteed to be there. Check for it and use if present. Gamble if not.