I want to emulate user's walking and count their steps for auto testing.
I tried to search for the solution, but only found simulate the location.
It's pretty easy since in reality this sensor returns a float number describing the number of steps taken by the user since the last reboot while activated.
So that the easiest implementation will include a method which generates just a random float within some realistic constraints (between 1 and 9999 steps):
public float generateStepsCount(){
float minVal = 1.0f;
float maxVal = 9999.0f;
Random rand = new Random();
return rand.nextFloat() * (maxVal - minVal) + minVal;
}
PS: TYPE_STEP_COUNTER has been there since API 19.
Related
I'm an experienced native iOS developer making my first foray into Android through Unity. I'm trying to set up a custom shader, but I'm having some trouble with the Normal maps. I've got them working perfectly in the Unity simulator on my computer, but when I build to an actual device (Samsung Galaxy S8+), the Normal maps don't work at all.
I'm using Mars as my test case. Here's the model running in the simulator on my computer:
And here's a screenshot from my device, running exactly the same code.
I've done a LOT of research, and apparently using Normal maps on Android with Unity is not an easy thing. There are a lot of people asking about it, but almost every answer I've found has said the trick is to override the texture import settings, and force it to be "Truecolor" which seems to be "RGBA 32 Bit" according to Unity's documentation. This hasn't helped me, though.
Another thread suggested reducing the Asino Level to zero, and another suggested turning off Mip Maps. I don't know what either of those are, but neither helped.
Here's my shader code, simplified but containing all references to Normal mapping:
void surf (Input IN, inout SurfaceOutputStandard o) {
half4 d = tex2D (_MainTex , IN.uv_MainTex);
half4 n = tex2D (_BumpMap , IN.uv_BumpMap);
o.Albedo = d.rgb;
o.Normal = UnpackNormal(n);
o.Metallic = 0.0;
o.Smoothness = 0.0;
}
I've seen some threads suggesting replacements for the "UnpackNormal()" function in the shader code, indicating that it might not be the thing to do on Android or mobile in general, but none of the suggested replacements have changed anything for better or worse: the normal maps continue to work in the simulator, but not on the device.
I've even tried making my own normal maps programmatically from a grayscale heightmap, to try to circumvent any import settings I may have done wrong. Here's the code I used, and again it works in the simulator but not on the device.
public Texture2D NormalMap(Texture2D source, float strength = 10.0f) {
Texture2D normalTexture;
float xLeft;
float xRight;
float yUp;
float yDown;
float yDelta;
float xDelta;
normalTexture = new Texture2D (source.width, source.height, TextureFormat.RGBA32, false, true);
for (int y=0; y<source.height; y++) {
for (int x=0; x<source.width; x++) {
xLeft = source.GetPixel (x - 1, y).grayscale * strength;
xRight = source.GetPixel (x + 1, y).grayscale * strength;
yUp = source.GetPixel (x, y - 1).grayscale * strength;
yDown = source.GetPixel (x, y + 1).grayscale * strength;
xDelta = ((xLeft - xRight) + 1) * 0.5f;
yDelta = ((yUp - yDown) + 1) * 0.5f;
normalTexture.SetPixel(x,y,new Color(xDelta,yDelta,1.0f,yDelta));
}
}
normalTexture.Apply();
return normalTexture;
}
Lastly, in the Build Settings, I've got the Platform set to Android and I've tried it using Texture Compression set to both "Don't Override" and "ETC (default)". The former was the original setting and the latter seemed to be Unity's suggestion both by the name and in the documentation.
I'm sure there's just some flag I haven't checked or some switch I haven't flipped, but I can't for the life of me figure out what I'm doing wrong here, or why there would be such a stubborn difference between the simulator and the device.
Can anyone help a Unity newbie out, and show me how these damn Normal maps are supposed to work on Android?
Check under:
Edit -> Project Settings -> Quality
Android is usually set to Fastest.
I want to emulate user's walking and count their steps for auto testing.
I tried to search for the solution, but only found simulate the location.
It's pretty easy since in reality this sensor returns a float number describing the number of steps taken by the user since the last reboot while activated.
So that the easiest implementation will include a method which generates just a random float within some realistic constraints (between 1 and 9999 steps):
public float generateStepsCount(){
float minVal = 1.0f;
float maxVal = 9999.0f;
Random rand = new Random();
return rand.nextFloat() * (maxVal - minVal) + minVal;
}
PS: TYPE_STEP_COUNTER has been there since API 19.
Recently, I encountered float number corrupted problem in selected Android device. I was wondering, anyone of you had encountered the similar problem as mine, yet have a way to reproduce it with a simplified code block?
I encountered similar problem in Nexus 5 devices. The problem doesn't happen in Genymotion Emulator.
It only happen in selected for loop code block, and it is extremely difficult to re-produce in other code block.
My situation is as follow :-
float rectangleWidth2 = 0.0f;
float startX = (float) (left + xPixelsPerUnit * (xValue - minX));
float stopX = startX;
float _left = startX - rectangleWidth2;
float _right = stopX + rectangleWidth2;
// I expect "_left" and "_right" will have same value. However, at this point,
// "_right" will become an arbitary large value, something like 5.3482353354E20
// However, I expect the value range for "_left" and "_right" within [-1000,1000]
If I change the code to
float _left = startX - rectangleWidth2;
float _right = startX + rectangleWidth2;
// "_left" and "_right" will then having same value.
A "realiable" workaround for my case, is to avoid using float as suggested by reporter. I use double whenever possible, and only perform necessary float casting, when there is a need.
Same problem occur, regardless I'm using Eclipse or Android Studio. I'm going to get Nexus 4 this week, to see whether same problem occur still...
Referenced Links
https://code.google.com/p/android/issues/detail?id=58698
http://www.gamedev.net/topic/660746-problem-with-random-float-value-on-android/
I am working on an application which requires me to manually handle the fling process rather than giving it to the framework. What I want to achieve is basically calculate the amount of pixels a listview moves when it receives a fling action. As the scroll method already provides distance in form of delta, I have handled it easily. But is there a way to get fling distance as only velocity parameter is being passed in the super method.
Note- I have to move another view in accordance with the fling distance, so I need to get it simultaneously just like onScroll provides it.
Thanks.
It is passed 3 years but no answer yet. I found some workaround to achieve it.
Actually it is kind of advanced topic as there are a lot of nuances but basically you can refer to Android source code(OverScroller class in particular) and use this method. You will need to copy it into your class and use it.
private double getSplineFlingDistance(int velocity) {
final double l = getSplineDeceleration(velocity);
final double decelMinusOne = DECELERATION_RATE - 1.0;
return mFlingFriction * PHYSICAL_COEF * Math.exp(DECELERATION_RATE / decelMinusOne * l);
}
Other methods and values can be obtained from the same class.
The link to the source code: https://android.googlesource.com/platform/frameworks/base/+/jb-release/core/java/android/widget/OverScroller.java
Keep in mind that in some devices the value can be different (not too much). Some vendors change the formula depending on their requirements and hardware to make it more smooth.
It looks like the original question ended up with nothing, but it was formulated pretty good, so I landed here and started my research. Here are my results.
My question was: What is the final value at the end of Android standard FlingAnimation?
new FlingAnimation(new FloatValueHolder(0f))
.addEndListener((animation, canceled, value, velocity) -> {
? value
I needed that value before animation start based on the start velocity to make some preparations at the destination point of the FlingAnimation.
Actually I started with Overscroller.java mentioned by #Adil Aliyev. I collected all the portions of code, but the result was way less, that came from the animation.
Then I took a look into FlingAnimation.java in pair with DynamicAnimation.java.
The key function in FlingAnimation.java to start the research was:
MassState updateValueAndVelocity(float value, float velocity, long deltaT) {
After playing with some equations I composed this final code. It gives not totally exact estimation to the last digit, but very close. I will use it for my needs. You are welcome too:
final float DEFAULT_FRICTION = -4.2f;
final float VELOCITY_THRESHOLD_MULTIPLIER = 1000f / 16f;
float mFriction = 1.1f * DEFAULT_FRICTION; // set here friction that you set in .setFriction(1.1f) or 1 by default
final float THRESHOLD_MULTIPLIER = 0.75f;
float mVelocityThreshold = THRESHOLD_MULTIPLIER * VELOCITY_THRESHOLD_MULTIPLIER;
double time = Math.log(mVelocityThreshold / startVelocity) * 1000d / mFriction;
double flingDistance = startVelocity / mFriction * (Math.exp(mFriction * time / 1000d) - 1);
I am trying to calculate the approximate position of an Android phone in a room. I tried with different methods such as location (wich is terrible in indoors) and gyroscope+compass. I only need to know the approximate position after walking during 5-10seconds so I think the integration of linear acceleration could be enough. I know the error is terrible because of the propagation of the error but maybe it will work in my setup. I only need the approximate position to point a camera to the Android phone.
I coded the double integration but I am doing sth wrong. IF the phone is static on a table the position (x,y,z) always keep increasing. What is the problem?
static final float NS2S = 1.0f / 1000000000.0f;
float[] last_values = null;
float[] velocity = null;
float[] position = null;
float[] acceleration = null;
long last_timestamp = 0;
SensorManager mSensorManager;
Sensor mAccelerometer;
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() != Sensor.TYPE_LINEAR_ACCELERATION)
return;
if(last_values != null){
float dt = (event.timestamp - last_timestamp) * NS2S;
acceleration[0]=(float) event.values[0] - (float) 0.0188;
acceleration[1]=(float) event.values[1] - (float) 0.00217;
acceleration[2]=(float) event.values[2] + (float) 0.01857;
for(int index = 0; index < 3;++index){
velocity[index] += (acceleration[index] + last_values[index])/2 * dt;
position[index] += velocity[index] * dt;
}
}
else{
last_values = new float[3];
acceleration = new float[3];
velocity = new float[3];
position = new float[3];
velocity[0] = velocity[1] = velocity[2] = 0f;
position[0] = position[1] = position[2] = 0f;
}
System.arraycopy(acceleration, 0, last_values, 0, 3);
last_timestamp = event.timestamp;
}
These are the positions I get when the phone is on the table (no motion). The (x,y,z) values are increasing but the phone is still.
And these are the positions after calculate the moving average for each axis and substract from each measurement. The phone is also still.
How to improve the code or another method to get the approximate position inside a room?
There are unavoidable measurement errors in the accelerometer. These are caused by tiny vibrations in the table, imperfections in the manufacturing, etc. etc. Accumulating these errors over time results in a Random Walk. This is why positioning systems can only use accelerometers as a positioning aid through some filter. They still require some form of dead reckoning such as GPS (which doesn't work well in doors).
There is a great deal of current research for indoor positioning systems. Some areas of research into systems that can take advantage of existing infrastructure are WiFi and LED lighting positioning. There is no obvious solution yet, but I'm sure we'll need a dedicated solution for accurate, reliable indoor positioning.
You said the position always keeps increasing. Do you mean the x, y, and z components only ever become positive, even after resetting several times? Or do you mean the position keeps drifting from zero?
If you output the raw acceleration measurements when the phone is still you should see the measurement errors. Put a bunch of these measurements in an Excel spreadsheet. Calculate the mean and the standard deviation. The mean should be zero for all axes. If not there is a bias that you can remove in your code with a simple averaging filter (calculate a running average and subtract that from each result). The standard deviation will show you how far you can expect to drift in each axis after N time steps as standard_deviation * sqrt(N). This should help you mathematically determine the expected accuracy as a function of time (or N time steps).
Brian is right, there are already deployed indoor positioning systems that work with infrastructure that you can easily find in (almost) any room.
One of the solutions that has proven to be most reliable is WiFi fingerprinting. I recommend you take a look at indoo.rs - www.indoo.rs - they are pioneers in the industry and have a pretty developed system already.
This may not be the most elegant or reliable solution, but in my case it serves the purpose.
Note In my case, I am grabbing a location before the user can even enter the activity that needs indoor positioning.. and I am only concerned with a rough estimate of how much they have moved around.
I have a sensor manager that is creating a rotation matrix based on the device orientation. (using Sensor.TYPE_ROTATION_VECTOR) That obviously doesn't give me movement forward, backward, or side to side, but instead only the device orientation. With that device orientation i have a good idea of the user's bearing in degrees (which way they are facing) and using the Sensor_Step_Detector available in KitKat 4.4, I make the assumption that a step is 1 meter in the direction the user is facing..
Again, I know this is not full proof or very accurate, but depending on your purpose this too might be a simple solution..
everytime a step is detected i basically call this function:
public void computeNewLocationByStep() {
Location newLocal = new Location("");
double vAngle = getBearingInDegrees(); // returns my users bearing
double vDistance = 1 / g.kEarthRadiusInMeters; //kEarthRadiusInMeters = 6353000;
vAngle = Math.toRadians(vAngle);
double vLat1 = Math.toRadians(_location.getLatitude());
double vLng1 = Math.toRadians(_location.getLongitude());
double vNewLat = Math.asin(Math.sin(vLat1) * Math.cos(vDistance) +
Math.cos(vLat1) * Math.sin(vDistance) * Math.cos(vAngle));
double vNewLng = vLng1 + Math.atan2(Math.sin(vAngle) * Math.sin(vDistance) * Math.cos(vLat1),
Math.cos(vDistance) - Math.sin(vLat1) * Math.sin(vNewLat));
newLocal.setLatitude(Math.toDegrees(vNewLat));
newLocal.setLongitude(Math.toDegrees(vNewLng));
stepCount =0;
_location = newLocal;
}