I'm working on a project for my thesis in which I'm using an app for analyzing swipe gestures. There are a lot of questions and informations around about the view pager in general and even how to manipulate its settings, but until now, I couldn't find the specific default value (distance between finger down and finger up) baked into the pager itself. This would be very useful for me in order to see what value google deems to be appropriate and to compare it to other values.
Help would be much appreciated! :)
edit:
OK, found some clues myself. This is the code for ViewPager: searchco.de/codesearch/view/10066260 and there is a static variable namend MIN_DISTANCE_FOR_FLING which has a value of 25 (dip). This value gets multiplied with the density of the current display, which is done in in initViewPager(). This value in turn is then used in determineTargetPage to check if the user has swiped a greater distance than the value. What I don't get: If I multiply 25 by e.g. 160 (as an exemplary density), the value gets way too big, so I'm obviously interpreting the code wrong in some way. I would really appreciate an explanation.
To sum this up in case anyone else needs the information (my thanks to #Luksprog):
The minimum threshhold for a sucessful swipe in a ViewPager is a non-device-specific constant which is 25dip, adjusted to the current device by multiplying it with a device-specific value, namely the scale factor of the device's display.
You need import ViewPager.php in your project (File -> Import -> General -> File System -> with path-to-SDK/extras/android/support/v4/src/) and disable mTouchSlop variable in initViewPager() block.
mTouchSlop = 0;//ViewConfigurationCompat.getScaledPagingTouchSlop(configuration);
Then touch drag will be work immediately.
Related
Using MPAndroidChart, I'm struggling to figure out how to set the current visible x axis values. The use case is simple and I would have thought very common, so I'm sure I must be missing some function which can do this:
Say you have a chart with x axis values 1-100
A user zooms and pans a chart so that the range 60-80 is visible. I want to store these values, so that tomorrow when the user re-launches the app, I can restore the exact viewing state (60-80)
Storing the values is really easy - you can simply call chart.lowestVisibleX and chart.highestVisibleX to get the x axis values. But how do I set them on a new instance of the chart? Unfortunately there doesn't seem to be a chart.setHighestVisibleX or chart.setLowestVisibleX.
This previous question / answer is nearly, but not quite, what I need. The suggestion is to use a combination of chart.moveViewToX(60) and chart.setVisibleXRangeMaximum(20). However as the docs for setVisibleXRangeMaximum state:
Sets the size of the area (range on the x-axis) that should be maximum
visible at once (no further zooming out allowed)
I don't want to prevent further zooming, which is what this does. There must be a way to set the zoom level without actually restricting further zooming - but I can't figure it out. Any suggestions?
Thanks
Just to answer my own question, I decided to workaround this issue by resetting the X range maximum after calling moveViewToX. This appears to work. So the solution would be:
chart.setVisibleXRangeMaximum(20)
chart.moveViewToX(60)
chart.setVisibleXRangeMaximum(100)
I'm trying to retrieve my current heading from an android device using Delphi Rad 10.1 Berlin.
There is a function in the OrientationSensor that is True Heading, however, this is only enabled on Windows, according to the Embarcadero knowledgebase.
So, I think to do it, I need to convert the following Variables into one heading.
OrientationSensor1.Sensor.HeadingX
OrientationSensor1.Sensor.HeadingY
OrientationSensor1.Sensor.HeadingZ
As I only need heading(and don't care about altitude), I believe I can disregard Z.
In return I need to retrieve current heading which should be from 0-360.
I used a formula I found online which is :-
angle = atan2(Y, X);
Which seemed to help, but was wildly inaccurate at some positions, and was negative at others.
Any help or advice would be appreciated.
Some details that may help are :
It's a Multi-Device Application in Delphi.
It's only going on Android Devices(and also only being tested on them).
Thanks in advance.
Don't discard HeadingZ.
These tree headings are not relative to world surface but instead relative to your device orientation and tilt.
So in order to get true heading you will have to take into account heading for all three axis and also tilt information for all three axis.
You can read more information about calculating heading here: https://stackoverflow.com/a/1055051/3636228
Yes linked answer is for Objective C but math behind is same for every programming language.
I am using the getPressure(index) method from the MotionEvent instance to get a value of the pressure applied to screen.
I am trying to figure out how to convert that value to at least an approximation of a standard measurement unit.
in Android the pressure value is a float ranging from 0 to 1. I need to express it in Newtons in some way.
From what i understood this is different across devices so its not possible get a really precise unit measurement but i am fine with an approximation.
Like what amount in newtons is normal for a stylus touching the screen on full force (the device measuring 1.0f of pressure)
I think you can only guess, and know that the results will be affected by huge uncertainity. Solutions I see:
Put an object of appropriate, known weight on the screen. Don't know about screens, but if it needs human skin to trigger the event, you can put your finger on the screen (making no strength on it) and then put some object on your finger.
Take a stylus, and by debugging learn how much force you need to get a 0.5f result. Then take a scale (foreign speaker here; I mean the tool that measures weights..?) and apply the same pressure on it with the stylus, and read the results.
In both cases, you can have a single map point (e.g., 0.5f -> 10 N), and then assume a linear dependency (knowing also that 0f -> 0 N) to fill the whole range.
With some patience you can fill different values too - I would not expect the relation to be linear actually.
getPressure returns a value 0-1 because the way pressure is calculated is device-dependant. Some devices will calculate the value from how much of the area of your finger is touching the screen. So from that it's probably not possible to convert to newtons in a way which will work on multiple Android devices unless you write a solution for each one.
My device has only two focus modes, AUTO and FIXED (as per getSupportedFocusModes()).
I want to set my camera at a fixed focus distance of 'x' (x being whatever I like, or whatever I can get from the camera..). (I'm aware of setFocusMode(Camera.Parameters.FOCUS_MODE_FIXED), but this seems to be fixed only on the farthest possible setting..)
Can this be done? (Android version 4.2.2)
Not trying to completely answer the question here, just trying to give it some direction.
So, what you need here is a driver support for that kind of operation. Then at some point you'd ask the driver from your application to set a requested focus distance.
Another question is: "if anyone really needs that kind of functionality?".
Android documentation says:
public static final String FOCUS_MODE_FIXED
Focus is fixed. The camera is always in this mode if the focus is not adjustable. If the camera has auto-focus, this mode can fix the focus, which is usually at hyperfocal distance. Applications should not call autoFocus(AutoFocusCallback) in this mode.
Lets see what hyperfocal distance is.
Hyperfocal distance
From Wikipedia, the free encyclopedia
In optics and photography, hyperfocal distance is a distance beyond which all objects can be brought into an "acceptable" focus. There are two commonly used definitions of hyperfocal distance, leading to values that differ only slightly:
Definition 1: The hyperfocal distance is the closest distance at which a lens can be focused while keeping objects at infinity acceptably sharp. When the lens is focused at this distance, all objects at distances from half of the hyperfocal distance out to infinity will be acceptably sharp.
Definition 2: The hyperfocal distance is the distance beyond which all objects are acceptably sharp, for a lens focused at infinity.
The distinction between the two meanings is rarely made, since they have almost identical values. The value computed according to the first definition exceeds that from the second by just one focal length.
As the hyperfocal distance is the focus distance giving the maximum depth of field, it is the most desirable distance to set the focus of a fixed-focus camera.
So the focus is not set on the farthest possible setting, but is set to have all visible objects to be acceptably sharp.
Returning to the question.
If you happen to be a developer of this particular camera's firmware, you can add any needed IOCTLs to you driver. But then you still going to need to call them somehow. This can't be achieved without adding additional functions into the Android OS, and further recompiling of Android itself and it's underlying Linux kernel.
So it seems like you can't achieve this goal, not from the user space at least.
One potential approach to achieve that fixed focus distance is to call autoFocus at the start of the camera life-cycle. Keep calling autoFocus sporatically until a condition is met. Once the condition is met, then instead of calling autoFocus, set a flag and call takePicture instead.
This is one solution that I have come to in order to get the desired effect that you might be looking to achieve.
So within my thread that is taking pictures continuously, the code looks something like this:
if(needsFocus)
{
myCamera.autoFocus(autoFocusCallback);
}
else //Focus is not needed anymore at this point
{
if(myCamera != null)
{
myCamera.startPreview();
myCamera.takePicture(pictureCallback);
}
}
Once the condition is met, needsFocus is set to true. At this point, the focus is fixed at the place that I want it to be at. Then it won't change throughout the rest of the activities task. The condition for my case was the appearance of a particular object detected with the OpenCV library.
I might be wrong, but the way you phrase your question seems like coming from a classic DSLR lens perspective.
On an android mobile camera, you don't actually have to worry that much of a lens focal distance, unless your mobile camera allows that (which does not seem to be the case, as you mention it just allows auto or fixed, instead of infinite, macro, continuous-video, etc).
You can just set local areas on the camera to focus and let the sdk do its work. If the object touched on the camera image is far or near it's the sdk work to calculate accordingly and focus for you.
For an example, try this open-source camera project.
I would like to roughly understand the amount of pressure the finger presses on the capacitive screen on android. My idea is to get the area covered by the finger when it is touched (maybe some extra parameters to get it more accurate, but thats the main idea).
So, is there any way to find the are covered? (for example get the number of pixels covered).
There is only MotionEvent.getPressure() (which you probably already found). I doubt that there is something that reports how many pixel are covered by the finger.
I do not really know but you have access to the following function :
MotionEvent e;
float press = e.getPressure(...);
press will be between 0 and 1, from 0 = no pressure, to 1 = normal pressure, however it can be more then 1...
Your thing is totally NIH... Use something that already exist ? Or maybe it doesn't cover your needs !
You can use MotionEvent.getSize() to get the normalized value (from 0 to 1) of the area of screen being pressed. This value is correlated with the number of pixels pressed.