How to detect my device left/right audio balance? - android

Is there a way to programmatically detect the current device's audio balance?
On Android 10 - the setting that I'm willing to detect is under:
Settings -> Accessibility -> Audio & On-Screen Text -> Audio balance
Also shown here
By changing this balance we as users can set either the left side or the right side of the stereo volume louder.
Can I somehow retrieve this value in my app?

This only works system-wide since Android Q and this would be the Settings.java:
/**
* Master balance (float -1.f = 100% left, 0.f = dead center, 1.f = 100% right).
*
* #hide
*/
public static final String MASTER_BALANCE = "master_balance";
These global settings can only be read (not tested, but something alike this):
Settings.Global.getFloat(context.getContentResolver(), Settings.Global.MASTER_BALANCE, 0f)
I could not find any public API for the system-wide setting (which means, that only the DevicePolicyManager could change it), but one can use MediaPlayer.setVolume (float leftVolume, float rightVolume) to control this for the playback of the own application.

Related

How to control the z axis of individual Mapbox features inside the same layer?

I'm using the Mapbox Maps SDK for Android to display pins with custom icons in a map on my app. More specifically, I'm using the SymbolLayer API. When the user clicks on a pin, its appearance changes to show it is selected. However, that clicked pin is often behind other pins, like in this image:
All those pins are Features from the same Source, added to the same SymbolLayer.
I want to be able to make the selected pin appear above the other pins, and for that I'm trying to control its z axis. I'm playing around with the PropertyFactory.symbolZOrder(value) method and it seems that neither Property.SYMBOL_Z_ORDER_VIEWPORT_Y nor Property.SYMBOL_Z_ORDER_SOURCE will be of help. I'm hopeful that I will be able to achieve that with an Expression but I have no idea on how to use it.
Any thoughts?
here's an example with explanations.
You should set the properties symbolZOrder and symbolSortKey when defining the symbol layer.
symbolZOrder accept one of the following as argument:
// SYMBOL_Z_ORDER: Controls the order in which overlapping symbols in the same layer are rendered
/**
* If symbol sort key is set, sort based on that. Otherwise sort symbols by their y-position relative to the viewport.
*/
public static final String SYMBOL_Z_ORDER_AUTO = "auto";
/**
* Symbols will be sorted by their y-position relative to the viewport.
*/
public static final String SYMBOL_Z_ORDER_VIEWPORT_Y = "viewport-y";
/**
* Symbols will be rendered in the same order as the source data with no sorting applied.
*/
public static final String SYMBOL_Z_ORDER_SOURCE = "source";
so if you wish to control the symbols based on some logic you should use the SYMBOL_Z_ORDER_AUTO value.
then you can set the symbolSortKey to some float value.
Style.OnStyleLoaded onStyleLoaded = style -> {
style.addLayer(new SymbolLayer(SYMBOLS_LAYER, SYMBOLS_SOURCE)
.withProperties(
iconImage(step(zoom(),
literal("marker_public_base"),
stop(6, get("icon")))),
iconIgnorePlacement(true),
iconAllowOverlap(true),
iconSize(interpolate(
linear(), zoom(),
stop(0, MARKER_SYMBOL_SIZE * .13),
stop(5, MARKER_SYMBOL_SIZE * .33),
stop(9, MARKER_SYMBOL_SIZE))),
iconAnchor(ICON_ANCHOR_BOTTOM),
textField(get("title")),
textSize(interpolate(
linear(), zoom(),
stop(5.9, 0),
stop(6, SYMBOL_TEXT_SIZE * .4),
stop(7, SYMBOL_TEXT_SIZE * .7),
stop(11, SYMBOL_TEXT_SIZE))),
textOptional(true),
textHaloColor("#ffffff"),
textHaloWidth(1f),
textHaloBlur(1f),
textAnchor(TEXT_ANCHOR_TOP),
symbolZOrder(SYMBOL_Z_ORDER_AUTO),
symbolSortKey(get("zIndex"))
));
where
points.forEach(point -> {
Feature feature = Feature.fromGeometry(com.mapbox.geojson.Point.fromLngLat(point.lon, point.lat));
feature.addStringProperty("id", point.id);
feature.addNumberProperty("zIndex", point.isPublic? 0f : point.isSearchResult? 2f : 1f);
feature.addStringProperty("title", point.name);
feature.addStringProperty("icon", getIconImageID(point.category, point.isPublic, point.visited));
symbolsFeatures.add(feature);
});

How to detect device is moving or not using sensor android

I am using below code to identify the movement of the device, means I would like to know that device is moving or not. I also use Google Activity APIs which provides different activity modes like WALKING, ON_FOOT, STILL, etc without using GPS. I would like to achieve the same with Sensors but I am not able to get it accurately.
The issue with the following code is that as soon as I move the device quickly like take it from the table then I am getting the result as moving whereas it's not actually moving.
// calling method from onSensorChanged method and using TYPE_ACCELEROMETER sensor.
double speed = getAccelerometer(event.values);
// then checking the speed.
if(speed > 0.9 && speed < 1.1) {
// device is not moving
} else {
// device is moving.
}
/**
* #return
*/
private double getAccelerometer(float[] values) {
// Movement
float x = values[0];
float y = values[1];
float z = values[2];
float accelerationSquareRoot =
(float) ((x * x + y * y + z * z) / (9.80665 * 9.80665));
return Math.sqrt(accelerationSquareRoot);
}
Can anyone guide me how to make this logic accurate so that I can identify the device is moving or not?
The accelerometer is made to return acceleration data and according to Netwon's 2nd law if the acceleration is constant then the body is not moving or moving with constant speed(this is quite impossibile in your case).
Therefore if you keep reading the same data on all three axis(or better in a quite strict range) from accelerometer over time it means the phone is not moving otherwise it is.
For the purpose, you need to use Activity Recognition API which will provide you some events like moving, stop, driving, e.t.c, And activity recognize use some sensor data and also help of location service when is running. For the more how we can use and what actually it. You can read from below link
https://developers.google.com/location-context/activity-recognition/

Accessing event input nodes in Android withour rooting

I want to be able to inject different events into an Android device. After some search, I found that I can do this by accessing event input nodes in Android OS, which are found in dev/input/eventX. Once these are accessed, read and write operations can take place, and hence I can inject events.
The problem is that these nodes are only accessible in a rooted device. If I try to use them without rooting, the process will fail as mentioned in this article,
http://www.pocketmagic.net/programmatically-injecting-events-on-android-part-2/
I don't want to root the device to preserve its warranty. I've searched the web for possible ways to accessing Android OS, but I only found rooting.
The alternative way which I think it would work is compiling the application as a system application, but I couldn't found whether this will allow it to have access (both read and write privileges) to event input nodes. Will this method provide these privileges?
If not, is there any alternative way to rooting, where I can give system permissions to my application without rooting the device?
Any help is appreciated.
Thanks.
EDIT: To elaborate more, I want to inject different touch events. For example, single touch, swipe, etc.
You can inject input events on a device by executing the /system/bin/input utility that ships with Android. You can see some examples of it being used (via adb) in this question. The input utility does not appear to need any special privileges to execute.
To create a system application, you need access to the signing keys used when the Android OS for your device was built - you can't just modify an ordinary App to give it system privileges. Even if you could, it wouldn't give you root access (although you could probably make it part of the input user group which the /dev/input/eventX devices also appear to allow access to).
If you want to inject touch events, you can either execute the /system/bin/input utility using the exec() method of the Java Runtime class or just use the injectMotionEvent() method in InputManager.
Below is a method taken from the Android source showing how to inject a MotionEvent - you can view the full source for more info.
/**
* Builds a MotionEvent and injects it into the event stream.
*
* #param inputSource the InputDevice.SOURCE_* sending the input event
* #param action the MotionEvent.ACTION_* for the event
* #param when the value of SystemClock.uptimeMillis() at which the event happened
* #param x x coordinate of event
* #param y y coordinate of event
* #param pressure pressure of event
*/
private void injectMotionEvent(int inputSource, int action, long when, float x, float y, float pressure) {
final float DEFAULT_SIZE = 1.0f;
final int DEFAULT_META_STATE = 0;
final float DEFAULT_PRECISION_X = 1.0f;
final float DEFAULT_PRECISION_Y = 1.0f;
final int DEFAULT_DEVICE_ID = 0;
final int DEFAULT_EDGE_FLAGS = 0;
MotionEvent event = MotionEvent.obtain(when, when, action, x, y, pressure, DEFAULT_SIZE,
DEFAULT_META_STATE, DEFAULT_PRECISION_X, DEFAULT_PRECISION_Y, DEFAULT_DEVICE_ID,
DEFAULT_EDGE_FLAGS);
event.setSource(inputSource);
Log.i(TAG, "injectMotionEvent: " + event);
InputManager.getInstance().injectInputEvent(event,
InputManager.INJECT_INPUT_EVENT_MODE_WAIT_FOR_FINISH);
}
These methods only allow you to inject events into your own app windows.
If you want to inject events into other windows not owned by your app, you need to declare additional permissions (READ_INPUT_STATE and INJECT_EVENTS) in your app manifest and sign your App with the Android OS signing keys. In other words, the permissions needed to inject events into other apps are never granted to ordinary apps (for obvious reasons).

How to control iso manually in camera2, android

I am new in android and trying to figure out new camera2 effects. I have no idea how to control iso in camera preview manually.
Any help will be appreciated.
Thanks.
One way to determine if your device supports manual ISO control is to check if it supports the MANUAL_SENSOR capability.
If so, you can turn off auto-exposure by either disabling all automatics:
previewBuilder.set(CaptureRequest.CONTROL_MODE, CaptureRequest.CONTROL_MODE_OFF);
or by just disabling auto-exposure, leaving auto-focus and auto-white-balance running:
previewBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_OFF);
Once you've disabled AE, you can manually control exposure time, sensitivity (ISO), and frame duration):
previewBuilder.set(CaptureRequest.SENSOR_EXPOSURE_TIME, exposureTime);
previewBuilder.set(CaptureRequest.SENSOR_SENSITIVITY, sensitivity);
previewBuilder.set(CaptureRequest.SENSOR_FRAME_DURATION, frameDuration);
The valid ranges for these values can be found from SENSOR_INFO_EXPOSURE_TIME_RANGE and SENSOR_INFO_SENSITIVITY_RANGE for exposure and sensitivity. For frame duration, the maximum frame duration can be found from SENSOR_INFO_MAX_DURATION, and the minimum frame duration (max frame rate) depends on your session output configuration. See StreamConfigurationMap.getOutputMinFrameDuration for more details on this.
Note that once you disable AE, you have to control all 3 parameters (there are defaults if you never set one, but they won't vary automatically). You can copy the last-good values for these from the last CaptureResult before you turn off AE, to start with.
You have to set previewbuilder first like this:
mPreviewBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.INFO_SUPPORTED_HARDWARE_LEVEL_FULL);
and than
Range<Integer> range2 = characteristics.get(CameraCharacteristics.SENSOR_INFO_SENSITIVITY_RANGE);
int max1 = range2.getUpper();//10000
int min1 = range2.getLower();//100
int iso = ((progress * (max1 - min1)) / 100 + min1);
mPreviewBuilder.set(CaptureRequest.SENSOR_SENSITIVITY, iso);
progress is a variable for seekBar from onProgressChanged(SeekBar seekBar, int progress, boolean user) override method

Detect Fling distance android

I am working on an application which requires me to manually handle the fling process rather than giving it to the framework. What I want to achieve is basically calculate the amount of pixels a listview moves when it receives a fling action. As the scroll method already provides distance in form of delta, I have handled it easily. But is there a way to get fling distance as only velocity parameter is being passed in the super method.
Note- I have to move another view in accordance with the fling distance, so I need to get it simultaneously just like onScroll provides it.
Thanks.
It is passed 3 years but no answer yet. I found some workaround to achieve it.
Actually it is kind of advanced topic as there are a lot of nuances but basically you can refer to Android source code(OverScroller class in particular) and use this method. You will need to copy it into your class and use it.
private double getSplineFlingDistance(int velocity) {
final double l = getSplineDeceleration(velocity);
final double decelMinusOne = DECELERATION_RATE - 1.0;
return mFlingFriction * PHYSICAL_COEF * Math.exp(DECELERATION_RATE / decelMinusOne * l);
}
Other methods and values can be obtained from the same class.
The link to the source code: https://android.googlesource.com/platform/frameworks/base/+/jb-release/core/java/android/widget/OverScroller.java
Keep in mind that in some devices the value can be different (not too much). Some vendors change the formula depending on their requirements and hardware to make it more smooth.
It looks like the original question ended up with nothing, but it was formulated pretty good, so I landed here and started my research. Here are my results.
My question was: What is the final value at the end of Android standard FlingAnimation?
new FlingAnimation(new FloatValueHolder(0f))
.addEndListener((animation, canceled, value, velocity) -> {
? value
I needed that value before animation start based on the start velocity to make some preparations at the destination point of the FlingAnimation.
Actually I started with Overscroller.java mentioned by #Adil Aliyev. I collected all the portions of code, but the result was way less, that came from the animation.
Then I took a look into FlingAnimation.java in pair with DynamicAnimation.java.
The key function in FlingAnimation.java to start the research was:
MassState updateValueAndVelocity(float value, float velocity, long deltaT) {
After playing with some equations I composed this final code. It gives not totally exact estimation to the last digit, but very close. I will use it for my needs. You are welcome too:
final float DEFAULT_FRICTION = -4.2f;
final float VELOCITY_THRESHOLD_MULTIPLIER = 1000f / 16f;
float mFriction = 1.1f * DEFAULT_FRICTION; // set here friction that you set in .setFriction(1.1f) or 1 by default
final float THRESHOLD_MULTIPLIER = 0.75f;
float mVelocityThreshold = THRESHOLD_MULTIPLIER * VELOCITY_THRESHOLD_MULTIPLIER;
double time = Math.log(mVelocityThreshold / startVelocity) * 1000d / mFriction;
double flingDistance = startVelocity / mFriction * (Math.exp(mFriction * time / 1000d) - 1);

Categories

Resources