I'd like to test android's behavior on all possible combinations of the following "inputs":
top activity's setRequestedOrientation() (15 possible values, not including SCREEN_ORIENTATION_BEHIND)
Settings.System.ACCELEROMETER_ROTATION (2 possible values)
Settings.System.USER_ROTATION (4 possible values)
device's physical orientation (queryable by OrientationEventListener) (4 possible quadrants)
Specifically, I want to see how the inputs affect the following "output":
getWindowManager().getDefaultDisplay().getRotation() (4 possible values)
So this will require testing at least all 15*2*4*4=480 possible input states.
Additionally, since rotation behavior is often dependent on the history
of the inputs (not just the current input values),
I want to test (at least) all possible transitions from one input state to an "adjacent" input state,
i.e. to an input state that differs from the given input state by one input parameter.
The number of such input state transitions is:
(number of input states) * (number of states adjacent to a given input state)
= (15*2*4*4) * ((15-1) + (2-1) + (4-1) + (4-1))
= 480 * 21
= 10080
Furthermore, sometimes output is dependent on the previous output as well as previous and current
input (e.g. SCREEN_ORIENTATION_LOCKED, SCREEN_ORIENTATION_SENSOR_LANDSCAPE).
The number of possible outputs for a given input state can be between 1 and 4,
so this multiplies the number of transitions that must be tested by up to 4:
10080 * 4 = 40320
That's a lot of transitions to test, so the testing would have to be programmatic/scripted.
Three out of the four input params are straightforward to control programmatically;
the one that's not straightforward to control is the device's physical orientation.
So, how would one go about scripting it? I can think of the following approaches.
Approach #1: Replace the (physical or emulated) device's accelerometer with a scriptable mock accelerometer
for the duration of the test. But, if I understand correctly, mock accelerometers do not exist for android.
Approach #2: Use the android emulator, and script pressing of the "rotate counterclockwise" and
"rotate clockwise" buttons using an interaction automation tool on the host machine (e.g. applescript / autohotkey / xdotool).
Any other ideas?
It turns out this is actually a duplicate of the following excellent question:
How can i simulate accelerometer in android emulator?
which has this excellent answer from #user1302884 :
Unfortunately, that question got no respect and was closed as off-topic (?!) so I won't mark this as a duplicate.
But here's the answer: no need for applescript/autohotkey/xdotool to drive the emulator's ui;
instead, telnet to the emulator and tell it which direction you want "up" to be.
telnet localhost 5554 # or whatever the port is
telnet> sensor # to get help on the sensor command
telnet> sensor get acceleration
acceleration = 0:9.81:0 # if in natural orientation
telnet> sensor get acceleration
acceleration = -9.81:0:0 # if rotated 90 degrees CW from natural orientation
telnet> sensor set orientation -1:1:0 # to set to 45 degrees CW from natural orientation
It would be nice if the emulated display would appear rotated by the specified number of degrees in response, but you can't have everything.
Related
I am creating an In-Vehicle Infotainment (IVI) system running Android. The development board I am using is the Rock960 from 96boards.com This board supports dual display output which is perfect because it needs to drive both the Head Unit (HU) and the Instrument Cluster (IC) in front of the driver. This is where the issue lies. The HU has a resolution of 1920x1080 and the IC has a resolution of 1280x480. Output works on both displays, but it appears very stretched on the IC due to its odd resolution.
The SoC is RK3399 from Rockchip. Here is a link to the AOSP page for this board: https://www.96boards.org/documentation/consumer/rock/build/aosp.md.html
I have tried changing the framebuffer resolution: persist.sys.framebuffer.main=1280x480 in device/rockchip/rk3399/rk3399_box/system.prop
When I do that, everything displays correctly on the IC but then the HU appears stretched.
Here is the system properties file:
#
# system.prop
#
#rild.libpath=/system/lib/libreference-ril.so
#rild.libargs=-d /dev/ttyUSB2
# Default ecclist
ro.ril.ecclist=112,911
wifi.interface=wlan0
persist.tegra.nvmmlite = 1
persist.sys.boot.check=false
ro.audio.monitorOrientation=true
#NFC
debug.nfc.fw_download=false
debug.nfc.se=false
#add Rockchip properties here
ro.rk.screenoff_time=2147483647
ro.rk.screenshot_enable=true
ro.rk.def_brightness=200
ro.rk.homepage_base=http://www.google.com/webhp?client=
{CID}&source=android-home
ro.rk.install_non_market_apps=false
sys.hwc.compose_policy=6
sys.wallpaper.rgb565=0
sf.power.control=8847360
sys.rkadb.root=0
ro.sf.fakerotation=false
ro.sf.hwrotation=0
ro.rk.MassStorage=false
ro.rk.systembar.voiceicon=true
ro.rk.systembar.tabletUI=false
ro.rk.LowBatteryBrightness=true
ro.tether.denied=false
sys.resolution.changed=false
ro.default.size=100
persist.sys.timezone=
ro.product.usbfactory=rockchip_usb
ro.support.lossless.bitstream=true
wifi.supplicant_scan_interval=15
ro.factory.tool=0
#set default lcd density for rk3399 box product
ro.sf.lcd_density=213
ro.adb.secure =0
ro.rk.statusbar=0
# set to false if not use displayd
ro.rk.displayd.enable=false
# default main framebuffer resolution
persist.sys.framebuffer.main=1920x1080
# default primary display
sys.hwc.device.primary=DP
sys.hwc.device.extend=HDMI-A
Also, any idea what ro.rk.displayd.enable is?
I expect both displays to show their content correctly according to their own resolutions. The output should not be stretched or distorted on either screen.
see HWComposer.cpp; this should be primary & external - instead of primary and extend (where extend might cause the scaling); which might already answer the question. these should be defined in /kernel/drivers/video/rockchip. adding further logging to the source code might help to understand what is even going on, when it is setting up the displays.
displayd might be an OSD display daemon, hence anything which ends with a d is usually a daemon. if this can be somehow be done with Android, while the kernel is adequately configured, see https://developer.android.com/reference/android/app/Presentation
the most easy might be to ask them (at least, compared to Chinese manuals).
In my Camera2 API project for Android, I want to set a region for my Exposure Calculation. Unfortunately it doesn't work. On the other side the Focus region works without any problems.
Device: Samsung S7 / Nexus 5
1.) Initial values for CONTROL_AF_MODE & CONTROL_AE_MODE
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_AUTO);
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON);
2.) Create the MeteringRectangle List
meteringFocusRectangleList = new MeteringRectangle[]{new MeteringRectangle(0,0,500,500,1000)};
3.) Check if it is supported by the device and set the CONTROL_AE_REGIONS (same for CONTROL_AF_REGIONS)
if (camera2SupportHandler.cameraCharacteristics.get(CameraCharacteristics.CONTROL_MAX_REGIONS_AE) > 0) {
camera2SupportHandler.mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_REGIONS, meteringFocusRectangleList);
}
4.) Tell the camera to start Exposure control
camera2SupportHandler.mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER, CameraMetadata.CONTROL_AE_PRECAPTURE_TRIGGER_START);
The CONTROL_AE_STATE is always in CONTROL_AE_STATE_SEARCHING, but doesn't use the configured regions...
After long testing & development I've found an answer.
The coordinate system - Camera 1 API VS Camera 2 API
RED = CAM1; GREEN = CAM2; As shown in the image below, the blue rect are the coordinates for a possible focus/exposure area for the Cam1. By using the Cam2 API, there must be firstly queried the max of the height and the width. Please find more info here.
Initial values for CONTROL_AF_MODE & CONTROL_AE_MODE: See in the question above.
Set the CONTROL_AE_REGIONS: See in the question above.
Set the CONTROL_AE_PRECAPTURE_TRIGGER.
// This is how to tell the camera to start AE control
CaptureRequest captureRequest = camera2SupportHandler.mPreviewRequestBuilder.build();
camera2SupportHandler.mCaptureSession.setRepeatingRequest(captureRequest, captureCallbackListener, camera2SupportHandler.mBackgroundHandler);
camera2SupportHandler.mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER, CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER_START);
camera2SupportHandler.mCaptureSession.capture(captureRequest, captureCallbackListener, camera2SupportHandler.mBackgroundHandler);
The ''captureCallbackListener'' gives feedback of the AE control (of course also for AF control)
So this configuration works for the most Android phones. Unfortunately it doesn't work for the Samsung S6/7. For this reason I've tested their Camera SDK, which can be found here.
After deep investigations I've found the config field ''SCaptureRequest.METERING_MODE''. By setting this to the value of ''SCaptureRequest.METERING_MODE_MANUAL'', the AE area works also the Samsung phones.
I'll add an example to github asap.
Recently I had the same problem and finally found a solution that helped me.
All I needed to do was to step 1 pixel from the edges of the active sensor rectangle. In your example instead of this rectangle:
meteringRectangleList = new MeteringRectangle[]{new MeteringRectangle(0,0,500,500,1000)};
I would use this:
meteringRectangleList = new MeteringRectangle[]{new MeteringRectangle(1,1,500,500,1000)};
and it started working as magic on both Samsung and Nexus 5!
(note that you should also step 1 pixel from right/bottom edges if you use maximum values there)
It seems that many vendors have poorly implemented this part of documentation
If the metering region is outside the used android.scaler.cropRegion returned in capture result metadata, the camera device will ignore the sections outside the crop region and output only the intersection rectangle as the metering region in the result metadata. If the region is entirely outside the crop region, it will be ignored and not reported in the result metadata.
Attempt to measure distance based acceleration (accelerometer mobile). If that's true
Accelerometer {
id: accel
dataRate: 1000 / 25
onReadingChanged: {
console.log(reading.x, reading.y, reading.z);
}
}
In console
D/libsensor.so(16533): qrc:/main.qml:20 (onReadingChanged): qml: 1.359906554222107,8.791508674621582,-0.4405331015586853
Now when you display information and having the mobile completely still (motionless). Shows acceleration in all axes, which is absurd! You have any idea why?
That's certainly not absurd.
According to Einstein's widely accepted (but still disturbing) theories, your phone can't tell if it's sitting still on planet Earth or accelerating inside a spaceship in deep space - that's called the "equivalence principle". So it's just assuming it's in an accelerating spaceship, because why not ? And that's so much cooler, don't you think ?
If you're near (or on) a planet and reading a zero acceleration, that's bad news, because that means you're freefalling in the distorted spacetime around the planet, and you're probably about to hit something.
You're reading an acceleration of about 9m/s^2, which is close to Earth's g value, so that's approximatively right, depending on your phone orientation. Maybe the accerelometer calibration is not quite right, you can test it with a dedicated application, if you've not done it already. NB Some apps will compensate for the gravity of Earth.
Of course, there's also the possibility of bugs in the phone or in Qt or in your code, or hardware failure, but you have to know what to expect.
Hope that helps.
I'm developing an Android application with a camera-related functionality feature.
First of all, I read a lot of stuff on SO, XDA and so on, then please don't redirect me to other useless posts.
I am trying to implement something like a "fixed focus mode", so that:
I start my application with FOCUS_MODE_AUTO (or something else);
bring into focus an object at an arbitrary distance;
fix the current focus;
move the camera on another object at a different distance which is out of focus.
I tried different solutions, i.e.:
mCamera.cancelAutoFocus() in the AutoFocusCallback to prevent the adjustment of the focus;
set a FocusArea: new Camera.Area(new Rect(-50, -50, 50, 50), 1000) to fix the focus on the current area.
I'm targeting API 20 and I'm working on a Samsung Galaxy S5. On this device, the supported focus modes are:
- auto
- infinity
- macro
- continuous-video
- continuous-picture
The suggestion that I found more frequently is to recompile Android...
"AUTO" mode doesn't mean that the camera continuously focuses - just that when you call the autoFocus command the focus is done automatically with no indication on what result you expect not like "Macro" or "Infinity".
http://developer.android.com/reference/android/hardware/Camera.html#autoFocus(android.hardware.Camera.AutoFocusCallback)
So if you don't have a loop that calls the autoFocus (as many examples do or call it again in the Callback) your focus should stay after it runs once.
If I understand, you want to focus keep the focus of the first object.
Have you tried to change the camera mode to FOCUS_MODE_FIXED after you focus the first object ?
Like that :
Camera.Parameters mParam = mCamera.getParameters();
mParam.setFocusMode(Camera.Parameters.FOCUS_MODE_FIXED);
mCamera.setParameters(mParam);
I have built and installed HelloSensor sample app on my Android/SmartWatch2 devices.
After commenting //sensor.getType().getName().equals(Registration.SensorTypeValue.MAGNETIC_FIELD)
to avoid accelerometer values display to be scratched by magnetic field values display, I was very happy with the result: I clearly saw the expression of "SW2 acceleration - gravity" displayed on my SmartWatch (clearly seeing ~(0, 0, 9.8) when the watch is layed down on a table, and ~(0, 9.8, 0) when I hold the SW vertically).
My problem is that, today, whatever orientation I give to my SmartWatch, values do not change anymore => ~(0, 0, 9.8) is always displayed, even if I hold the SW vertically.
Since it worked fine at first, I wonder if my sensor is not "broken". How can I check this?
Did you try restarting and/or resetting your SW2? Also you can try unpairing/repairing the watch with the phone.
If it still isn't working, not sure there is much you can do to check the sensor otherwise. If your watch is still under warranty would suggest sending it in for repairs.