I would like to get the value of the current gains and change the value of the RGB gains.
In iOS, Apple provides setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:completionHandler.
- (void)setWhiteBalanceGains:(AVCaptureWhiteBalanceGains)gains
{
NSError *error = nil;
if ( [self.captureDevice lockForConfiguration:&error] ) {
AVCaptureWhiteBalanceGains normalizedGains = [self normalizedGains:gains];
[self.captureDevice setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:normalizedGains completionHandler:nil];
[self.captureDevice unlockForConfiguration];
}
else {
NSLog( #"Could not lock device for configuration: %#", error );
}
}
- (AVCaptureWhiteBalanceGains)normalizedGains:(AVCaptureWhiteBalanceGains) g
{
AVCaptureWhiteBalanceGains gains = g;
gains.redGain = MAX(gains.redGain, 1.0f);
gains.greenGain = MAX(gains.greenGain, 3.0f);
gains.blueGain = MAX(gains.blueGain, 18.0f);
return gains;
}
How can we achieve this in android using cameraX?
COLOR_CORRECTION_GAINS
COLOR_CORRECTION_MODE
I have checked in the doc regarding channel control. But how can we change color correction and reset the cameraX preview with the new control?
You can use Camera2Interop:
fun buildPreview() : Preview {
val builder = Preview.Builder()
val camera2InterOp = Camera2Interop.Extender(builder)
camera2InterOp.setCaptureRequestOption(CaptureRequest. COLOR_CORRECTION_MODE, CameraMetadata.COLOR_CORRECTION_MODE_FAST)
return builder.build()
}
Old but still used is the class Camera.Parameters#getWhiteBalance
https://developer.android.com/reference/android/hardware/Camera.Parameters#getWhiteBalance()
Using class Camera.Parameters call getWhiteBalance.
The newer way is to use Capture request https://developer.android.com/reference/android/hardware/camera2/CaptureRequest
Here is the full documentation of Camera2
https://developer.android.com/reference/android/hardware/camera2/package-summary
Related
I am trying to apply Real-time FFT on a sensor data which is connected over the bluetooth BLE.
There is a sdk which allows you to receive the data from the sensor in the Android using handler. I am using ViewModel to send the sensor data in various parts of the app to plot the data using GraphView and perform FFT.
I am using JTransform to perform the FFT but before I was using JDSP to perform STFT.
Below is the code use to perform FFT on a unfiltered raw sensor data using JTransform
var t = 0
var fs = 512
var sampleSize = 2*fs
val windowSize = sampleSize/2
private fun getFFT(sample:DoubleArray): Array<DataPoint>{
val fft = DoubleFFT_1D(sampleSize.toLong())
fft.realForward(sample)
return analysed(sample)
}
private fun analysed(sample:DoubleArray): Array<DataPoint> {
val series:Array<DataPoint> = Array(sample.size) { i -> DataPoint(0.0,0.0) }
sample.forEachIndexed { i, y ->
val x = i.toDouble()
series[i] = DataPoint(x, y)
}
return series
}
sensorViewModel.getRaw().observe(this){
if(t<sampleSize-1){
sample[t] = it.toDouble()
t++
}else{
sample = sample.takeLast(windowSize).toDoubleArray().plus(DoubleArray(2*windowSize) { i -> 0.0 })
t = windowSize
// Plot FFT
asyncTask.execute(onPreExecute = {
}, doInBackground = {
getFFT(sample)
}, onPostExecute = {
fftseries.resetData(it)
})
}
}
Although my code runs without crash but I can see so many problems with the app.
Using a sliding window to create "sample" to perform FFT on the created sample feels really inefficient. Can anyone please suggest how can I do write it better with the better control of window size.
How to make this FFT plot fast?
This is a weird issue . I don't know whether it is a bug or something in the library .So the issue is with flash of the camerax .
I assign a global variable flashMode
private var flashMode: Int = ImageCapture.FLASH_MODE_OFF
then set it to the ImageCaptureBuilder
imageCapture = ImageCapture.Builder()
.setCaptureMode(ImageCapture.CAPTURE_MODE_MINIMIZE_LATENCY)
.setTargetAspectRatio(screenAspectRatio)
.setFlashMode(flashMode)
.setTargetRotation(rotation)
.build()
And before taking the picture I turn the flash On
if (camera?.cameraInfo?.hasFlashUnit() == true) {
flashMode = ImageCapture.FLASH_MODE_ON
}
But this does Not turn the camera flash On , but it sets the flashMode value to 1 ,i.e.,ImageCapture.FLASH_MODE_ON (I checked via logging ).Before capturing the image , i again logged and checked the value of flashMode and it was 1 , but then too the flash was not turning on.But if I set the global variable as
private var flashMode: Int = ImageCapture.FLASH_MODE_ON
Then it works and the flash get's turned on .
I have tried this above method of setting the Flash On post setting it to FlashModeOff in my previous projects and it had worked well . But this time I am not able to understand what is wrong .
Thanks in advance
Try changing the flash mode with
imageCapture.flashMode = ImageCapture.FLASH_MODE_ON
instead of
flashMode = ImageCapture.FLASH_MODE_ON
Im using JAVA
private void ativarFlash() {
ic_flash.startAnimation(animationDown);
if (flashMode == FlashMode.ON) {
flashMode = FlashMode.OFF;
} else {
flashMode = FlashMode.ON;
}
startCamera();
}
And on startCamera();
private void startCamera() {
CameraX.unbindAll();
preview = setPreview();
imageCapture = setImageCapture();
//bind to lifecycle:
CameraX.bindToLifecycle(this, preview, imageCapture);
ivBitmap.setImageDrawable(null);
}
I use Camera X and trying to lock AF and AE using FocusMeteringAction, AF locking fine but AE doesn't lock. What can be a reason?
camerax_version = "1.1.0-alpha02"
val factory: MeteringPointFactory = previewView.meteringPointFactory
val point: MeteringPoint = factory.createPoint(x, y)
val builder = FocusMeteringAction.Builder(point)
builder.disableAutoCancel()
camerax?.cameraControl?.startFocusAndMetering(builder.build())
The code snippet is simple, and the ListenableFuture from startFocusAndMetering() returns a successful result, but AE still dynamic and not locked.
The result I am expecting:
Then point the app towards something bright (eg. the sun or bright light) then lock the exposure. Then move the camera away from the light and everything will be super dark. This shows that the camera is not automatically adjusting the exposure (because it is locked).
My actual result is:
Exposure adjusting and the picture is light/normal.
Would be grateful for any ideas!
Thanks in advance!
It's been a long time but this might be helpful to someone may ended up here.
I don't know if it is possible by using only CameraX apis. But I've achieved the expected behaviour with Camera2 apis. With the below snippet, the AF and AE can be measured and updated with a tap and stays locked afterwards until another tap occures.
private void initTapToFocus() {
previewView.setOnTouchListener((v, event) -> {
MeteringPointFactory meteringPointFactory = previewView.getMeteringPointFactory();
MeteringPoint point = meteringPointFactory.createPoint(event.getX(), event.getY());
FocusMeteringAction action = new FocusMeteringAction
.Builder(point, FocusMeteringAction.FLAG_AF)
.addPoint(point, FocusMeteringAction.FLAG_AE)
.disableAutoCancel()
.build();
//Unlock AE before the tap so it can be updated.
lockAe(false, () -> doFocusAndMetering(action));
return true;
});
}
private void doFocusAndMetering(FocusMeteringAction builder) {
ListenableFuture<FocusMeteringResult> future = cameraControl.startFocusAndMetering(builder);
//Lock AE again when measuring completed
future.addListener(() -> lockAe(true, () -> {}), executor);
}
private void lockAe(boolean lockAe, Runnable doWhenComplete) {
Camera2CameraControl camera2CameraControl = Camera2CameraControl.from(cameraControl);
CaptureRequestOptions options = new CaptureRequestOptions.Builder()
.setCaptureRequestOption(CaptureRequest.CONTROL_AE_LOCK, lockAe)
.build();
camera2CameraControl.setCaptureRequestOptions(options).addListener(doWhenComplete, executor);
}
There is a kind of weird issue, I am using oboe lib https://github.com/google/oboe, for sound playback. Of course you can choose sound playback output according to android settings
https://developer.android.com/reference/android/media/AudioDeviceInfo
So, if I need to set exact output chanel I need to set it to oboe lib.
By the way output chanal that I need is TYPE_BUILTIN_SPEAKER, but on some devices (sometimes, not constantly) I hear the sound from
TYPE_BUILTIN_EARPIECE
How I am doing this, I have such method to get needed chanel id
fun findAudioDevice(app: Application,
deviceFlag: Int,
deviceType: Int): AudioDeviceInfo?
{
var result: AudioDeviceInfo? = null
val manager = app.getSystemService(Context.AUDIO_SERVICE) as AudioManager
val adis = manager.getDevices(deviceFlag)
for (adi in adis)
{
if (adi.type == deviceType)
{
result = adi
break
}
}
return result
}
How I use it
val id = getAudioDeviceInfoId(getBuildInSpeakerInfo())
private fun getBuildInSpeakerInfo(): AudioDeviceInfo?
{
return com.tetavi.ar.basedomain.utils.Utils.findAudioDevice( //
getApplication<Application>(), //
AudioManager.GET_DEVICES_OUTPUTS, //
AudioDeviceInfo.TYPE_BUILTIN_SPEAKER //
)
}
private fun getAudioDeviceInfoId(info: AudioDeviceInfo?): Int
{
var result = -1
if (info != null)
{
result = info.id
}
return result
}
And eventually I need to set this id to oboe lib. Oboe lib is native lib, so with JNI I pass this id and set it
oboe::Result oboe_engine::createPlaybackStream()
{
oboe::AudioStreamBuilder builder;
const oboe::SharingMode sharingMode = oboe::SharingMode::Exclusive;
const int32_t sampleRate = mBackingTrack->getSampleRate();
const oboe::AudioFormat audioFormat = oboe::AudioFormat::Float;
const oboe::PerformanceMode performanceMode = oboe::PerformanceMode::PowerSaving;
builder.setSharingMode(sharingMode)
->setPerformanceMode(performanceMode)
->setFormat(audioFormat)
->setCallback(this)
->setSampleRate(sampleRate);
if (m_output_playback_chanel_id != EMPTY_NUM)
{
//set output playback chanel (like internal or external speaker)
builder.setDeviceId(m_output_playback_chanel_id); <------------- THIS LINE
}
return builder.openStream(&mAudioStream);
}
So, actually issue is that on some devices (sometimes, not constantly) I still hear that sound playback goes from internal speaker TYPE_BUILTIN_EARPIECE inspite of I set directly that I need to use TYPE_BUILTIN_SPEAKER
I checked a few times the flow, from moment that I get this id (it is acctually is 3) and up to the moment when I set it as a param to oboe lib, but still sometimes I hear sound from internal speaker.
So, question is - if I miss something here? Maybe some trick should be implemented or something else?
How can I close an especific Android app using UiAutomator API?
Like when you manually press the Recents button and swipe the app you want to close.
Better way (not device, OS version, UI or orientation specific):
Runtime.getRuntime().exec(new String[] {"am", "force-stop", "pkg.name.of.your.app"});
Tested and working on a Nexus 5X with android 6.0
The best option would be to use getUiDevice.pressRecentApps, this will load up the recent apps for you, then take a screenshot using the uiautomator viewerso you have a view of the xml of the screen that has been loaded. You can then use this xml to select the object you wish to swipe using
UiObject app = new UIObject(new UiSelector().resourceId("The id of the app");
app.swipeLeft(100);
or right
This should be able to close your app. The xml will depend on what style of android you are using and the device.
This is how I kill all android apps at once with uiautomator:
public static void killApps()
{
UiDevice device = UiDevice.getInstance(InstrumentationRegistry.getInstrumentation());
try
{
device.pressHome();
device.pressRecentApps();
// Clear all isn't always visible unless you scroll all apps down
int height = device.getDisplayHeight();
int width = device.getDisplayWidth();
device.swipe(width/2,height/2, width/2, height, 50);
UiObject clear = device.findObject(new UiSelector()
.resourceId("com.android.systemui:id/button")
.text("CLEAR ALL")
);
if (clear.exists())
{
clear.click();
}
}
catch (RemoteException e)
{
e.printStackTrace();
}
catch (UiObjectNotFoundException e)
{
e.printStackTrace();
}
}
Building on the solution from #user597159 I got the following to close all applications on a Pixel 2 for Firebase Test Lab (i.e. the "walleye" device type):
private void killAllApps() throws Exception {
boolean keepSwiping = true;
int maxSwipeAttempts = 10;
uiDevice.pressRecentApps();
for (int swipeAttempt=0; swipeAttempt<maxSwipeAttempts && keepSwiping; swipeAttempt++) {
int height = uiDevice.getDisplayHeight();
int width = uiDevice.getDisplayWidth();
uiDevice.swipe(width / 2, height / 2, width, height / 2, 50);
UiObject clearAll1 = uiDevice.findObject(new UiSelector().text("Clear all"));
UiObject clearAll2 = uiDevice.findObject(new UiSelector().textStartsWith("Clear all"));
UiObject clearAll3 = uiDevice.findObject(new UiSelector().textContains("Clear all"));
UiObject clear = clearAll1.exists() ? clearAll1 :
(clearAll2.exists() ? clearAll2 : clearAll3);
if (clear.exists()) {
Logger.debug(TAG, "Attempting to close app by killAllApps and found clear=all button on swipeAttempt=" + swipeAttempt);
clear.click();
keepSwiping = false;
} else {
Logger.debug(TAG, "Attempting to close app by killAllApps but have to keep swiping swipeAttempt=" + swipeAttempt);
keepSwiping = true;
}
}
}
Note on the Pixel 2, it is spelled "Clear all" not "CLEAR ALL".
I could not get some of the other solutions to work. I got UiObjectNotFoundException for the following:
app = uiDevice.findObject(new UiSelector().textContains("SDK Test App"));
And also for:
app = uiDevice.findObject(new UiSelector().className(com.locuslabs.android.sdk.SdkTestApplication.class));
In other words app.exists() returned false for these approaches that attempted to swipe up on the app to close on Pixel 2.
When it is just one app that will be in the recent app list, this worked for me.
if(mDevice.pressRecentApps()) {
Thread.sleep(1000);
int startX = 300; int startY =835; int endX = 1000; int endY = 835; // co-ordinates refer to x-axis from left of screen to right.
int steps = 8;// speed at which the app closes
mDevice.swipe(startX,startY,endX,endY,steps);
}
Here's a Kotlin answer that is similar to this answer. It adds an extension function to UiDevice to clear all tasks. I tested this on a Pixel 2 emulator.
fun UiDevice.clearAllTasks(swipeAttempts: Int = 10 ) {
pressRecentApps()
var currentAttempt = 1
val centerScreenX = displayWidth / 2
val centerScreenY = displayHeight / 2
while (currentAttempt <= swipeAttempts) {
Timber.d("Clear all tasks attempt $currentAttempt")
swipe(centerScreenX, centerScreenY, displayWidth, centerScreenY, 50)
val uiObject = findObject(UiSelector().text("Clear all"))
if (uiObject.exists()) {
uiObject.click()
break
} else {
currentAttempt++
}
}
}
So I took a little more comprehensive approach to this at it seemed quite unreliable with others answers. I use A LOT of custom extensions so will try to post most of them:
fun UiDevice.clearTasks(swipes: Int = 2) {
logInfo { message("clearTasks swipes:$swipes") }
pressHome()
wait()
pressRecentApps()
wait()
repeat(swipes) {
if (context.isPortrait)
swipe(centerScreenX, centerScreenY, centerScreenX, 50, 10)
else
swipe(centerScreenX, centerScreenY, 25, centerScreenY, 10)
wait()
}
}
fun UiDevice.wait(seconds: Int = defaultWaitTime) {
logInfo { message("wait:$seconds") }
waitForWindowUpdate(null, seconds * SecondLong)
waitForMoreIdle()
}
fun UiDevice.waitForMoreIdle(times: Int = 3) {
logInfo { message("waitForMoreIdle times:$times") }
repeat(times) { waitForIdle() }
}
val UiDevice.centerScreenX get() = displayWidth / 2
val UiDevice.centerScreenY get() = displayHeight / 2
val Context.isPortrait get() = resources.configuration.orientation == ORIENTATION_PORTRAIT
val context: Context get() = ApplicationProvider.getApplicationContext()
Note: I don't really on some text as it depends on language on phone that can be different. Also I like more this custom swipe times approach as I know how much tasks approximately I need to clear out most of the time. There is side effect of showing app launcher but who cares. Didn't find a way to detect it to cancel swiping yet.