I have developed simple camera application for Android mobile using flex 4. The problem is , when I run that application it uses the rear camera. It's not using the front camera. How can I change the camera. I need to use front side camera for this application , kindly help me .
var camera:Camera = Camera.getCamera(cameraIndex.toString());
if (camera)
{
var ui: UIComponent = new UIComponent();
var localVideoDisplay: Video = new Video(180, 135);
localVideoDisplay.attachCamera(camera);
ui.addChild(localVideoDisplay);
cameraGroup.addChild(ui);
}
this is the code I have used in my application.
try
function getCamera( position:String ):Camera
{
var camera:Camera;
var cameraCount:uint = Camera.names.length;
for ( var i:uint = 0; i < cameraCount; ++i )
{
camera = Camera.getCamera( String(i) );
if ( camera.position == position )
return camera;
}
return Camera.getCamera();
}
Use getCamera(CameraPosition.FRONT)
Related
I would like to get the value of the current gains and change the value of the RGB gains.
In iOS, Apple provides setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:completionHandler.
- (void)setWhiteBalanceGains:(AVCaptureWhiteBalanceGains)gains
{
NSError *error = nil;
if ( [self.captureDevice lockForConfiguration:&error] ) {
AVCaptureWhiteBalanceGains normalizedGains = [self normalizedGains:gains];
[self.captureDevice setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:normalizedGains completionHandler:nil];
[self.captureDevice unlockForConfiguration];
}
else {
NSLog( #"Could not lock device for configuration: %#", error );
}
}
- (AVCaptureWhiteBalanceGains)normalizedGains:(AVCaptureWhiteBalanceGains) g
{
AVCaptureWhiteBalanceGains gains = g;
gains.redGain = MAX(gains.redGain, 1.0f);
gains.greenGain = MAX(gains.greenGain, 3.0f);
gains.blueGain = MAX(gains.blueGain, 18.0f);
return gains;
}
How can we achieve this in android using cameraX?
COLOR_CORRECTION_GAINS
COLOR_CORRECTION_MODE
I have checked in the doc regarding channel control. But how can we change color correction and reset the cameraX preview with the new control?
You can use Camera2Interop:
fun buildPreview() : Preview {
val builder = Preview.Builder()
val camera2InterOp = Camera2Interop.Extender(builder)
camera2InterOp.setCaptureRequestOption(CaptureRequest. COLOR_CORRECTION_MODE, CameraMetadata.COLOR_CORRECTION_MODE_FAST)
return builder.build()
}
Old but still used is the class Camera.Parameters#getWhiteBalance
https://developer.android.com/reference/android/hardware/Camera.Parameters#getWhiteBalance()
Using class Camera.Parameters call getWhiteBalance.
The newer way is to use Capture request https://developer.android.com/reference/android/hardware/camera2/CaptureRequest
Here is the full documentation of Camera2
https://developer.android.com/reference/android/hardware/camera2/package-summary
I am developing a barcode reader with Xamarin.Forms. And I'm trying to scan the image on Android device.
First I select the image from the gallery with Xamarin.Essentials MediaPicker and from the path of this image I get an RGBLuminance with the Dependency class.
Then I am trying to decode this RGBLuminance with the Decode() method of the ZXing BarcodeReaderGeneric class.
The application successfully decodes the barcodes in some images. However, sometimes it returns null when decoding. I might have made a mistake while converting the image to Bitmap or creating the RGBLuminanceSource.
I would like to find out how a class that can decode both color, black and white and grayscale images should be.
public RGBLuminanceSource GetRGBLuminanceSource(string imagePath)
{
if (File.Exists(imagePath))
{
Android.Graphics.Bitmap bitmap = BitmapFactory.DecodeFile(imagePath);
List<byte> rgbBytesList = new List<byte>();
for (int y = 0; y < bitmap.Height; y++)
{
for (int x = 0; x < bitmap.Width; x++)
{
var c = new Android.Graphics.Color(bitmap.GetPixel(x, y));
rgbBytesList.AddRange(new[] { c.A, c.R, c.G, c.B });
}
}
byte[] rgbBytes = rgbBytesList.ToArray();
return new RGBLuminanceSource(rgbBytes, bitmap.Width, bitmap.Height, RGBLuminanceSource.BitmapFormat.RGB32);
}
return null;
}
Command in the ViewModel class:
public ICommand PickCommand => new Command(PickImage);
private async void PickImage()
{
var pickResult = await MediaPicker.PickPhotoAsync(new MediaPickerOptions
{
Title = "Select a barcode."
});
var path = pickResult.FullPath;
var RGBLuminance = DependencyService.Get<ILuminance>().GetRGBLuminanceSource(path);
var reader = new BarcodeReaderGeneric();
var result = reader.Decode(RGBLuminance);
}
I am using this code in xamarin.android and i never had issues with it:
var scanner = new MobileBarcodeScanner();
var result = await scanner.Scan(_context, MobileBarcodeScanningOptions.Default);
It opens the camera, user takes a pic of barcode and result.Text contains the scanned barcode.
I am using ionic 2 with webrtc to get a video stream from both front and rear camera.
Please see my typescript code below:
if (this.isFrontCam) {
constraints = {
mandatory: {},
optional: [{sourceId: this.cameras[0]}]
};
} else {
constraints = {
mandatory: {},
optional: [{sourceId: this.cameras[1]}]
};
}
if (this.currentVideoStream && this.currentVideoStream !=null) {
this.currentVideoStream.getTracks().forEach(function (track) {
track.stop();
});
this.currentVideoStream.release();
this.currentVideoStream = null;
}
var n = <any>navigator;
n.getUserMedia = n.getUserMedia || n.webkitGetUserMedia || n.mozGetUserMedia || n.msGetUserMedia;
//getting local video stream
n.getUserMedia({
audio: true,
video: constraints
}, function (myStream) {
alert("Current Video stream " + self.currentVideoStream);
self.currentVideoStream = myStream;
alert("New Stream"+ myStream);
//displaying local video stream on the page
(<HTMLVideoElement>document.getElementById('localVideo')).src = window.URL.createObjectURL(myStream);
I am getting 2 different (front and rear) cameras device ids (this.cameras) and if I use them individually both cameras are working as expected but when I flip them using the above code(button click from UI calls this function) like above they are not working. It simply shows black screen.
I was able to solve this by removing the below 2 lines:
this.currentVideoStream.release();
this.currentVideoStream = null;
and adding (<HTMLVideoElement>document.getElementById('localVideo')).play();
i build a application that record sound in Desktop using ActionScript 3 , now i convert the application to Andriod Application but there is a problem that SampleDataEvent.SAMPLE_DATA event doesn't receive any data to record
Here is the code :
private var _microphone:Microphone;
private var _buffer:ByteArray = new ByteArray();
private var _difference:uint;
public function record():void
{
if ( _microphone == null )
_microphone = Microphone.getMicrophone();
_difference = getTimer();
_microphone.setSilenceLevel(_silenceLevel, _timeOut);
_microphone.gain = _gain;
_microphone.rate = _rate;
_buffer.length = 0;
_microphone.addEventListener(SampleDataEvent.SAMPLE_DATA, onSampleData);
_microphone.addEventListener(StatusEvent.STATUS, onStatus);
}
private function onSampleData(event:SampleDataEvent):void
{
_recordingEvent.time = getTimer() - _difference;
dispatchEvent( _recordingEvent );
var buteData:Number;
while(event.data.bytesAvailable > 0)
{
buteData = event.data.readFloat();
_buffer.writeFloat(buteData);
soundBytes.writeFloat( buteData);
}
}
anyone can help here
Thanx
I think, maybe you don't check out about AIR for Android settings. If you not checked in the RECORD_AUDIO. you should check it.
refer a below image.
I'm current working wits AS3 and Flex 4.6 to create an android application.
i'm using the front camera and attach it to a local Video object that i add as an child to an VideoDisplay object.
When i debug on my computer everything is working perfectly, but when i build the project and run it on my Android device my local video display becomes an gray grid.
As example i took an picture of the device.
I wrote this method based on a post here on Stackoverflow to initialize the front and back camera.
private function InitCamera():void {
var CamCount:int = ( Camera.isSupported ) ? Camera.names.length : 0;
for( var i:int = 0; i < CamCount; i++ ) {
var cam:Camera = Camera.getCamera( String( i ) );
if( cam ) {
if( cam.position == CameraPosition.FRONT ) {
CamFront = cam;
continue;
}
if( cam.position == CameraPosition.BACK ) {
CamBack = cam;
continue;
}
if( cam.position == CameraPosition.UNKNOWN ) {
CamFront = cam;
continue;
}
}
}
}
And i wrote this method to create an Video object, attach the front Camera as the default camera and add the Video as an child to an VideoDisplay:
private function SetUpLocalVideo():void {
Debug( "Setting up local video" );
LocalVideo = new Video( this.LVideo.width, this.LVideo.height );
LocalVideo.attachCamera( CamFront );
LVideo.addChild( LocalVideo ); <--- this is the VideoDisplay
}
I've been searching on the internet for an solution, but so far i failed to find any.
Do any one else had this problem before ? can you share you solutions with me ?
I appreciate the help.
Thanks.
Set the render mode to direct on your application.xml
<renderMode>direct</renderMode>
If it still doesn't work, change the dpi settings to 240 of your main flex application.