Recently I went thru the code for accessing the camera using flash ActionScript3 and I have tested the code in iMac machine, iPhone and Android.Now based on this, I am developing an application for Android which includes the accessibility of the front camera. Now my Problem is I dont know how to access the front camera? We should use some other code or should we specify which camera should be accessed? First of all, can we access the front camera thru flash?
I made a simple android app. Here is the code for selecting camera window
public class SelectCameraAlertAndroid extends StartAlertAndroid_design{
public function SelectCameraAlertAndroid() {
frontCameraButton.addEventListener(MouseEvent.CLICK, onFrontCamera);
backCameraButton.addEventListener(MouseEvent.CLICK, onBackCamera);
}
private function onFrontCamera(event:MouseEvent):void {
Model.model.camera = Camera.getCamera("1");
Model.model.cameraSelectedSignal.dispatch();
dispatchEvent(new Event("closeMe"));
}
private function onBackCamera(event:MouseEvent):void {
Model.model.camera = Camera.getCamera("0");
Model.model.cameraSelectedSignal.dispatch();
dispatchEvent(new Event("closeMe"));
}
}
Not true. You can access the front camera on Android.
The only problem is that you don't get to use the CameraUI(pretty sure).
var camera = Camera.getCamera("1");
camera.setMode(stage.stageWidth, stage.stageHeight, 30, true);
var video:Video = new Video(stage.stageWidth, stage.stageHeight);
video.attachCamera(camera);
addChild(video);
Note: This answer is outdated. Please refer to the other answers for updated information.
Currently, AIR only supports access to the primary camera on an Android device.
http://forums.adobe.com/thread/849983
Official documentation: http://help.adobe.com/en_US/FlashPlatform/reference/actionscript/3/flash/media/Camera.html#getCamera()
"On Android devices, you can only access the rear-facing camera."
Related
I'm using expo-image-picker to allow the user to pick and take pictures. Choosing an image from the library works as expected but when using the camera, after taking the picture the app crashes.
Here is my code:
const take = async () => {
let result = await ImagePicker.launchCameraAsync({
mediaTypes: ImagePicker.MediaTypeOptions.Images,
quality: 1,
});
console.log(result.uri);
if (!result.cancelled) {
setImageUri(result.uri);
}
};
I think I have all the required permissions and the problem seems to occur only on older android devices with limited memory. Any ideas?
There is no good and easy solution.
What is happening is, when you launch the camera, it launches as a new Activity, leaving your app mainActivity in background.
From Android P (9) and on, the OS can kill your background activity. There is even a website just about whose implementation is worse for devs: https://dontkillmyapp.com/
Possible solutions:
Be prepared to crash. Save all states and navigation stack on local-storage, launchCamera. The app will restart, you will restore all the screen/navigation/data/etc, and get photo using the ImagePicker.getPendingResultAsync.
Change lib to expo-camera or react-native-vision-camera as they use the camera on mainActivity but needs boring reimplementation, and don' t look as good as the manufacturer native cameras.
I'm using Agora Android native sdk 3.3.0. I made a reference to this project SwitchCameraScreenShare
I get a black screen/(no video frame update), when I switch the sdk video source (using mRtcEngine.setVideoSource(screenShareVideoSource)) to the screensharing video source. When I switch back to default video source (i.e: AgoraDefaultSource class) like this mRtcEngine.setVideoSource(AgoraDefaultSource()), the camera comes back.
This is the video source I use for screen share: ExternalVideoInputManager. This is the actual screen share input from android MediaProjectionManager: ScreenShareInput.
I think the problem is how I start the screen share. Before I start the screen share service (I use android Service), I first initialise the SDK, then join a channel, like below:
fun initializeAgoraEngine(context: Context) {
try {
appId = context.getString(R.string.agora_app_id)
mRtcEngine = RtcEngine.create(context, appId, this)
} catch (e: Exception) {
throw RuntimeException("NEED TO check rtc sdk init fatal error\n" + Log.getStackTraceString(e))
}
setupVideoProfile()
agoraEvents?.onAgoraInitialized()
}
private fun setupVideoProfile() {
mRtcEngine!!.enableVideo()
mRtcEngine!!.setVideoEncoderConfiguration(VideoEncoderConfiguration(VideoEncoderConfiguration.VD_640x360,
VideoEncoderConfiguration.FRAME_RATE.FRAME_RATE_FPS_15,
VideoEncoderConfiguration.STANDARD_BITRATE,
ORIENTATION_MODE.ORIENTATION_MODE_FIXED_PORTRAIT))
}
Then after I start screen share service and set the screen share video source, I remove the old video textureview, from the video FrameLayout view, then set a new one, like this:
if (binding.myVideoview.childCount > 0) {
binding.myVideoview.removeAllViews()
}
val textureView = RtcEngine.CreateTextureView(this)
binding.myVideoview.addView(textureView)
AgoraHelper.setLocalVideo(textureView)
But it shows black screen, the video frame on the other device connected to the channel freezes, I've granted permissions for CAMERA, AUDIO-RECORD. Please help me know what i'm doing wrong, thanks.
This is the situation: We have an android game that has been published for a few years now, that was originally developed in ActionScript3 and built with Adobe Air...
Now, the problem we are facing is that because ActionScript3 is basically unsupported by most advertising and analytics SDKs, we need to update to something newer, without losing our current playerbase. So we started developing a clone of the current game in Unity3d, and also found a way to sign with the old .P12 and the playstore console lets us upload and everything.
Now to the real question: The last and current hurdle we have to overcome is find a way to preserve the current users' save games when they update to the Unity version. We use a SharedObject to store all user preferences and progress.
I have searched forums, here in SO and in the Unity3D forums and found no one asking a similar question.
But we would like to know if someone knows a way for another app, with the same signature and packagename (is basically an update to the AdobeAir app) to read the SharedObject from the AdobeAir App and parse it so it can create a local save with Unity3D (with PlayerPrefs or any other method).
The code we use to Save and Load progress from a SharedObject is this:
public function LoadVal(_id:String, _defaultValue:Object = null):Object
{
if (existSO(_id))
{
return loadSO(_id);
}
else {
return _defaultValue;
}
}
private function loadSO(id:String):Object
{
mySO = SharedObject.getLocal(nameSO);
return mySO.data[id];
}
public function existSO(id:String):Boolean
{
mySO = SharedObject.getLocal(nameSO);
return mySO.data[id] != null;
}
public function SaveVal(id:String,val:Object):void
{
mySO = SharedObject.getLocal(nameSO);
mySO.data[id] = val;
mySO.flush()
}
public function clearSO():void
{
mySO = SharedObject.getLocal(nameSO);
mySO.clear();
}
I'm trying to build a QR Code reader following this tutorial
http://code.tutsplus.com/tutorials/android-sdk-create-a-barcode-reader--mobile-17162
I managed to get everything working, except that I need the camera to be the front camera of my device instead of the rear camera. I can't find any place in the tutorial that allows me to change this. I tried following this answer, but I still could not get it to work.
Mainly, my issue is with importing the library. I get the following error.
operator is not allowed for source level below 1.7
When I set my compiler settings to 1.7, I get this
Android requires compiler compliance level 5.0 or 6.0. Found '1.7' instead
I'm not exactly very proficient with Android and I apologize if it might not be a good question.
So, any way for me to use ZXing with the front camera in my app? Any links?
Thank you very much.
The source code uses Java 7. Android does not require Java <= 6. You can see that the build provided in the project happily feeds Java 7 bytecode to dex and produces a working app. I am not sure what tool you are using that suggests otherwise. Maybe it is old.
You should not need to copy and compile the project's code though. Why is that necessary? use the core.jar file.
You don't need any of this to use the front camera. Just invoke by Intent (https://github.com/zxing/zxing/wiki/Scanning-Via-Intent) and set extra SCAN_CAMERA_ID to the ID of the camera you want -- usually 1 for the front one.
Example:
intent.putExtra("SCAN_MODE", "QR_CODE_MODE");
intent.putExtra("SCAN_CAMERA_ID", 1);
If you use IntentIntegrator, you can use setCameraId() to specify the front camera:
IntentIntegrator integrator = new IntentIntegrator(yourActivity);
integrator.setCameraId(1);
integrator.initiateScan();
After quite a few search i found how to use the front camera. There is this piece of code in com.google.zxing.client.android.camera.CameraConfigurationManager.java
public void openDriver(SurfaceHolder holder) throws IOException {
Camera theCamera = camera;
if (theCamera == null) {
theCamera = Camera.open();
if (theCamera == null) {
throw new IOException();
}
camera = theCamera;
}
theCamera.setPreviewDisplay(holder);
jus change the Camera.open() to Camera.open(1)
worked fine for me
Can anyone tell me how to check if the android phone has a front camera too? I'd tried to use some help form https://docs.google.com/View?id=dhtsnvs6_57d2hpqtgr but Camera camera = FrontFacingCamera.getFrontFacingCamera(); sometimes works sometimes not.
Any help please?
Can anyone tell me how to check if the android phone has a front camera too?
There is no API for this, at least through Android 2.2. Sorry!
I'd tried to use some help form https://docs.google.com/View?id=dhtsnvs6_57d2hpqtgr but Camera camera = FrontFacingCamera.getFrontFacingCamera(); sometimes works sometimes not.
That is for two specific models of phones, not for Android devices in general. With luck, the upcoming Gingerbread release will add built-in support for front-facing cameras.
In the meantime, you need to get the instructions (like the one you linked to) from each and every device manufacturer and attempt to follow them.
private boolean hasFlash() {
if (Build.VERSION.SDK_INT > Build.VERSION_CODES.M) {
try {
if(camManager==null)
camManager=(CameraManager)getSystemService( CAMERA_SERVICE );
String cameraId = camManager.getCameraIdList()[1];
CameraCharacteristics cameraCharacteristics = camManager.getCameraCharacteristics( cameraId );
return cameraCharacteristics.get( CameraCharacteristics.FLASH_INFO_AVAILABLE );
} catch (Exception e) {
e.printStackTrace();
}
}
return false;
}