we are developing an Android App to be used on 1 of 2 Android cameras. Specifically the Nikon S800C.
We are able to use digital zoom but as yet we have not found a way of handling the hardware zoom which is the whole point of using a camera rather than a basic smart phone.
We cannot use the internal camera app as we have to make a custom camera overlay and preview screen. As such we have had to compose our own camera app.
It seems from what I have been advised it is not technically possible to control the camera hardware zoom from within our own app.
So my question is: Is this really the case? Is there absolutely no way of building a custom camera app that has a custom camera capture experience and also a camera HARDWARE zoom?
If this is the case, is there a way of creating a custom preview and camera overlay screen for the existing internal camera app on our device? Which would negate the need to build our own.
Thanks.
Related
I am an iOS developer, newbie to Android App Development. I am currently developing an App in Android where I have to open the camera preview and show a marker right in front. As the user pans through the camera, the marker should be fixed at one point and should be visible only when camera is pointed at that region.
I know this can be very easily done in iOS using the ARKit. (https://www.raywenderlich.com/172543/augmented-reality-and-arkit-tutorial)
I need some reference on how to implement this on Android. Should I be using ARCore or is it possible just using the Camera API and SurfaceView?
Thanks!
Is it possible to use the device's front facing camera with ARCore? I see no references in Google's docs.
I can't find any reference in the docs. It would also make the whole AR process much more complicated. You have to invert the logic for moving the camera etc. And it's much harder to recognize planes as the user is always in the way. Right now, ARCore only recognizes planes so you can't detect feature points, e.g. in the face of a user.
The answer is: Yes.
With a front-facing camera (without a depth sensor) on any supported Android device you can track users' face since ARCore 1.7 SDK release. The API allowing you to work with a front-facing camera was named Augmented Faces. So, from now you can create a high quality 468-point 3D mesh that your Android app can overlay on a user’s face to bring fun animated effects.
Making an area-aware Unity app on Tango, and it works fine until I add another camera. It uses an ADF for location awareness, and the default AR Camera prefab. So it shows what the device's camera sees. Fine. Now I want to add a mini-map in the upper corner. Simply adding another camera will cause the app to crash. I tried:
- perspective vs orthog
- render to texture
- various 'clear' flags
- checked my layers
- no culling or anything fancy
It's simple. If I enable the camera, it crashes. If I disable the camera, it works.
Does anyone have any insights into this?
Thanks
so I need to modify certain properties about the way this app is taking/processing a picture. Yet, I would like to just use the standard buttons that the android has for taking pictures, so that i don't have to code them all myself. is this possible when doing such with a surface view and the Camera API? like, zoom, flash, choose camera, and such
I have been struggling with creating a camera preview with a custom layout/overlay that mimics the functionality of the native camera app.
Is there any way I can simply constrain the size of the native camera preview and possibly overlay a grid image on top of it without having to fully rewrite all of the camera's functionality?
No. You cannot hack the UI of another app, any more than another app can hack your UI.
Also, bear in mind that there is no single "native camera app". There are dozens, perhaps hundreds, across the thousands of Android device models.