We are developing our own Android-based hardware and we wish to use Vuforia (developed via Unity3D) for certain applications. However, we are having problems making Vuforia work well with our current camera orientation settings.
On our hardware, when the camera is placed horizontally - everything works fine. That is, when the camera is parallel to the placement of the display. However, we need to place the camera vertically, or in other words, with a 90 degree difference to the placement of the display. These are all hardware settings. Our kernel is programmed according to such settings and every other program that utilises the camera works compatibly with everything, including our IMU sensors. However, apps developed with Vuforia behave completely odd when the camera is placed vertically.
We assume the problem to be related to Vuforia's algorithms of processing raw camera data however we are not sure. Moreover, we do not know how to fix the situation. For further details, I can list:
-When "Enable Video Background" is on, the projected image is distorted and no video feed is available. The AR projection appears on a black background with distorted dimensions.
-When "Enable Video Background" is on and the device is rotated, the black background is replaced by flickering solid colors.
-When "Enable Video Background" is off, the AR projection has normal dimensions (no distortion) however it is tracked with wrong axis settings. For example, when the target moves left in real world, the projection moves up.
-When "Enable Video Background" is off and the device is rotated, the AR projection is larger compared to its appearance when the device is in it's default state.
I will be glad to provide any more information you need.
Thank you very much, have a nice day.
PS: We have found out that applications that use the camera as a main purpose (Camera apps, Barcode Scanners, etc) work fine while apps for which camera usage is an extra quality (such as some games) have the same problem as Vuforia. This make me think that apps who access the camera directly work fine whereas those who use Android API and classes fail for some reason.
First understand that every platform deals with cameras differently and that beyond this different android phone manufacturers deal with these differently as well. In my testing WITHOUT vuforia I had to transform the plane I cast the video feed onto 0,-90,90 for android/iphone and -270,-90,90 for the windows surface tablet. Past this the iPhone rear camera was mirrored, the android front camera was mirrored as well as the surface front camera. That is easy to account for, but an annoying issue is that the Google Pixel and Samsung front cameras were mirrored across the y (as were ALL iOS on the back camera), but the Nexus 6p was mirrored across the x. What I am getting at here is that there are a LOT of devices to account for with android so try more than just that one device. Vuforia so far has dealt with my pixel and 4 of my iOS devices just fine.
As for how to fix your problem:
Go into your player settings for unity and look at the orientation. There are a few options here and my application only uses portrait so I force portrait and it seems to work fine (none of the problems I had to account for with the above mentioned scenario). Vuforia previously did NOT support auto rotation so you need to make sure you have the latest version since it sounds like that is what you need. If the auto rotate is set and it is not working right you may have to account for that specific device (don't do this for all devices until after you test those devices). To account for that device use an if (or construct a case statement if you have multiple instances of this problem with different devices) and then reflect or translate as needed. Cross platform development systems (like unity) doesn't always get everything perfect since there is basically no standard. In these cases you have to directly account for them by creating a method and a case statement within that so you can cleanly and modularly manipulate all necessary devices. It is a pain, but it beats developing for all devices separately.
One more thing is make sure you check out the vuforia configuration file as it has some settings such as camera mirror and direction settings on there. These seem to be public settings so you should also be able to script to these in your case statement in the event you need to use "Flip Horizontally" for one phone, but not another.
Related
Assuming I have a new Android phone with a wide-angle back lens that can record with its native Camera App at Full-View mode (so we utilize the sensor as much as possible to not miss any visual information). Assume that the phone can even do that at 60FPS (I guess there are such phones but if not, please correct me).
The question is: Do you think that one could achieve the same footage (maybe using the NDK or CameraX) but at lower resolutions? I mean, keep the Full-View (because I do not want to crop the scene), but just lower proportionally the XY resolution. Is this something that is supported? Or are the settings offered by the native Apps the only possible options that a 3rd party developer can have?
If this can be achieved, how can one identify and use the Camera capabilities to set this up? (i.e. Full-View, Max FPS available for that Full-View and a resolution lower than 0.3MP shaped to that Full-View). Code example is more than welcome.
I want to configure front and back both cameras into Android camera2 API, to take pictures and videos from both cameras simultaneously, I have created 2 texture views , when ever I am opening one camera (front or back) my code is working fine but whenever I am trying to open both cameras simultaneously , code is breaking upon creating session, I am getting cameraAccessException :configure stream : method not implimented.
I want to save both front and back camera captured images as one image and both video as one video.
Guys it will be very much helpful if you can put some sample code or some sample link.
i am using one plus 6, i recently downloaded an app "Dual camera fron back Camera", by using this i am able to capture image from front and back both camera on the same time, so if somebody want to suggest for no hardware support, i think it may be valid for other phones but for my case i think i am missing something in coding, till now from google search it looks like there is some problem with session creation for second camera, i debugged my code, during creation of second camera session it fails so if you have any idea about that, please share.
Thanks
Rakesh
The camera API is fine with it, but most Android devices do not have sufficient hardware resources to run both cameras at once, so you will generally get an error trying to open the second camera.
Both image sensors are typically connected to the same image signal processor (ISP), and that ISP can only operate one camera at a time. Some high-end devices have ISPs with multiple processing pipelines which can in theory run more than one camera at a time, but they often require using multiple pipelines to handle advanced functionality or very high resolutions for the main (back) camera.
So on those devices, multiple cameras may be usable at once, but not at maximum resolution or other similar restrictions.
Some manufacturers include multi-camera features in their own camera app, since they know exactly what the limitations are and can write application code to work within them. They may not make multi-camera available to normal apps, due to concerns about performance, thermal limits, or just lack of time to verify more than the exact use case they implement in their own app.
The Android camera API does not currently have a way to query if multiple cameras can be used at once, or if they can be, what the restrictions are. So the only thing you can do is try, and handle the errors in case that isn't feasible.
I am working with Google Cardboard through Unity on a virtual reality project in which the view slowly changes from stereoscopic VR (a slightly different image in each eye) to monoscopic VR (the same image in each eye).
I can edit the cameras in the Unity Editor, and the changes work as intended in the Game window within the Editor, but when I build the project onto an Android phone, all the user can see is the scene through the default Google Cardboard camera setup.
This seems to happen as long as "Virtual Reality Supported" checkbox in the Player Settings menu is flagged. Does anyone know why this happens and if it can be overcome?
This happens because user's well being is our most precious resource in VR, so the devs of the SDK (GVR and OVR alike) have made some decisions to make messing with the camera intentionally convoluted - to avoid people sayich that the tech is at fault when all their audience feels sick because they changed the FOV or something.
I believe you can still do what you want to do by scaling the camera parent
I have an HTC One M8, and I have flashed cyanogen to it.
I am using it for testing some cameras that attach via the micro USB port.
When testing applications they seem to have a preference for front or rear, not a camera list, and assuming that different device models have different camera models, features, etc, I can only assume android provides some abstraction level between direct accessing the camera and the application. Furthermore is seems safe to assume that somewhere that is configured as front or rear respectively.
So the question is can that be edited, for instance a device has an internal camera that does not have the feature or resolution I need, can I plug one into the USB port, and change a configuration somewhere that says "when an application requests the rear camera, give it this one?"
Essentially can you edit somewhere what CAMERA_FACING_FRONT and CAMERA_FACING_BACK refers to on a hardware level?
I have looked and not be able to find much, and I do not compile my own android ROMS, so I am afraid it may end up being one of the reason ROMs are device specific because this is compiled into the OS and coded specifically for the camera native to that device, therefor not configurable, however if someone could nudge me in the correct direction that would be great.
Example, http://developer.android.com/training/camera/cameradirect.html states a call to open() on the Camera object returns the first rear facing camera. Something somewhere has instructed android what that camera is and that it is facing rear. It is specifically that configuration point I want to see if I can pick up and redirect. an example of why, I have an application that given the known size of something in the field of the camera, it can accurately measure distance to that object, it only does it through the rear camera, I want to try to make it do it through the thermal camera.
The third camera works fine on the device, and its native app consistently finds it (albeit not sure how or if it is a specific protocol over USB, driver, etc...)
It appears the setting I am looking for is defined in the HAL https://source.android.com/devices/camera/index.html#implementing
camera_common.h defines an important struct, camera_module, which
defines a standard structure to obtain general information about the
camera, such as its ID and properties that are common to all cameras
such as whether or not it is a front or back-facing camera.
Since this is modified in the header file and compiled into the kernel, then without hacking the ROM it would appear it is simply NOT configurable at any level outside a custom ROM.
Id est, I will never force another app into the camera of my choosing unless I choose to build a ROM to do it or without overly invasive hacking at the running ROM :-(
In my AS3 Flex Mobile application for Android, I am using camera and it is being automatically rotated 90 degrees before I even done any video rotation by myself, it seems like it's a known bug in AIR. But I was wondering if anyone found a solution since it's really pretty important feature for mobile application developer.
I've tried to do some rotation manually in my code, but it is only fixes the view on my display, but still sends the wrong video to the receiver.
If any code is required I will add the snippets
Please let me know.
As you mentioned, this is a known bug with AIR. It is not consistent, either. On some devices, it is in the correct orientation but in some (and all iOS devices, I believe, though I haven't fully tested that), it is rotated as you are seeing. For example, it was always oriented correctly on my Nexus 4 and on my Nexus 5, but a friends Moto X is rotated incorrectly.
Unfortunately, I don't believe there is anything you can do short of having the user do a calibration (i.e. overlay a straight line and tell them to place it horizontally and click a button) and rotating the camera display and any images you take with the display.
That being said, if you are using the camera to take photos, I highly recommend using CameraUI instead, which is the native implementation.
I've faced the same issue today but i'm developping in Java, not with AIR so i don't know if it the same, for me the solution was to add this line before starting the recording.
mMediaRecorder.setOrientationHint(90);