How to make 360° video from Android device? - android

In iphone there is one app which gives access to both camera at same time. And iphone has launched hardware (lens) which easily install one both cameras at same time... When user click on record button... iPhone records 180+180 view from both the side and convert into 360° view. So is there any similar app for Android ??
It may make easy for people to shoot 360° view

Related

Detect if picture of picture taken in mobile app

I am working on a face recognition app where the picture is taken and sent to server for recognition.
I have to add a validation that user should capture picture of real person and of another picture. I have tried a feature of eye blink and in which the camera waits for eye blink and captures as soon as eye is blinked, but that is not working out because it detects as eye blink if mobile is shaken during capture.
Would like to ask for help here, is there any way that we can detect if user is capturing picture of another picture. Any ideas would help.
I am using react native to build both Android and iOS apps.
Thanks in advance.
Thanks for support.
I resolve it by the eye blink trick after all. Here is a little algorithm I used:
Open camera, click capture button:
Camera detects if any face is in the view and waits for eye blink.
If eye blink probability is 90% for both the eyes, wait 200 milliseconds. Detect face again with eye open probability > 90% to verify if the face is still there, and capture the picture at the end.
That's a cheap trick but working out so far.
Regards
On some iPhones (iOS 11.1 upwards), there's a so-called trueDepthCamera that's used for Face ID. With it (or the back facing dual camea system) you can capture images along with depth maps. You could exploit that feature to see if the face is flat (captured from an image) or has normal facial contours. See here...
One would have to come up with a 3d face model to fool that.
It's limited to only a few iPhone models though and I don't know about Android.

How to support dual screen on android with an interactive screen?

I am making a game running on an android phone with an interactive station, the station looks like an interactive screen, when the phone connected to the station ,the station would show the same desktop content as phone and can launch app and do anything as same as phone ,it can also launch different app on phone and station respectively. I wanna achieve that launch my app and can see a different view in phone screen and station screen, both phone screen, and station screen can receive user input event.
I have tried to use,
ExtCamera.SetTargetBuffers(Display.displays[1].colorBuffer, Display.displays[1].depthBuffer);
to output what ExtCamera view to the secondary screen.
I can see different camera view in the station screen and phone screen, but the station screen can not receive user input, I wonder it because this method is only to render a texture to screen so it can not receive user input ? or there is another way to achieve this goal? I knew that android native offer class presentation to support multi-display, but it seems a large workload between android and unity to interact.
Any help would be appreciated!

Wrong Camera Orientation with Android & Vuforia

We are developing our own Android-based hardware and we wish to use Vuforia (developed via Unity3D) for certain applications. However, we are having problems making Vuforia work well with our current camera orientation settings.
On our hardware, when the camera is placed horizontally - everything works fine. That is, when the camera is parallel to the placement of the display. However, we need to place the camera vertically, or in other words, with a 90 degree difference to the placement of the display. These are all hardware settings. Our kernel is programmed according to such settings and every other program that utilises the camera works compatibly with everything, including our IMU sensors. However, apps developed with Vuforia behave completely odd when the camera is placed vertically.
We assume the problem to be related to Vuforia's algorithms of processing raw camera data however we are not sure. Moreover, we do not know how to fix the situation. For further details, I can list:
-When "Enable Video Background" is on, the projected image is distorted and no video feed is available. The AR projection appears on a black background with distorted dimensions.
-When "Enable Video Background" is on and the device is rotated, the black background is replaced by flickering solid colors.
-When "Enable Video Background" is off, the AR projection has normal dimensions (no distortion) however it is tracked with wrong axis settings. For example, when the target moves left in real world, the projection moves up.
-When "Enable Video Background" is off and the device is rotated, the AR projection is larger compared to its appearance when the device is in it's default state.
I will be glad to provide any more information you need.
Thank you very much, have a nice day.
PS: We have found out that applications that use the camera as a main purpose (Camera apps, Barcode Scanners, etc) work fine while apps for which camera usage is an extra quality (such as some games) have the same problem as Vuforia. This make me think that apps who access the camera directly work fine whereas those who use Android API and classes fail for some reason.
First understand that every platform deals with cameras differently and that beyond this different android phone manufacturers deal with these differently as well. In my testing WITHOUT vuforia I had to transform the plane I cast the video feed onto 0,-90,90 for android/iphone and -270,-90,90 for the windows surface tablet. Past this the iPhone rear camera was mirrored, the android front camera was mirrored as well as the surface front camera. That is easy to account for, but an annoying issue is that the Google Pixel and Samsung front cameras were mirrored across the y (as were ALL iOS on the back camera), but the Nexus 6p was mirrored across the x. What I am getting at here is that there are a LOT of devices to account for with android so try more than just that one device. Vuforia so far has dealt with my pixel and 4 of my iOS devices just fine.
As for how to fix your problem:
Go into your player settings for unity and look at the orientation. There are a few options here and my application only uses portrait so I force portrait and it seems to work fine (none of the problems I had to account for with the above mentioned scenario). Vuforia previously did NOT support auto rotation so you need to make sure you have the latest version since it sounds like that is what you need. If the auto rotate is set and it is not working right you may have to account for that specific device (don't do this for all devices until after you test those devices). To account for that device use an if (or construct a case statement if you have multiple instances of this problem with different devices) and then reflect or translate as needed. Cross platform development systems (like unity) doesn't always get everything perfect since there is basically no standard. In these cases you have to directly account for them by creating a method and a case statement within that so you can cleanly and modularly manipulate all necessary devices. It is a pain, but it beats developing for all devices separately.
One more thing is make sure you check out the vuforia configuration file as it has some settings such as camera mirror and direction settings on there. These seem to be public settings so you should also be able to script to these in your case statement in the event you need to use "Flip Horizontally" for one phone, but not another.

How to show 4 videos at once using Google Cardboard?

I am trying to show four videos at once using Google Cardboard. These videos are normal 2D videos that were shot on a normal 16:9 camera. What I want and need is to have one video in front of you then you turn your head 90 degrees and you see another video, turn again and see another until you hit the front video again. Please see my Pablo Picasso Microsoft Paint skills to visualize what I am talking about...
So basically what I need is like four VR movie theater screens that a person can look around in. Is there a program I could use or do I have to do some programming to make this happen? Searching this is not easy with all the articles of VR that pop up. Any help that can point me in the right direction would be greatly appreciated!
I actually found an app that did this all for me. The app is called 360 Virtual Reality Player(Google Play Store) and it takes any 2D video and makes it into a head-tracked VR video. Once I found this app, all I needed to do is stitch the videos together with a black bar in between them using OpenCV to get the desired effect.

Android Kitkat allows only swipe on 360 image

I am trying to implement 360 image for android in my application. I was successful in viewing the 360 image in my app using this tutorial https://developers.google.com/vr/android/get-started#start_your_own_project. After that I have to cases with phones running on different android APIs:
I was running the app in a phone which is running API 19(Kitkat). I can only swipe the 360 image to view the other areas of the image.
I implement the app in my other phone which is running on Lollipop, the image rotates along with the movement of my phone and there is no swipe option in it.
So, my query is:
Is there a way to toggle between these two options of swiping the 360 image manually and automatically rotating the 360 image along with the movement of my phone or having both options of viewing the image in one screen?
It shouldn't be possible yet, based on currently available documentation. Depending on the device, the SDK allows you to swipe the image with your finger or rotate the view using the gyroscope, but not both.
ORIGINAL
Did you use the VR view? Based on this post you can supposedly only do click and drag on a computer (browser), not Android.
Did you use a WebView for your swiping functionality?

Categories

Resources