I am developing an augmented reality app in Android using ARCore library.
I would like to play a video in Augmented image.
I can able to successfully identify an object using ARCore Android
But I am not able to play a video exactly on the augmented image.
Any help would be appreciated.
Here are 5 features you have to check to make sure your app's algorithm is OK:
You have a Supported file format of Video Asset (mp4 is robust choice).
You have an ArAnchor which your Video Object is attached to.
You have a geometry which your mp4 Video texture is assigned to.
You are rendering your Video Object.
The logic of video player is true.
Look at Clayton Wilkinson's brilliant answer in Need to play Video in ARCore SO post.
Related
I have been able to display the video from Tango color camera on a TextureView using TangoTextureCameraPreview. Now, I would like to record the video steam into .mp4 files. It is relatively easy with MediaRecorder and Camera2, but I am not sure how to do it with Tango. Is there a way to create a PersistentInputSurface from Tango that is accepted by MediaRecorder? If not, is the GLSurfaceView from TangoCameraPreview and MediaCodec the right direction to look into?
Thanks!
It depends if you want to use depth data or not. If not, Tango camera can act as a standard Android camera. If you don't want to write the code from scratch, you can check Media for Mobile API here: https://github.com/INDExOS/media-for-mobile
It will work for Tango the same way it works for any other Android device.
YouTube already has such function "play in CardBoard" which will reformat the footage to make it feel like you are watching in an Imax theatre.
How to do it with android VR Sdk. i'm takeing look at VrVideoView. When playing an normal video, It generate very strange view point for the normal 2D video and play it as 3D video.
The VrVideoView only renders the video in a 360° way and it gives an awfull result if the video is not a 360° video.
So you have to use a GvrView and manage the renderring youself in order to have a good result.
Here's a quick and dirty example about how to display a stereoscopic video with the Google VR SDK
https://github.com/Zanfas/Cardboard_VideoPlayer
You have to know that the Google VR SDK only apply distorsion to the rendering of a 3D Scene managed with Open GL ES.
The idea is to make a virtual screen in a 3D scene that displays the video. You render the 3D scene with the camera looking at that screen and there you go.
Regards
I want to put YouTube streaming video inside the app for Cardboard (for Android and iOS). I know the are plugins to do similar things such as "Easy Movie Texture" but they don't support YouTube streaming. I've found "Youtube Video Player" inside Unity Asset Store but I'm not entirely sure that it will work with CardboardMain.prefab (I mean properly splitting 360 video for two screens).
Please help with this issue.
Thanks.
You can do that using Easy Movie texture with Youtube Video Player you can get the just pass the result of Youtube Video Player to Easy Movie Texture video path, and will works like a charm :)
Is there a way to record square (640x640) videos and concat them in Android? I looked up in the Internet and found some solutions. The solution seems to be "ffmpeg". However, to use ffmpeg I need to dive into NDK and build ffmpeg from its sources. Is there a solution by only using the Android SDK?
My basic needs are:
Record multiple videos (square format)
Resize captured videos (i.e. 480x480 to 640x640)
Concat captured videos
Rotate final video (clockwise 90)
Final output will be in mp4 or mpg format
Is there a solution by only using the Android SDK?
Not really.
Your primary video recording option is MediaRecorder, and it supports exactly nothing of what you list. For example, there is no requirement for any Android device to support taking square videos.
You are also welcome to use the camera preview stuff to assemble your own videos from individual frames. Vine does this, AFAIK. There, you could perhaps use existing Bitmap facilities to handle the cropping, resizing, and rotating. However, this will be slow, and doing this work in a way that can keep up with a reasonable frame rate will be difficult. Also, I do not know if there is a library that can stitch those frames together into a video, or blend in any sort of audio (camera previews are pure images).
I am developing an android application in which a specific video is played when the poster of a specific movie is shown infront of the camera in android, i found many AR tutorial just show 3D object when detect a pattern, i need some advice to make application that can play video with AR application using android camera and qcar SDK
I don't know qcar, but of course you can put a SurfaceView upon an existing SurfaceView from your CameraPreview and play a video in it.
What I would do is implementing your poster-recognition logic, this will be tricky enough and then, If it works reliable, I would add another SurfaceView in which the video is played.
You will find out, that the CameraPreview surface has actually to be on top of the video surface.
I assume you are trying to play video on glsurface ,it is possible by using surface video texture ,but only ics(4.0) and above devices support surface video texture.Vuforia have a very good sample of this here,you can handle what to do for lower devices such as playing video fullscreen etc..