Android: render textureview of 2D video in side-by-side - android

I am trying to create an video player app which is capable of displaying a 2D video in a side-by-side projection for cardboard. The input is a 2D video, so no need for head tracking. I checked the Google Cardboard SDKs but they don't provide an implementation for simple side-by-side projections with Non-3D or 360° contents.
My current implementation of the video player already uses a TextureView to render the video.
I found a sample code for side-by-side-camera preview: [android Side By Side Camera
How do I get the output from the TextureView of my video player into a GvrView?

Related

How to play video in AR core in android with rotate, scale and zoom?

I want to play video in AR core android on surface detect. We able to load model but want to play video.

Is there a way of playing a video with a alpha channel on android?

I have a set of video animations with an alpha channel. I would like the user of the android app to record a video and play the animation that overlay the camera view at the same time. I have searched the Internet but didn't find any solution to achieve this. Have you any ideas of how that can be achieved?
For Preview:
You could try using opencv to read the video frame by frame and then overlay it with the camera frame and display it on a textureview.
[EDIT]: This doesn't work since opencv on android doesn't have the ability to extract frames from a video, as there is no native ffmpeg support.
For Recording:
You cannot record video this way and your best bet is to use ffmpeg to overlay the animation video on the recorded video.
Opencv4android: http://docs.opencv.org/2.4/doc/tutorials/introduction/android_binary_package/dev_with_OCV_on_Android.html
Read Video frame by frame: (http://answers.opencv.org/question/5768/how-can-i-get-one-single-frame-from-a-video-file/)
Blending frames on opencv: (http://docs.opencv.org/2.4/doc/tutorials/core/adding_images/adding_images.html)
FFFMPEG on Android: https://android-arsenal.com/details/1/931

VrVideoView play normal video

YouTube already has such function "play in CardBoard" which will reformat the footage to make it feel like you are watching in an Imax theatre.
How to do it with android VR Sdk. i'm takeing look at VrVideoView. When playing an normal video, It generate very strange view point for the normal 2D video and play it as 3D video.
The VrVideoView only renders the video in a 360° way and it gives an awfull result if the video is not a 360° video.
So you have to use a GvrView and manage the renderring youself in order to have a good result.
Here's a quick and dirty example about how to display a stereoscopic video with the Google VR SDK
https://github.com/Zanfas/Cardboard_VideoPlayer
You have to know that the Google VR SDK only apply distorsion to the rendering of a 3D Scene managed with Open GL ES.
The idea is to make a virtual screen in a 3D scene that displays the video. You render the 3D scene with the camera looking at that screen and there you go.
Regards

Add overlay while record video on Android

I want to record video by camera in my Android device. I need to add overlay image over recorded movie. In iOS I would use GPUImage. In Android I found Android GPUImage. I tried to use it but I didn't found any way to add any filter while video recording. In provided example I could add filters only for taking photos. Is there any ways to record video with filters with Android GPUImage? Is there any other ways to add images overlay over recording video in realtime? If not, is there any ways to add images overlay over recorded video in postprocessing?
You can add an overlay image on video using a blend filter.
About video recording: android-gpuimage library does not support it but you can try using the android-gpuimage-videorecording library. It is a fork of the gpu-image for android that provides also the video recording functionality
android-gpuimage-videorecording
see the GPUImageMovieWriter class
It should point you in the right direction for developing your own video writer on top of GPUImage.
The idea is to:
draw on current screen surface
switch to encoder input surface and draw previous frame buffer again on it
switch back to screen surface
other useful links: EGL surface helper, Media encoder
This project (MagicCamera) has many multiple input filters. You can write your own fragment shader to overlay an image on camera texture (similar to MagicN1977Filter). It also includes video recording.

play video with qcar and android AR application

I am developing an android application in which a specific video is played when the poster of a specific movie is shown infront of the camera in android, i found many AR tutorial just show 3D object when detect a pattern, i need some advice to make application that can play video with AR application using android camera and qcar SDK
I don't know qcar, but of course you can put a SurfaceView upon an existing SurfaceView from your CameraPreview and play a video in it.
What I would do is implementing your poster-recognition logic, this will be tricky enough and then, If it works reliable, I would add another SurfaceView in which the video is played.
You will find out, that the CameraPreview surface has actually to be on top of the video surface.
I assume you are trying to play video on glsurface ,it is possible by using surface video texture ,but only ics(4.0) and above devices support surface video texture.Vuforia have a very good sample of this here,you can handle what to do for lower devices such as playing video fullscreen etc..

Categories

Resources