Android: How to create YouTube-like VR-controls? - android

I am trying to develop a VR video player using latest Google VR SDK for Android (v1.0.3), but there is no high-level API to build VR playback controls.
YouTube VR player uses old version of gvr toolkit and renders controls (for example, com.google.android.libraries.youtube.common.ui.TouchImageView) in some way.
What is the best way to implement such controls using latest VR SDK? Do I need to use custom renderer with OpenGL or NDK?
I would be very grateful for implementation details.

GVR SDK does not provide a way do draw something over VrVideoView, so we need to implement VR video by ourselves.
The main idea of solution - use GvrView with custom StereoRenderer.
First of all, we need to implement VR video renderer (using VR shaders and MediaPlayer/ExoPlayer).
Then we need to implement custom controls on the scene using OpenGL ES and GVR SDK (HeadTracking, Eye, etc.).

You need to use OpenGL or other engine such as Unity3D to show the video texture. Decoding the video in android and parse the texture to OpenGL
or engine to show.

Related

How do I get Unity camera frame in Android?

I am using an Android video chat SDK in Unity. I want to provide Unity camera frame to the SDK.
Now I set a RenderTexture to Camera.targetTexture, and I pass RenderTexture.GetNativeTexturePtr() to Android.
What should I do next? I googled many times, should I use GLES20 api? Or is there any other solutions?

VrVideoView play normal video

YouTube already has such function "play in CardBoard" which will reformat the footage to make it feel like you are watching in an Imax theatre.
How to do it with android VR Sdk. i'm takeing look at VrVideoView. When playing an normal video, It generate very strange view point for the normal 2D video and play it as 3D video.
The VrVideoView only renders the video in a 360° way and it gives an awfull result if the video is not a 360° video.
So you have to use a GvrView and manage the renderring youself in order to have a good result.
Here's a quick and dirty example about how to display a stereoscopic video with the Google VR SDK
https://github.com/Zanfas/Cardboard_VideoPlayer
You have to know that the Google VR SDK only apply distorsion to the rendering of a 3D Scene managed with Open GL ES.
The idea is to make a virtual screen in a 3D scene that displays the video. You render the 3D scene with the camera looking at that screen and there you go.
Regards

Chromecast Android Games GLES2?

Does the Cast SDK support rendering 3D objects natively? Like can I upload content for the Chromecast device to render internally on its own hardware? The Chromecast hardware does support GLES2 natively.
Or can I stream GLES2 rendered content to the device? Does the SDK have tools for Video Game streaming in general?
Does the Chromecast hardware accelerate HTML5 canvas objects?

Best choice to represent video for iOS and Android

I am developing an app for iOS and Android to make a video call between two devices. The idea is to develop one c++ library for both platforms, and to integrate the comunication and display parts in the same way.
For compatibility reasons, I am using OpenGL to represent the video, and FFMPEG to encode the comunication, but in some other questions (like here) I have read that it is not the best option for Android. Although, I have realized that on iOS, the opengl approach is faster than the native approach I have tested.
So the question is: For Android, which is the alternative to OpenGL, but using JNI? Is there any?
You can use OpenGL ES, a flavor of the OpenGL specification intended for embedded devices. This version is also optimized for mobile devices. OpenGL ES is available for Android and iOS.
http://developer.android.com/guide/topics/graphics/opengl.html
https://developer.apple.com/opengl-es/
The following SO questions can get you in the right direction when it comes to implementing this:
Android Video Player Using NDK, OpenGL ES, and FFmpeg
Android. How play video on Surface(OpenGL)
OpenGL in Android for video display
play video using opengles, android
opengl es(iphone) render from file
https://discussions.apple.com/message/7661759#7661759

Video Texture: Augmented Reality Framework for Android

Which frameworks have this feature? Video I would like to implement something for Android. Are there already approaches? QCAR (Qualcomm AR SDK) has this feature not yet, unfortunately :(
To my knowledge, the QCAR SDK 1.5 Beta for Android only allows you to use the camera video as a texture. This differs from the posted video which shows a movie texture being loaded on a geometrical plane.
The Metaio Mobile SDK 3.0 does allow you to set movie textures on a geometry that matches the video.
http://docs.metaio.com/bin/view/Main/AdvancedContentExample#Movie_textures
The QCAR SDK 1.5 Beta for Android now provides a simple and streamlined way of accessing the video background texture. Developers can now write custom special effects using powerful shaders resulting in far more exciting application experiences. :)
Edit: Now Vuforia has the Video Playback sample. Devices from API-Level 14 (Android 4.0) support video on texture. Other devices will play the video in full screen mode.

Categories

Resources