Video Texture: Augmented Reality Framework for Android - android

Which frameworks have this feature? Video I would like to implement something for Android. Are there already approaches? QCAR (Qualcomm AR SDK) has this feature not yet, unfortunately :(

To my knowledge, the QCAR SDK 1.5 Beta for Android only allows you to use the camera video as a texture. This differs from the posted video which shows a movie texture being loaded on a geometrical plane.
The Metaio Mobile SDK 3.0 does allow you to set movie textures on a geometry that matches the video.
http://docs.metaio.com/bin/view/Main/AdvancedContentExample#Movie_textures

The QCAR SDK 1.5 Beta for Android now provides a simple and streamlined way of accessing the video background texture. Developers can now write custom special effects using powerful shaders resulting in far more exciting application experiences. :)
Edit: Now Vuforia has the Video Playback sample. Devices from API-Level 14 (Android 4.0) support video on texture. Other devices will play the video in full screen mode.

Related

Android: How to create YouTube-like VR-controls?

I am trying to develop a VR video player using latest Google VR SDK for Android (v1.0.3), but there is no high-level API to build VR playback controls.
YouTube VR player uses old version of gvr toolkit and renders controls (for example, com.google.android.libraries.youtube.common.ui.TouchImageView) in some way.
What is the best way to implement such controls using latest VR SDK? Do I need to use custom renderer with OpenGL or NDK?
I would be very grateful for implementation details.
GVR SDK does not provide a way do draw something over VrVideoView, so we need to implement VR video by ourselves.
The main idea of solution - use GvrView with custom StereoRenderer.
First of all, we need to implement VR video renderer (using VR shaders and MediaPlayer/ExoPlayer).
Then we need to implement custom controls on the scene using OpenGL ES and GVR SDK (HeadTracking, Eye, etc.).
You need to use OpenGL or other engine such as Unity3D to show the video texture. Decoding the video in android and parse the texture to OpenGL
or engine to show.

Chromecast Android Games GLES2?

Does the Cast SDK support rendering 3D objects natively? Like can I upload content for the Chromecast device to render internally on its own hardware? The Chromecast hardware does support GLES2 natively.
Or can I stream GLES2 rendered content to the device? Does the SDK have tools for Video Game streaming in general?
Does the Chromecast hardware accelerate HTML5 canvas objects?

Record GLSurfaceView on < Android 4.3

I'm developing an app for applying effects to the camera image in real-time. Currently I'm using the MediaMuxer class in combination with MediaCodec. Those classes were implemented with Android 4.3.
Now I wanted to redesign my app and make it compatible for more devices. The only thing I found in the internet was a combination of FFmpeg and OpenCV, but I read that the framerate is not very well if I want to use a high resolution. Is there any possibility to encode video in real-time while capturing the camera image without using MediaMuxer and MediaCodec?
PS: I'm using GLSurfaceView for OpenGL fragment shader effects. So this is a must-have.
Real-time encoding of large frames at a moderate frame rate is not going to happen with software codecs.
MediaCodec was introduced in 4.1, so you can still take advantage of hardware-accelerated compression so long as you can deal with the various problems. You'd still need an alternative to MediaMuxer if you want a .mp4 file at the end.
Some commercial game recorders, such as Kamcord and Everyplay, claim to work on Android 4.1+. So it's technically possible, though I don't know if they used non-public APIs to feed surfaces directly into the video encoder.
In pre-Jellybean Android it only gets harder.
(For anyone interested in recording GL in >= 4.3, see EncodeAndMuxTest or Grafika's "Record GL app".)

Best choice to represent video for iOS and Android

I am developing an app for iOS and Android to make a video call between two devices. The idea is to develop one c++ library for both platforms, and to integrate the comunication and display parts in the same way.
For compatibility reasons, I am using OpenGL to represent the video, and FFMPEG to encode the comunication, but in some other questions (like here) I have read that it is not the best option for Android. Although, I have realized that on iOS, the opengl approach is faster than the native approach I have tested.
So the question is: For Android, which is the alternative to OpenGL, but using JNI? Is there any?
You can use OpenGL ES, a flavor of the OpenGL specification intended for embedded devices. This version is also optimized for mobile devices. OpenGL ES is available for Android and iOS.
http://developer.android.com/guide/topics/graphics/opengl.html
https://developer.apple.com/opengl-es/
The following SO questions can get you in the right direction when it comes to implementing this:
Android Video Player Using NDK, OpenGL ES, and FFmpeg
Android. How play video on Surface(OpenGL)
OpenGL in Android for video display
play video using opengles, android
opengl es(iphone) render from file
https://discussions.apple.com/message/7661759#7661759

Tracking from specific video using QCAR

I am developing an android application which receiving the video stream from a server. I want to use QCAR SDK to track the Frame Marker in the video. However, it seems that QCAR can only work on the video from camera device. How can I use QCAR to do the AR with the specific video instead of camera video? Or is any other SDK can do this?
QCAR can only use the live camera feed in v1.5. The video goes directly to the tracker without any hooks for developer intervention or redirection from a video source.
This feature has been requested on the wish list.
You may be able to do it with the Metaio UnifeyeMobile SDK. It is more configurable in that way - but it can be quite expensive (unless you are ok with the limitations of the free version).

Categories

Resources