Does the Cast SDK support rendering 3D objects natively? Like can I upload content for the Chromecast device to render internally on its own hardware? The Chromecast hardware does support GLES2 natively.
Or can I stream GLES2 rendered content to the device? Does the SDK have tools for Video Game streaming in general?
Does the Chromecast hardware accelerate HTML5 canvas objects?
Related
I am trying to develop a VR video player using latest Google VR SDK for Android (v1.0.3), but there is no high-level API to build VR playback controls.
YouTube VR player uses old version of gvr toolkit and renders controls (for example, com.google.android.libraries.youtube.common.ui.TouchImageView) in some way.
What is the best way to implement such controls using latest VR SDK? Do I need to use custom renderer with OpenGL or NDK?
I would be very grateful for implementation details.
GVR SDK does not provide a way do draw something over VrVideoView, so we need to implement VR video by ourselves.
The main idea of solution - use GvrView with custom StereoRenderer.
First of all, we need to implement VR video renderer (using VR shaders and MediaPlayer/ExoPlayer).
Then we need to implement custom controls on the scene using OpenGL ES and GVR SDK (HeadTracking, Eye, etc.).
You need to use OpenGL or other engine such as Unity3D to show the video texture. Decoding the video in android and parse the texture to OpenGL
or engine to show.
I am developing an app for iOS and Android to make a video call between two devices. The idea is to develop one c++ library for both platforms, and to integrate the comunication and display parts in the same way.
For compatibility reasons, I am using OpenGL to represent the video, and FFMPEG to encode the comunication, but in some other questions (like here) I have read that it is not the best option for Android. Although, I have realized that on iOS, the opengl approach is faster than the native approach I have tested.
So the question is: For Android, which is the alternative to OpenGL, but using JNI? Is there any?
You can use OpenGL ES, a flavor of the OpenGL specification intended for embedded devices. This version is also optimized for mobile devices. OpenGL ES is available for Android and iOS.
http://developer.android.com/guide/topics/graphics/opengl.html
https://developer.apple.com/opengl-es/
The following SO questions can get you in the right direction when it comes to implementing this:
Android Video Player Using NDK, OpenGL ES, and FFmpeg
Android. How play video on Surface(OpenGL)
OpenGL in Android for video display
play video using opengles, android
opengl es(iphone) render from file
https://discussions.apple.com/message/7661759#7661759
Which frameworks have this feature? Video I would like to implement something for Android. Are there already approaches? QCAR (Qualcomm AR SDK) has this feature not yet, unfortunately :(
To my knowledge, the QCAR SDK 1.5 Beta for Android only allows you to use the camera video as a texture. This differs from the posted video which shows a movie texture being loaded on a geometrical plane.
The Metaio Mobile SDK 3.0 does allow you to set movie textures on a geometry that matches the video.
http://docs.metaio.com/bin/view/Main/AdvancedContentExample#Movie_textures
The QCAR SDK 1.5 Beta for Android now provides a simple and streamlined way of accessing the video background texture. Developers can now write custom special effects using powerful shaders resulting in far more exciting application experiences. :)
Edit: Now Vuforia has the Video Playback sample. Devices from API-Level 14 (Android 4.0) support video on texture. Other devices will play the video in full screen mode.
I would like to use the native decoder for a custom video player. The VideoView and MediaPlayer does not provide functionality that will support my requirements.
I am using FFMPEG (software decoder) right now, but I would prefer to use native hardware decoding if possible. Is there a way to do this through the NDK?
There isn't currently a public API available for accessing any native hardware and I believe the presence of such hardware isn't guaranteed. While you could go digging into the DSP capabilities of the ARM processors being used by some devices it wouldn't be portable to all Android devices.
I would recommend continuing your software approach for guaranteed support on all devices.
I've been trying to build a 720p streaming video player in AppInventor, and cannot figure out how to get the video player to do anything remotely like 720p, nor can I figure out how to get it to listen-to / attach -to a UDP video stream on the WIFI.
The doc for the App Inventor video player component is here:
http://appinventor.googlelabs.com/learn/reference/components/media.html#VideoPlayer
The doc for the supported video sizes in android is here:
http://developer.android.com/guide/appendix/media-formats.html
Reading these docs, I'm left with the impression that
Android only supports 480 x 360 H.264 video (I'm fine with H.264, but not 480x360) in it's native widgets.
App Inventor does not support Streaming.
Has your mileage varied?
Actually there isn't a spec for what encoding sizes or if hardware acceleration is required of devices so depends on the manufacturer and model. There is however some minimal set of requirements set by Google in their Android Compatibility Definition Document if a manufacturer wants Android Market.
App Inventor is really interesting Google Labs project and realize that it is a 'Labs' project so trying build a streaming player with it might be a bit out of scope. I think its more for the hobbyist and education folk (Lego Mindstorm modules) than anything else. Since App Inventor just wraps the Android Framework, you would be better off going straight to the Android SDK and doing it there.
My recommendation, if you are trying to do a streaming app, use Java and the Android SDK.
App Inventor does not support Streaming.
App Inventor supports the "Activity Starter Component" which can be used to start the Android Video Player application, which will play an RTSP stream.
I used the following properties on the Activity Starter component to start an rtsp stream:
Action: android.intent.action.VIEW
DataUri: rtsp://a.sample.domain/somestream.sdp