I am developing an app for iOS and Android to make a video call between two devices. The idea is to develop one c++ library for both platforms, and to integrate the comunication and display parts in the same way.
For compatibility reasons, I am using OpenGL to represent the video, and FFMPEG to encode the comunication, but in some other questions (like here) I have read that it is not the best option for Android. Although, I have realized that on iOS, the opengl approach is faster than the native approach I have tested.
So the question is: For Android, which is the alternative to OpenGL, but using JNI? Is there any?
You can use OpenGL ES, a flavor of the OpenGL specification intended for embedded devices. This version is also optimized for mobile devices. OpenGL ES is available for Android and iOS.
http://developer.android.com/guide/topics/graphics/opengl.html
https://developer.apple.com/opengl-es/
The following SO questions can get you in the right direction when it comes to implementing this:
Android Video Player Using NDK, OpenGL ES, and FFmpeg
Android. How play video on Surface(OpenGL)
OpenGL in Android for video display
play video using opengles, android
opengl es(iphone) render from file
https://discussions.apple.com/message/7661759#7661759
Related
I am trying to develop a VR video player using latest Google VR SDK for Android (v1.0.3), but there is no high-level API to build VR playback controls.
YouTube VR player uses old version of gvr toolkit and renders controls (for example, com.google.android.libraries.youtube.common.ui.TouchImageView) in some way.
What is the best way to implement such controls using latest VR SDK? Do I need to use custom renderer with OpenGL or NDK?
I would be very grateful for implementation details.
GVR SDK does not provide a way do draw something over VrVideoView, so we need to implement VR video by ourselves.
The main idea of solution - use GvrView with custom StereoRenderer.
First of all, we need to implement VR video renderer (using VR shaders and MediaPlayer/ExoPlayer).
Then we need to implement custom controls on the scene using OpenGL ES and GVR SDK (HeadTracking, Eye, etc.).
You need to use OpenGL or other engine such as Unity3D to show the video texture. Decoding the video in android and parse the texture to OpenGL
or engine to show.
Does the Cast SDK support rendering 3D objects natively? Like can I upload content for the Chromecast device to render internally on its own hardware? The Chromecast hardware does support GLES2 natively.
Or can I stream GLES2 rendered content to the device? Does the SDK have tools for Video Game streaming in general?
Does the Chromecast hardware accelerate HTML5 canvas objects?
Which frameworks have this feature? Video I would like to implement something for Android. Are there already approaches? QCAR (Qualcomm AR SDK) has this feature not yet, unfortunately :(
To my knowledge, the QCAR SDK 1.5 Beta for Android only allows you to use the camera video as a texture. This differs from the posted video which shows a movie texture being loaded on a geometrical plane.
The Metaio Mobile SDK 3.0 does allow you to set movie textures on a geometry that matches the video.
http://docs.metaio.com/bin/view/Main/AdvancedContentExample#Movie_textures
The QCAR SDK 1.5 Beta for Android now provides a simple and streamlined way of accessing the video background texture. Developers can now write custom special effects using powerful shaders resulting in far more exciting application experiences. :)
Edit: Now Vuforia has the Video Playback sample. Devices from API-Level 14 (Android 4.0) support video on texture. Other devices will play the video in full screen mode.
I would like to use the native decoder for a custom video player. The VideoView and MediaPlayer does not provide functionality that will support my requirements.
I am using FFMPEG (software decoder) right now, but I would prefer to use native hardware decoding if possible. Is there a way to do this through the NDK?
There isn't currently a public API available for accessing any native hardware and I believe the presence of such hardware isn't guaranteed. While you could go digging into the DSP capabilities of the ARM processors being used by some devices it wouldn't be portable to all Android devices.
I would recommend continuing your software approach for guaranteed support on all devices.
I'm very new to Android. Now i need to work on Adding my own codec to Android.I mean, I want to know what all the steps i need to take to add my own codec to android.
I'm very fresh to this i need some basic things, so can someone please explain the steps I need to take in order to add a new codec to Android?
This is virtually impossible to do in a portable way as all audio and video codecs are compiled at the platform level (due to the fact that most of the time they require hardware specific acceleration)
If you are only interested on this working on a specific hardware platform and have an unlocked bootloader (So you can boot a custom build of Android) you can compile the full Android platform from scratch using the AOSP as a base.
Depending on which version of Android you're targeting you're looking at adding code to either Opencore or Stagefright (The subsystems that Android uses for A/V decoding and parsing) here you can add audio decoders, audio encoders, video encoders, video decoders and container parsers.
Here is some discussion of adding to Stagefright:
http://freepine.blogspot.com/2010/01/overview-of-stagefrighter-player.html
http://groups.google.com/group/android-porting/msg/5d88e76845a22bbb
However unless the encoding scheme you wish to support is very simple (What are you wanting to add?) it is likely to be too CPU intensive for most Android devices to run without being able to offload some of the work to another system (like the radio chipset or the GPU).
In Android Framework, mediacodec is implemented as Stagefright, a media playback engine at the native level that has built-in software-based codecs for popular media formats. Stagefright also supports integration with custom hardware codecs provided as OpenMAX component. Here is the article which summaries the steps.
CHECK HERE