I have an Android application that displays a video using gstreamer. It's similar to the tutorial mentioned here:
http://docs.gstreamer.com/display/GstSDK/Android+tutorial+3%3A+Video
Especially, it uses the GStreamerSurfaceView which extends SurfaceView.
I want now to perform some treatments on the video with another library that uses a GLSurfaceView:
class DemoGLSurfaceView extends GLSurfaceView {
public DemoGLSurfaceView(Context context) {
super(context);
setEGLContextClientVersion(2);
mRenderer = new DemoRenderer(context);
setRenderer(mRenderer);
}
DemoRenderer mRenderer;
}
class DemoRenderer implements GLSurfaceView.Renderer {
Context act;
public DemoRenderer(Context context) {
act = context;
}
#Override
public void onDrawFrame(GL10 arg0) {
nativeRender();
}
#Override
public void onSurfaceChanged(GL10 arg0, int arg1, int arg2) {
}
#Override
public void onSurfaceCreated(GL10 arg0, EGLConfig arg1) {
}
}
private static native void nativeRender();
How can I "insert" the GLSurfaceView nativeRender process onto the GstreamerSurfaceView?
A solution consists to develop a Gstreamer app video sink that can get each frame in memory and copy it to the OnDraw of GLSurfaceView.
Related
I have written a Fragment class, which is dedicated to display a constant stream of android.graphics.Bitmap. The stream of Bitmap objects is delivered by JavaCV.
Do I have to use the Canvas.drawBitmap method to display the Bitmaps in order to get a video?
The class looks like this
/**
* A simple {#link Fragment} subclass for showing th Video stream.
*/
#EFragment(R.layout.fragment_video)
public class VideoFragment extends Fragment implements TextureView.SurfaceTextureListener, VideoListener {
public static final String TAG = "de.mw.talk2drone.ui.video.VideoFragment";
#ViewById
TextureView textureView;
#Override
public void onCreate(final Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
}
#Override
public void onSurfaceTextureAvailable(#NonNull final SurfaceTexture surface, final int width, final int height) {
}
#Override
public void onSurfaceTextureSizeChanged(#NonNull final SurfaceTexture surface, final int width, final int height) {
}
#Override
public boolean onSurfaceTextureDestroyed(#NonNull final SurfaceTexture surface) {
return false;
}
#Override
public void onSurfaceTextureUpdated(#NonNull final SurfaceTexture surface) {
}
#Override
public void onFrameReceived(TelloVideoFrame frame) {
Log.d(TAG, "received a frame in Fragment: " +frame);
}
}
Do you have some samples or links of how to smoothly display a Bitmap stream in a TextureView? I have seen multiple references to Grafika, which may give a clue of what has to been done.
While creating my first app in AndEngine am getting only a black screen instead of getting the background image and play button..
Here the code
MainActivity
public class MainActivity extends BaseGameActivity {
private BoundCamera camera;
private float WIDTH = 800;
private float HEIGHT = 480;
#Override
public Engine onCreateEngine(EngineOptions engineOptions){
return new LimitedFPSEngine(engineOptions,60);
}
#Override
public EngineOptions onCreateEngineOptions() {
camera = new BoundCamera(0,0,WIDTH,HEIGHT);
EngineOptions engineOptions = new
EngineOptions(true, ScreenOrientation.LANDSCAPE_FIXED, new FillResolutionPolicy(),camera);
engineOptions.getAudioOptions().setNeedsMusic(true).setNeedsSound(true);
return engineOptions;
}
#Override
public void onCreateResources(OnCreateResourcesCallback pOnCreateResourcesCallback) throws IOException {
}
#Override
public void onCreateScene(OnCreateSceneCallback pOnCreateSceneCallback) throws IOException {
}
#Override
public void onPopulateScene(Scene pScene, OnPopulateSceneCallback pOnPopulateSceneCallback) throws IOException {
}
}
BaseScene.java
public abstract class BaseScene extends Scene {
protected Engine engine;
protected Activity activity;
protected ResourceManager resourceManager;
protected VertexBufferObjectManager vbom;
protected Camera camera;
public BaseScene(){
this.resourceManager = ResourceManager.getInstance();
this.activity = resourceManager.activity;
this.engine = resourceManager.engine;
this.vbom = resourceManager.vbom;
this.camera = resourceManager.camera;
createScene();
}
public abstract void createScene();
public abstract void onBackKeyPressed();
public abstract SceneManager.SceneType getSceneType();
public abstract void disposeScene();
}
SceneManager.java
public class SceneManager {
private BaseScene mainMenu;
private BaseScene gameScene;
private BaseScene currentScene;
private static final SceneManager INSTANCE = new SceneManager();
private SceneType currentSceneType = SceneType.SCENE_MENU;
private Engine engine = ResourceManager.getInstance().engine;
public enum SceneType
{
SCENE_MENU,
SCENE_GAME
}
public void setScene(BaseScene scene){
engine.setScene(scene);
currentScene = scene;
currentSceneType = scene.getSceneType();
}
public static SceneManager getInstance(){
return INSTANCE;
}
public SceneType getSceneType(){
return currentSceneType;
}
}
I have 2 more classes MainMenu and ResourceManager
Where did i go wrong?
Minimally you should implement 4 callbacks method from the superclass in your game activity.
onCreateEngineOptions : Where you should specify main characteristics of your game engine. (e.g. camera, rendering options, sound options)
onCreateResources : Where you should load textures and sounds that you need to use right away once your game is launched.
onCreateScene : Where you should instantiate your game's first scene. This game will be shown by engine whenever user starts your game.
onPopulateScene : Implementing this callback is optional and depends on what your design architecture is. However, you should call the given callback to let the engine to go ahead.
In these methods, you're given a pOnCreateSceneCallback object. This object should be called once you're done in that method. You must do so, otherwise your engine gets stuck and won't load your game.
Getting started with AndEngine
How to get started with andengine
http://www.matim-dev.com/tutorials.html
http://andengine.wikidot.com/getting-started-with-andengine
the black screen appears when texture(image) has more size than your bitMapTextureAtlus . so ensure that you are loading bitMapTextureAtlus with required size.
I'm trying to create video wallpaper using MediaPlayer. Also I want to use own shaders to apply visual effects on video. I've figured out that it is possible if use TextureView. Here is an example which works perfect, but for Activity only. I need the same functionality for WallpaperService. So, I tried to replace GLSurfaceView by GLTextureView class. I had a class which works great:
public final class GLWallpaperService extends WallpaperService {
#Override
public Engine onCreateEngine() {
return new WallpaperEngine();
}
private final class WallpaperEngine extends Engine implements
SharedPreferences.OnSharedPreferenceChangeListener {
// Slightly modified GLSurfaceView.
private WallpaperGLSurfaceView mGLSurfaceView;
private SharedPreferences mPreferences;
private EngineCore mRenderer;
#Override
public void onCreate(SurfaceHolder surfaceHolder) {
super.onCreate(surfaceHolder);
mRenderer = new EngineCore();
mRenderer.setContext(GLWallpaperService.this);
mGLSurfaceView = new WallpaperGLSurfaceView(GLWallpaperService.this);
mGLSurfaceView.setEGLContextClientVersion(2);
mGLSurfaceView.setRenderer(mRenderer);
mGLSurfaceView.setRenderMode(GLSurfaceView.RENDERMODE_CONTINUOUSLY);
mGLSurfaceView.onPause();
}
#Override
public void onDestroy() {
super.onDestroy();
mGLSurfaceView.onDestroy();
mGLSurfaceView = null;
mRenderer = null;
}
#Override
public void onVisibilityChanged(boolean visible) {
super.onVisibilityChanged(visible);
if (visible) {
mGLSurfaceView.onResume();
} else {
mGLSurfaceView.onPause();
}
}
/**
* Lazy as I am, I din't bother using GLWallpaperService (found on
* GitHub) project for wrapping OpenGL functionality into my wallpaper
* service. Instead I am using GLSurfaceView and trick it into hooking
* into Engine provided SurfaceHolder instead of SurfaceView provided
* one GLSurfaceView extends.
*/
private final class WallpaperGLSurfaceView extends GLSurfaceView {
public WallpaperGLSurfaceView(Context context) {
super(context);
}
#Override
public SurfaceHolder getHolder() {
return WallpaperEngine.this.getSurfaceHolder();
}
/**
* Should be called once underlying Engine is destroyed. Calling
* onDetachedFromWindow() will stop rendering thread which is lost
* otherwise.
*/
public void onDestroy() {
super.onDetachedFromWindow();
}
}
}
}
And I got new one by replacing GLSurfaceView on GLTextureView:
public final class GLTextureWallpaperService extends WallpaperService {
#Override
public Engine onCreateEngine() {
return new WallpaperEngine();
}
private final class WallpaperEngine extends Engine {
// Slightly modified GLSurfaceView.
private WallpaperGLTextureView mGLTextureView;
private EngineRenderer mRenderer;
#Override
public void onCreate(SurfaceHolder surfaceHolder) {
super.onCreate(surfaceHolder);
mRenderer = new EngineRenderer();
mRenderer.setContext(GLTextureWallpaperService.this);
mGLTextureView = new WallpaperGLTextureView(GLTextureWallpaperService.this);
mGLTextureView.setEGLContextClientVersion(2);
mGLTextureView.setRenderer(mRenderer);
mGLTextureView.setRenderMode(GLTextureView.RENDERMODE_CONTINUOUSLY);
mGLTextureView.onPause();
}
#Override
public void onDestroy() {
super.onDestroy();
mGLTextureView.onDestroy();
mGLTextureView = null;
mRenderer = null;
}
#Override
public void onVisibilityChanged(boolean visible) {
super.onVisibilityChanged(visible);
if (visible) {
mGLTextureView.onResume();
} else {
mGLTextureView.onPause();
}
}
/**
* Lazy as I am, I din't bother using GLWallpaperService (found on
* GitHub) project for wrapping OpenGL functionality into my wallpaper
* service. Instead I am using GLSurfaceView and trick it into hooking
* into Engine provided SurfaceHolder instead of SurfaceView provided
* one GLSurfaceView extends.
*/
private final class WallpaperGLTextureView extends GLTextureView {
public WallpaperGLTextureView(Context context) {
super(context);
}
//THIS IS NOT EXIST IN GLTEXTUREVIEW CLASS!!!
/*
#Override
public SurfaceHolder getHolder() {
Log.e("getHolder", "getHolder");
return WallpaperEngine.this.getSurfaceHolder();
}
*/
//TRIED TO CHANGE getHolder() BY THIS ONE - NO RESULTS!
#Override
public SurfaceTexture getSurfaceTexture() {
Log.e("getSurfaceTexture", "getSurfaceTexture");
return (SurfaceTexture) WallpaperEngine.this.getSurfaceHolder();
}
public void onDestroy() {
super.onDetachedFromWindow();
}
}
}
}
Renderer class:
public class EngineRenderer implements GLTextureView.Renderer {
Context context;
public void setContext(Context context){
this.context=context;
}
#Override
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
Log.e("ds", "-onSurfaceCreated--");
}
#Override
public void onSurfaceChanged(GL10 gl, int width, int height) {
Log.e("ds", "-onSurfaceCreated--");
}
#Override
public void onDrawFrame(GL10 gl) {
GLES20.glClearColor(1, 0, 0, 1);
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
}
#Override
public void onSurfaceDestroyed(GL10 gl) {
}
}
I don't get any errors, just black screen. Renderer class is not drawn! It seems to me that GLTextureView doesn't have getHolder() function.
What should I do? Maybe there is another way to implement my goal.
I'm trying to get a simple libgdx project running on Android. Everything is fine, but my InputProcessor does not fire its events.
I implemented everything according to this tutorial:
http://code.google.com/p/libgdx-users/wiki/IntegratingAndroidNativeUiElements3TierProjectSetup#Code_Example
The first call of "showToast" works fine and is shown on my screen => The showToast-Method does work. Unfortunately, I can't fire any of the InputProcessor events. Even the debugger does not stop there, so they are definitely not called.
Edit: Here is the complete code. I only omitted the Calculator Class, since it works fine and should not be of any conern here. If anyone disagrees with that I can always add it, of course.
Surface Class in libgdx main project (Main class so to say)
public class Surface implements ApplicationListener {
ActionResolver actionResolver;
SpriteBatch spriteBatch;
Texture texture;
Calculator calculator;
public Surface(ActionResolver actionResolver) {
this.actionResolver = actionResolver;
}
#Override
public void create() {
spriteBatch = new SpriteBatch();
texture = new Texture(Gdx.files.internal("ship.png"));
calculator = new Calculator(texture);
actionResolver.showToast("Tap screen to open Activity");
Gdx.input.setInputProcessor(new InputProcessor() {
#Override
public boolean touchDown(int x, int y, int pointer, int button) {
actionResolver.showToast("touchDown");
actionResolver.showMyList();
return true;
}
// overriding all other interface-methods the same way
});
}
#Override
public void render() {
Gdx.gl.glClear(GL10.GL_COLOR_BUFFER_BIT);
calculator.update();
spriteBatch.begin();
calculator.draw(spriteBatch);
spriteBatch.end();
}
// Overriding resize, pause, resume, dispose without functionality
}
ActionResolver interface in libgdx main project
public interface ActionResolver {
public void showMyList();
public void showToast(String toastMessage);
}
Implementation of the ActionResolver interface within my Android project
public class ActionResolverImpl implements ActionResolver {
Handler uiThread;
Context appContext;
public ActionResolverImpl(Context appContext) {
uiThread = new Handler();
this.appContext = appContext;
}
#Override
public void showMyList() {
appContext.startActivity(new Intent(this.appContext, MyListActivity.class));
}
#Override
public void showToast(final String toastMessage) {
uiThread.post(new Runnable() {
#Override
public void run() {
Toast.makeText(appContext, toastMessage, Toast.LENGTH_SHORT).show();
}
});
}
}
Android Activity for inizializing the Suface-Class
public class AndroidActivity extends AndroidApplication {
ActionResolverImpl actionResolver;
#Override
public void onCreate(android.os.Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
actionResolver = new ActionResolverImpl(this);
initialize(new Surface(actionResolver), false);
}
}
I also implemented the InputProcessor in my Surface-class, but this should not (and did not) make any difference. Any ideas, what I'm missing?
I am running a frame by frame animation using sequential images in a surface view. I am declaring the activity by :
public class myView extends SurfaceView implements SurfaceHolder.Callback
{
public myView (Context paramContext, Listener paramListener)
{
super(paramContext);
getHolder().addCallback(this);
//Some Code
}
public void doDraw(Canvas paramCanvas, int imgpos)
{
// Animation from image source using InputStream
}
public boolean onTouchEvent(MotionEvent paramMotionEvent)
{
return super.onTouchEvent(paramMotionEvent);
}
public void surfaceChanged(SurfaceHolder paramSurfaceHolder, int paramInt1, int paramInt2, int paramInt3)
{
//Some Code
}
public void surfaceCreated(SurfaceHolder paramSurfaceHolder)
{
//Some code
}
public void surfaceDestroyed(SurfaceHolder paramSurfaceHolder)
{
//Some Code
}
}
Now I am trying to implement a Listener for the Animation End event for the above using the following code :
public static abstract interface Listener
{
public abstract void onAnimationEnd();
}
I am stuck up with the above code and can anyone suggest me how I can implement this interface in another activity, so that I can find out that the current animation is over in the Surface View Activity.
Thanks in Advance,
Tim
Add
setAnimationListener
to your animation. If you are assign the animation in the form of drawable in xml file check here to know how to get the animation drawable.