Why my OpenGL texture are painted in pink? - android

I setup webrtc on my android (peer to peer video chat). When I draw texture that comes from the local camera, everything is fine, but when I try to draw texture that comes from the remote smartphone then I have a pink image, something like this :
on webrtc i just do this to get the remote stream:
mRemoteVideoTrack = getRemoteVideoTrack();
mRemoteVideoTrack.setEnabled(true);
mRemoteVideoTrack.addSink(mRemoteProxyVideoSink);
private VideoTrack getRemoteVideoTrack() {
for (RtpTransceiver transceiver : mPeerConnection.getTransceivers()) {
MediaStreamTrack track = transceiver.getReceiver().track();
if (track instanceof VideoTrack) {
return (VideoTrack) track;
}
}
return null;
}
and I get the texture id in the mRemoteProxyVideoSink :
private class RemoteProxyVideoSink implements VideoSink {
#Override
synchronized public void onFrame(VideoFrame frame) {
VideoFrame.TextureBuffer textureBuffer = (VideoFrame.TextureBuffer) frame.getBuffer();
mTextureID = textureBuffer.getTextureId();
.. draw mTextureID (in UI thread because onFrame is not fired in UI thread) ...
}
}
Any idea why my textures are painted in Pink?

Related

Draw text or image on the camera stream (GLSL)

I have a live broadcasting app based off grafika's examples, where I send my video feed over RTMP to be live broadcast.
I now want to watermark my video by overlaying text or a logo on my video stream. I know this can be done with GLSL filtering, but I have no idea how to implement this based on the sample that I linked.
I tried using Alpha blending but it seems the two texture formats are somehow incompatible (one being TEXTURE_EXTERNAL_OES and the other one TEXTURE_2D) and I just get a black frame in return.
EDIT:
I based my code on Kickflip API:
class CameraSurfaceRenderer implements GLSurfaceView.Renderer {
private static final String TAG = "CameraSurfaceRenderer";
private static final boolean VERBOSE = false;
private CameraEncoder mCameraEncoder;
private FullFrameRect mFullScreenCamera;
private FullFrameRect mFullScreenOverlay; // For texture overlay
private final float[] mSTMatrix = new float[16];
private int mOverlayTextureId;
private int mCameraTextureId;
private boolean mRecordingEnabled;
private int mFrameCount;
// Keep track of selected filters + relevant state
private boolean mIncomingSizeUpdated;
private int mIncomingWidth;
private int mIncomingHeight;
private int mCurrentFilter;
private int mNewFilter;
boolean showBox = false;
/**
* Constructs CameraSurfaceRenderer.
* <p>
* #param recorder video encoder object
*/
public CameraSurfaceRenderer(CameraEncoder recorder) {
mCameraEncoder = recorder;
mCameraTextureId = -1;
mFrameCount = -1;
SessionConfig config = recorder.getConfig();
mIncomingWidth = config.getVideoWidth();
mIncomingHeight = config.getVideoHeight();
mIncomingSizeUpdated = true; // Force texture size update on next onDrawFrame
mCurrentFilter = -1;
mNewFilter = Filters.FILTER_NONE;
mRecordingEnabled = false;
}
/**
* Notifies the renderer that we want to stop or start recording.
*/
public void changeRecordingState(boolean isRecording) {
Log.d(TAG, "changeRecordingState: was " + mRecordingEnabled + " now " + isRecording);
mRecordingEnabled = isRecording;
}
#Override
public void onSurfaceCreated(GL10 unused, EGLConfig config) {
Log.d(TAG, "onSurfaceCreated");
// Set up the texture blitter that will be used for on-screen display. This
// is *not* applied to the recording, because that uses a separate shader.
mFullScreenCamera = new FullFrameRect(
new Texture2dProgram(Texture2dProgram.ProgramType.TEXTURE_EXT));
// For texture overlay:
GLES20.glEnable(GLES20.GL_BLEND);
GLES20.glBlendFunc(GLES20.GL_SRC_ALPHA, GLES20.GL_ONE_MINUS_SRC_ALPHA);
mFullScreenOverlay = new FullFrameRect(
new Texture2dProgram(Texture2dProgram.ProgramType.TEXTURE_2D));
mOverlayTextureId = GlUtil.createTextureWithTextContent("hello!");
mOverlayTextureId = GlUtil.createTextureFromImage(mCameraView.getContext(), R.drawable.red_dot);
mCameraTextureId = mFullScreenCamera.createTextureObject();
mCameraEncoder.onSurfaceCreated(mCameraTextureId);
mFrameCount = 0;
}
#Override
public void onSurfaceChanged(GL10 unused, int width, int height) {
Log.d(TAG, "onSurfaceChanged " + width + "x" + height);
}
#Override
public void onDrawFrame(GL10 unused) {
if (VERBOSE){
if(mFrameCount % 30 == 0){
Log.d(TAG, "onDrawFrame tex=" + mCameraTextureId);
mCameraEncoder.logSavedEglState();
}
}
if (mCurrentFilter != mNewFilter) {
Filters.updateFilter(mFullScreenCamera, mNewFilter);
mCurrentFilter = mNewFilter;
mIncomingSizeUpdated = true;
}
if (mIncomingSizeUpdated) {
mFullScreenCamera.getProgram().setTexSize(mIncomingWidth, mIncomingHeight);
mFullScreenOverlay.getProgram().setTexSize(mIncomingWidth, mIncomingHeight);
mIncomingSizeUpdated = false;
Log.i(TAG, "setTexSize on display Texture");
}
// Draw the video frame.
if(mCameraEncoder.isSurfaceTextureReadyForDisplay()){
mCameraEncoder.getSurfaceTextureForDisplay().updateTexImage();
mCameraEncoder.getSurfaceTextureForDisplay().getTransformMatrix(mSTMatrix);
//Drawing texture overlay:
mFullScreenOverlay.drawFrame(mOverlayTextureId, mSTMatrix);
mFullScreenCamera.drawFrame(mCameraTextureId, mSTMatrix);
}
mFrameCount++;
}
public void signalVertialVideo(FullFrameRect.SCREEN_ROTATION isVertical) {
if (mFullScreenCamera != null) mFullScreenCamera.adjustForVerticalVideo(isVertical, false);
}
/**
* Changes the filter that we're applying to the camera preview.
*/
public void changeFilterMode(int filter) {
mNewFilter = filter;
}
public void handleTouchEvent(MotionEvent ev){
mFullScreenCamera.handleTouchEvent(ev);
}
}
This is the code for Rendering the image on the screen (GLSurfaceView), but this is not actually overlayed over the video. If I am not mistaken, this is done on CameraEncoder.
Thing is, replicating the code from CameraSurfaceRenderer into CameraEncoder (they both have similar code when it comes to filters) does not provide an overlayed text/image.
The texture object uses the GL_TEXTURE_EXTERNAL_OES texture target, which is defined by the GL_OES_EGL_image_external OpenGL ES extension. This limits how the texture may be used. Each time the texture is bound it must be bound to the GL_TEXTURE_EXTERNAL_OES target rather than the GL_TEXTURE_2D target. Additionally, any OpenGL ES 2.0 shader that samples from the texture must declare its use of this extension using, for example, an "#extension GL_OES_EGL_image_external : require" directive. Such shaders must also access the texture using the samplerExternalOES GLSL sampler type.
https://developer.android.com/reference/android/graphics/SurfaceTexture.html
Post your code that you used to do alpha blending and I can probably fix it.
I would probably override the Texture2dProgram and pass that to the FullFrame Renderer. It has example code for rendering using the GL_TEXTURE_EXTERNAL_OES extension. Basically, #Override the draw function, call the base implementation, bind your watermark and draw.
That should be between camera and the video encoder.

Surface Texture object is not getting the frames from a Surface Class

On the one hand, I have a Surface Class which when instantiated, automatically initialize a new thread and start grabbing frames from a streaming source via native code based on FFMPEG. Here is the main parts of the code for the aforementioned Surface Class:
public class StreamingSurface extends Surface implements Runnable {
...
public StreamingSurface(SurfaceTexture surfaceTexture, int width, int height) {
super(surfaceTexture);
screenWidth = width;
screenHeight = height;
init();
}
public void init() {
mDrawTop = 0;
mDrawLeft = 0;
mVideoCurrentFrame = 0;
this.setVideoFile();
this.startPlay();
}
public void setVideoFile() {
// Initialise FFMPEG
naInit("");
// Get stream video res
int[] res = naGetVideoRes();
mDisplayWidth = (int)(res[0]);
mDisplayHeight = (int)(res[1]);
// Prepare Display
mBitmap = Bitmap.createBitmap(mDisplayWidth, mDisplayHeight, Bitmap.Config.ARGB_8888);
naPrepareDisplay(mBitmap, mDisplayWidth, mDisplayHeight);
}
public void startPlay() {
thread = new Thread(this);
thread.start();
}
#Override
public void run() {
while (true) {
while (2 == mStatus) {
//pause
SystemClock.sleep(100);
}
mVideoCurrentFrame = naGetVideoFrame();
if (0 < mVideoCurrentFrame) {
//success, redraw
if(isValid()){
Canvas canvas = lockCanvas(null);
if (null != mBitmap) {
canvas.drawBitmap(mBitmap, mDrawLeft, mDrawTop, prFramePaint);
}
unlockCanvasAndPost(canvas);
}
} else {
//failure, probably end of video, break
naFinish(mBitmap);
mStatus = 0;
break;
}
}
}
}
In my MainActivity class, I instantiated this class in the following way:
public void startCamera(int texture)
{
mSurface = new SurfaceTexture(texture);
mSurface.setOnFrameAvailableListener(this);
Surface surface = new StreamingSurface(mSurface, 640, 360);
surface.release();
}
I read the following line in the Android developer page, regarding the Surface class constructor:
"Images drawn to the Surface will be made available to the SurfaceTexture, which can attach them to an OpenGL ES texture via updateTexImage()."
That is exactly what I want to do, and I have everything ready for the further renderization. But definitely, with the above code, I never get my frames captured in the surface class transformed to its corresponding SurfaceTexture. I know this because the debugger, for instace, never call the OnFrameAvailableLister method associated with that Surface Texture.
Any ideas? Maybe the fact that I am using a thread to call the drawing functions is messing everything? In such a case, what alternatives I have to grab the frames?
Thanks in advance

Unable to blit from External Texture to EGLSurface in android

When i have tried to render texture and transformation matrix to the EGLSurface, no display is seen in the view.
As a follow up of this issue , slightly i have modified slightly the code by following grafika/fadden sample code continuous capture
Here is my code:
Here is a draw method which runs on RenderThread.
This draw method is getting invoked properly whevener the data is produced at the producer end from Native Code.
public void drawFrame() {
mOffScreenSurface.makeCurrent();
mCameraTexture.updateTexImage();
mCameraTexture.getTransformMatrix(mTmpMatrix);
mSurfaceWindowUser.makeCurrent();
mFullFrameBlit.drawFrame(mTextureId, mTmpMatrix);
mSurfaceWindowUser.swapBuffers();
}
run method of RenderThread ->
public void run() {
Looper.prepare();
mHandler = new RenderHandler(this);
mEglCore = new EglCore(null, EglCore.FLAG_RECORDABLE);
mOffScreenSurface = new OffscreenSurface(mEglCore, 640, 480);
mOffScreenSurface.makeCurrent();
mFullFrameBlit = new FullFrameRect(
new Texture2dProgram(Texture2dProgram.ProgramType.TEXTURE_EXT));
mTextureId = mFullFrameBlit.createTextureObject();
mCameraTexture = new SurfaceTexture(mTextureId);
mCameraSurface = new Surface (mCameraTexture); // This surface i am sending to Native Code where i use ANativeWindow reference and copy the data using post method. {producer}
mCameraTexture.setOnFrameAvailableListener(new SurfaceTexture.OnFrameAvailableListener() {
#Override
public void onFrameAvailable(SurfaceTexture surfaceTexture) {
Log.d (TAG, "Long breath.. data is pumbed by Native Layer producer..");
mHandler.frameReceivedFromProducer();
}
});
mSurfaceWindowUser = new WindowSurface(mEglCore, mSurfaceUser, false); // this mSurfaceUser is a surface received from MainActivity TextureView.
}
To confirm if the produce at the native side producing the data, if i pass directly the user surface Without any EGL configurations, the frames are rendered into the screen.
At the native Level,
geometryResult = ANativeWindow_setBuffersGeometry(userNaiveWindow,640, 480, WINDOW_FORMAT_RGBA_8888);
To Render the frame i use
ANativeWindow_lock and ANativeWindow_unlockAndPost() to render directly frame into buffer.
I could not able to think what could be wrong and where i have to dig more ?
Thanks fadden for your help.

Is it possible to render an Android View to an OpenGL FBO or texture?

Is it possible to render a View (say, a WebView) to an FBO so it can be used as a texture in an OpenGL composition?
I brought together a complete demo project which renders a view to GL textures in real time in an efficient way which can be found in this repo. It shows how to render WebView to GL texture in real time as an example.
Also a brief code for this can look like the following (taken from the demo project from the repo above):
public class GLWebView extends WebView {
private ViewToGLRenderer mViewToGLRenderer;
...
// drawing magic
#Override
public void draw( Canvas canvas ) {
//returns canvas attached to gl texture to draw on
Canvas glAttachedCanvas = mViewToGLRenderer.onDrawViewBegin();
if(glAttachedCanvas != null) {
//translate canvas to reflect view scrolling
float xScale = glAttachedCanvas.getWidth() / (float)canvas.getWidth();
glAttachedCanvas.scale(xScale, xScale);
glAttachedCanvas.translate(-getScrollX(), -getScrollY());
//draw the view to provided canvas
super.draw(glAttachedCanvas);
}
// notify the canvas is updated
mViewToGLRenderer.onDrawViewEnd();
}
...
}
public class ViewToGLRenderer implements GLSurfaceView.Renderer{
private SurfaceTexture mSurfaceTexture;
private Surface mSurface;
private int mGlSurfaceTexture;
private Canvas mSurfaceCanvas;
...
#Override
public void onDrawFrame(GL10 gl){
synchronized (this){
// update texture
mSurfaceTexture.updateTexImage();
}
}
#Override
public void onSurfaceChanged(GL10 gl, int width, int height){
releaseSurface();
mGlSurfaceTexture = createTexture();
if (mGlSurfaceTexture > 0){
//attach the texture to a surface.
//It's a clue class for rendering an android view to gl level
mSurfaceTexture = new SurfaceTexture(mGlSurfaceTexture);
mSurfaceTexture.setDefaultBufferSize(mTextureWidth, mTextureHeight);
mSurface = new Surface(mSurfaceTexture);
}
}
public Canvas onDrawViewBegin(){
mSurfaceCanvas = null;
if (mSurface != null) {
try {
mSurfaceCanvas = mSurface.lockCanvas(null);
}catch (Exception e){
Log.e(TAG, "error while rendering view to gl: " + e);
}
}
return mSurfaceCanvas;
}
public void onDrawViewEnd(){
if(mSurfaceCanvas != null) {
mSurface.unlockCanvasAndPost(mSurfaceCanvas);
}
mSurfaceCanvas = null;
}
}
The demo output screenshot:
Yes is it certainly possible, I have written up a how-to here;
http://www.felixjones.co.uk/neo%20website/Android_View/
However for static elements that won't change, the bitmap option may be better.
At least someone managed to render text this way:
Rendering Text in OpenGL on Android
It describes the method I used for rendering high-quality dynamic text efficiently using OpenGL ES 1.0, with TrueType/OpenType font files.
[...]
The whole process is actually quite easy. We generate the bitmap (as a texture), calculate and store the size of each character, as well as it's location on the texture (UV coordinates). There are some other finer details, but we'll get to that.
OpenGL ES 2.0 Version: https://github.com/d3kod/Texample2

slow face detection android

Hi my face detection thread is working too slow,
I call this thread from onPreviewFrame only if the thread is not working else i just skip the call and after the thread detect face i call onDraw inside the view to draw rectangle
public void run() {
FaceDetector faceDetector = new FaceDetector(bitmapImg.getWidth(), bitmapImg.getHeight(), 1);
numOfFacesDetected = faceDetector.findFaces(bitmapImg, detectedFaces);
if (numOfFacesDetected != 0) {
detectedFaces.getMidPoint(eyesMidPoint);
eyesDistance = detectedFaces.eyesDistance();
handler.post(new Runnable() {
public void run() {
mPrev.invalidate();
// turn off thread lock
}
});
mPrev.setEyesDistance(eyesDistance);
mPrev.setEyesMidPoint(eyesMidPoint);
}
isThreadWorking = false;
}
public void onPreviewFrame(byte[] yuv, Camera camera) {
if (isThreadWorking)
return;
isThreadWorking = true;
ByteBuffer bbuffer = ByteBuffer.wrap(yuv);
bbuffer.get(grayBuff_, 0, bufflen_);
detectThread = new FaceDetectThread(handler);
detectThread.setBuffer(grayBuff_);
detectThread.start();
my questions is maybe because am working with bitmap and not gray scale it's taking too long ? how can i improve the speed ?
The FaceDetector API is not really made to process frames in a live preview. It's way to slow for that.
If you are running on a fairly new device, a better option is to use the FaceDetectionListener API in Android 14+. It is very fast and can be used to create an overlay on a preview SurfaceHolder.

Categories

Resources