I am building an Android application where an ExoPlayer plays a video onto the surface of a SurfaceView, and I am investigating whether it is possible to dynamically blur the playing video.
Blurring techniques that involve first generating a bitmap of the view to blur will not work, since the surface part of a SurfaceView does not appear in bitmaps.
Surfaces and views used to have built-in blurring effects in older versions of Android (e.g. Surface.FX_SURFACE_BLUR), but seem to have been deprecated in newer APIs.
Can anyone share some insight on how a surface can be dynamically blurred? Thank you.
There are lots of questions on StackOverflow with small bits and pieces of what needs to be done. I'll go over the method I used and hopefully it will be useful to somebody.
If this was a static blur of a video frame, it would be sufficient to play the video in a TextureView, use the .getBitmap() function and blur the resulting Bitmap using a tool such as Renderscript. However, .getBitmap() is performed on the main UI thread, and hence lags the video whose frames it is trying to copy.
To perform a blur for every frame, the best approach seems to be to use a GLSurfaceView with a custom renderer. I used the code available in VidEffects pointed to from this answer as a good starting point.
Blurs with large radii can be very computationally intensive. That is why I first approached performing the blur with two separate fragment shaders (one to blur horizontally, and one to blur the result vertically). I actually ended up using only one fragment shader to apply a 7x7 Gaussian kernel. A very important thing to keep in mind if your GLSurfaceView is large is to call setFixedSize() on the GLSurfaceView's SurfaceHolder to make its resolution lower than that of the screen. The result does not look very pixelated since it is blurred anyway, but the performance increase is very significant.
The blur I made managed to play 24fps on most devices, with setFixedSize() specifying its resolution to be 100x70.
In case anyone wants a single pass fragment shader to complete the circle... The following code implements the ShaderInterface defined in the code available from VidEffects. I adapted it from this example on ShaderToy.com.
public class BlurEffect2 implements ShaderInterface {
private final int mMaskSize;
private final int mWidth;
private final int mHeight;
public BlurEffect2(int maskSize, int width, int height) {
mMaskSize = maskSize;
mWidth = width;
mHeight = height;
}
#Override
public String getShader(GLSurfaceView mGlSurfaceView) {
float hStep = 1.0f / mWidth;
float vStep = 1.0f / mHeight;
return "#extension GL_OES_EGL_image_external : require\n" +
"precision mediump float;\n" +
//"in" attributes from our vertex shader
"varying vec2 vTextureCoord;\n" +
//declare uniforms
"uniform samplerExternalOES sTexture;\n" +
"float normpdf(in float x, in float sigma) {\n" +
" return 0.39894 * exp(-0.5 * x * x / (sigma * sigma)) / sigma;\n" +
"}\n" +
"void main() {\n" +
" vec3 c = texture2D(sTexture, vTextureCoord).rgb;\n" +
//declare stuff
" const int mSize = " + mMaskSize + ";\n" +
" const int kSize = (mSize - 1) / 2;\n" +
" float kernel[ mSize];\n" +
" vec3 final_colour = vec3(0.0);\n" +
//create the 1-D kernel
" float sigma = 7.0;\n" +
" float Z = 0.0;\n" +
" for (int j = 0; j <= kSize; ++j) {\n" +
" kernel[kSize + j] = kernel[kSize - j] = normpdf(float(j), sigma);\n" +
" }\n" +
//get the normalization factor (as the gaussian has been clamped)
" for (int j = 0; j < mSize; ++j) {\n" +
" Z += kernel[j];\n" +
" }\n" +
//read out the texels
" for (int i = -kSize; i <= kSize; ++i) {\n" +
" for (int j = -kSize; j <= kSize; ++j) {\n" +
" final_colour += kernel[kSize + j] * kernel[kSize + i] * texture2D(sTexture, (vTextureCoord.xy + vec2(float(i)*" + hStep + ", float(j)*" + vStep + "))).rgb;\n" +
" }\n" +
" }\n" +
" gl_FragColor = vec4(final_colour / (Z * Z), 1.0);\n" +
"}";
}
}
Just as Michael has pointed out above, you can increase the performance by setting the size of the SurfaceView using setFixedSize.
#BindView(R.id.video_snap)
VideoSurfaceView mVideoView;
#Override
public void showVideo(String cachedPath) {
mImageView.setVisibility(View.GONE);
mVideoView.setVisibility(View.VISIBLE);
//Get width and height of the video
final MediaMetadataRetriever mRetriever = new MediaMetadataRetriever();
mRetriever.setDataSource(cachedPath);
int width = Integer.parseInt(mRetriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_VIDEO_WIDTH));
int height = Integer.parseInt(mRetriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_VIDEO_HEIGHT));
//divide the width and height by 10
width /= 10;
height /= 10;
//set the size of the surface to play on to 1/10 the width and height
mVideoView.getHolder().setFixedSize(width, height);
//Set up the media player
mMediaPlayer = new MediaPlayer();
mMediaPlayer.setLooping(true);
try {
mMediaPlayer.setDataSource(cachedPath);
} catch (Exception e) {
Timber.e(e, e.getMessage());
}
//init and start the video player with the mask size set to 17
mVideoView.init(mMediaPlayer, new BlurEffect2(17, width, height));
mVideoView.onResume();
}
Related
My code is
vec4 textureColor = texture2D(uTextureSampler, vTextureCoord);
if(textureColor.r* 0.299 + textureColor.g * 0.587 + textureColor.b * 0.114 < 0.1) {
gl_FragColor = vec4(0.0, 0.0, 0.0, 0.0);
} else {
gl_FragColor = vec4(textureColor.r, textureColor.g, textureColor.b, textureColor.w);
}
My problem is how to judge the pixel is black? How can I do that, should change rgb to hsv?
return "precision mediump float; \n"+
" varying highp vec2 " + VARYING_TEXTURE_COORD + ";\n" +
" \n" +
" uniform sampler2D " + TEXTURE_SAMPLER_UNIFORM + ";\n" +
" \n" +
" void main()\n" +
" {\n" +
" vec3 keying_color = vec3(0.0, 0.0, 0.0);\n" +
" float thresh = 0.45; // [0, 1.732]\n" +
" float slope = 0.1; // [0, 1]\n" +
" vec3 input_color = texture2D(" + TEXTURE_SAMPLER_UNIFORM + ", " + VARYING_TEXTURE_COORD + ").rgb;\n" +
" float d = abs(length(abs(keying_color.rgb - input_color.rgb)));\n" +
" float edge0 = thresh * (1.0 - slope);\n" +
" float alpha = smoothstep(edge0, thresh, d);\n" +
" gl_FragColor = vec4(input_color,alpha);\n" +
" }";
In the keying_color variable is stored the actual color we want to replace. It is using classic RGB model, but intensity is not expressed as 0-255 integer. It is a float value in range 0-1. (So 0 = 0, 255 = 0, 122 = 0.478…) In our case, the green color has value (0.647, 0.941, 0.29), but if you are using different video, measure the color yourself.
Note: Make sure you have the right color. Some color measurement software automatically converts colors to slightly different formats, such as AdobeRGB.
So where’s the magic?
We load current pixel color in the input_color, then calculate difference between input and keying color. Based on this difference, alpha value is calculated and used for specific pixel.
You can control how strict the comparison is using the slope and threshold values. It is a bit more complicated, but the most basic rule is: The more threshold you have, the bigger tolerance.
So, we are done, right? Nope.
You can look this link: http://blog.csdn.net/u012847940/article/details/47441923
I want to do a fisheye effect on android useing opengl 2.0,i can do it not use the opengl,but this not i want ,because this is inefficient and not support video texture. I also test the fisheye effect using Android Media Effects API,but the effect looks not good.
i also search fishshader as follows:
private static final String FISHEYE_FRAGMENT_SHADER =
"precision mediump float;\n" +
"uniform sampler2D u_Texture;\n" +
"uniform vec2 vScale;\n" +
"const float alpha = float(4.0 * 2.0 + 0.75);\n" +
"varying vec2 v_TexCoordinate;\n" +
"void main() {\n" +
" float bound2 = 0.25 * (vScale.x * vScale.x + vScale.y * vScale.y);\n" +
" float bound = sqrt(bound2);\n" +
" float radius = 1.15 * bound;\n" +
" float radius2 = radius * radius;\n" +
" float max_radian = 0.5 * 3.14159265 - atan(alpha / bound * sqrt(radius2 - bound2));\n" +
" float factor = bound / max_radian;\n" +
" float m_pi_2 = 1.570963;\n" +
" vec2 coord = v_TexCoordinate - vec2(0.5, 0.5);\n" +
" float dist = length(coord * vScale);\n" +
" float radian = m_pi_2 - atan(alpha * sqrt(radius2 - dist * dist), dist);\n" +
" float scalar = radian * factor / dist;\n" +
" vec2 new_coord = coord * scalar + vec2(0.5, 0.5);\n" +
" gl_FragColor = texture2D(u_Texture, new_coord);\n" +
"}\n";
this is i want to ,but i donot know how to use it .Can someone give me some clue.
Android OpenGL ES does (normally) support video textures. It's not strictly part of the OpenGL ES API, but you can normally import video surfaces as EGL External Images via Android SurfaceViews.
There are lots of similar questions on the web, but this SO question should provide a useful starting point:
Android. How play video on Surface(OpenGL)
I am learning about OpenGL ES and in particular, compute shaders in OpenGL ES 3.1, specifically, in Android 5.0.1.
I have 3 shaders defined (compute, vertex, and fragment) and attached to two different programs, one for the compute shader and one for the vertex and fragment shaders.
I have no problem when I only use the vertex and fragment shaders but now that I added the compute shader I get the following error:
02-11 20:02:10.375 13243-13264/com.example.daan.daggl2 I/VERSION﹕ OpenGL ES 3.1 NVIDIA 349.00
02-11 20:02:10.472 13243-13264/com.example.daan.daggl2 A/libc﹕ Fatal signal 11 (SIGSEGV), code 1, fault addr 0x0 in tid 13264 (GLThread 9847)
I'm trying to figure out what I'm missing but I'm new to this topic and I'm not sure where to begin looking for the problem.
The object's constructor is:
public Triangle() {
buffers = new int[1];
GLES31.glGenBuffers(1, buffers, 0);
gVBO = buffers[0];
// set up the vertex and fragment shaders
int vertexShader = MyGLRenderer.loadShader(
GLES31.GL_VERTEX_SHADER, vertexShaderCode);
int fragmentShader = MyGLRenderer.loadShader(
GLES31.GL_FRAGMENT_SHADER, fragmentShaderCode);
program = GLES31.glCreateProgram();
GLES31.glAttachShader(program, vertexShader);
GLES31.glAttachShader(program, fragmentShader);
GLES31.glLinkProgram(program);
checkGLError("LinkProgram/program");
// set up the compute shader
int computeShader = MyGLRenderer.loadShader(
GLES31.GL_COMPUTE_SHADER, computeShaderCode);
computeProgram = GLES31.glCreateProgram();
GLES31.glAttachShader(computeProgram, computeShader);
GLES31.glLinkProgram(computeProgram);
checkGLError("LinkProgram/computeProgram");
}
The draw function:
public void draw(float[] mvpMatrix, float[] color) {
GLES31.glUseProgram(computeProgram);
checkGLError("UseProgram/computeProgram");
int radiusId = GLES31.glGetUniformLocation(
computeProgram, "radius");
indexBufferBinding = 0;
GLES31.glUniform1f(radiusId, (float) radius);
GLES31.glBindBufferBase(
GLES31.GL_SHADER_STORAGE_BUFFER, indexBufferBinding, gVBO);
checkGLError("glBindBuffer/gVBO");
GLES31.glDispatchCompute(2, 2, 1);
GLES31.glBindBufferBase(
GLES31.GL_SHADER_STORAGE_BUFFER, indexBufferBinding, 0);
// See note 1 below
//GLES31.glMemoryBarrier(
// GLES31.GL_VERTEX_ATTRIB_ARRAY_BARRIER_BIT);
GLES31.glMemoryBarrier(
GLES31.GL_SHADER_STORAGE_BARRIER_BIT);
checkGLError("glMemoryBarrier/1");
GLES31.glBindBuffer(GLES31.GL_ARRAY_BUFFER, gVBO);
checkGLError("glBindBuffer/gVBO");
GLES31.glUseProgram(program);
int posId = GLES31.glGetAttribLocation(
program, "a_v4Position");
int fillId = GLES31.glGetAttribLocation(
program, "a_v4FillColor");
int mvpMatrixId = GLES31.glGetUniformLocation(
program, "mvp_matrix");
GLES31.glEnableVertexAttribArray(posId);
GLES31.glEnableVertexAttribArray(fillId);
GLES31.glUniformMatrix4fv(mvpMatrixId, 1, false, mvpMatrix, 0);
// See note 2 below
GLES31.glDrawArrays(GLES31.GL_POINTS, 0, 3);
}
For reference, the shaders are at the end of the post.
Notes:
I can't find GL_VERTEX_ATTRIB_ARRAY_BARRIER_BIT in the Android documentation. Shouldn't it be there? The documentation mentions it.
The draw call crashes the application. Up until before glDrawArrays() everything seems fine.
Any ideas on what to look for are appreciated.
Vertex Shader:
private final String vertexShaderCode =
"uniform mat4 mvp_matrix;" +
"attribute vec4 a_v4Position;" +
"attribute vec4 a_v4FillColor;" +
"varying vec4 v_v4FillColor;" +
"void main(void) {" +
" v_v4FillColor = a_v4FillColor;" +
" gl_Position = mvp_matrix * a_v4Position;" +
"}";
Fragment Shader:
private final String fragmentShaderCode =
"precision mediump float;" +
"varying vec4 v_v4FillColor;" +
"void main(void) {" +
" gl_FragColor = v_v4FillColor;" +
"}";
Compute Shader:
private final String computeShaderCode =
"#version 310 es" +
"\n" +
"uniform float radius;" +
"struct Vector3f { float x; float y; float z; float w; };" +
"struct AttribData { Vector3f v; Vector3f c; };" +
"layout(std140, binding = 0) buffer destBuffer { AttribData data[]; } outBuffer;" +
"layout (local_size_x = 8, local_size_y = 8, local_size_z = 1) in;" +
"void main() {" +
" ivec2 storePos = ivec2(gl_GlobalInvocationID.xy);" +
" uint gWidth = gl_WorkGroupSize.x * gl_NumWorkGroups.x;" +
" uint gHeigth = gl_WorkGroupSize.y * gl_NumWorkGroups.y;" +
" uint gSize = uint(gWidth) * uint(gHeigth);" +
" uint offset = uint(storePos.y)*gWidth + uint(storePos.x);" +
" float alpha = 2.0 * 3.1159265359 * (float(offset) / float(gSize));" +
" outBuffer.data[offset].v.x = float(sin(alpha)) * float(radius);" +
" outBuffer.data[offset].v.y = float(cos(alpha)) * float(radius);" +
" outBuffer.data[offset].v.z = 0.0;" +
" outBuffer.data[offset].v.w = 1.0;" +
" outBuffer.data[offset].c.x = float(storePos.x) / float(gWidth);" +
" outBuffer.data[offset].c.y = 0.0;" +
" outBuffer.data[offset].c.z = 1.0;" +
" outBuffer.data[offset].c.w = 1.0;" +
"}";
Yes, GL_VERTEX_ATTRIB_ARRAY_BARRIER_BIT is missing in the Android Java bindings. There's a tradition of the Java OpenGL bindings in Android being incomplete. There are still things missing from the 3.0 bindings as well.
In this case, since it's just an enum value, you can easily work around it by defining the value itself. The C/C++ definition is:
#define GL_VERTEX_ATTRIB_ARRAY_BARRIER_BIT 0x00000001
So in Java, you can add a definition like this to your code:
static final int GL_VERTEX_ATTRIB_ARRAY_BARRIER_BIT = 0x00000001;
As for the crash, I'm not totally sure. I notice that there's no glVertexAttribPointer() calls in the posted code. If they are indeed not in your code, that would certainly be a problem.
I 'd like someone to advise me a way to find the coordinates of a point on a sprite in libgdx .
As you can see from the image I have set the sprite with the point of origin on the red dot , and I can not change it.
I would leave the red dot as the source and find the coordinates of the green on the same sprite .
Thanks for your help.
EDIT
#Override
public void create () {
img = new Texture("rocket.png");
font = new BitmapFont();
font.setColor(Color.BLUE);
MyInputProcessor inputProcessor = new MyInputProcessor();
Gdx.input.setInputProcessor(inputProcessor);
float w = Gdx.graphics.getWidth();
float h = Gdx.graphics.getHeight();
sprite = new Sprite(img);
spacesprite = new Sprite(new Texture(Gdx.files.internal("space.jpg")));
spacesprite.setPosition(0,0);
spacesprite.setSize(Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
point = new Sprite(new Texture(Gdx.files.internal("point.png")));
batch = new SpriteBatch();
}
#Override
public void render () {
sprite.setPosition(Gdx.graphics.getWidth() / 2 - sprite.getWidth()/2, Gdx.graphics.getHeight() / 2 - sprite.getHeight()/2);
sprite.setOrigin(sprite.getWidth()/2, sprite.getHeight()/2);
point.setPosition(sprite.getX() + sprite.getWidth()/2 - point.getWidth()/2, sprite.getY() + sprite.getHeight()/2);
point.setOrigin(point.getWidth()/2, 0);
if(Gdx.input.isButtonPressed(Input.Buttons.LEFT)){
//sprite.setPosition(Gdx.input.getX() - sprite.getWidth()/2, Gdx.graphics.getHeight() - Gdx.input.getY() - sprite.getHeight()/2);
if(Gdx.input.getX() < Gdx.graphics.getWidth() / 2)
{
//System.out.println("x: " + Gdx.input.getX() + " - y: " + Gdx.input.getY());
sprite.setRotation(rotation++);
point.setRotation(rotation++);
System.out.println("Sprite: X" + sprite.getX() + " - Y:" + sprite.getY());
}
else
{
//System.out.println("x: " + Gdx.input.getX() + " - y: " + Gdx.input.getY());
sprite.setRotation(rotation--);
point.setRotation(rotation--);
System.out.println("Sprite: X" + sprite.getX() + " - Y:" + sprite.getY());
}
}
batch.begin();
spacesprite.draw(batch);
sprite.draw(batch);
point.draw(batch);
batch.end();
}
someone can adapt the code , when I rotate I'd get the position , but they are insecure about my implementation .
if you're looking for a middle point on the rectangular sprite, try something like this:
float x = obj.getOriginX() + obj.getHeight();
float y = obj.getOriginY() + obj.getWidth() / 2;
Either use Gonio, matrix or quaternions to calculate the rotated coordinates.
For a few simple calculations I'd use gonio, something like:
float angle_rad = sprite.getRotation() / 180.0f * PI;
float rotated_x = Math.sin(angle_rad) * y + Math.cos(angle_rad) * x;
float rotated_y = Math.sin(angle_rad) * x + Math.cos(angle_rad) * y;
If you have to do this more often look into matrices or quaternions, matrices are a little easier though: http://en.wikipedia.org/wiki/Rotation_matrix
I am trying to achieve a fisheye effect on a BitMap image in Android. Is there an existing library or algorithm which can help?
I recommend you to use Android Media Effects API. If you want to have more control on the effect (or target older Android versions) you can also directly use opengl to apply a fisheye effect to your photo. Some tutorials on the subject : http://www.learnopengles.com/android-lesson-four-introducing-basic-texturing/ . Learning opengl will permit you to be able to apply all kind of effects to your photo, shader codes can be easily found on the internet (eg : https://github.com/BradLarson/GPUImage/tree/master/framework/Source)
Here is a shader code for a fisheye effect :
private static final String FISHEYE_FRAGMENT_SHADER =
"precision mediump float;\n" +
"uniform sampler2D u_Texture;\n" +
"uniform vec2 vScale;\n" +
"const float alpha = float(4.0 * 2.0 + 0.75);\n" +
"varying vec2 v_TexCoordinate;\n" +
"void main() {\n" +
" float bound2 = 0.25 * (vScale.x * vScale.x + vScale.y * vScale.y);\n" +
" float bound = sqrt(bound2);\n" +
" float radius = 1.15 * bound;\n" +
" float radius2 = radius * radius;\n" +
" float max_radian = 0.5 * 3.14159265 - atan(alpha / bound * sqrt(radius2 - bound2));\n" +
" float factor = bound / max_radian;\n" +
" float m_pi_2 = 1.570963;\n" +
" vec2 coord = v_TexCoordinate - vec2(0.5, 0.5);\n" +
" float dist = length(coord * vScale);\n" +
" float radian = m_pi_2 - atan(alpha * sqrt(radius2 - dist * dist), dist);\n" +
" float scalar = radian * factor / dist;\n" +
" vec2 new_coord = coord * scalar + vec2(0.5, 0.5);\n" +
" gl_FragColor = texture2D(u_Texture, new_coord);\n" +
"}\n";
Have a look at OpenCV for Android:
http://opencv.org/platforms/android.html
And this answer:
How to simulate fisheye lens effect by openCV?
Perhaps a more simple solution would be using the Android Media Effects API. It's only available from API 14 and above however.