In Android, how to give draw a rotating/flipping coin wrt user on SurfaceView. It's very easy to use an animation and show the flipping coin in an ImageView, but I require this on SurfaceView with the same smoothness. Is there any way that I can give flipping animation in an ImageView then take out its content at every instant and then draw it on the SurfaceView continuously?
you can use Matrix class to perfom these operation on drawing bitmap.
something like
Paint paint = new Paint();
paint.setAntiAlias(true);
Matrix needleTransMatrix = new Matrix();
needleTransMatrix.postRotate(rotationAngle,needleImage.getWidth()/2,needleImage.getHeight());
needleTransMatrix.postTranslate( centerX+METER_OFFSET-needleImage.getWidth()/2, centerY-needleImage.getHeight());
Matrix dialTransMatrix = new Matrix();
dialTransMatrix.postRotate(rotationAngle,dialImage.getWidth()/2,dialImage.getHeight()/2);
dialTransMatrix.postTranslate( METER_OFFSET,0);
canvas.drawBitmap(bgImage, METER_OFFSET, 0, paint);
canvas.drawBitmap(dialImage, dialTransMatrix, paint);
canvas.drawBitmap(needleImage, needleTransMatrix, paint);
The below is the drawing thread about which i am talking about
private class AnimationThread extends Thread{
private boolean isRunning = true;
private int destinationNeedlePosition = NEEDLE_POSITION_ENQURIES;
public AnimationThread(int destinationNeedlePosition) {
this.destinationNeedlePosition = destinationNeedlePosition;
}
public void cancelAnimation(){
isRunning = false;
}
#Override
public void run() {
int destinationAngle = destinationNeedlePosition *45;
while(isRunning && destinationAngle != rotationAngle){
if(destinationAngle > rotationAngle)
rotationAngle++;
else
rotationAngle--;
postInvalidate();
try {
sleep(10);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
}
Related
My situation is rather simple. I have a mobile and wear app. My mobile app is a "drawing" app. It connects to a piece of hardware that has a magnetic pencil. Drawing on the hardware sends three functions to my app - touchStart, touchMove, touchStop. I can draw the path on my mobile app at a reasonable rate. But, if I send these 3 touch detections to my watch via SendMessage, the image is drawn slowly. I'm trying to speed it up so the drawing is in a real time, like the mobile device.
Here's the code to the Paint class of my watch app:
class SimpleDrawingView extends View {
// setup initial color
private final int paintColor = Color.WHITE;
// defines paint and canvas
private Paint drawPaint;
// Store circles to draw each time the user touches down
public int xwidth, xheight;
private Path bPath = new Path();
private Path path = new Path();
public SimpleDrawingView(Context context, AttributeSet attrs) {
super(context, attrs);
setFocusable(true);
setFocusableInTouchMode(true);
setupPaint();
}
public Path redraw() {
Matrix m = new Matrix();
RectF innerBounds = new RectF();
bPath = new Path(path);
bPath.computeBounds(innerBounds, true);
int width = getWidth();
int height = getHeight();
RectF intoBounds = new RectF(0, 0, width, height);
m.setRectToRect(innerBounds, intoBounds, Matrix.ScaleToFit.CENTER);
bPath.transform(m);
return bPath;
}
// Draw each circle onto the view
#Override
protected void onDraw(Canvas canvas) {
Log.d("MEx", " drawing ");
redraw();
canvas.drawPath(path, drawPaint);
}
public void actionDown(Point a)
{
path.moveTo(a.x, a.y);
invalidate();
}
public void clear()
{
path.reset();
invalidate();
}
public void actionEnd(Point a)
{
// path.lineTo(a.x, a.y);
}
public void actionMove(Point a)
{
path.lineTo(a.x, a.y);
invalidate();
}
// Append new circle each time user presses on screen
private void setupPaint() {
// Setup paint with color and stroke styles
drawPaint = new Paint();
drawPaint.setColor(paintColor);
drawPaint.setAntiAlias(true);
drawPaint.setStrokeWidth(23);
drawPaint.setStyle(Paint.Style.STROKE);
drawPaint.setStrokeJoin(Paint.Join.ROUND);
drawPaint.setStrokeCap(Paint.Cap.ROUND);
}
}
Here's my message receiver:
public class Receiver extends BroadcastReceiver {
#Override
public void onReceive(Context context, Intent intent) {
for(String s : intent.getExtras().keySet()) {
String first = intent.getExtras().get(s).toString();
String label, x, y;
if (first.contains("charge+")) {
textLabel.setText(first.split("7")[1]);
} else if (first.contains("battery+")) {
String color = first.substring(8);
Log.d("TESTTWO", color + " <- ");
switch(color)
{
case "RED":
dotView.setImageResource(R.drawable.red);
textLabel.setText("");
break;
case "GREEN":
dotView.setImageResource(R.drawable.green);
textLabel.setTextColor(Color.GREEN);
break;
case "YELLOW":
textLabel.setTextColor(Color.YELLOW);
dotView.setImageResource(R.drawable.yellow);
break;
}
} else {
label = first.substring(0, 1);
String[] toSplit;
toSplit = first.substring(2).split(",");
Point point = new Point(Integer.parseInt(toSplit[0]), Integer.parseInt(toSplit[1]));
switch (label) {
case "S":
imageView.actionDown(point);
break;
case "M":
imageView.actionMove(point);
break;
case "E":
imageView.actionEnd(point);
break;
}
}
}
}
}
Points are sent to my watch in the format of either
S-x,y
M-x,y
E-x,y
S = start, M = move, E = end.
If someone can help me optimize this functionality it would be amazing!
I tried send the data as an asset as well, that's even slower.
I am new to OpenGL and ARCore and I am using GoogleARCore Sample as a base to create my application. I am using OpenGL-ES-2.0 version. I able to do Pinch zoom (2 fingers) using android.view.ScaleGestureDetector.SimpleOnScaleGestureListener. By using this library class of Rotation Rotation Gesture I am able to get my rotation degree and it worked well with my 3D Object.
While rotating my 3D object, my object gets scaled too. I want to stop my scaling while the user is doing the rotation. How can I achieve this? Or how can I pass my both scaling and rotation in a different method to update their respective matrix? I do not want to use any 3D party library for this.
Please help me with this. Below is my code and suggest me where I am doing anything wrong.
ScaleGesture
private class CustomScaleGesture extends ScaleGestureDetector.SimpleOnScaleGestureListener {
#Override
public boolean onScale(ScaleGestureDetector detector) {
DebugHelper.log("detector.getScaleFactor(): " + detector.getScaleFactor() + " scaleFactor = " + scaleFactor);
scaleFactor *= detector.getScaleFactor();
DebugHelper.log("final scaleFactor: " + scaleFactor);
return true;
}
}
RotationGesture
private class RotateListener extends RotateGestureDetector.SimpleOnRotateGestureListener {
#Override
public boolean onRotate(RotateGestureDetector detector) {
DebugHelper.log("RotateListener called..");
mRotationDegrees -= detector.getRotationDegreesDelta();
DebugHelper.log("RotateListener: " + mRotationDegrees);
return true;
}
}
MainActivity
public class MyARActivity extends BaseActivity<MyActivityArBinding> implements GLSurfaceView.Renderer {
//AR Variables
private int mWidth;
private int mHeight;
private boolean capturePicture = false;
private boolean installRequested;
private boolean moving;
float[] projmtx = new float[16];
float[] viewmtx = new float[16];
private Session session;
private Snackbar messageSnackbar;
private DisplayRotationHelper displayRotationHelper;
private final BackgroundRenderer backgroundRenderer = new BackgroundRenderer();
private ObjectRenderer virtualObject;
private ObjectRenderer virtualObjectShadow;
private final PlaneRenderer planeRenderer = new PlaneRenderer();
private PointCloudRenderer pointCloud = new PointCloudRenderer();
// Temporary matrix allocated here to reduce number of allocations for each frame.
private float[] anchorMatrix = new float[16];
// Tap handling and UI.
private ArrayBlockingQueue<MotionEvent> queuedSingleTaps = new ArrayBlockingQueue<>(16);
private ArrayList<Anchor> anchors = new ArrayList<>();
//load and manipulate obj
private SQLiteHelper sqlHelper;
private boolean isObjectChanged = false;
private String objectPath;
private List<CharacterModel> characterModelList = new ArrayList<>();
private boolean isFirstTimeLoad = true;
//Gestures
private float mRotationDegrees = 0.f;
private RotateGestureDetector mRotateDetector;
private float scaleFactor = 1.0f;
private ScaleGestureDetector scaleDetector;
private GestureDetector gestureDetector;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setHeaderVisible(false);
doDefaults();
}
private void doDefaults() {
binding.setPresenter(this);
sqlHelper = SQLiteHelper.getInstance(this);
load3DCharacters();
initAR();
}
#SuppressLint("ClickableViewAccessibility")
private void initAR() {
displayRotationHelper = new DisplayRotationHelper(this);
scaleDetector = new ScaleGestureDetector(this, new CustomScaleGesture());
mRotateDetector = new RotateGestureDetector(getApplicationContext(), new RotateListener());
// Set up tap listener.
gestureDetector = new GestureDetector(this, new GestureDetector.SimpleOnGestureListener() {
#Override
public boolean onSingleTapUp(MotionEvent e) {
if (anchors.size() <= 0) {
onSingleTap(e);
}
return true;
}
#Override
public boolean onDown(MotionEvent e) {
return true;
}
});
binding.surfaceView.setOnTouchListener(new View.OnTouchListener() {
#Override
public boolean onTouch(View v, MotionEvent event) {
DebugHelper.log("binding.surfaceView.setOnTouchListener called..");
mRotateDetector.onTouchEvent(event);
scaleDetector.onTouchEvent(event);
switch (event.getAction()) {
case MotionEvent.ACTION_DOWN:
moving = true;
DebugHelper.log("ACTION_DOWN");
break;
case MotionEvent.ACTION_UP:
DebugHelper.log("ACTION_UP");
moving = false;
break;
case MotionEvent.ACTION_MOVE:
DebugHelper.log("ACTION_MOVE");
if (anchors.size() > 0) {
onSecondTouch(event);
}
break;
}
return gestureDetector.onTouchEvent(event);
}
});
// Set up renderer.
binding.surfaceView.setPreserveEGLContextOnPause(true);
binding.surfaceView.setEGLContextClientVersion(2);
binding.surfaceView.setEGLConfigChooser(8, 8, 8, 8, 16, 0); // Alpha used for plane blending.
binding.surfaceView.setRenderer(this);
binding.surfaceView.setRenderMode(GLSurfaceView.RENDERMODE_CONTINUOUSLY);
installRequested = false;
}
private void onSecondTouch(MotionEvent e) {
Log.e("Second Touch", "Executed");
if (e.getPointerCount() > 1) {
scaleDetector.onTouchEvent(e);
} else {
queuedSingleTaps.offer(e);
}
}
private void onSingleTap(MotionEvent e) {
// Queue tap if there is space. Tap is lost if queue is full.
DebugHelper.log("onSingleTap()");
queuedSingleTaps.offer(e);
}
private void load3DCharacters() {
CharacterModel model = new CharacterModel();
model.setName("Cat");
model.setObjectPath("cat/cat.obj");
model.setScaleFactor(0.25f);
model.setResourceId(R.drawable.cat);
characterModelList.add(model);
model = new CharacterModel();
model.setName("Old Man");
model.setObjectPath("man/muro.obj");
model.setScaleFactor(0.0085f);
model.setResourceId(R.drawable.old_man);
characterModelList.add(model);
model = new CharacterModel();
model.setName("Bloodwing");
model.setObjectPath("bloodwing/bloodwing.obj");
model.setScaleFactor(0.0009f);
model.setResourceId(R.drawable.bat);
characterModelList.add(model);
}
private void loadObject(CharacterModel model) {
try {
this.objectPath = model.getObjectPath();
this.scaleFactor = model.getScaleFactor();
if (virtualObject == null) {
virtualObject = new ObjectRenderer(objectPath);
virtualObject.createOnGlThread(this);
virtualObject.setMaterialProperties(0.0f, 1.0f, 1.0f, 6.0f);
} else {
// Clear screen to notify driver it should not load any pixels from previous frame.
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
isObjectChanged = true;
virtualObject.updateObjectPath(model.getObjectPath());
}
} catch (Exception ex) {
ex.printStackTrace();
}
}
#Override
public void onDrawFrame(GL10 gl) {
if (isObjectChanged) {
isObjectChanged = false;
try {
virtualObject.createOnGlThread(this);
virtualObject.setMaterialProperties(0.0f, 2.0f, 0.5f, 6.0f);
} catch (IOException e) {
e.printStackTrace();
}
return;
}
// Clear screen to notify driver it should not load any pixels from previous frame.
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
if (session == null) {
return;
}
// Notify ARCore session that the view size changed so that the perspective matrix and
// the video background can be properly adjusted.
displayRotationHelper.updateSessionIfNeeded(session);
try {
session.setCameraTextureName(backgroundRenderer.getTextureId());
// Obtain the current frame from ARSession. When the configuration is set to
// UpdateMode.BLOCKING (it is by default), this will throttle the rendering to the
// camera framerate.
Frame frame = session.update();
Camera camera = frame.getCamera();
// Handle taps. Handling only one tap per frame, as taps are usually low frequency
// compared to frame rate.
MotionEvent tap = queuedSingleTaps.poll();
if (tap != null && camera.getTrackingState() == TrackingState.TRACKING) {
for (HitResult hit : frame.hitTest(tap)) {
// Check if any plane was hit, and if it was hit inside the plane polygon
Trackable trackable = hit.getTrackable();
// Creates an anchor if a plane or an oriented point was hit.
if ((trackable instanceof Plane && ((Plane) trackable).isPoseInPolygon(hit.getHitPose())) || (trackable instanceof Point && ((Point) trackable).getOrientationMode() == Point.OrientationMode.ESTIMATED_SURFACE_NORMAL)) {
// Hits are sorted by depth. Consider only closest hit on a plane or oriented point.
// Cap the number of objects created. This avoids overloading both the
// rendering system and ARCore.
//if (!isUpdate) {
DebugHelper.log("Anchor size = " + anchors.size());
if (anchors.size() >= 1) {
anchors.get(0).detach();
anchors.remove(0);
}
// Adding an Anchor tells ARCore that it should track this position in
// space. This anchor is created on the Plane to place the 3D model
// in the correct position relative both to the world and to the plane.
if (anchors.size() > 0) {
DebugHelper.log("anchor list has data");
for (Anchor anchor : anchors) {
anchor.detach();
anchors.remove(anchor);
}
}
Anchor anchor = hit.createAnchor();
if (anchor != null)
anchors.add(anchor);
else
DebugHelper.log("anchor is null");
//}
break;
}
}
}
// Draw background.
backgroundRenderer.draw(frame);
// If not tracking, don't draw 3d objects.
if (camera.getTrackingState() == TrackingState.PAUSED) {
return;
}
// Get projection matrix.
camera.getProjectionMatrix(projmtx, 0, 0.1f, 100.0f);
// Get camera matrix and draw.
camera.getViewMatrix(viewmtx, 0);
// Compute lighting from average intensity of the image.
final float lightIntensity = frame.getLightEstimate().getPixelIntensity();
// Visualize tracked points.
PointCloud pointCloud = frame.acquirePointCloud();
this.pointCloud.update(pointCloud);
if (!capturePicture)
this.pointCloud.draw(viewmtx, projmtx);
// Application is responsible for releasing the point cloud resources after
// using it.
pointCloud.release();
// Check if we detected at least one plane. If so, hide the loading message.
if (messageSnackbar != null) {
{
for (Plane plane : session.getAllTrackables(Plane.class)) {
if (plane.getType() == Plane.Type.HORIZONTAL_UPWARD_FACING
&& plane.getTrackingState() == TrackingState.TRACKING) {
hideLoadingMessage();
break;
}
}
}
for (Plane plane : session.getAllTrackables(Plane.class)) {
if (plane.getType() == Plane.Type.HORIZONTAL_UPWARD_FACING && plane.getTrackingState() == TrackingState.TRACKING) {
hideLoadingMessage();
break;
}
}
}
// Visualize planes.
if (!capturePicture)
planeRenderer.drawPlanes(session.getAllTrackables(Plane.class), camera.getDisplayOrientedPose(), projmtx);
// Visualize anchors created by touch.
for (Anchor anchor : anchors) {
if (anchor.getTrackingState() != TrackingState.TRACKING) {
continue;
}
// Get the current pose of an Anchor in world space. The Anchor pose is updated
// during calls to session.update() as ARCore refines its estimate of the world.
anchor.getPose().toMatrix(anchorMatrix, 0);
// Update and draw the model and its shadow.
if (virtualObject != null) {
//passing my scaleFector and rotationDegree to update my matrix.
virtualObject.updateModelMatrix(anchorMatrix, scaleFactor, mRotationDegrees);
if (viewmtx != null && projmtx != null) {
virtualObject.draw(viewmtx, projmtx, lightIntensity);
}
}
}
if (capturePicture) {
capturePicture = false;
onSavePicture();
}
} catch (Throwable t) {
Log.e(TAG, "Exception on the OpenGL thread", t);
}
}
ObjectRenderer
public void draw(float[] cameraView, float[] cameraPerspective, float lightIntensity) {
try {
ShaderUtil.checkGLError(TAG, "Before draw");
Matrix.multiplyMM(mModelViewMatrix, 0, cameraView, 0, mModelMatrix, 0);
Matrix.multiplyMM(mModelViewProjectionMatrix, 0, cameraPerspective, 0, mModelViewMatrix, 0);
Matrix.setRotateM(mRotationMatrix, 0, MyARActivity.rotationDegrees, 0.0f, 1.0f, 0.0f); //rotation degree pass to matrix
Matrix.multiplyMM(mFinalModelViewProjectionMatrix, 0, mModelViewProjectionMatrix, 0, mRotationMatrix, 0);
GLES20.glUseProgram(mProgram);
// Set the lighting environment properties.
Matrix.multiplyMV(mViewLightDirection, 0, mModelViewMatrix, 0, LIGHT_DIRECTION, 0);
normalizeVec3(mViewLightDirection);
GLES20.glUniform4f(mLightingParametersUniform, mViewLightDirection[0], mViewLightDirection[1], mViewLightDirection[2], lightIntensity);
// Set the object material properties.
GLES20.glUniform4f(mMaterialParametersUniform, mAmbient, mDiffuse, mSpecular, mSpecularPower);
// Set the ModelViewProjection matrix in the shader.
GLES20.glUniformMatrix4fv(mModelViewUniform, 1, false, mModelViewMatrix, 0);
GLES20.glUniformMatrix4fv(mModelViewProjectionUniform, 1, false, mModelViewProjectionMatrix, 0);
GLES20.glUniformMatrix4fv(mModelViewProjectionUniform, 1, false, mFinalModelViewProjectionMatrix, 0);
if (mBlendMode != null) {
GLES20.glDepthMask(false);
GLES20.glEnable(GLES20.GL_BLEND);
switch (mBlendMode) {
case Shadow:
// Multiplicative blending function for Shadow.
GLES20.glBlendFunc(GLES20.GL_ZERO, GLES20.GL_ONE_MINUS_SRC_ALPHA);
break;
case Grid:
// Grid, additive blending function.
GLES20.glBlendFunc(GLES20.GL_SRC_ALPHA, GLES20.GL_ONE_MINUS_SRC_ALPHA);
break;
}
}
if (mObj != null && mObj.getNumMaterialGroups() > 0) {
//Start drawing data from each VAO
for (int i = 0; i < mObj.getNumMaterialGroups(); i++) {
// Attach the object texture.
GLES20.glUniform1i(mTextureUniform, 0);
GLES20.glBindTexture(GL_TEXTURE_2D, mTextures[i]);
GLES30.glBindVertexArray(vectorArrayObjectIds[i]);
GLES20.glDrawElements(GLES20.GL_TRIANGLES, mObj.getMaterialGroup(i).getNumFaces() * 3,
GLES20.GL_UNSIGNED_SHORT, 0);
GLES30.glBindVertexArray(0);
//Unbind texture
GLES20.glBindTexture(GL_TEXTURE_2D, 0);
}
}
if (mBlendMode != null) {
GLES20.glDisable(GLES20.GL_BLEND);
GLES20.glDepthMask(true);
}
ShaderUtil.checkGLError(TAG, "After draw");
} catch (Exception ex) {
ex.printStackTrace();
}
}
I have class allows you draw on an image from the filesystem. It has methods that I've shared below. The idea is that, as the image is touched, circles will be drawn on top of that image at each of the points that have been touched.
#Override
protected void onDraw(Canvas canvas) {
try{
if (mTextPaint == null)
{
mTextPaint = new Paint(Paint.ANTI_ALIAS_FLAG);
mTextPaint.setColor(Color.CYAN);
}
if (cachedBitmap == null)
{
Bitmap immutableBitmap = MediaStore.Images.Media.getBitmap(getContext().getContentResolver(), Uri.fromFile(CameraActivity.getFile()));
cachedBitmap = immutableBitmap.copy(Bitmap.Config.ARGB_8888, true);
}
if (cachedCanvas == null)
{
cachedCanvas = canvas;
cachedCanvas.setBitmap(cachedBitmap);
}
if (! mPoints.isEmpty())
{
for (Point point : mPoints)
{
cachedCanvas.drawCircle(point.x, point.y, 25, mTextPaint);
}
Log.i(TAG, "Drawing points...");
}
}
catch (Exception e)
{
Log.e(TAG, "Error on draw: " + e.toString());
}
}
/**
* populates a list of points that have been touched
*
* #param event
* #return
*/
#Override
public boolean onTouchEvent(MotionEvent event)
{
Point point = new Point((int)event.getX(), (int)event.getY());
mPoints.add(point);
invalidate();
return true;
}
After this is done, I'd like to include a static method that returns the now drawn on bitmap, something like this:
public static Bitmap getCachedBitmap()
{
return cachedBitmap;
}
The problem is, the cachedBitmap variable is not being updated as it is being drawn on, so when I inspect the return value of cachedBitmap, I just get the Bitmap from the file URI specified in the initial immutableBitmap variable.
Any idea how I can return the altered bitmap?
You can take an image of outer layout which contains both the bitmap and the canvas on which circles are drawn:
View v = yourlayout.getRootView();
v.setDrawingCacheEnabled(true);
Bitmap bitmap = v.getDrawingCache();
BitmapDrawable drawable=new BitmapDrawable(bitmap);
By this way, you will get your initial image + edits.
I want to move the text over an image to fix the text on the desired place over the image.
I run this code successfully But its not optimum as its not user friendly.Text Movement doesn't match to finger's pick and drop even sometimes.If anybody have better code Please share with me or let me know if i am missing something.
//Listeners for the Canvas that is being awarded
popImgae.setOnTouchListener( new OnTouchListener(){
public boolean onTouch(View v, MotionEvent e) {
someGlobalXvariable = e.getX();
someGlobalYvariable = e.getY();
setTextPosition();
//saveImage(imgRecord.get(1),leftPos,topPos,popText.getTextSize(),popText.getText().toString());
return true;
}
});
public void setTextPosition(){
try {
redrawImage(imgRecord.get(1),Integer.parseInt(imgRecord.get(8)),imgRecord.get(6),Integer.parseInt(imgRecord.get(9)));
} catch (Exception e) {
// TODO: handle exception
System.out.println("##########Error in setTextPositio========="+e.getMessage());
}
}
///Redrawing the image & touchin Move of the Canvas with text
public void redrawImage(String path,float sizeValue,String textValue,int colorValue) {
//Bitmap bm = BitmapFactory.decodeResource(getResources(), R.drawable.fashion_pic);
BitmapFactory.Options options = new BitmapFactory.Options();
try {
options.inMutable = true;
} catch (Exception e) {
// TODO: handle exception
System.out.println("#############Error is======"+e.getMessage());
}
Bitmap bm = BitmapFactory.decodeFile(path,options);
//bm = imageManipulation.convertToMutable(bm);
proxy = Bitmap.createBitmap(bm.getWidth(), bm.getHeight(), Config.ARGB_8888);
Canvas c = new Canvas(proxy);
//Here, we draw the background image.
c.drawBitmap(bm, new Matrix(), null);
Paint paint = new Paint();
paint.setColor(colorValue); // Text Color
paint.setStrokeWidth(30); // Text Size
paint.setTextSize(sizeValue);
System.out.println("Values passing=========="+someGlobalXvariable+", "+someGlobalYvariable+", "
+sizeValue+", "+textValue);
//Here, we draw the text where the user last touched.
c.drawText(textValue, someGlobalXvariable, someGlobalYvariable, paint);
popImgae.setImageBitmap(proxy);
}
class MyView extends View{
public MyView(Context context) {
super(context);
//get your drawable image
setBackgroundDrawable(drawable);
}
#Override
protected void onDraw(Canvas canvas) {
super.onDraw(canvas);
canvas.drawText("text", x, y, new Paint());
}
float x,y;
#Override
public boolean onTouchEvent(MotionEvent event) {
x= event.getX();
y= event.getX();
invalidate();
return true;
}
}
I have to draw a circle in live wallpaper when it touches the boundary the direction of drawing gets reversed (something like in zigzag format).
The problem is i am able to draw circle in this format. But:
How to remove the previously drawn circle so that only single circle (dot) is visible at a time.
When i redraw the bitmap it starts flickering why this happens?
Code is as follows:
Thread to draw circle:
{animRunnable = new Runnable() {
public void run() {
if (!isRightEndReached && moveCircleX < 320) {
moveCircleX++;
moveCircleY++;
} else if (isRightEndReached) {
moveCircleX--;
moveCircleY++;
}
if (moveCircleX >= 320) {
isRightEndReached = true;
} else if (moveCircleX <= 0) {
isRightEndReached = false;
}
moveCircle(moveCircleX, moveCircleY);
if (moveCircleY == 480) {
// end of screen -re-init x and y point to move circle.
moveCircleX = intialStartX-10;
moveCircleY = intialStartY+1;
isRightEndReached = false;
// show wallpaper.
showWallpaper();
moveCircle(moveCircleX, moveCircleY);
}
}
};
/**
* Method to move circle
*
* #param x
* #param y
*/
private void moveCircle(int x, int y) {
Log.d("x==" + x, "y==" + y);
Paint paint = new Paint();
SurfaceHolder surfaceHolder = getSurfaceHolder();
Canvas canvas = null;
try {
canvas = surfaceHolder.lockCanvas();
if (canvas != null) {
canvas.save();
paint.setColor(Color.RED);
canvas.drawCircle(x, y, 5, paint);
canvas.restore();
}
} catch (Exception e) {
e.printStackTrace();
}
finally {
if (canvas != null) {
surfaceHolder.unlockCanvasAndPost(canvas);
}
}
animHandler.removeCallbacks(animRunnable);
if (isVisible()) {
animHandler.postDelayed(animRunnable, 1000L / 500L);
}
}
//Show wallpaper method.
/**
* Method to show wallpaper.
*/
void showWallpaper() {
SurfaceHolder surfaceHolder = getSurfaceHolder();
Canvas canvas = null;
try {
canvas = surfaceHolder.lockCanvas();
if (canvas != null) {
System.out
.println("Drawing bitmap in show Wallpaper method.");
canvas.save();
BitmapFactory.Options options = new BitmapFactory.Options();
options.inPurgeable = true;
bitmap = BitmapFactory.decodeResource(getResources(),
R.drawable.aquarium, options);
canvas.drawColor(0xff000000);
canvas.drawBitmap(bitmap, 0, 0, null);
canvas.restore();
}
} finally {
if (canvas != null) {
surfaceHolder.unlockCanvasAndPost(canvas);
}
}
}
}
SOLVED: finally i got the solution by not concentrating on removing circle but drawing bitmap again and again with new point. The method is as follows:
{
BitmapFactory.Options options = new BitmapFactory.Options();
options.inPurgeable = true;
bitmap = BitmapFactory.decodeResource(getResources(),
R.drawable.aquarium, options);
Paint paint = new Paint();
/**
* Method to move circle i.e to draw bitmap with new circle position.
*
* #param x
* #param y
*/
private void renderBackground(int x, int y) {
Log.d("x==" + x, "y==" + y);
surfaceHolder = getSurfaceHolder();
Canvas canvas = null;
try {
canvas = surfaceHolder.lockCanvas();
if (canvas != null) {
paint.setColor(Color.RED);
canvas.save();
// set Back ground
canvas.drawBitmap(bitmap, 0, 0, null);
// write draw circle.
paint.setAntiAlias(true);
canvas.drawCircle(x, y, 15, paint);
canvas.restore();
bitmap.recycle();
}
} catch (Exception e) {
e.printStackTrace();
}
finally {
if (canvas != null) {
surfaceHolder.unlockCanvasAndPost(canvas);
// showWallpaper();
}
}
animHandler.removeCallbacks(animRunnable);
if (isVisible()) {
animHandler.postDelayed(animRunnable, 1000L / 25L);
}
}
}