How to Convert View Coordinates to Canvas Coordinates on Android? - android

I'm porting an app from ios to android that involves drag and drop items from a list view into a view that has a rectangle area.
On Swift I used this Code to validate that the item is inside the main area:
func isInsideMainArea(point: CGPoint) -> Bool {
let nodePoint = convertPoint(fromView: point);
return mainArea?.contains(nodePoint);
}
But on android I cannot find the correct way to convert the View coordinates to the canvas coordinates so I can evalute if the dropped item is inside the main area.
So far this is my validation code but does not work as expected, because if I scale or translate the canvas the coordinates change:
public Boolean isInsideMainArea(Room testRoom, Float textX, Float testY) {
RectF rectangle = new RectF(
x,
y,
room.getWidth() * CONSTANTS.PIX_PER_METER * metrics.density,
room.getHeight() * CONSTANTS.PIX_PER_METER * metrics.density
);
if(mainArea.contains(rectangle)){
return true;
}
return false;
}
Is there a equivalent to the function convertPoint on android or a way to do it?

Related

Show marker view by default in MpChart

I am using MpChart's LineChart for showing my graphs. I have added multiple data set lines. Everything is working fine. But i want the MarkerView should be set to some point in the middle and should be visible by default. Right now marker view is visible only when I touch it. Is there any method to achieve this ?
Initial graph
Markerview shown after graph is touched
Is it help?
for (IDataSet set : mChart.getData().getDataSets())
set.setDrawValues(true);
mChart.invalidate();
I've done this in the IOS version of this lib.But as the documentation says they almost identical, so i hope my answer is correctly "translated".
What i did was get a point where your default marker will always be shown.
Default point were the marker should be locked, define own your point based on HighestVisibleX or something else.
Then on first render i highlight this point:
Highlight myFirstRenderedHighlight = new Highlight(myLockedMarkerPoint.x, 0);
someChart.highlightValues(new Highlight[] { myFirstRenderedHighlight });
So if u always want to show this point even when dragging along x-axis, then u need to redraw the highlight. This can be done by listening on chartTranslation This can be done by implementing the interface OnChartGestureListener. onChartTranslate() , example:
// same logic as picking first point with HighestVisibleX or something
Highlight movingHighlight = new Highlight(entryInLockedPoint.x,0);
someChart.highlightValues(new Highlight[] { movingHighlight });
And if you want to mark several of graphs then choose from charts datasets.
Hope this was what your were looking for :)
There is no default implementation to do this in the library.
One way to do this can be to modify the LineChartRenderer class in the library. MpAndroidCharts allows you to draw circles on plotted points, you can modify this by defining a new constructor for LineChartEntry and pass a bitmap to it. You can then draw your bitmap at the plotted point instead of the circle that is drawn.
ArrayList<Entry> values = new ArrayList<Entry>();
Drawable d;
for (int i = 0; i < dataList.size(); i++) {
LineChartData data = dataList.get(i);
float val = Float.valueOf(Utils.decimalValuePrecisionTwoPlaces((float) data.getDataVolGallon()));
if (data.getImageIndex() >= 0) {
d = ContextCompat.getDrawable(getContext(), resIcon[data.getImageIndex()]);
bitmap = ((BitmapDrawable) d).getBitmap();
bitmap = Bitmap.createScaledBitmap(bitmap, bitmap.getWidth() / 2, bitmap.getHeight() / 2, false);
values.add(new Entry(i, val, bitmap));
} else {
values.add(new Entry(i, val));
}
}
Above code is an example for how to set entries with and without bitmap.
if(e.getBitmap() != null)
{
c.drawBitmap(e.getBitmap(),mCirclesBuffer[0] - circleRadius, mCirclesBuffer[1] - circleRadius, mRenderPaint);
}
This the code to draw the image from bitmap, just comment the line to draw circles in drawCircles() of LineChartRenderer and use this instead.
Leave a comment if you have any question.Hope this helps !!
To adjust start x,y position of your Marker just override this method in your MarkerView class. This also adjust your marker x position if it gets out of bounds of your chart view.
override fun draw(canvas: Canvas, positionX: Float, positionY: Float) {
// Check marker position and update offsets.
var posx = positionX
val posy: Float
val w = width
val h = height
posx -= if (resources.displayMetrics.widthPixels - positionX < w) {
w.toFloat()
} else {
w / 2.toFloat() // Draw marker in the middle of highlight
}
posy = lineChart.height - h.toFloat() * 2 // Always starts from middle of chart
// Translate to the correct position and draw
canvas.translate(posx, posy)
draw(canvas)
canvas.translate(-posx, -posy)
}

Android - How to overlay one path on top of another

I currently have the following code:
private void drawGreen(Canvas canvas) {
greenPaint.setColor(0xFF00AA00);
if (start) {
greenPath = new Path();
greenPath.reset();
greenPath.moveTo(pathArrayX.get(0), pathArrayY.get(0));
start = false;
}
if (isInsideCircle(pathArrayX.get(pathIndex), pathArrayY.get(pathIndex), curX, curY, TypedValue.applyDimension(TypedValue.COMPLEX_UNIT_DIP, 25, getResources().getDisplayMetrics()))) {
greenPath.lineTo(pathArrayX.get(pathIndex), pathArrayY.get(pathIndex));
canvas.drawPath(greenPath, greenPaint);
pathIndex++;
}
}
private boolean isInsideCircle(float x, float y, float centerX, float centerY, float radius) {
return Math.pow(x - centerX, 2) + Math.pow(y - centerY, 2) < Math.pow(radius, 2);
}
In my app, I at first draw a red path, with its coordinates stored in the ArrayLists pathArrayX and pathArrayY. I am tracking the X and Y coordinates of a circular ImageView being moved underneath a mouse, and would like to overlay the red path with a green path when the user hovers over the path from beginning to end. As the user hovers over the red path, the portion of the red path that they already completed would be overlaid by a green path along the same segment. The X and Y coordinates of the ImageView (curX and curY) are being calculated from a running thread.
However, my app doesn't appear to be drawing the green path at all. Is there anything I am doing wrong here?
Is this function even being called?
Assuming it's being called inside onDraw(Canvas), it looks like it might be missing the outer code for a loop. Seeing that you're doing pathIndex++ at the end, were you using a while loop? If you're just going to loop through point, use a for-loop as while-loop is more prone to dropping into an endless loop if you forgot to increment counter or doing wrongly, or do it in multiple places.
Side notes: if the boolean start flag is only being used to lazily initialise greenPath, you should scrap that and just use if (greenPath == null){ as a general practise. Use states that you can directly infer from objects and not use flags if you can help it, this makes code cleaner.

How to get coordinates of touch screen points in ogl?

I am drawing map using OpenGL. I am getting map drawn after reading XML files and setting corresponding buffer. This map contains streets, highways and boundary. What i want is whenever i touch the map, the color of the specific layer should be changed.
The issue i am facing is this whenever i touch on the screen i am just getting the the screen pixel of the point where i touched. I want to convert this point into OpenGL coordinates so that i can match this point with the Map drawn and can highlight the selected point.
How to convert this point into OpenGL coordinates?
You need to unproject screen point into an OpenGL world space:
vec3 UnProjectPoint( const vec3& Point, const max4& Projection, const mat4& ModelView )
{
vec4 R( Point, 1.0f );
R.x = 2.0f * R.x - 1.0f;
R.y = 2.0f * R.y - 1.0f;
R.y = -R.y;
R.z = 1.0f;
R = Projection.GetInversed() * R;
R = ModelView.GetInversed() * R;
return R.ToVec3();
}
You can transform screen co-ords to opengl using a transform matrix and your camera position.
See: https://stackoverflow.com/a/11716990/1369222
Better override onTouchEvent(MotionEvent e) of GLSurfaceView class and use the code below in the Renderer class in onSurfaceChanged(GL10 gl, int width, int height) method.
GLU.gluOrtho2D(gl,0,width,0,height);
The above code will map the screen coordinates to the openGL SurfaceView screen and you can put the points easily on the screen. But this will be only in the 2D view.

Mapping A "Touch Region" to a Bitmap

I am trying to gain some more familiarity with the Android SurfaceView class, and in doing so am attempting to create a simple application that allows a user to move a Bitmap around the screen. The troublesome part of this implementation is that I am also including the functionality that the user may drag the image again after it has been placed. In order to do this, I am mapping the bitmap to a simple set of coordinates that define the Bitmap's current location. The region I am mapping the image to, however, does not match up with the image.
The Problem
After placing an image on the SurfaceView using canvas.drawBitmap(), and recording the coordinates of the placed image, the mapping system that I have set up misinterprets the Bitmap's coordinates somehow and does not display correctly. As you can see in this image, I have simply used canvas.drawLine() to draw lines representing the space of my touch region, and the image is always off and to the right:
The Code
Here, I shall provide the relevant code excerpts to help answer my question.
CustomSurface.java
This method encapsulates the drawing of the objects onto the canvas. The comments clarify each element:
public void onDraw(Canvas c){
//Simple black paint
Paint paint = new Paint();
//Draw a white background
c.drawColor(Color.WHITE);
//Draw the bitmap at the coordinates
c.drawBitmap(g.getResource(), g.getCenterX(), g.getCenterY(), null);
//Draws the actual surface that is receiving touch input
c.drawLine(g.left, g.top, g.right, g.top, paint);
c.drawLine(g.right, g.top, g.right, g.bottom, paint);
c.drawLine(g.right, g.bottom, g.left, g.bottom, paint);
c.drawLine(g.left, g.bottom, g.left, g.top, paint);
}
This method encapsulates how I capture touch events:
public boolean onTouchEvent(MotionEvent e){
switch(e.getAction()){
case MotionEvent.ACTION_DOWN:{
if(g.contains((int) e.getX(), (int) e.getY()))
item_selected = true;
break;
}
case MotionEvent.ACTION_MOVE:{
if(item_selected)
g.move((int) e.getX(), (int) e.getY());
break;
}
case MotionEvent.ACTION_UP:{
item_selected = false;
break;
}
default:{
//Do nothing
break;
}
}
return true;
}
Graphic.java
This method is used to construct the Graphic:
//Initializes the graphic assuming the coordinate is in the upper left corner
public Graphic(Bitmap image, int start_x, int start_y){
resource = image;
left = start_x;
top = start_y;
right = start_x + image.getWidth();
bottom = start_y + image.getHeight();
}
This method detects if a user is clicking inside the image:
public boolean contains(int x, int y){
if(x >= left && x <= right){
if(y >= top && y <= bottom){
return true;
}
}
return false;
}
This method is used to move the graphic:
public void move(int x, int y){
left = x;
top = y;
right = x + resource.getWidth();
bottom = y + resource.getHeight();
}
I also have 2 methods that determine the center of the region (used for redrawing):
public int getCenterX(){
return (right - left) / 2 + left;
}
public int getCenterY(){
return (bottom - top) / 2 + top;
}
Any help would be greatly appreciated, I feel as though many other StackOverflow users could really benefit from a solution to this issue.
There's a very nice and thorough explanation of touch/multitouch/gestures on Android Developers blog, that includes free and open source code example at google code.
Please, take a look. If you don't need gestures -- just skip that part, read about touch events only.
This issue ended up being much simpler than I had thought, and after some tweaking I realized that this was an issue of image width compensation.
This line in the above code is where the error stems from:
c.drawBitmap(g.getResource(), g.getCenterX(), g.getCenterY(), null);
As you can tell, I manipulated the coordinates from within the Graphic class to produce the center of the bitmap, and then called canvas.drawBitmap() assuming that it would draw from the center outward.
Obviously, this would not work because the canvas always drops from the top left of an image downwards and to the right, so the solution was simple.
The Solution
Create the touch region with regards to the touch location, but draw it relative to a distance equal to the image width subtracted from the center location in the x and y directions. I basically changed the architecture of the Graphic class to implement a getDrawX() and getDrawY() method that would return the modified x and y coordinates of where it should be drawn in order to have the center_x and center_y values (determined in the constructor) actually appear to be at the center of the region.
It all comes down to the fact that in an attempt to compensate for the way the canvas draws bitmaps, I unfortunately incorporated some bad behaviors and in the end had to handle the offset in a completely different way.

Android Bitmap OnTouch Questions

I've drawn 5 bitmaps from .png files on a canvas - a head, a body and two arms and legs.
How can I detect which of these has been touched on an OnTouch? And, more specifically, can I detect if the OnTouch was within the actual shape of the body part touched?
What I mean is, obviously, the .pngs themselves are rectangular, but does Android know, or can I tell it, to ignore the transparency within the image?
You could get the colour of pixel touched and compare it to the colour of pixel on the background at those co-ords.
EDIT: ok, ignore that, you can't get the colour of a pixel on the canvas, so instead, get the x,y of the touch, check if any of the body part images have been touched, if so, take the x,y of the image from the touch x,y, then get the pixel of the image, which should be transparent or colour.
public boolean onTouchEvent(MotionEvent event)
{
int x = (int) event.getX();
int y = (int) event.getY();
int offsetx, offsety;
for(int i = 0;i<NUM_OF_BODY_PARTS;i++)
{
if(bodyPartRect[i].intersects(x,y,x+1,y+1))
{
offsetx = x - bodyPartRect[i].left;
offsety = y - bodyPartRect[i].top;
if(bodyPartBMP[i].getPixel(offsetx,offsety) == TRANSPARENT)
{
//whatever
}
}
}
}

Categories

Resources