How to close a response body when Tile not exists? - android

I put tiles on top of the map. There are areas where there are no tiles and I get a leak W/OkHttpClient: A connection to http://*****.***/ was leaked. Did you forget to close a response body? . How to fix this error? I know about this solution, but I do not know the exact boundaries, so this method does not fit.
My code:
TileProvider tileProvider = new UrlTileProvider(256, 256) {
#Override
public URL getTileUrl(int x, int y, int zoom) {
String s = String.format("http://****.***/tiles/%d/%d/%d.png", zoom, x, y);
if (!checkTileExists(x, y, zoom)) {
return null;
}
try {
return new URL(s);
} catch (MalformedURLException e) {
throw new AssertionError(e);
}
}
private boolean checkTileExists(int x, int y, int zoom) {
int minZoom = 7;
int maxZoom = 17;
if ((zoom < minZoom || zoom > maxZoom)) {
return false;
}
return true;
}
};
tileOverlay = mMap.addTileOverlay(new TileOverlayOptions().tileProvider(tileProvider));

Related

How to download an offline map for a particular country

I am developing an Android app. And I want a particular area map which I can add in my app and use it for offline navigation. How can it be possible?
I didn’t get a solution. I’m using the below link, but it doesn’t work.
I display a satellite map. I have put tiles files in the asset folder, but it does not work for me.
public class CustomMapTileProvider implements TileProvider {
private static final int TILE_WIDTH = 256;
private static final int TILE_HEIGHT = 256;
private static final int BUFFER_SIZE = 16 * 1024;
private AssetManager mAssets;
public CustomMapTileProvider(AssetManager assets) {
mAssets = assets;
}
private static final SparseArray<Rect> TILE_ZOOMS = new SparseArray<Rect>() {{
put(8, new Rect(135, 180, 135, 181 ));
put(9, new Rect(270, 361, 271, 363 ));
put(10, new Rect(541, 723, 543, 726 ));
put(11, new Rect(1082, 1447, 1086, 1452));
put(12, new Rect(2165, 2894, 2172, 2905));
put(13, new Rect(4330, 5789, 4345, 5810));
put(14, new Rect(8661, 11578, 8691, 11621));
}};
private boolean hasTile(int x, int y, int zoom) {
Rect b = TILE_ZOOMS.get(zoom);
return b == null ? false : (b.left <= x && x <= b.right && b.top <= y && y <= b.bottom);
}
#Override
public Tile getTile(int x, int y, int zoom) {
y = fixYCoordinate(y, zoom);
if (hasTile(x, y, zoom)) {
byte[] image = readTileImage(x, y, zoom);
return image == null ? null : new Tile(TILE_WIDTH, TILE_HEIGHT, image);
}
else {
return NO_TILE;
}
}
private int fixYCoordinate(int y, int zoom) {
int size = 1 << zoom; // size = 2^zoom
return size - 1 - y;
}
private byte[] readTileImage(int x, int y, int zoom) {
InputStream in = null;
ByteArrayOutputStream buffer = null;
try {
in = mAssets.open(getTileFilename(x, y, zoom));
//in = mAssets.open("world_countries.mbtiles");
buffer = new ByteArrayOutputStream();
int nRead;
byte[] data = new byte[BUFFER_SIZE];
while ((nRead = in.read(data, 0, BUFFER_SIZE)) != -1) {
buffer.write(data, 0, nRead);
}
buffer.flush();
return buffer.toByteArray();
}
catch (IOException e) {
e.printStackTrace();
return null;
}
catch (OutOfMemoryError e) {
e.printStackTrace();
return null;
}
finally {
if (in != null)
try {
in.close();
}
catch (Exception ignored) {
}
if (buffer != null)
try {
buffer.close();
}
catch (Exception ignored) {
}
}
}
private String getTileFilename(int x, int y, int zoom) {
return "map/" + zoom + '/' + x + '/' + y + ".png";
}
}

Canvas Touch not working in S4 Android device

The wheel View (CircleView) is working fine in major of devices but this error coming from S4 and Note 3 devices.
The touch is getting deducted but the that not fall under the weidgetregion.
false - 1 has to be true - 1
Region Log is:
My Circle View code is
public class CircleView extends View implements OnTouchListener{
boolean firstTime = false;
private List<CircleViewBean> mMenuEntries = new ArrayList<CircleViewBean>();
private OnCellTouchListener mOnCellTouchListener = null;
public interface OnCellTouchListener {
public void onTouch(Wedge cell);
}
private Shader mShader;
private Paint mPaint = new Paint(Paint.ANTI_ALIAS_FLAG | Paint.FILTER_BITMAP_FLAG);
private float screen_density = getContext().getResources().getDisplayMetrics().density;
//Radius of inner ring size
private int mMinSize = scalePX(60);
//Radius of outer ring size
private int mMaxSize = scalePX(170);
private int mWedgeQty = 6;
//Center X location of Radial Menu
private int xPosition = scalePX(120);
//Center Y location of Radial Menu
private int yPosition = scalePX(120);
int touchIndex = -1;
private Wedge[] mWedges;
private RectF mViewRect = new RectF();
private int scalePX( int dp_size )
{
return (int) (dp_size * screen_density + 0.5f);
}
public CircleView(Context context) {
this(context, null);
}
public CircleView(Context context, AttributeSet attrs) {
super(context, attrs);
HashMap<String, String> device = Constants.getDeviceDetails(getResources().getDisplayMetrics().heightPixels, getResources().getDisplayMetrics().widthPixels);
mMinSize = Integer.parseInt(device.get("in_arc"));
mMaxSize = Integer.parseInt(device.get("out_arc"));
setBackgroundResource(R.drawable.centre_wheel);
}
private void determineWedges() {
int entriesQty = mMenuEntries.size();
if ( entriesQty > 0) {
mWedgeQty = entriesQty;
float degSlice = 360 / mWedgeQty;
float start_degSlice = 270 - (degSlice/2);
//calculates where to put the images
this.mWedges = new Wedge[mWedgeQty];
double mid = 0, min = 0, max = 0;
for(int i = 0; i < this.mWedges.length; i++) {
this.mWedges[i] = new Wedge(xPosition, yPosition, mMinSize, mMaxSize, (i
* degSlice)+start_degSlice, degSlice, mMenuEntries.get(i).getIndex());
mid = this.mWedges[i].midValue = normalizeAngle( ((i * degSlice) + start_degSlice + degSlice) / 2 );
min = normalizeAngle( (i * degSlice) + start_degSlice );
max = normalizeAngle( (i * degSlice) + start_degSlice + degSlice);
this.mWedges[i].minValue = min;
this.mWedges[i].midValue = mid;
this.mWedges[i].maxValue = max;
mViewRect.union( new RectF( mWedges[i].getWedgeRegion().getBounds() ) );
}
mShader = new RadialGradient(xPosition, yPosition, mMaxSize, new int[] { 0xff595756, 0xffCCC5C3, 0xf878280}, null, Shader.TileMode.MIRROR);
invalidate();
}
}
#Override
public void onDraw(Canvas canvas) {
if(!firstTime){
firstTime = true;
this.xPosition = (int) (getWidth()/2f);
this.yPosition = (int) (getHeight()/2f);
determineWedges();
}
canvas.scale(getWidth() / mViewRect.width(), getHeight() / mViewRect.width(), xPosition, yPosition);
//Saving the canvas and later restoring it so only this image will be rotated.
canvas.save(Canvas.MATRIX_SAVE_FLAG);
canvas.restore();
canvas.save();
canvas.restore();
mPaint.setShader( mShader );
}
private double normalizeAngle(double angle) {
if(angle >= 0) {
while( angle > 360 ) {
angle -= 360;
}
}
else {
while( angle < -360) {
angle += 360;
}
}
return angle;
}
protected void onMeasure(int widthMeasureSpec, int heightMeasureSpec) {
// TODO Auto-generated method stub
int wmode = MeasureSpec.getMode(widthMeasureSpec);
int hmode = MeasureSpec.getMode(heightMeasureSpec);
int wsize = MeasureSpec.getSize(widthMeasureSpec);
int hsize = MeasureSpec.getSize(heightMeasureSpec);
int width = (int)mViewRect.width();
int height = (int) mViewRect.height();
if (wmode == MeasureSpec.EXACTLY) {
width = wsize;
}
if (hmode == MeasureSpec.EXACTLY) {
height = hsize;
}
this.setMeasuredDimension(width, height);
invalidate();
}
public class Wedge extends Path {
private int x, y;
private int InnerSize, OuterSize;
private float StartArc;
private float ArcWidth;
private Region mWedgeRegion;
private int index=0;
public double minValue;
public double midValue;
public double maxValue;
private Wedge(int x, int y, int InnerSize, int OuterSize, float StartArc, float ArcWidth, int category) {
super();
this.index = category;
if (StartArc >= 360) {
StartArc = StartArc-360;
}
minValue = midValue = maxValue = 0;
mWedgeRegion = new Region();
this.x = x; this.y = y;
this.InnerSize = InnerSize;
this.OuterSize = OuterSize;
this.StartArc = StartArc;
this.ArcWidth = ArcWidth;
this.buildPath();
}
public int getCategoryIndex(){
return this.index;
}
public String toString() {
return minValue + " " + midValue + " " + maxValue;
}
/**
*
* #return the bottom rect that will be used for intersection
*/
public Region getWedgeRegion() {
return mWedgeRegion;
}
private void buildPath() {
final RectF rect = new RectF();
final RectF rect2 = new RectF();
//Rectangles values
rect.set(this.x-this.InnerSize, this.y-this.InnerSize, this.x+this.InnerSize, this.y+this.InnerSize);
rect2.set(this.x-this.OuterSize, this.y-this.OuterSize, this.x+this.OuterSize, this.y+this.OuterSize);
this.reset();
//this.moveTo(100, 100);
this.arcTo(rect2, StartArc, ArcWidth);
this.arcTo(rect, StartArc+ArcWidth, -ArcWidth);
this.close();
mWedgeRegion.setPath( this, new Region(0,0,480,800) );
}
}
public boolean addMenuEntry(CircleViewBean menuEntry) {
mMenuEntries.add(menuEntry);
return true;
}
#Override
public boolean onTouchEvent(MotionEvent event) {
if(event.getAction() == MotionEvent.ACTION_UP){
if(mOnCellTouchListener!=null && touchIndex >-1){
int i=0;
for(Wedge day : mWedges) {
if(day.getWedgeRegion().getBounds().contains((int)event.getX(), (int)event.getY()) && touchIndex==i) {
mOnCellTouchListener.onTouch(mWedges[touchIndex]);
break;
}
i++;
}
}
}else if(event.getAction() == MotionEvent.ACTION_DOWN || event.getAction() == MotionEvent.ACTION_MOVE){
int i=0;
for(Wedge day : mWedges) {
if(day.getWedgeRegion().getBounds().contains((int)event.getX(), (int)event.getY())) {
touchIndex = i;
setBackgroundResource(mMenuEntries.get(touchIndex).getIcon());
}
i++;
}
}
return true;
}
public void setOnCellTouchListener(OnCellTouchListener p) {
mOnCellTouchListener = p;
}
public boolean onTouch(View v, MotionEvent event) {
return false;
}
}
First of all, look at line 307. Learn how to read crash logs because it says exactly on what line the crash is, and then it shouldn't be too hard too determine what is wrong.
Not knowing what line it is I guess that mWedges might be null. in the onTouch you do for(Wedge day : mWedges) but it is not guaranteed that is isn't null there. You should check before you do that if it is null.
You put it to a non null value in determineWedges but only when there is at least 1 mMenuEntries. So when there are no entries when you do an onTouch it will crash.
At last i found my mistake in this code that it is working in mdpi and htpi and not in xxhdpi the reason is
mWedgeRegion.setPath( this, new Region(0,0,480,800) );
The bound exceeds the circle size, i mean the xxhdpi draws an circle with 1000x1000 values, so i made this too like this (max values)
mWedgeRegion.setPath( this, new Region(0,0,720,1200) )

TileProvider using local tiles

I would like to use the new TileProvider functionality of the latest Android Maps API (v2) to overlay some custom tiles on the GoogleMap. However as my users will not have internet a lot of the time, I want to keep the tiles stored in a zipfile/folder structure on the device. I will be generating my tiles using Maptiler with geotiffs. My questions are:
What would be the best way to store the tiles on the device?
How would I go about creating a TileProvider that returns local tiles?
You can put tiles into assets folder (if it is acceptable for the app size) or download them all on first start and put them into device storage (SD card).
You can implement TileProvider like this:
public class CustomMapTileProvider implements TileProvider {
private static final int TILE_WIDTH = 256;
private static final int TILE_HEIGHT = 256;
private static final int BUFFER_SIZE = 16 * 1024;
private AssetManager mAssets;
public CustomMapTileProvider(AssetManager assets) {
mAssets = assets;
}
#Override
public Tile getTile(int x, int y, int zoom) {
byte[] image = readTileImage(x, y, zoom);
return image == null ? null : new Tile(TILE_WIDTH, TILE_HEIGHT, image);
}
private byte[] readTileImage(int x, int y, int zoom) {
InputStream in = null;
ByteArrayOutputStream buffer = null;
try {
in = mAssets.open(getTileFilename(x, y, zoom));
buffer = new ByteArrayOutputStream();
int nRead;
byte[] data = new byte[BUFFER_SIZE];
while ((nRead = in.read(data, 0, BUFFER_SIZE)) != -1) {
buffer.write(data, 0, nRead);
}
buffer.flush();
return buffer.toByteArray();
} catch (IOException e) {
e.printStackTrace();
return null;
} catch (OutOfMemoryError e) {
e.printStackTrace();
return null;
} finally {
if (in != null) try { in.close(); } catch (Exception ignored) {}
if (buffer != null) try { buffer.close(); } catch (Exception ignored) {}
}
}
private String getTileFilename(int x, int y, int zoom) {
return "map/" + zoom + '/' + x + '/' + y + ".png";
}
}
And now you can use it with your GoogleMap instance:
private void setUpMap() {
mMap.setMapType(GoogleMap.MAP_TYPE_NONE);
mMap.addTileOverlay(new TileOverlayOptions().tileProvider(new CustomMapTileProvider(getResources().getAssets())));
CameraUpdate upd = CameraUpdateFactory.newLatLngZoom(new LatLng(LAT, LON), ZOOM);
mMap.moveCamera(upd);
}
In my case I also had a problem with y coordinate of tiles generated by MapTiler, but I managed it by adding this method into CustomMapTileProvider:
/**
* Fixing tile's y index (reversing order)
*/
private int fixYCoordinate(int y, int zoom) {
int size = 1 << zoom; // size = 2^zoom
return size - 1 - y;
}
and callig it from getTile() method like this:
#Override
public Tile getTile(int x, int y, int zoom) {
y = fixYCoordinate(y, zoom);
...
}
[Upd]
If you know exac area of your custom map, you should return NO_TILE for missing tiles from getTile(...) method.
This is how I did it:
private static final SparseArray<Rect> TILE_ZOOMS = new SparseArray<Rect>() {{
put(8, new Rect(135, 180, 135, 181 ));
put(9, new Rect(270, 361, 271, 363 ));
put(10, new Rect(541, 723, 543, 726 ));
put(11, new Rect(1082, 1447, 1086, 1452));
put(12, new Rect(2165, 2894, 2172, 2905));
put(13, new Rect(4330, 5789, 4345, 5810));
put(14, new Rect(8661, 11578, 8691, 11621));
}};
#Override
public Tile getTile(int x, int y, int zoom) {
y = fixYCoordinate(y, zoom);
if (hasTile(x, y, zoom)) {
byte[] image = readTileImage(x, y, zoom);
return image == null ? null : new Tile(TILE_WIDTH, TILE_HEIGHT, image);
} else {
return NO_TILE;
}
}
private boolean hasTile(int x, int y, int zoom) {
Rect b = TILE_ZOOMS.get(zoom);
return b == null ? false : (b.left <= x && x <= b.right && b.top <= y && y <= b.bottom);
}
The possibility of adding custom tileproviders in the new API (v2) is great, however you mention that your users are mostly offline. If a user is offline when first launching the application you cannot use the new API as it requires the user to be online (at least once to build a cache it seems) - otherwise it will only display a black screen.
EDIT 2/22-14:
I recently came across the same issue again - having custom tiles for an app which had to work offline. Solved it by adding an invisible (w/h 0/0) mapview to an initial view where the client had to download some content. This seems to work, and allows me to use a mapview in offline mode later on.
This is how I implemented this in Kotlin:
class LocalTileProvider : TileProvider
{
override fun getTile(x: Int, y: Int, zoom: Int): Tile?
{
// This is for my case only
if (zoom > 11)
return TileProvider.NO_TILE
val path = "${getImagesFolder()}/tiles/$zoom/$x/$y/filled.png"
val file = File(path)
if (!file.exists())
return TileProvider.NO_TILE
return try {
Tile(TILE_SIZE, TILE_SIZE, file.readBytes())
}
catch (e: Exception)
{
TileProvider.NO_TILE
}
}
companion object {
const val TILE_SIZE = 512
}
}

how to bound set of points (custom shape) and touch event on it android

i am able to get my bitmap set of points (as an array) using this link
now my question is how can i bound these points as shape/region. Means when user touched on area of my bounded points, i want to move objects(shape) according to that. Above link return points of colored bitmap (it remove transparent part), only colored part points are return as an array.
This is what my code :
1) CustomSahpe.java
public class CustomShape {
private final Context context;
Bitmap bitmap;
int width, height;
int[] pixels;
private final ArrayList<Point> points = new ArrayList<Point>();
public CustomShape(Context context) {
// TODO Auto-generated constructor stub
// super(context);
this.context = context;
bitmap = BitmapFactory.decodeResource(context.getResources(),
R.drawable.ic_menu_balloon);
width = bitmap.getWidth();
height = bitmap.getHeight();
pixels = new int[width * height];
bitmap.getPixels(pixels, 0, width, 0, 0, width, height);
getActualBitmap();
}
public ArrayList<Point> getPoints(){
return points;
}
public void getActualBitmap() {
for (int x = 0; x < width; x+=2) {
int firstY = -1, lastY = -1;
for (int y = 0; y < height; y+=2) {
boolean transparent = (pixels[y * width + x] == Color.TRANSPARENT);
if (!transparent) {
if (firstY == -1) {
firstY = y;
}
lastY = y;
}
}
if (firstY != -1) {
points.add(new Point(x, firstY));
points.add(new Point(x, lastY));
}
}
}
}
2) MyShapre.java
class MyShape{
CustomShape customShape ;
Point points[];
private int x, y;
Path path = new Path();
public MyShape(Context context) {
customShape = new CustomShape(ScaleTestActivity.this);
points = new Point[customShape.getPoints().size()];
for(int i=0;i<customShape.getPoints().size();i++){
points[i] = new Point();
points[i] = customShape.getPoints().get(i);
}
}
public Path getPath(){
return path;
}
public void onDraw(Canvas canvas) {
// TODO Auto-generated method stub
Paint paint = new Paint();
paint.setColor(Color.WHITE);
for(int i =0 ;i<points.length;i++){
Point point = new Point(points[i].x + getX(), points[i].y + getY());
path.lineTo(points[i].x, points[i].y);
canvas.drawPoint(point.x,point.y,paint);
}
}
public void setX(int x) {
this.x = x;
}
public int getX() {
return x;
}
public void setY(int y) {
this.y = y;
}
public int getY() {
return y;
}
}
}
3) MainPanel.java
class MainPanel extends View{
Context context;
MyShape myShape;
boolean flag = false;
public MainPanel(Context context) {
super(context);
this.context = context;
myShape = new MyShape(context);
}
#Override
protected void onDraw(Canvas canvas) {
super.onDraw(canvas);
canvas.drawColor(Color.RED);
myShape.onDraw(canvas);
}
#Override
public boolean onTouchEvent(MotionEvent event) {
// TODO Auto-generated method stub
int x,y;
x = (int)event.getX();
y = (int)event.getY();
Point point = new Point(x, y);
switch (event.getAction()) {
case MotionEvent.ACTION_DOWN:
myShape.setX(x);
myShape.setY(y);
RectF rectF = new RectF();
Path path = myShape.getPath();
path.computeBounds(rectF, true);
Region region = new Region();
region.setPath(path, new Region((int) rectF.left, (int) rectF.top, (int) rectF.right, (int) rectF.bottom));
if(region.contains(x,y)){
flag = true;
Log.i("System out","onDown");
}
break;
case MotionEvent.ACTION_MOVE:
Log.i("System out","onMove : "+flag);
if(flag){
myShape.setX(x);
myShape.setY(y);
Log.i("System out","onMove");
}
break;
case MotionEvent.ACTION_UP:
// myShape.setX(x);
// myShape.setY(y);
flag = false;
Log.i("System out","onUp");
break;
default:
break;
}
invalidate();
return true;
}
}
4) ScaleTestActivity.java
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(new MainPanel(this));
}
I use a Polygon class to detect touches on rotated bitmaps. It's based mostly on information and code from this site http://alienryderflex.com/polygon/. This should work with your code.
public class Polygon {
// Polygon coodinates.
private final int[] polyY, polyX;
// Number of sides in the polygon.
private final int polySides;
/**
* Default constructor.
* #param px Polygon y coods.
* #param py Polygon x coods.
* #param ps Polygon sides count.
*/
public Polygon( final int[] px, final int[] py, final int ps ) {
polyX = px;
polyY = py;
polySides = ps;
}
/**
* Checks if the Polygon contains a point.
* #see "http://alienryderflex.com/polygon/"
* #param x Point horizontal pos.
* #param y Point vertical pos.
* #return Point is in Poly flag.
*/
public boolean contains( final float x, final float y ) {
boolean oddTransitions = false;
for( int i = 0, j = polySides -1; i < polySides; j = i++ ) {
if( ( polyY[ i ] < y && polyY[ j ] >= y ) || ( polyY[ j ] < y && polyY[ i ] >= y ) ) {
if( polyX[ i ] + ( y - polyY[ i ] ) / ( polyY[ j ] - polyY[ i ] ) * ( polyX[ j ] - polyX[ i ] ) < x ) {
oddTransitions = !oddTransitions;
}
}
}
return oddTransitions;
}
}
You could add this constructor to help you convert a Point array to a Polygon object.
public Polygon(Point[] points){
polySides = points.length;
polyY = new int[polySides];
polyX = new int[polySides];
for(int i = 0; i < polySides; i++){
polyY[i] = points[i].y;
polyX[i] = points[i].x;
}
}
You might be able to use it in your MyShape class with this method.
public boolean isTouched(final float X, final float Y){
final Polygon p = new Polygon(points);
return p.contains(X, Y);
}
Now if you have an odd shape you should be able to detect exactly if the use touches it. I have used this method many times.
Are you looking for a way to tell whether a touch event falls on the non-transparent portion of your drawn bitmap? If so, why don't you just map the touch coordinate to the proper pixel on the bitmap and test the color?
And if that's the case, then you can skip all the path clipping stuff, since the link you posted was only doing that to overcome emulator inefficiencies.
It's a bit complicated so I am not going to provide the full source but I will give you an idea.
You need to transfer your shape in to a triangles collection, then on touch find the nearest point of your shape and check if your are inside this point triangle.
For searching and sorting points you can use red-black-red tree structure.
The search algorithm eventually should be at O(log(N)) and creating shape structure should be O(N*Log(N))

Full bitmap is not being populated with pixel data after parallelization.

I have an app that processes a bitmap with a spherize distortion. You can touch the screen and set the radius of a circle that will contain the distortion. Once the distort button is pressed a subset bitmap is created the same size of the radius and this subset bitmap is sent for processing. Once the subset is distorted it is put back on the original bitmap as an overlay using the x,y cords from the original touch event.
Everything works fine apart from that the last line of pixels (across the bottom) of the subset bitmap is not populated with pixel data. It looks like there is a black line at the bottom of the subset bitmap. The distortion class uses parallel programming. This checks the hardware at runtime to find out how many processor are available and the splits the bitmap up over the processor accordingly. I've had help with the parallelization and not sure how to find out why the black line is present. The looping seems to be in order, any ideas? Thanks in advance Matt.
.
import java.util.concurrent.Callable;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.concurrent.FutureTask;
import android.graphics.Bitmap;
import android.os.Debug;
import android.util.Log;
public class MultiRuntimeProcessorFilter {
private static final String TAG = "mrpf";
private int x = 0;
private Bitmap input = null;
private int radius;
public void createBitmapSections(int nOp, int[] sections){
int processors = nOp;
int jMax = input.getHeight();
int aSectionSize = (int) Math.ceil(jMax/processors);
Log.e(TAG, "++++++++++ sections size = "+aSectionSize);
int k = 0;
for(int h=0; h<processors+1; h++){
sections[h] = k;
k+= aSectionSize;
}
}// end of createBitmapSections()
#SuppressWarnings("unchecked")
public Bitmap barrel (Bitmap input, float k, int r){
this.radius = r;
this.input = input;
int []arr = new int[input.getWidth()*input.getHeight()];
int nrOfProcessors = Runtime.getRuntime().availableProcessors();
int[] sections = new int[nrOfProcessors+1];
createBitmapSections(nrOfProcessors,sections);
ExecutorService threadPool = Executors.newFixedThreadPool(nrOfProcessors);
for(int g=0; g<sections.length;g++){
Log.e(TAG, "++++++++++ sections= "+sections[g]);
}
// ExecutorService threadPool = Executors.newFixedThreadPool(nrOfProcessors);
Object[] task = new Object[nrOfProcessors];
for(int z = 0; z < nrOfProcessors; z++){
task[z] = (FutureTask<PartialResult>) threadPool.submit(new PartialProcessing(sections[z], sections[z+1] - 1, input, k));
Log.e(TAG, "++++++++++ task"+z+"= "+task[z].toString());
}
PartialResult[] results = new PartialResult[nrOfProcessors];
try{
for(int t = 0; t < nrOfProcessors; t++){
results[t] = ((FutureTask<PartialResult>) task[t]).get();
results[t].fill(arr);
}
}catch(Exception e){
e.printStackTrace();
}
Bitmap dst2 = Bitmap.createBitmap(arr,input.getWidth(),input.getHeight(),input.getConfig());
return dst2;
}//end of barrel()
public class PartialResult {
int startP;
int endP;
int[] storedValues;
public PartialResult(int startp, int endp, Bitmap input){
this.startP = startp;
this.endP = endp;
this.storedValues = new int[input.getWidth()*input.getHeight()];
}
public void addValue(int p, int result) {
storedValues[p] = result;
}
public void fill(int[] arr) {
for (int p = startP; p < endP; p++){
for(int b=0;b<radius;b++,x++)
arr[x] = storedValues[x];
}
Log.e(TAG, "++++++++++ x ="+x);
}
}//end of partialResult
public class PartialProcessing implements Callable<PartialResult> {
int startJ;
int endJ;
private int[] scalar;
private float xscale;
private float yscale;
private float xshift;
private float yshift;
private float thresh = 1;
private int [] s1;
private int [] s2;
private int [] s3;
private int [] s4;
private int [] s;
private Bitmap input;
private float k;
public PartialProcessing(int startj, int endj, Bitmap input, float k) {
this.startJ = startj;
this.endJ = endj;
this.input = input;
this.k = k;
s = new int[4];
scalar = new int[4];
s1 = new int[4];
s2 = new int[4];
s3 = new int[4];
s4 = new int[4];
}
int [] getARGB(Bitmap buf,int x, int y){
int rgb = buf.getPixel(y, x); // Returns by default ARGB.
// int [] scalar = new int[4];
// scalar[0] = (rgb >>> 24) & 0xFF;
scalar[1] = (rgb >>> 16) & 0xFF;
scalar[2] = (rgb >>> 8) & 0xFF;
scalar[3] = (rgb >>> 0) & 0xFF;
return scalar;
}
float getRadialX(float x,float y,float cx,float cy,float k){
x = (x*xscale+xshift);
y = (y*yscale+yshift);
float res = x+((x-cx)*k*((x-cx)*(x-cx)+(y-cy)*(y-cy)));
return res;
}
float getRadialY(float x,float y,float cx,float cy,float k){
x = (x*xscale+xshift);
y = (y*yscale+yshift);
float res = y+((y-cy)*k*((x-cx)*(x-cx)+(y-cy)*(y-cy)));
return res;
}
float calc_shift(float x1,float x2,float cx,float k){
float x3 = (float)(x1+(x2-x1)*0.5);
float res1 = x1+((x1-cx)*k*((x1-cx)*(x1-cx)));
float res3 = x3+((x3-cx)*k*((x3-cx)*(x3-cx)));
if(res1>-thresh && res1 < thresh)
return x1;
if(res3<0){
return calc_shift(x3,x2,cx,k);
}
else{
return calc_shift(x1,x3,cx,k);
}
}
void sampleImage(Bitmap arr, float idx0, float idx1)
{
// s = new int [4];
if(idx0<0 || idx1<0 || idx0>(arr.getHeight()-1) || idx1>(arr.getWidth()-1)){
s[0]=0;
s[1]=0;
s[2]=0;
s[3]=0;
return;
}
float idx0_fl=(float) Math.floor(idx0);
float idx0_cl=(float) Math.ceil(idx0);
float idx1_fl=(float) Math.floor(idx1);
float idx1_cl=(float) Math.ceil(idx1);
s1 = getARGB(arr,(int)idx0_fl,(int)idx1_fl);
s2 = getARGB(arr,(int)idx0_fl,(int)idx1_cl);
s3 = getARGB(arr,(int)idx0_cl,(int)idx1_cl);
s4 = getARGB(arr,(int)idx0_cl,(int)idx1_fl);
float x = idx0 - idx0_fl;
float y = idx1 - idx1_fl;
// s[0]= (int) (s1[0]*(1-x)*(1-y) + s2[0]*(1-x)*y + s3[0]*x*y + s4[0]*x*(1-y));
s[1]= (int) (s1[1]*(1-x)*(1-y) + s2[1]*(1-x)*y + s3[1]*x*y + s4[1]*x*(1-y));
s[2]= (int) (s1[2]*(1-x)*(1-y) + s2[2]*(1-x)*y + s3[2]*x*y + s4[2]*x*(1-y));
s[3]= (int) (s1[3]*(1-x)*(1-y) + s2[3]*(1-x)*y + s3[3]*x*y + s4[3]*x*(1-y));
}
#Override public PartialResult call() {
PartialResult partialResult = new PartialResult(startJ, endJ,input);
float centerX=input.getWidth()/2; //center of distortion
float centerY=input.getHeight()/2;
int width = input.getWidth(); //image bounds
int height = input.getHeight();
xshift = calc_shift(0,centerX-1,centerX,k);
float newcenterX = width-centerX;
float xshift_2 = calc_shift(0,newcenterX-1,newcenterX,k);
yshift = calc_shift(0,centerY-1,centerY,k);
float newcenterY = height-centerY;
float yshift_2 = calc_shift(0,newcenterY-1,newcenterY,k);
xscale = (width-xshift-xshift_2)/width;
yscale = (height-yshift-yshift_2)/height;
int p = startJ*radius;
int origPixel = 0;
int color = 0;
int i;
for (int j = startJ; j < endJ; j++){
for ( i = 0; i < width; i++, p++){
origPixel = input.getPixel(i,j);
float x = getRadialX((float)j,(float)i,centerX,centerY,k);
float y = getRadialY((float)j,(float)i,centerX,centerY,k);
sampleImage(input,x,y);
color = ((s[1]&0x0ff)<<16)|((s[2]&0x0ff)<<8)|(s[3]&0x0ff);
//Log.e(TAG, "radius = "+radius);
if(((i-centerX)*(i-centerX) + (j-centerY)*(j-centerY)) <= radius*(radius/4)){
partialResult.addValue(p, color);
}else{
partialResult.addValue(p, origPixel);
}
}//end of inner for
}//end of outer for
return partialResult;
}//end of call
}// end of partialprocessing
}//end of MultiProcesorFilter
.
[update] I'll post the view class that calls the barrel method. this class gets the touch events and sets the radius of the distortion prior to processing. You can see more how everything is set up before the distortion is applied.
public class TouchView extends View{
private File tempFile;
private byte[] imageArray;
private Bitmap bgr;
private Bitmap crop;
private Bitmap crop2;
private Bitmap overLay;
private Bitmap overLay2;
private Paint pTouch;
private float centreX;
private float centreY;
private float centreA;
private float centreB;
private Boolean xyFound = false;
private Boolean abFound = false;
private int Progress = 1;
private static final String TAG = "*********TouchView";
private Filters f = null;
private Filters f2 = null;
private boolean bothCirclesInPlace = false;
private MultiProcessorFilter mpf;
private MultiProcessorFilter mpf2;
private MultiRuntimeProcessorFilter mrpf;
private MultiRuntimeProcessorFilter mrpf2;
private int radius = 50;
protected boolean isLocked = false;
protected boolean isSaved = false;
protected byte [] data;
private float distance1;
private float distance2;
public TouchView(Context context) {
super(context);
}
public TouchView(Context context, AttributeSet attr) {
super(context,attr);
Log.e(TAG, "++++++++++ inside touchview constructor");
tempFile = new File(Environment.getExternalStorageDirectory().
getAbsolutePath() + "/"+"image.jpeg");
imageArray = new byte[(int)tempFile.length()];
// new Thread(new Runnable() {
// public void run() {
try{
InputStream is = new FileInputStream(tempFile);
BufferedInputStream bis = new BufferedInputStream(is);
DataInputStream dis = new DataInputStream(bis);
int i = 0;
while (dis.available() > 0 ) {
imageArray[i] = dis.readByte();
i++;
}
dis.close();
} catch (Exception e) {
e.printStackTrace();
}
// }
// }).start();
Bitmap bm = BitmapFactory.decodeByteArray(imageArray, 0, imageArray.length);
if(bm == null){
Log.e(TAG, "bm = null");
}else{
Log.e(TAG, "bm = not null");
}
bgr = bm.copy(bm.getConfig(), true);
overLay = null;
overLay2 = null;
bm.recycle();
pTouch = new Paint(Paint.ANTI_ALIAS_FLAG);
// pTouch.setXfermode(new PorterDuffXfermode(Mode.SRC_OUT));
pTouch.setColor(Color.RED);
pTouch.setStyle(Paint.Style.STROKE);
}// end of touchView constructor
public void findCirclePixels(){
//f = new Filters();
// f2 = new Filters();
//mpf = new MultiProcessorFilter();
//mpf2 = new MultiProcessorFilter();
mrpf = new MultiRuntimeProcessorFilter();
mrpf2 = new MultiRuntimeProcessorFilter();
crop = Bitmap.createBitmap(bgr,Math.max((int)centreX-radius,0),Math.max((int)centreY-radius,0),radius*2,radius*2);
crop2 = Bitmap.createBitmap(bgr,Math.max((int)centreA-radius,0),Math.max((int)centreB-radius,0),radius*2,radius*2);
new Thread(new Runnable() {
public void run() {
float prog = (float)Progress/150001;
// final Bitmap bgr3 = f.barrel(crop,prog);
// final Bitmap bgr4 = f2.barrel(crop2,prog);
//final Bitmap bgr3 = mpf.barrel(crop,prog);
// final Bitmap bgr4 = mpf2.barrel(crop2,prog);
final Bitmap bgr3 = mrpf.barrel(crop,prog,radius*2);
final Bitmap bgr4 = mrpf2.barrel(crop2,prog, radius*2);
TouchView.this.post(new Runnable() {
public void run() {
TouchView.this.overLay = bgr3;
TouchView.this.overLay2 = bgr4;
TouchView.this.invalidate();
}
});
}
}).start();
}// end of findCirclePixels()
#Override
public boolean onTouchEvent(MotionEvent ev) {
switch (ev.getAction()) {
case MotionEvent.ACTION_DOWN: {
int w = getResources().getDisplayMetrics().widthPixels;
int h = getResources().getDisplayMetrics().heightPixels;
if(ev.getX() <radius || ev.getX() > w - radius ){
// Log.e(TAG, "touch event is too near width edge!!!!!!!!!!");
showToastMessage("You touched too near the screen edge");
break;
}
if(ev.getY() <radius || ev.getY() > h - radius ){
// Log.e(TAG, "touch event is too near height edge!!!!!!!!!!");
showToastMessage("You touched too near the screen edge");
break;
}
distance1 = (float) Math.sqrt(Math.pow(ev.getX() - centreX, 2.0) + Math.pow(ev.getY() - centreY, 2.0));
distance2 = (float) Math.sqrt(Math.pow(ev.getX() - centreA, 2.0) + Math.pow(ev.getY() - centreB, 2.0));
Log.e(TAG, "dist1 = "+distance1 +" distance2 = " + distance2);
if(isLocked == false){
if(abFound == false){
centreA = (int) ev.getX();
centreB = (int) ev.getY();
abFound = true;
invalidate();
}
if(xyFound == false){
centreX = (int) ev.getX();
centreY = (int) ev.getY();
xyFound = true;
invalidate();
}
if(abFound == true && xyFound == true){
bothCirclesInPlace = true;
}
break;
}
}
case MotionEvent.ACTION_MOVE: {
if(isLocked == false){
/*if(xyFound == false){
centreX = (int) ev.getX()-70;
centreY = (int) ev.getY()-70;
xyFound = true;
}else{
centreA = (int) ev.getX()-70;
centreB = (int) ev.getY()-70;
bothCirclesInPlace = true;
invalidate();
}
*/
if(distance1 < distance2){
centreX = (int) ev.getX();
centreY = (int) ev.getY();
xyFound = true;
invalidate();
}else{
centreA = (int) ev.getX();
centreB = (int) ev.getY();
bothCirclesInPlace = true;
invalidate();
}
break;
}
}
case MotionEvent.ACTION_UP:
break;
}
return true;
}//end of onTouchEvent
public void initSlider(final HorizontalSlider slider)
{
slider.setOnProgressChangeListener(changeListener);
}
private OnProgressChangeListener changeListener = new OnProgressChangeListener() {
#Override
public void onProgressChanged(View v, int progress) {
if(isLocked == true){
setProgress(progress);
}else{
Toast.makeText(TouchView.this.getContext(), "press lock before applying distortion ", Toast.LENGTH_SHORT).show();
}
}
};
#Override
public void onDraw(Canvas canvas){
super.onDraw(canvas);
Log.e(TAG, "******about to draw bgr ");
canvas.drawBitmap(bgr, 0, 0, null);
if(isSaved == false){
if (isLocked == true && bothCirclesInPlace == true){
if(overLay != null)
canvas.drawBitmap(overLay, centreX-radius, centreY-radius, null);
if(overLay2 != null)
canvas.drawBitmap(overLay2, centreA-radius, centreB-radius, null);
}
if(bothCirclesInPlace == true && isLocked == false){
canvas.drawCircle(centreX, centreY, radius,pTouch);
canvas.drawCircle(centreA, centreB, radius,pTouch);
}
}else{
// String mFilePath : Absolute Path of the file to be saved
// Bitmap mBitmap1 : First bitmap. This goes as background.
// Bitmap mCBitmap : Bitmap associated with the Canvas. All draws on the canvas are drawn into this bitmap.
// Bitmap mBitmap2 : Second bitmap. This goes on top of first (in this example serves as foreground.
// Paint mPaint1 : Paint to draw first bitmap
// Paint mPaint2 : Paint to draw second bitmap on top of first bitmap
isSaved = false;
Bitmap mCBitmap = Bitmap.createBitmap(bgr.getWidth(), bgr.getHeight(), bgr.getConfig());
Canvas tCanvas = new Canvas(mCBitmap);
tCanvas.drawBitmap(bgr, 0, 0, null);
if(overLay != null)
tCanvas.drawBitmap(overLay, centreX-radius, centreY-radius, null);
if(overLay2 != null)
tCanvas.drawBitmap(overLay2, centreA-radius, centreB-radius, null);
canvas.drawBitmap(mCBitmap, 0, 0, null);
ByteArrayOutputStream bos = new ByteArrayOutputStream();
mCBitmap.compress(CompressFormat.JPEG, 100 /*ignored for PNG*/, bos);
data = bos.toByteArray();
try {
bos.flush();
bos.close();
} catch (IOException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
}
try {
bos.flush();
bos.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
if ( data == null){
Log.e(TAG, "data in touchview before save clicked is null");
}else{
Log.e(TAG, "data in touchview before saved clicked is not null");
}
}
}//end of onDraw
protected void setProgress(int progress2) {
Log.e(TAG, "***********in SETPROGRESS");
this.Progress = progress2;
findCirclePixels();
}
public int getRadius() {
return radius;
}
public void setRadius(int r) {
radius = r;
invalidate();
}
public void showToastMessage(String mess){
Toast.makeText(TouchView.this.getContext(), mess.toString(), Toast.LENGTH_SHORT).show();
}
}
My guess would be that when the bottom of the image is processed, the operation operates partially on the input image, and partially outside of the image, due to the radius in your barrel method. Edges can often cause issues when operating outside the bounds of an actual image, giving 0 as a result, which can cause a black line...
I suggest to try to increase the size of your image:
#SuppressWarnings("unchecked")
public Bitmap barrel (Bitmap input, float k, int r){
this.radius = r;
this.input = input;
// Add an offset to the width and height equal to the radius
// To avoid performing processing outside the bounds of the input image
int []arr = new int[(input.getWidth() + this.radius) * (input.getHeight() + this.radius)];
// Continue...
Again, this is my first guess, and I have no time to check right now, but surely, investigating the edge first, would be my recommendation.
just a guess, what happen if you put this
BitmapDrawable bmpd = new BitmapDrawable(input);
int []arr = new int[(bmpd.getIntrinsicWidth() + this.radius) * (bmpd. getIntrinsicHeight() + this.radius)];
Your problem most likely has to do with your assumed coordinate system of the image and of the spherize algorithm.
See MathWorks Image Coordinate Systems
I expect that you are treating your input/output images according to the Pixel Indices method, but the spherize algorithm is processing your data using the Spatial Coordinate System. This often causes the outermost border of a processed image to be blank because the algorithm has translated your image up and to the left by 0.5 pixels. Coordinate 3 in the original system is now 3.5 in the new system and has fallen outside the bounds of computation.
This is actually a huge problem in 2D to 3D image processing algorithms as the projection between the two spaces is not exactly trivial and tiny implementation differences cause noticeable problems. Notice how the Pixel Indices coordinate system is 3x3, but the Spatial Coordinate system is essentially 4x4.
Try setting your spherize barrel to be width+1/height+1 instead of width/height and see if that fills out your missing row.

Categories

Resources