I want to connect 2 images in android. Example,
I want them to be like this,
I have already done it by hard coding. First I find the upper left of the green image and then find the the most lower left point of the green side.I do it with touch event's event.gerRawX() and event.getRawY() parameters. Now I know the distance in x and y between those two points. I do the similar thing for the red one too. Now when green piece is moved close to the red I just calculate if upper left point of red piece is near to the lower left of the green piece. if so I translate the green one/ red one to the other one. But this hard coded calculation should fail for same size tablet or phone with different resolution. I just want to know how do I generalize the solution. Thanks.
Edit: My GameActivity and ImageInfo class where I try to connect both images. I have actually 6 images like this. And i wanto connect them. Like image 1 will connect to image 2 and image 2 to 3 and so on.
GameActivity class
package com.example.JigSawPuzzle;
import com.example.test.R;
import android.annotation.SuppressLint;
import android.app.Activity;
import android.app.AlertDialog;
import android.app.Dialog;
import android.content.Context;
import android.content.DialogInterface;
import android.content.Intent;
import android.content.pm.ActivityInfo;
import android.content.res.Configuration;
import android.graphics.Point;
import android.media.AudioManager;
import android.media.SoundPool;
import android.os.Bundle;
import android.util.DisplayMetrics;
import android.util.Log;
import android.view.Display;
import android.view.MotionEvent;
import android.view.Surface;
import android.view.View;
import android.view.View.OnClickListener;
import android.view.View.OnTouchListener;
import android.view.ViewGroup.LayoutParams;
import android.widget.Button;
import android.widget.ImageView;
import android.widget.RelativeLayout;
import android.widget.Toast;
#SuppressLint("NewApi")
public class GameActivity extends Activity implements OnTouchListener
{
static double screenInches;
int touchStartX = 0;
int touchStartY = 0;
int diffX = 0;
int diffY = 0;
int attachedPieces=0;
int imageX = 0;
int imageY = 0;
int height,width;
boolean [] flag = new boolean[7];
boolean isPortait;
RelativeLayout relataivelayoutLayout;
RelativeLayout.LayoutParams paramsA,paramsB,paramsC,paramsD,paramsE,paramsF;
ImageView imageA,imageB,imageC,imageD,imageE,imageF;
ImageInfo [] imageInfoArray = new ImageInfo[7];
// added for sound effect
private SoundPool soundPool;
private int correctPieceAttachSoundId,gameFinishSoundId;
private boolean loaded = false;
//for 10inch landscape height = 752, width = 1280; portrait height = 1232 width = 800
#Override
protected void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
isPortait = (getResources().getConfiguration().orientation == Configuration.ORIENTATION_LANDSCAPE) ? false : true;
Display display = getWindowManager().getDefaultDisplay();
height = display.getHeight();
width = display.getWidth();
imageA = new ImageView(this);
imageB = new ImageView(this);
imageC = new ImageView(this);
imageD = new ImageView(this);
imageE = new ImageView(this);
imageF = new ImageView(this);
DisplayMetrics dm = new DisplayMetrics();
getWindowManager().getDefaultDisplay().getMetrics(dm);
double x = Math.pow(dm.widthPixels/dm.xdpi,2);
double y = Math.pow(dm.heightPixels/dm.ydpi,2);
screenInches = Math.sqrt(x+y);
if(screenInches>9.0)
{
imageA.setBackgroundResource(R.drawable.a);
imageB.setBackgroundResource(R.drawable.b);
imageC.setBackgroundResource(R.drawable.c);
imageD.setBackgroundResource(R.drawable.d);
imageE.setBackgroundResource(R.drawable.e);
imageF.setBackgroundResource(R.drawable.f);
}
else
{
imageA.setBackgroundResource(R.drawable.aa);
imageB.setBackgroundResource(R.drawable.bb);
imageC.setBackgroundResource(R.drawable.cc);
imageD.setBackgroundResource(R.drawable.dd);
imageE.setBackgroundResource(R.drawable.ee);
imageF.setBackgroundResource(R.drawable.ff);
}
imageA.setId(1);
imageA.setTag("a");
paramsA = new RelativeLayout.LayoutParams(LayoutParams.WRAP_CONTENT,LayoutParams.WRAP_CONTENT); //The WRAP_CONTENT parameters can be replaced by an absolute width and height or the FILL_PARENT option)
imageA.setLayoutParams(paramsA);
imageA.setOnTouchListener(this);
//Log.d("GameActivity", "")
imageB.setId(2);
imageB.setTag("b");
paramsB = new RelativeLayout.LayoutParams(LayoutParams.WRAP_CONTENT,LayoutParams.WRAP_CONTENT); //The WRAP_CONTENT parameters can be replaced by an absolute width and height or the FILL_PARENT option)
imageB.setLayoutParams(paramsB);
imageB.setOnTouchListener(this);
imageC.setId(3);
imageC.setTag("c");
paramsC = new RelativeLayout.LayoutParams(LayoutParams.WRAP_CONTENT,LayoutParams.WRAP_CONTENT); //The WRAP_CONTENT parameters can be replaced by an absolute width and height or the FILL_PARENT option)
imageC.setLayoutParams(paramsC);
imageC.setOnTouchListener(this);
imageD.setId(4);
imageD.setTag("d");
paramsD = new RelativeLayout.LayoutParams(LayoutParams.WRAP_CONTENT,LayoutParams.WRAP_CONTENT); //The WRAP_CONTENT parameters can be replaced by an absolute width and height or the FILL_PARENT option)
imageD.setLayoutParams(paramsD);
imageD.setOnTouchListener(this);
imageE.setId(5);
imageE.setTag("e");
paramsE = new RelativeLayout.LayoutParams(LayoutParams.WRAP_CONTENT,LayoutParams.WRAP_CONTENT); //The WRAP_CONTENT parameters can be replaced by an absolute width and height or the FILL_PARENT option)
imageE.setLayoutParams(paramsE);
imageE.setOnTouchListener(this);
imageF.setId(6);
imageF.setTag("f");
paramsF = new RelativeLayout.LayoutParams(LayoutParams.WRAP_CONTENT,LayoutParams.WRAP_CONTENT); //The WRAP_CONTENT parameters can be replaced by an absolute width and height or the FILL_PARENT option)
imageF.setLayoutParams(paramsF);
imageF.setOnTouchListener(this);
setupPieces();
// Set the hardware buttons to control the music
this.setVolumeControlStream(AudioManager.STREAM_MUSIC);
// Load the sound
soundPool = new SoundPool(10, AudioManager.STREAM_MUSIC, 0);
soundPool.setOnLoadCompleteListener(new SoundPool.OnLoadCompleteListener()
{
#Override
public void onLoadComplete(SoundPool soundPool, int sampleId, int status) {
loaded = true;
}
});
gameFinishSoundId = soundPool.load(this, R.raw.bells, 1);
correctPieceAttachSoundId= soundPool.load(this, R.raw.bell, 1);
}
public void playSound(int soundId)
{
AudioManager audioManager = (AudioManager) getSystemService(AUDIO_SERVICE);
float actualVolume = (float) audioManager.getStreamVolume(AudioManager.STREAM_MUSIC);
float maxVolume = (float) audioManager.getStreamMaxVolume(AudioManager.STREAM_MUSIC);
float volume = actualVolume / maxVolume;
// Is the sound loaded already?
if (loaded)
{
soundPool.play(soundId, volume, volume, 1, 0, 1.0f);
}
}
public void onConfigurationChanged(Configuration newConfig)
{
super.onConfigurationChanged(newConfig);
// Checks the orientation of the screen
if (newConfig.orientation == Configuration.ORIENTATION_LANDSCAPE)
{
// Toast.makeText(this, "landscape", Toast.LENGTH_SHORT).show();
}
else if (newConfig.orientation == Configuration.ORIENTATION_PORTRAIT)
{
// Toast.makeText(this, "portrait", Toast.LENGTH_SHORT).show();
}
// Toast.makeText(this, "in OnConfiguaration Changed", Toast.LENGTH_SHORT).show();
isPortait = (getResources().getConfiguration().orientation == Configuration.ORIENTATION_LANDSCAPE) ? false : true;
((RelativeLayout)imageA.getParent()).removeAllViews();
Log.d("GameActivity", "in onconfigurationChanged");
setupPieces();
}
public void setupPieces()
{
RelativeLayout relataivelayoutLayout = new RelativeLayout(this);
Display display = getWindowManager().getDefaultDisplay();
height = display.getHeight();
width = display.getWidth();
if(!isPortait)
{
paramsA.leftMargin = 10; //Your X coordinate
paramsA.topMargin = 30; //Your Y coordinate
}
else
{
paramsA.leftMargin = 30; //Your X coordinate
paramsA.topMargin = 10; //Your Y coordinate
}
imageA.setLayoutParams(paramsA);
imageA.setOnTouchListener(this);
//Log.d("GameActivity", "")
if(!isPortait)
{
paramsB.leftMargin = 650; //Your X coordinate
paramsB.topMargin = 300; //Your Y coordinate
}
else
{
paramsB.leftMargin = 300; //Your X coordinate
paramsB.topMargin = 750; //Your Y coordinate
}
imageB.setLayoutParams(paramsB);
imageB.setOnTouchListener(this);
if(!isPortait)
{
paramsC.leftMargin = 400; //Your X coordinate
paramsC.topMargin = 380; //Your Y coordinate
}
else
{
paramsC.leftMargin = 400; //Your X coordinate
paramsC.topMargin = 350; //Your Y coordinate
}
imageC.setLayoutParams(paramsC);
imageC.setOnTouchListener(this);
if(!isPortait)
{
paramsD.leftMargin = 750; //Your X coordinate
paramsD.topMargin = 20; //Your Y coordinate
}
else
{
paramsD.leftMargin = 20; //Your X coordinate
paramsD.topMargin = 750; //Your Y coordinate
}
imageD.setLayoutParams(paramsD);
imageD.setOnTouchListener(this);
if(!isPortait)
{
paramsE.leftMargin = 900; //Your X coordinate
paramsE.topMargin = 400; //Your Y coordinate
}
else
{
paramsE.leftMargin = 475; //Your X coordinate
paramsE.topMargin = 700; //Your Y coordinate
}
imageE.setLayoutParams(paramsE);
imageE.setOnTouchListener(this);
if(!isPortait)
{
paramsF.leftMargin = 90; //Your X coordinate
paramsF.topMargin = 300; //Your Y coordinate
}
else
{
paramsF.leftMargin = 90; //Your X coordinate
paramsF.topMargin = 300; //Your Y coordinate
}
imageF.setLayoutParams(paramsF);
imageF.setOnTouchListener(this);
ImageInfo imageAinfo = new ImageInfo(imageA.getId(),imageA,1,2);
imageInfoArray[0] = imageAinfo;
ImageInfo imageBinfo = new ImageInfo(imageB.getId(),imageB,1,3);
imageInfoArray[1] = imageBinfo;
ImageInfo imageCinfo = new ImageInfo(imageC.getId(),imageC,2,4);
imageInfoArray[2] = imageCinfo;
ImageInfo imageDinfo = new ImageInfo(imageD.getId(),imageD,3,5);
imageInfoArray[3] = imageDinfo;
ImageInfo imageEinfo = new ImageInfo(imageE.getId(),imageE,4,6);
imageInfoArray[4] = imageEinfo;
ImageInfo imageFinfo = new ImageInfo(imageF.getId(),imageF,5,6);
imageInfoArray[5] = imageFinfo;
relataivelayoutLayout.addView(imageA);
relataivelayoutLayout.addView(imageB);
relataivelayoutLayout.addView(imageC);
relataivelayoutLayout.addView(imageD);
relataivelayoutLayout.addView(imageE);
relataivelayoutLayout.addView(imageF);
setContentView(relataivelayoutLayout);
}
private void updatePosition(int id)
{
if(flag[imageInfoArray[id-1].id])
return;
flag[imageInfoArray[id-1].id] = true;
RelativeLayout.LayoutParams param = (RelativeLayout.LayoutParams)imageInfoArray[id-1].imageView.getLayoutParams();
param.leftMargin = imageInfoArray[id-1].imageView.getLeft() + diffX;
param.topMargin = imageInfoArray[id-1].imageView.getTop() + diffY;
imageInfoArray[id-1].imageView.setLayoutParams(param);
if(imageInfoArray[id-1].isTopConnected)
updatePosition(imageInfoArray[id-1].topPieceId);
if(imageInfoArray[id-1].isBottomConnected)
updatePosition(imageInfoArray[id-1].bottomPieceId);
return;
}
#Override
public boolean onTouch(View v, MotionEvent event)
{
if(v instanceof ImageView)
{
ImageView imageView = (ImageView) v;
ImageInfo imageInfo = imageInfoArray[imageView.getId()-1];
switch (event.getAction() & MotionEvent.ACTION_MASK)
{
case MotionEvent.ACTION_DOWN:
touchStartX = (int) event.getRawX();
touchStartY = (int) event.getRawY();
imageX = imageView.getLeft();
imageY = imageView.getTop();
//Toast.makeText(this, "x = "+event.getRawX()+" y = "+event.getRawY(), Toast.LENGTH_SHORT).show();
break;
case MotionEvent.ACTION_UP:
touchStartX = (int) event.getRawX();
touchStartY = (int) event.getRawY();
imageX = imageView.getLeft();
imageY = imageView.getTop();
int id = imageInfo.id;
while(imageInfo.isTopConnected)
{
if(imageInfo.id == imageInfo.topPieceId)
break;
imageInfo = imageInfoArray[imageInfo.topPieceId-1];
}
if(!imageInfo.isTopConnected)
{
imageView = imageInfo.imageView;
int topConnectingPieceId = imageInfo.topPieceId;
int topConnectingX=0,topConnectingY=0;
topConnectingX = imageInfo.calculateTopX(imageView.getLeft(), imageView.getId());
topConnectingY = imageInfo.calculateTopY(imageView.getTop(), imageView.getId());
ImageInfo topImageInfo = imageInfoArray[topConnectingPieceId-1];
int bottomConnectingX = topImageInfo.calculateBottomX(topImageInfo.imageView.getLeft(),topConnectingPieceId);
int bottomConnectingY = topImageInfo.calculateBottomY(topImageInfo.imageView.getTop(),topConnectingPieceId);
diffX = (bottomConnectingX-topConnectingX);
diffY = (bottomConnectingY-topConnectingY);
if(Math.abs(diffX)<=20 && Math.abs(diffY)<=20)
{
for(int i=0;i<7;i++)
flag[i]=false;
updatePosition(imageInfo.id);
imageInfo.setIsTopConnected(true);
topImageInfo.setIsBottomConnected(true);
attachedPieces++;
if(attachedPieces==5)
{
setupFinishDialogue();
playSound(gameFinishSoundId);
}
else
playSound(correctPieceAttachSoundId);
break;
}
}
imageInfo = imageInfoArray[id-1];
while(imageInfo.isBottomConnected)
{
if(imageInfo.id == imageInfoArray[imageInfo.bottomPieceId-1].id)
break;
imageInfo = imageInfoArray[imageInfo.bottomPieceId-1];
}
imageView = imageInfo.imageView;
if(!imageInfo.isBottomConnected)
{
int topConnectingX=0,topConnectingY=0;
int bottomConnectingX = imageInfo.calculateBottomX(imageView.getLeft(), imageView.getId());
int bottomConnectingY = imageInfo.calculateBottomY(imageView.getTop(), imageView.getId());
int bottomConnectingPieceId = imageInfo.bottomPieceId;
ImageInfo bottomImageInfo = imageInfoArray[bottomConnectingPieceId-1];
topConnectingX = bottomImageInfo.calculateTopX(bottomImageInfo.imageView.getLeft(),bottomConnectingPieceId);
topConnectingY = bottomImageInfo.calculateTopY(bottomImageInfo.imageView.getTop(), bottomConnectingPieceId);
diffX = (topConnectingX-bottomConnectingX);
diffY = (topConnectingY-bottomConnectingY);
if(Math.abs(diffX)<=20 && Math.abs(diffY)<=20)
{
for(int i=0;i<7;i++)
flag[i]=false;
updatePosition(imageInfo.id);
imageInfo.setIsBottomConnected(true);
bottomImageInfo.setIsTopConnected(true);
attachedPieces++;
if(attachedPieces==5)
{
setupFinishDialogue();
playSound(gameFinishSoundId);
}
else
playSound(correctPieceAttachSoundId);
}
}
break;
case MotionEvent.ACTION_MOVE:
diffX = (int) (event.getRawX() - touchStartX);
diffY = (int) (event.getRawY() - touchStartY);
touchStartX = (int)event.getRawX();
touchStartY = (int)event.getRawY();
for(int i=0;i<7;i++)
flag[i]=false;
updatePosition(imageInfo.id);
break;
default:
break;
}
}
return true;
}
void lockOrientation()
{
switch (getResources().getConfiguration().orientation)
{
case Configuration.ORIENTATION_PORTRAIT:
if(android.os.Build.VERSION.SDK_INT < android.os.Build.VERSION_CODES.FROYO)
{
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_PORTRAIT);
}
else
{
int rotation = getWindowManager().getDefaultDisplay().getRotation();
if(rotation == android.view.Surface.ROTATION_90|| rotation == android.view.Surface.ROTATION_180)
{
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_REVERSE_PORTRAIT);
}
else
{
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_PORTRAIT);
}
}
break;
case Configuration.ORIENTATION_LANDSCAPE:
if(android.os.Build.VERSION.SDK_INT < android.os.Build.VERSION_CODES.FROYO)
{
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
}
else
{
int rotation = getWindowManager().getDefaultDisplay().getRotation();
if(rotation == android.view.Surface.ROTATION_0 || rotation == android.view.Surface.ROTATION_90){
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
}
else
{
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_REVERSE_LANDSCAPE);
}
}
break;
}
}
void setupFinishDialogue()
{
/*if(getWindowManager().getDefaultDisplay().getRotation()==3)
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_PORTRAIT);
else if(getWindowManager().getDefaultDisplay().getRotation()==1)
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_REVERSE_PORTRAIT);
else if(getWindowManager().getDefaultDisplay().getRotation()==2)
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_REVERSE_LANDSCAPE);
else setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_NOSENSOR);*/
lockOrientation();
AlertDialog.Builder builderForAlertBox = new AlertDialog.Builder(this);
builderForAlertBox.setCancelable(false).setMessage("Good Job!").setPositiveButton("Restart", dialogClickListner).setNegativeButton("Quit", dialogClickListner).
setCancelable(true).show();
}
DialogInterface.OnClickListener dialogClickListner = new DialogInterface.OnClickListener()
{
public void onClick(DialogInterface dialog, int which)
{
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_SENSOR);
switch (which) {
case DialogInterface.BUTTON_POSITIVE:
finish();
Intent intent = new Intent(getApplicationContext(),GameActivity.class);
startActivity(intent);
break;
case DialogInterface.BUTTON_NEGATIVE:
finish();
default:
break;
}
}
};
}
ImageInfo Class
package com.example.JigSawPuzzle;
import android.widget.ImageView;
import android.widget.Toast;
public class ImageInfo
{
ImageView imageView;
int imageATopLeftDifferenceX = -1;
int imageATopLeftDifferenceY = -1;
int imageABottomRightDifferenceX = 113;
int imageABottomRightDifferenceY = 140;
int imageBTopLeftDifferenceX = 0;
int imageBTopLeftDifferenceY = 0;
int imageBBottomRightDifferenceX = 0;
int imageBBottomRightDifferenceY = 111;
int imageCTopLeftDifferenceX = 14;
int imageCTopLeftDifferenceY = 0;
int imageCBottomRightDifferenceX = 0;
int imageCBottomRightDifferenceY = 88;
int imageDTopLeftDifferenceX = 92;
int imageDTopLeftDifferenceY = 2;
int imageDBottomRightDifferenceX = 0;
int imageDBottomRightDifferenceY = 70;
/*int imageETopLeftDifferenceX = 0;
int imageETopLeftDifferenceY = 0;
int imageEBottomRightDifferenceX = 55;
int imageEBottomRightDifferenceY = 112;*/
int imageETopLeftDifferenceX = 55;
int imageETopLeftDifferenceY = 112;
int imageEBottomRightDifferenceX = 0;
int imageEBottomRightDifferenceY = 0;
/*int imageFTopLeftDifferenceX = 0;
int imageFTopLeftDifferenceY = 26;
int imageFBottomRightDifferenceX = 0;
int imageFBottomRightDifferenceY = 109;
int id,topPieceId,bottomPieceId;*/
int imageFTopLeftDifferenceX = 0;
int imageFTopLeftDifferenceY = 109;
int imageFBottomRightDifferenceX = 0;
int imageFBottomRightDifferenceY = 26;
int id,topPieceId,bottomPieceId;
boolean isTopConnected = false;
boolean isBottomConnected = false;
public ImageInfo(int id,ImageView imageView,int topPieceId,int bottomPieceId)
{
this.topPieceId = topPieceId;
this.bottomPieceId = bottomPieceId;
this.imageView = imageView;
this.id = id;
if(id==1)
isTopConnected = true;
else if(id==6)
isBottomConnected = true;
if(GameActivity.screenInches>9.0)
initializePiecesInfo();
}
private void initializePiecesInfo()
{
imageATopLeftDifferenceX = 0;
imageATopLeftDifferenceY = 0;
imageABottomRightDifferenceX = 150;
imageABottomRightDifferenceY = 184;
imageBTopLeftDifferenceX = 0;
imageBTopLeftDifferenceY = 0;
imageBBottomRightDifferenceX = 0;
imageBBottomRightDifferenceY = 148;
imageCTopLeftDifferenceX = 23;
imageCTopLeftDifferenceY = 0;
imageCBottomRightDifferenceX = 0;
imageCBottomRightDifferenceY = 115;
imageDTopLeftDifferenceX = 121;
imageDTopLeftDifferenceY = 0;
imageDBottomRightDifferenceX = 0;
imageDBottomRightDifferenceY = 91;
/*int imageETopLeftDifferenceX = 0;
int imageETopLeftDifferenceY = 0;
int imageEBottomRightDifferenceX = 55;
int imageEBottomRightDifferenceY = 112;*/
imageETopLeftDifferenceX = 74;
imageETopLeftDifferenceY = 147;
imageEBottomRightDifferenceX = 0;
imageEBottomRightDifferenceY = 0;
/*int imageFTopLeftDifferenceX = 0;
int imageFTopLeftDifferenceY = 26;
int imageFBottomRightDifferenceX = 0;
int imageFBottomRightDifferenceY = 109;
int id,topPieceId,bottomPieceId;*/
imageFTopLeftDifferenceX = 0;
imageFTopLeftDifferenceY = 144;
imageFBottomRightDifferenceX = 0;
imageFBottomRightDifferenceY = 26;
}
int calculateTopX(int realX,int id)
{
if(id==2)
return realX+imageBTopLeftDifferenceX;
if(id==3)
return realX+imageCTopLeftDifferenceX;
if(id==4)
return realX+imageDTopLeftDifferenceX;
if(id==5)
return realX+imageETopLeftDifferenceX;
if(id==6)
return realX+imageFTopLeftDifferenceX;
return realX;
}
int calculateTopY(int realY,int id)
{
if(id==2)
return realY+imageBTopLeftDifferenceY;
if(id==3)
return realY+imageCTopLeftDifferenceY;
if(id==4)
return realY+imageDTopLeftDifferenceY;
if(id==5)
return realY+imageETopLeftDifferenceY;
if(id==6)
return realY+imageFTopLeftDifferenceY;
return realY;
}
int calculateBottomX(int realX,int id)
{
if(id==1)
return realX+imageABottomRightDifferenceX;
if(id==2)
return realX+imageBBottomRightDifferenceX;
if(id==3)
return realX+imageCBottomRightDifferenceX;
if(id==4)
return realX+imageDBottomRightDifferenceX;
if(id==5)
return realX+imageEBottomRightDifferenceX;
return realX+imageFBottomRightDifferenceX;
}
int calculateBottomY(int realY,int id)
{
if(id==1)
return realY+imageABottomRightDifferenceY;
if(id==2)
return realY+imageBBottomRightDifferenceY;
if(id==3)
return realY+imageCBottomRightDifferenceY;
if(id==4)
return realY+imageDBottomRightDifferenceY;
if(id==5)
return realY+imageEBottomRightDifferenceY;
return realY+imageFBottomRightDifferenceY;
}
void setIsTopConnected(boolean isTopConnected)
{
this.isTopConnected = isTopConnected;
}
void setIsBottomConnected(boolean isBottomConnected)
{
this.isBottomConnected = isBottomConnected;
}
}
Given the images you provided it is straightforward to connect them in a generic way because the images have the same width and they have transparent background, here is the idea:
1- You need the overlap distance between the two images, which you can calculate from either the bottom or top image
The green line in the bottom picture should be equal to the red line in the top picture
Given that your images have transparent background you can calculate this distance easily, I will use the bottom image here.
The idea is to check every pixel in the last column of the bottom bitmap (i.e. width - 1) and stop once you hit a non-transparent pixel.
private int getOverlapDistance(Bitmap bottomBitmap) {
int height = bottomBitmap.getHeight();
int width = bottomBitmap.getWidth();
int distance = 0;
for (int i = 0; i < height; i++) {
if (Color.alpha(bottomBitmap.getPixel(width - 1, i)) != 0) {
distance = (i + 1);
break;
}
}
return distance;
}
To connect them you can do something like (assuming you have separate ImageView for the top and bottom image):
bitmap = ((BitmapDrawable) bottomImage.getDrawable()).getBitmap();
int overlapDistance = getOverlapDistance(bitmap);
bottomImage.setTop(topImage.getBottom() - overlapDistance);
Actually I tried this with a simple activity and it is working, here is how it look like before and after connecting the two images:
I just execute the above code when the button connect is clicked
Related
I'm using mrmaffen's VLC-ANDROID-SDK to develop an RTSP streaming app.
https://github.com/mrmaffen/vlc-android-sdk
I've had a lot of success getting it working and running quite well, but the problem I'm having that I can't seem to shake is getting it to display the video feed in fullscreen on the SurfaceView, or even just in the center of the SurfaceView.
This is what I get:
http://s1378.photobucket.com/user/Jo_Han_Solo/media/Screenshot_20171214-125504_zps437k1kw2.png.html?filters[user]=146993343&filters[recent]=1&sort=1&o=1
The black window is the total size of the screen, I want that video to fill the screen and hopefully always fill from center, but I can't figure out how to do it.
Anyone have any experience with anything like this and knows how to fix it?
I kind of solved the problem but in a bit of a dodgy way, it's far from complete but considering the lack of knowledge and information on the topic I thought this might help someone for the time being.
Find the size of your screen.
Set up your final IVLCOut to incorporate the screen size.
Adjust setScale to "fullscreen" the video stream.
To explain each task:
Setup your globals:
public class SingleStreamView extends AppCompatActivity implements
IVLCVout.Callback {
public int mHeight;
public int mWidth;
Secondly, in the onCreate task find your screen sizes of your device:
DisplayMetrics displayMetrics = new DisplayMetrics();
getWindowManager().getDefaultDisplay().getMetrics(displayMetrics);
mHeight = displayMetrics.heightPixels;
mWidth = displayMetrics.widthPixels;
2.
Then go down to your "CreatePlayer" event and where you set up your video output:
// Set up video output
final IVLCVout vout = mMediaPlayer.getVLCVout();
vout.setVideoView(mSurface);
vout.setWindowSize(mWidth,mHeight);
vout.addCallback(this);
vout.attachViews();
The winning line that made it center in my surface was the "vout.setWindowSize(mWidth,mHeight);"
Then I simply used the setscale option to "fullscreen" the video. That said, it's a bit of a hack way of doing it, and I would like to try and figure out a way to grab the codec information so to dynamically set the scale of the video and that way automatically fullscreen every size video stream to any size screen but for now this will work for known video stream resolutions, it will automatically adjust to the screen size of your phone.
Either way I found that with a Samsung Galaxy s8, a good scaling factor for a 640x480p RTSP stream was 1.8. Coded like so:
Media m = new Media(libvlc, Uri.parse(RTSP_ADDRESS));
m.setHWDecoderEnabled(true,false);
m.addOption(":network-caching=100");
m.addOption(":clock-jitter=0");
m.addOption(":clock-synchro=0");
m.addOption(":fullscreen");
mMediaPlayer.setMedia(m);
mMediaPlayer.setAspectRatio("16:9");
mMediaPlayer.setScale(1.8f);
mMediaPlayer.play();
Where you got "mMediaPlayer.setScale(1.8f);"
Hope this helps someone!
your solution seems interesting, however I'm facing the same issues, which I can't seem to solve (yet) with your approach.
Screenshots of what I got sofar can be seen at:
https://photos.app.goo.gl/9nKo22Mkc2SZq4SK9
I also want to (vertically) center an rtsp-video-stream in either landscape/portrait mode on a Samsung-XCover4 (with 720x1280 pixels) and on a device with minimum resolution of 320x480. The minimum Android SDK-version I would love to have it running is API-22 (Android 5.1.1).
The libvlc code for which I got the (embedded)VLC-player working, is based on 'de.mrmaffen:libvlc-android:2.1.12#aar'.
Given the above 'requirements', you can see the following behavior in the screenshots. The first two screenshots are on a Samsung-XCover4 (720x1280) where you can see that device-orientation=landscape clips the video and doesn't scale it, whereas the 3rd and 4th screenshot show that the same video-stream doesn't follow the SURFACE_BEST_FIT method (see code below for an explanation) on a device with small-resolution.
I would love to see an updateVideoSurfaces to handle the change in device-orientation or at least to show the entire video on startup.
The layout for my VLC-video-player (part of a vertical LinearLayout) is as follows:
<LinearLayout
android:layout_width="match_parent"
android:layout_height="0dp"
android:layout_weight="0.3"
android:layout_marginBottom="8dp"
android:layout_marginEnd="8dp"
android:layout_marginStart="8dp"
android:layout_marginTop="8dp"
android:orientation="vertical">
<FrameLayout
android:id="#+id/video_surface_frame"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_gravity="center"
android:foregroundGravity="clip_horizontal|clip_vertical"
tools:ignore="true">
<ViewStub
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout="#layout/surface_view"
android:id="#+id/surface_stub" />
<ViewStub
android:layout_width="1dp"
android:layout_height="1dp"
android:layout="#layout/surface_view"
android:id="#+id/subtitles_surface_stub" />
<ViewStub
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout="#layout/texture_view"
android:id="#+id/texture_stub" />
</FrameLayout>
</LinearLayout>
The example code I got from de.mrmaffen uses an updateVideoSurfaces (see below java-code) which uses a number of SURFACE_XX method which to me seem to cover all scenarios with different device-orientations and resolution.
For some reason I can't figure out why this doesn't work and I suspect that the layout I'm using for the player (the FrameLayout/ViewStub's) may cause the issues.
I was wondering if you can shed some light on directions in order to make sure that the video stream will auto-scale/center on any device orientation/resolution.
The player-code I'm using is as follows:
package com.testing.vlc2player;
import ...
public class VLC2PlayerActivity extends AppCompatActivity implements IVLCVout.OnNewVideoLayoutListener,
IVLCVout.Callback {
private static final Logger log = LoggerFactory.getLogger(VLC2PlayerActivity.class);
private static final boolean USE_SURFACE_VIEW = true;
private static final boolean ENABLE_SUBTITLES = false;
private static final int SURFACE_BEST_FIT = 0;
private static final int SURFACE_FIT_SCREEN = 1;
private static final int SURFACE_FILL = 2;
private static final int SURFACE_16_9 = 3;
private static final int SURFACE_4_3 = 4;
private static final int SURFACE_ORIGINAL = 5;
private static final int CURRENT_SIZE = SURFACE_BEST_FIT;
private FrameLayout mVideoSurfaceFrame = null;
private SurfaceView mVideoSurface = null;
private SurfaceView mSubtitlesSurface = null;
private TextureView mVideoTexture = null;
private View mVideoView = null;
private final Handler mHandler = new Handler();
private View.OnLayoutChangeListener mOnLayoutChangeListener = null;
private LibVLC mLibVLC = null;
private MediaPlayer mMediaPlayer = null;
private int mVideoHeight = 0;
private int mVideoWidth = 0;
private int mVideoVisibleHeight = 0;
private int mVideoVisibleWidth = 0;
private int mVideoSarNum = 0;
private int mVideoSarDen = 0;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_video_player);
setupVLCLayout();
}
private void setupVLCLayout() {
log.debug("...");
final ArrayList<String> args = new ArrayList<>();
args.add("-vvv");
mLibVLC = new LibVLC(this, args);
mMediaPlayer = new MediaPlayer(mLibVLC);
mVideoSurfaceFrame = findViewById(R.id.video_surface_frame);
if (USE_SURFACE_VIEW) {
ViewStub stub = findViewById(R.id.surface_stub);
mVideoSurface = (SurfaceView) stub.inflate();
if (ENABLE_SUBTITLES) {
stub = findViewById(R.id.subtitles_surface_stub);
mSubtitlesSurface = (SurfaceView) stub.inflate();
mSubtitlesSurface.setZOrderMediaOverlay(true);
mSubtitlesSurface.getHolder().setFormat(PixelFormat.TRANSLUCENT);
}
mVideoView = mVideoSurface;
} else {
ViewStub stub = findViewById(R.id.texture_stub);
mVideoTexture = (TextureView) stub.inflate();
mVideoView = mVideoTexture;
}
}
#Override
protected void onDestroy() {
super.onDestroy();
mMediaPlayer.release();
mLibVLC.release();
}
#Override
protected void onStart() {
super.onStart();
final IVLCVout vlcVout = mMediaPlayer.getVLCVout();
if (mVideoSurface != null) {
vlcVout.setVideoView(mVideoSurface);
if (mSubtitlesSurface != null) {
vlcVout.setSubtitlesView(mSubtitlesSurface);
}
} else {
vlcVout.setVideoView(mVideoTexture);
}
vlcVout.attachViews(this);
String url = getString(R.string.videoURL);
Uri uri = Uri.parse(url);
final Media media = new Media(mLibVLC, uri);
mMediaPlayer.setMedia(media);
media.release();
mMediaPlayer.play();
if (mOnLayoutChangeListener == null) {
mOnLayoutChangeListener = new View.OnLayoutChangeListener() {
private final Runnable mRunnable = new Runnable() {
#Override
public void run() {
updateVideoSurfaces();
}
};
#Override
public void onLayoutChange(View v, int left, int top, int right,
int bottom, int oldLeft, int oldTop, int oldRight, int oldBottom) {
if (left != oldLeft || top != oldTop || right != oldRight || bottom != oldBottom) {
mHandler.removeCallbacks(mRunnable);
mHandler.post(mRunnable);
}
}
};
}
mVideoSurfaceFrame.addOnLayoutChangeListener(mOnLayoutChangeListener);
}
#Override
protected void onStop() {
super.onStop();
if (mOnLayoutChangeListener != null) {
mVideoSurfaceFrame.removeOnLayoutChangeListener(mOnLayoutChangeListener);
mOnLayoutChangeListener = null;
}
mMediaPlayer.stop();
mMediaPlayer.getVLCVout().detachViews();
}
private void changeMediaPlayerLayout(int displayW, int displayH) {
log.debug("displayW={}, displayH={}", displayW, displayH);
/* Change the video placement using the MediaPlayer API */
int dispWd = displayW;
int dispHt = displayH;
dispWd = mVideoSurface.getWidth(); //Note: we do NOT want to use the entire display!
dispHt = mVideoSurface.getHeight();
switch (CURRENT_SIZE) {
case SURFACE_BEST_FIT:
mMediaPlayer.setAspectRatio(null);
mMediaPlayer.setScale(0);
break;
case SURFACE_FIT_SCREEN:
case SURFACE_FILL: {
Media.VideoTrack vtrack = mMediaPlayer.getCurrentVideoTrack();
if (vtrack == null) {
return;
}
final boolean videoSwapped = vtrack.orientation == Media.VideoTrack.Orientation.LeftBottom
|| vtrack.orientation == Media.VideoTrack.Orientation.RightTop;
if (CURRENT_SIZE == SURFACE_FIT_SCREEN) {
int videoW = vtrack.width;
int videoH = vtrack.height;
if (videoSwapped) {
int swap = videoW;
videoW = videoH;
videoH = swap;
}
if (vtrack.sarNum != vtrack.sarDen) {
videoW = videoW * vtrack.sarNum / vtrack.sarDen;
}
float ar = videoW / (float) videoH;
float dar = dispWd / (float) dispHt;
//noinspection unused
float scale;
if (dar >= ar) {
scale = dispWd / (float) videoW; /* horizontal */
} else {
scale = dispHt / (float) videoH; /* vertical */
}
log.debug("scale={}", scale);
mMediaPlayer.setScale(scale);
mMediaPlayer.setAspectRatio(null);
} else {
mMediaPlayer.setScale(0);
mMediaPlayer.setAspectRatio(!videoSwapped ? ""+dispWd+":"+dispHt
: ""+dispHt+":"+dispWd);
}
break;
}
case SURFACE_16_9:
mMediaPlayer.setAspectRatio("16:9");
mMediaPlayer.setScale(0);
break;
case SURFACE_4_3:
mMediaPlayer.setAspectRatio("4:3");
mMediaPlayer.setScale(0);
break;
case SURFACE_ORIGINAL:
mMediaPlayer.setAspectRatio(null);
mMediaPlayer.setScale(1);
break;
}
}
private void updateVideoSurfaces() {
log.debug("...");
int sw = getWindow().getDecorView().getWidth();
int sh = getWindow().getDecorView().getHeight();
// sanity check
if (sw * sh == 0) {
log.error("Invalid surface size");
return;
}
mMediaPlayer.getVLCVout().setWindowSize(sw, sh);
ViewGroup.LayoutParams lp = mVideoView.getLayoutParams();
if (mVideoWidth * mVideoHeight == 0) {
/* Case of OpenGL vouts: handles the placement of the video using MediaPlayer API */
lp.width = ViewGroup.LayoutParams.MATCH_PARENT;
lp.height = ViewGroup.LayoutParams.MATCH_PARENT;
mVideoView.setLayoutParams(lp);
lp = mVideoSurfaceFrame.getLayoutParams();
lp.width = ViewGroup.LayoutParams.MATCH_PARENT;
lp.height = ViewGroup.LayoutParams.MATCH_PARENT;
mVideoSurfaceFrame.setLayoutParams(lp);
changeMediaPlayerLayout(sw, sh);
return;
}
if (lp.width == lp.height && lp.width == ViewGroup.LayoutParams.MATCH_PARENT) {
/* We handle the placement of the video using Android View LayoutParams */
mMediaPlayer.setAspectRatio(null);
mMediaPlayer.setScale(0);
}
double dw = sw, dh = sh;
final boolean isPortrait = getResources().getConfiguration().orientation == Configuration.ORIENTATION_PORTRAIT;
if (sw > sh && isPortrait || sw < sh && !isPortrait) {
dw = sh;
dh = sw;
}
// compute the aspect ratio
double ar, vw;
if (mVideoSarDen == mVideoSarNum) {
/* No indication about the density, assuming 1:1 */
vw = mVideoVisibleWidth;
ar = (double)mVideoVisibleWidth / (double)mVideoVisibleHeight;
} else {
/* Use the specified aspect ratio */
vw = mVideoVisibleWidth * (double)mVideoSarNum / mVideoSarDen;
ar = vw / mVideoVisibleHeight;
}
// compute the display aspect ratio
double dar = dw / dh;
switch (CURRENT_SIZE) {
case SURFACE_BEST_FIT:
if (dar < ar) {
dh = dw / ar;
} else {
dw = dh * ar;
}
break;
case SURFACE_FIT_SCREEN:
if (dar >= ar) {
dh = dw / ar; /* horizontal */
} else {
dw = dh * ar; /* vertical */
}
break;
case SURFACE_FILL:
break;
case SURFACE_16_9:
ar = 16.0 / 9.0;
if (dar < ar) {
dh = dw / ar;
} else {
dw = dh * ar;
}
break;
case SURFACE_4_3:
ar = 4.0 / 3.0;
if (dar < ar) {
dh = dw / ar;
} else {
dw = dh * ar;
}
break;
case SURFACE_ORIGINAL:
dh = mVideoVisibleHeight;
dw = vw;
break;
}
// set display size
lp.width = (int) Math.ceil(dw * mVideoWidth / mVideoVisibleWidth);
lp.height = (int) Math.ceil(dh * mVideoHeight / mVideoVisibleHeight);
mVideoView.setLayoutParams(lp);
if (mSubtitlesSurface != null) {
mSubtitlesSurface.setLayoutParams(lp);
}
// set frame size (crop if necessary)
lp = mVideoSurfaceFrame.getLayoutParams();
lp.width = (int) Math.floor(dw);
lp.height = (int) Math.floor(dh);
mVideoSurfaceFrame.setLayoutParams(lp);
mVideoView.invalidate();
if (mSubtitlesSurface != null) {
mSubtitlesSurface.invalidate();
}
}
#TargetApi(Build.VERSION_CODES.JELLY_BEAN_MR1)
#Override
public void onNewVideoLayout(IVLCVout vlcVout, int width, int height,
int visibleWidth, int visibleHeight,
int sarNum, int sarDen) {
log.debug("...");
mVideoWidth = width;
mVideoHeight = height;
mVideoVisibleWidth = visibleWidth;
mVideoVisibleHeight = visibleHeight;
mVideoSarNum = sarNum;
mVideoSarDen = sarDen;
updateVideoSurfaces();
}
#Override
public void onSurfacesCreated(IVLCVout vlcVout) {
log.debug("vlcVout={}", vlcVout);
}
/**
* This callback is called when surfaces are destroyed.
*/
public void onSurfacesDestroyed(IVLCVout vlcVout) {
log.debug("vlcVout={}", vlcVout);
}
public void onStopClientMonitoring(View view) {
// log.info("UI -> Stop monitoring clientId= ...");
// onBackPressed();
String androidSDKRelease = Build.VERSION.RELEASE;
int androidSDKInt = Build.VERSION.SDK_INT;
String androidInfo = String.format(Locale.getDefault(), "Android %s (Version %d)", androidSDKRelease, androidSDKInt);
String appVersionName = BuildConfig.VERSION_NAME;
String appName = getString(R.string.app_name);
String appInfoTitle = String.format(getString(R.string.app_info_title), appName);
String infoMsg = String.format(getString(R.string.app_info_message), appVersionName, androidInfo);
new AlertDialog.Builder(this).setTitle(appInfoTitle)
.setMessage(infoMsg)
.setPositiveButton(getString(R.string.button_ok), new DialogInterface.OnClickListener() {
#Override
public void onClick(DialogInterface dialog, int which) {
// Dismiss dialog
dialog.dismiss();
}
})
.create()
.show();
}
}
Okay so I'm using LibVLC for Android (with Android Studio) to receive the live RTSP stream of an IP Camera since VideoView doesn't quite support LIVE streams. I'm using a sample code from the VideoLAN people which can be found here:
https://code.videolan.org/videolan/libvlc-android-samples
And I've done a lot of investigation on the code to achieve 19:6 aspect ratio out of the 4:3 that my camera outputs. The reason why I'm trying to break the aspect ratio is because this IP Camera records 1280x720 pixels but outputs 640x480 through it's second stream. The problem is that the width isn't cropped, but stretched from the sides, so it looks kinda compressed tight.
I've tried setting the 4 alignParent options to true on the SurfaceView, but no results. Also tried to multiply some of the width variables I found there in the JavaActivity class code by 1.33333 which should theorically stretch the width, but nothing happened at all, not even an error or an exception. I also tried making a new class extending SurfaceView and tweaking with the onMeasure method, but no dice. This is the JavaActivity code as is from the example, (of course I've adapted mine to work with my project but with minor changes)
package org.videolan.javasample;
import android.annotation.TargetApi;
import android.content.res.Configuration;
import android.graphics.PixelFormat;
import android.net.Uri;
import android.os.Build;
import android.os.Handler;
import android.support.v7.app.AppCompatActivity;
import android.os.Bundle;
import android.util.Log;
import android.view.SurfaceView;
import android.view.TextureView;
import android.view.View;
import android.view.ViewGroup;
import android.view.ViewStub;
import android.widget.FrameLayout;
import org.videolan.libvlc.IVLCVout;
import org.videolan.libvlc.LibVLC;
import org.videolan.libvlc.Media;
import org.videolan.libvlc.MediaPlayer;
import java.util.ArrayList;
public class JavaActivity extends AppCompatActivity implements IVLCVout.OnNewVideoLayoutListener {
private static final boolean USE_SURFACE_VIEW = true;
private static final boolean ENABLE_SUBTITLES = true;
private static final String TAG = "JavaActivity";
private static final String SAMPLE_URL = "http://download.blender.org/peach/bigbuckbunny_movies/BigBuckBunny_640x360.m4v";
// Not the actual RTSP Live Stream link but you know...
private static final int SURFACE_BEST_FIT = 0;
private static final int SURFACE_FIT_SCREEN = 1;
private static final int SURFACE_FILL = 2;
private static final int SURFACE_16_9 = 3;
private static final int SURFACE_4_3 = 4;
private static final int SURFACE_ORIGINAL = 5;
private static int CURRENT_SIZE = SURFACE_BEST_FIT;
private FrameLayout mVideoSurfaceFrame = null;
private SurfaceView mVideoSurface = null;
private SurfaceView mSubtitlesSurface = null;
private TextureView mVideoTexture = null;
private View mVideoView = null;
private final Handler mHandler = new Handler();
private View.OnLayoutChangeListener mOnLayoutChangeListener = null;
private LibVLC mLibVLC = null;
private MediaPlayer mMediaPlayer = null;
private int mVideoHeight = 0;
private int mVideoWidth = 0;
private int mVideoVisibleHeight = 0;
private int mVideoVisibleWidth = 0;
private int mVideoSarNum = 0;
private int mVideoSarDen = 0;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
final ArrayList<String> args = new ArrayList<>();
args.add("-vvv");
mLibVLC = new LibVLC(this, args);
mMediaPlayer = new MediaPlayer(mLibVLC);
mVideoSurfaceFrame = (FrameLayout) findViewById(R.id.video_surface_frame);
if (USE_SURFACE_VIEW) {
ViewStub stub = (ViewStub) findViewById(R.id.surface_stub);
mVideoSurface = (SurfaceView) stub.inflate();
if (ENABLE_SUBTITLES) {
stub = (ViewStub) findViewById(R.id.subtitles_surface_stub);
mSubtitlesSurface = (SurfaceView) stub.inflate();
mSubtitlesSurface.setZOrderMediaOverlay(true);
mSubtitlesSurface.getHolder().setFormat(PixelFormat.TRANSLUCENT);
}
mVideoView = mVideoSurface;
}
else
{
ViewStub stub = (ViewStub) findViewById(R.id.texture_stub);
mVideoTexture = (TextureView) stub.inflate();
mVideoView = mVideoTexture;
}
}
#Override
protected void onDestroy() {
super.onDestroy();
mMediaPlayer.release();
mLibVLC.release();
}
#Override
protected void onStart() {
super.onStart();
final IVLCVout vlcVout = mMediaPlayer.getVLCVout();
if (mVideoSurface != null) {
vlcVout.setVideoView(mVideoSurface);
if (mSubtitlesSurface != null)
vlcVout.setSubtitlesView(mSubtitlesSurface);
}
else
vlcVout.setVideoView(mVideoTexture);
vlcVout.attachViews(this);
Media media = new Media(mLibVLC, Uri.parse(SAMPLE_URL));
mMediaPlayer.setMedia(media);
media.release();
mMediaPlayer.play();
if (mOnLayoutChangeListener == null) {
mOnLayoutChangeListener = new View.OnLayoutChangeListener() {
private final Runnable mRunnable = new Runnable() {
#Override
public void run() {
updateVideoSurfaces();
}
};
#Override
public void onLayoutChange(View v, int left, int top, int right,
int bottom, int oldLeft, int oldTop, int oldRight, int oldBottom) {
if (left != oldLeft || top != oldTop || right != oldRight || bottom != oldBottom) {
mHandler.removeCallbacks(mRunnable);
mHandler.post(mRunnable);
}
}
};
}
mVideoSurfaceFrame.addOnLayoutChangeListener(mOnLayoutChangeListener);
}
#Override
protected void onStop() {
super.onStop();
if (mOnLayoutChangeListener != null) {
mVideoSurfaceFrame.removeOnLayoutChangeListener(mOnLayoutChangeListener);
mOnLayoutChangeListener = null;
}
mMediaPlayer.stop();
mMediaPlayer.getVLCVout().detachViews();
}
private void changeMediaPlayerLayout(int displayW, int displayH) {
/* Change the video placement using the MediaPlayer API */
switch (CURRENT_SIZE) {
case SURFACE_BEST_FIT:
mMediaPlayer.setAspectRatio(null);
mMediaPlayer.setScale(0);
break;
case SURFACE_FIT_SCREEN:
case SURFACE_FILL: {
Media.VideoTrack vtrack = mMediaPlayer.getCurrentVideoTrack();
if (vtrack == null)
return;
final boolean videoSwapped = vtrack.orientation == Media.VideoTrack.Orientation.LeftBottom
|| vtrack.orientation == Media.VideoTrack.Orientation.RightTop;
if (CURRENT_SIZE == SURFACE_FIT_SCREEN) {
int videoW = vtrack.width;
int videoH = vtrack.height;
if (videoSwapped) {
int swap = videoW;
videoW = videoH;
videoH = swap;
}
if (vtrack.sarNum != vtrack.sarDen)
videoW = videoW * vtrack.sarNum / vtrack.sarDen;
float ar = videoW / (float) videoH;
float dar = displayW / (float) displayH;
float scale;
if (dar >= ar)
scale = displayW / (float) videoW; /* horizontal */
else
scale = displayH / (float) videoH; /* vertical */
mMediaPlayer.setScale(scale);
mMediaPlayer.setAspectRatio(null);
} else {
mMediaPlayer.setScale(0);
mMediaPlayer.setAspectRatio(!videoSwapped ? ""+displayW+":"+displayH
: ""+displayH+":"+displayW);
}
break;
}
case SURFACE_16_9:
mMediaPlayer.setAspectRatio("16:9");
mMediaPlayer.setScale(0);
break;
case SURFACE_4_3:
mMediaPlayer.setAspectRatio("4:3");
mMediaPlayer.setScale(0);
break;
case SURFACE_ORIGINAL:
mMediaPlayer.setAspectRatio(null);
mMediaPlayer.setScale(1);
break;
}
}
private void updateVideoSurfaces() {
int sw = getWindow().getDecorView().getWidth();
int sh = getWindow().getDecorView().getHeight();
// sanity check
if (sw * sh == 0) {
Log.e(TAG, "Invalid surface size");
return;
}
mMediaPlayer.getVLCVout().setWindowSize(sw, sh);
ViewGroup.LayoutParams lp = mVideoView.getLayoutParams();
if (mVideoWidth * mVideoHeight == 0) {
/* Case of OpenGL vouts: handles the placement of the video using MediaPlayer API */
lp.width = ViewGroup.LayoutParams.MATCH_PARENT;
lp.height = ViewGroup.LayoutParams.MATCH_PARENT;
mVideoView.setLayoutParams(lp);
lp = mVideoSurfaceFrame.getLayoutParams();
lp.width = ViewGroup.LayoutParams.MATCH_PARENT;
lp.height = ViewGroup.LayoutParams.MATCH_PARENT;
mVideoSurfaceFrame.setLayoutParams(lp);
changeMediaPlayerLayout(sw, sh);
return;
}
if (lp.width == lp.height && lp.width == ViewGroup.LayoutParams.MATCH_PARENT) {
/* We handle the placement of the video using Android View LayoutParams */
mMediaPlayer.setAspectRatio(null);
mMediaPlayer.setScale(0);
}
double dw = sw, dh = sh;
final boolean isPortrait = getResources().getConfiguration().orientation == Configuration.ORIENTATION_PORTRAIT;
if (sw > sh && isPortrait || sw < sh && !isPortrait) {
dw = sh;
dh = sw;
}
// compute the aspect ratio
double ar, vw;
if (mVideoSarDen == mVideoSarNum) {
/* No indication about the density, assuming 1:1 */
vw = mVideoVisibleWidth;
ar = (double)mVideoVisibleWidth / (double)mVideoVisibleHeight;
} else {
/* Use the specified aspect ratio */
vw = mVideoVisibleWidth * (double)mVideoSarNum / mVideoSarDen;
ar = vw / mVideoVisibleHeight;
}
// compute the display aspect ratio
double dar = dw / dh;
switch (CURRENT_SIZE) {
case SURFACE_BEST_FIT:
if (dar < ar)
dh = dw / ar;
else
dw = dh * ar;
break;
case SURFACE_FIT_SCREEN:
if (dar >= ar)
dh = dw / ar; /* horizontal */
else
dw = dh * ar; /* vertical */
break;
case SURFACE_FILL:
break;
case SURFACE_16_9:
ar = 16.0 / 9.0;
if (dar < ar)
dh = dw / ar;
else
dw = dh * ar;
break;
case SURFACE_4_3:
ar = 4.0 / 3.0;
if (dar < ar)
dh = dw / ar;
else
dw = dh * ar;
break;
case SURFACE_ORIGINAL:
dh = mVideoVisibleHeight;
dw = vw;
break;
}
// set display size
lp.width = (int) Math.ceil(dw * mVideoWidth / mVideoVisibleWidth);
lp.height = (int) Math.ceil(dh * mVideoHeight / mVideoVisibleHeight);
mVideoView.setLayoutParams(lp);
if (mSubtitlesSurface != null)
mSubtitlesSurface.setLayoutParams(lp);
// set frame size (crop if necessary)
lp = mVideoSurfaceFrame.getLayoutParams();
lp.width = (int) Math.floor(dw);
lp.height = (int) Math.floor(dh);
mVideoSurfaceFrame.setLayoutParams(lp);
mVideoView.invalidate();
if (mSubtitlesSurface != null)
mSubtitlesSurface.invalidate();
}
#TargetApi(Build.VERSION_CODES.JELLY_BEAN_MR1)
#Override
public void onNewVideoLayout(IVLCVout vlcVout, int width, int height, int visibleWidth, int visibleHeight, int sarNum, int sarDen) {
mVideoWidth = width;
mVideoHeight = height;
mVideoVisibleWidth = visibleWidth;
mVideoVisibleHeight = visibleHeight;
mVideoSarNum = sarNum;
mVideoSarDen = sarDen;
updateVideoSurfaces();
}
}
This is how it's looking right now:
And this is how I'd like it to show:
I've photoshopped the second one btw.
Any help is appreciated. If you need any more data, just let me know.
I followed along with the same sample and ran into the same issues. It turns out you need to add an argument when initializing VLC.
final ArrayList<String> args = new ArrayList<>();
args.add("--vout=android-display"); // Add this line!
args.add("-vvv");
mLibVLC = new LibVLC(this, args);
mMediaPlayer = new MediaPlayer(mLibVLC);
Credit goes to Alexander Ukhov in the videolan forums for pointing this out.
Here ( https://drive.google.com/file/d/1lK_aOOYaKwMxvtpyyEoDXEFWcrDjGMyy/view?usp=sharing ) is the demo source code for RTMP and RTSP both. I have personally checked it and it works. I used it for live video uploading to the server. It upload video as you shoot it. Down streaming need to be done by Back end developer, they will just provide you a link and you need to use that for down streaming .
I've been trying to call an activity from a surface view via On Touch. However, with the code I've constructed (and I've tried all possible trials and errors, I'm a novice programmer fyi), I keep clicking on the surface view and nothing happens.
I badly need some help guys, this is a school project :3
Thanks!
package com.projtimesequencesrc;
import android.content.Context;
import android.content.Intent;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.graphics.Canvas;
import android.graphics.Paint;
import android.graphics.Rect;
import android.os.SystemClock;
import android.util.AttributeSet;
import android.view.MotionEvent;
import android.view.View;
import android.view.View.OnTouchListener;
public class SurfaceViewer extends View implements OnTouchListener {
private Context svc;
float x, y;
Bitmap bgd, main;
Bitmap dialog;
Rect dialogsrc, dialogdst;
//dog
Bitmap dog;
Rect dogsrc, dogdst;
Paint p;
//Molly
Bitmap molly;
Rect mollysrc, mollydst;
//Molly2
Bitmap youngstar;
Rect youngstarsrc, youngstardst;
//Bear
Bitmap bear;
Rect bearsrc, beardst;
Thread tbear, tdbear;
Runnable rbear, dbear;
//man1
Bitmap man1;
Rect mansrc, mandst;
// Main Character
Rect bgdst, charsrc, chardst;
Thread tstory, tdstory;
Runnable rstory, dstory;
protected void onDraw (Canvas c) {
super.onDraw(c);
bgdst.right=getWidth();
bgdst.bottom=getHeight();
c.drawBitmap(bgd, null, bgdst, p);
c.drawBitmap(main, charsrc,chardst, p);
c.drawBitmap(dog, dogsrc, dogdst, p);
c.drawBitmap(bear, bearsrc, beardst, p);
c.drawBitmap(man1, mansrc, mandst, p);
c.drawBitmap(dialog, dialogsrc, dialogdst, p);
c.drawBitmap(molly, mollysrc,mollydst, p);
c.drawBitmap(youngstar, youngstarsrc, youngstardst, p);
invalidate();
}
void sqyoungstar2() {
int count = 0;
while (true) {
youngstarsrc.left += 64;
youngstarsrc.right += 64;
if (youngstarsrc.left > (64*6)) {
youngstarsrc.left = 0;
youngstarsrc.right = 64;
}
SystemClock.sleep(200);
count++;
if (count > 38) {
break;
}
}
}
void sqyoungstar1() {
youngstarsrc.left = 0;
youngstarsrc.right = 64;
youngstarsrc.top = 64*2;
youngstarsrc.bottom = 64*3;
youngstardst.left = 280;
youngstardst.right = 330;
youngstardst.top = 200;
youngstardst.bottom = 250;
sqyoungstar2();
}
void seq4() {
// Main Character
charsrc.left = 0;
charsrc.right = 64;
charsrc.top = (64*9);
charsrc.bottom = (64*10);
mansrc.left = 0;
mansrc.right = 64;
mansrc.top = (64*9);
mansrc.bottom = (64*10);
int count = 0;
while (true) {
charsrc.left += 64;
charsrc.right += 64;
if (charsrc.left > (64*8)) {
charsrc.left = 0;
charsrc.right = 64;
}
chardst.offset(-5, 0);
SystemClock.sleep(100);
count++;
if (count > 38) {
break;
}
}
SystemClock.sleep(1000);
}
void seq3() {
// Main Character
charsrc.left = 0;
charsrc.top = (64*11);
charsrc.right= 64;
charsrc.bottom = (64*12);
SystemClock.sleep(3000);
seq4();
}
void seq2() {
// Main Character
int count = 0;
while (true) {
charsrc.left += 64;
charsrc.right += 64;
if (charsrc.left > (64 * 6)) {
charsrc.left = 0;
charsrc.right = 64;
}
chardst.offset(5, 0);
SystemClock.sleep(100);
count++;
if (count > 38) {
break;
}
}
seq3();
}
void sequence1() {
// main character
dialogsrc.left = 0;
dialogsrc.right = 536;
dialogsrc.top = 0;
dialogsrc.bottom = 120;
dialogdst.left = 100;
dialogdst.right = 736;
dialogdst.top = 50;
dialogdst.bottom = 170;
charsrc.left = 0;
charsrc.top = (64*11);
charsrc.right= 64;
charsrc.bottom = (64*12);
chardst.left = 192;
chardst.top = 255;
chardst.right= 256;
chardst.bottom = 315;
mansrc.left = 0;
mansrc.right = 64;
mansrc.top = (64*3);
mansrc.bottom = (64*4);
mandst.left = 150;
mandst.right = 214;
mandst.top = 255;
mandst.bottom = 315;
SystemClock.sleep(2000);
seq2();
}
void sqdog4() {
dogsrc.left = 0;
dogsrc.right = 80;
dogsrc.top = 0;
dogsrc.bottom = 80;
SystemClock.sleep(2000);
}
void sqdog3() {
dogsrc.left = 80;
dogsrc.right = 80*2;
dogsrc.top = 80;
dogsrc.bottom = 80*2;
SystemClock.sleep(2000);
sqdog4();
}
void sqdog2() {
int count = 0;
while (true) {
dogsrc.left +=80;
dogsrc.right +=80;
if (dogsrc.left > (80*2)) {
dogsrc.left = 0;
dogsrc.right = 80;
}
SystemClock.sleep(100);
count++;
if (count > 28) {
break;
}
}
SystemClock.sleep(5000);
// insert sequence
sqdog3();
}
void sqdog1() {
dogsrc.left = 0;
dogsrc.right = 80;
dogsrc.top = 0;
dogsrc.bottom = 80;
dogdst.left = 470;
dogdst.right = 535;
dogdst.top = 280;
dogdst.bottom = 315;
sqdog2();
}
void sqbear2() {
int count = 0;
while (true) {
bearsrc.left+= 56;
bearsrc.right+=56;
if (bearsrc.left > 56) {
bearsrc.left = 0;
bearsrc.right = 56;
}
SystemClock.sleep(100);
count++;
if (count > 14) {
break;
}
}
SystemClock.sleep(2000);
}
void sqbear1() {
bearsrc.left = 0;
bearsrc.right = 56;
bearsrc.top = (56*4);
bearsrc.bottom = (56*5);
beardst.left = 600;
beardst.right = 656;
beardst.top = 260;
beardst.bottom = 316;
sqbear2();
}
public SurfaceViewer(Context context, AttributeSet attrs) {
super(context, attrs);
bgdst = new Rect();
charsrc = new Rect();
chardst = new Rect();
dogsrc = new Rect();
dogdst = new Rect();
bearsrc = new Rect();
beardst = new Rect();
dialogsrc = new Rect();
dialogdst = new Rect();
mansrc = new Rect();
mandst = new Rect();
mollysrc = new Rect();
mollydst = new Rect();
youngstarsrc = new Rect();
youngstardst = new Rect();
// TODO Auto-generated constructor stub
main = BitmapFactory.decodeResource(getResources(),
R.drawable.moviemysterycharacter);
bgd = BitmapFactory.decodeResource(getResources(),
R.drawable.background);
dog = BitmapFactory.decodeResource(getResources(),
R.drawable.dog);
bear = BitmapFactory.decodeResource(getResources(),
R.drawable.differentbears);
man1 = BitmapFactory.decodeResource(getResources(), R.drawable.man1);
dialog = BitmapFactory.decodeResource(getResources(), R.drawable.dialog1);
molly = BitmapFactory.decodeResource(getResources(), R.drawable.auntmollu);
youngstar = BitmapFactory.decodeResource(getResources(), R.drawable.auntmollu);
p = new Paint();
rstory = new Runnable() {
public void run() {
while(true) {
sequence1();
}
}
};
tstory = new Thread(rstory);
tstory.start();
dstory = new Runnable() {
public void run() {
while(true) {
sqdog1();
}
}
};
tdstory = new Thread(dstory);
tdstory.start();
dbear = new Runnable() {
public void run() {
while(true) {
sqbear1();
}
}
};
tbear = new Thread(dbear);
tbear.start();
dbear = new Runnable() {
public void run() {
while (true) {
sqyoungstar1();
}
}
};
tbear = new Thread(dbear);
tbear.start();
}
#Override
public boolean onTouch(View v, MotionEvent me) {
// TODO Auto-generated method stub
Intent i = new Intent(svc, SecondAct.class);
switch (me.getAction()) {
case MotionEvent.ACTION_DOWN:
svc.startActivity(i);
break;
case MotionEvent.ACTION_UP:
svc.startActivity(i);
break;
}
return true;
}
}
Try to use getContext() instead of svc.
I check the reference to obtain the Screen height and width. My mobile phone’s height and width like below
Point p = new Point();
getWindowManager().getDefaultDisplay().getSize(p);
p.x //width 540
p.y //height 960
I write a demo, but the value obtained not right
The demo like this
import android.graphics.Point;
public class Strike {
private StrikeView view;
private int left = 0;
private int top = 0;
private Point win;
private boolean isDown = true;
private boolean isRight = true;
public Strike(StrikeView view, Point win) {
this.view = view;
this.win = win;
this.left = view.getLeft();
this.top = view.getTop();
}
/**
* Set direction and refresh
*/
public void setPostion() {
view.setLeft(left);
view.setTop(top);
view.invalidate();
}
/**
* calculate view direction
*/
public void postion() {
if (isRight) {
if (left + 30 < win.x)
left++;
else {
isRight = false;
left--;
}
} else {
if (left > 0)
left--;
else {
isRight = true;
left++;
}
}
if (isDown) {
if (top + 30 < win.y)
top++;
else {
isDown = false;
top--;
}
} else {
if (top > 0)
top--;
else {
isDown = true;
top++;
}
}
}
}
Here is what I used (tested on Android 2.1+):
private void size() {
if (Build.VERSION.SDK_INT < Build.VERSION_CODES.HONEYCOMB_MR2) {
size_old(getWindowManager().getDefaultDisplay());
}
else {
size_new(getWindowManager().getDefaultDisplay());
}
}
#TargetApi(Build.VERSION_CODES.HONEYCOMB_MR2)
private void size_new(Display display) {
Point point = new Point();
display.getSize(point);
// point.x and point.y
}
#SuppressWarnings("deprecation")
private String size_old(Display display) {
// display.getWidth() and display.getHeight()
}
To know size of any phone is worth to use such method...
DisplayMetrics displaymetrics = new DisplayMetrics();
getWindowManager().getDefaultDisplay().getMetrics(displaymetrics);
int height = displaymetrics.heightPixels;
int wwidth = displaymetrics.widthPixels;
Use this:
DisplayMetrics metrics = context.getResources().getDisplayMetrics();
int width = metrics.widthPixels;
int height = metrics.heightPixels;
I have an app that processes a bitmap with a spherize distortion. You can touch the screen and set the radius of a circle that will contain the distortion. Once the distort button is pressed a subset bitmap is created the same size of the radius and this subset bitmap is sent for processing. Once the subset is distorted it is put back on the original bitmap as an overlay using the x,y cords from the original touch event.
Everything works fine apart from that the last line of pixels (across the bottom) of the subset bitmap is not populated with pixel data. It looks like there is a black line at the bottom of the subset bitmap. The distortion class uses parallel programming. This checks the hardware at runtime to find out how many processor are available and the splits the bitmap up over the processor accordingly. I've had help with the parallelization and not sure how to find out why the black line is present. The looping seems to be in order, any ideas? Thanks in advance Matt.
.
import java.util.concurrent.Callable;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.concurrent.FutureTask;
import android.graphics.Bitmap;
import android.os.Debug;
import android.util.Log;
public class MultiRuntimeProcessorFilter {
private static final String TAG = "mrpf";
private int x = 0;
private Bitmap input = null;
private int radius;
public void createBitmapSections(int nOp, int[] sections){
int processors = nOp;
int jMax = input.getHeight();
int aSectionSize = (int) Math.ceil(jMax/processors);
Log.e(TAG, "++++++++++ sections size = "+aSectionSize);
int k = 0;
for(int h=0; h<processors+1; h++){
sections[h] = k;
k+= aSectionSize;
}
}// end of createBitmapSections()
#SuppressWarnings("unchecked")
public Bitmap barrel (Bitmap input, float k, int r){
this.radius = r;
this.input = input;
int []arr = new int[input.getWidth()*input.getHeight()];
int nrOfProcessors = Runtime.getRuntime().availableProcessors();
int[] sections = new int[nrOfProcessors+1];
createBitmapSections(nrOfProcessors,sections);
ExecutorService threadPool = Executors.newFixedThreadPool(nrOfProcessors);
for(int g=0; g<sections.length;g++){
Log.e(TAG, "++++++++++ sections= "+sections[g]);
}
// ExecutorService threadPool = Executors.newFixedThreadPool(nrOfProcessors);
Object[] task = new Object[nrOfProcessors];
for(int z = 0; z < nrOfProcessors; z++){
task[z] = (FutureTask<PartialResult>) threadPool.submit(new PartialProcessing(sections[z], sections[z+1] - 1, input, k));
Log.e(TAG, "++++++++++ task"+z+"= "+task[z].toString());
}
PartialResult[] results = new PartialResult[nrOfProcessors];
try{
for(int t = 0; t < nrOfProcessors; t++){
results[t] = ((FutureTask<PartialResult>) task[t]).get();
results[t].fill(arr);
}
}catch(Exception e){
e.printStackTrace();
}
Bitmap dst2 = Bitmap.createBitmap(arr,input.getWidth(),input.getHeight(),input.getConfig());
return dst2;
}//end of barrel()
public class PartialResult {
int startP;
int endP;
int[] storedValues;
public PartialResult(int startp, int endp, Bitmap input){
this.startP = startp;
this.endP = endp;
this.storedValues = new int[input.getWidth()*input.getHeight()];
}
public void addValue(int p, int result) {
storedValues[p] = result;
}
public void fill(int[] arr) {
for (int p = startP; p < endP; p++){
for(int b=0;b<radius;b++,x++)
arr[x] = storedValues[x];
}
Log.e(TAG, "++++++++++ x ="+x);
}
}//end of partialResult
public class PartialProcessing implements Callable<PartialResult> {
int startJ;
int endJ;
private int[] scalar;
private float xscale;
private float yscale;
private float xshift;
private float yshift;
private float thresh = 1;
private int [] s1;
private int [] s2;
private int [] s3;
private int [] s4;
private int [] s;
private Bitmap input;
private float k;
public PartialProcessing(int startj, int endj, Bitmap input, float k) {
this.startJ = startj;
this.endJ = endj;
this.input = input;
this.k = k;
s = new int[4];
scalar = new int[4];
s1 = new int[4];
s2 = new int[4];
s3 = new int[4];
s4 = new int[4];
}
int [] getARGB(Bitmap buf,int x, int y){
int rgb = buf.getPixel(y, x); // Returns by default ARGB.
// int [] scalar = new int[4];
// scalar[0] = (rgb >>> 24) & 0xFF;
scalar[1] = (rgb >>> 16) & 0xFF;
scalar[2] = (rgb >>> 8) & 0xFF;
scalar[3] = (rgb >>> 0) & 0xFF;
return scalar;
}
float getRadialX(float x,float y,float cx,float cy,float k){
x = (x*xscale+xshift);
y = (y*yscale+yshift);
float res = x+((x-cx)*k*((x-cx)*(x-cx)+(y-cy)*(y-cy)));
return res;
}
float getRadialY(float x,float y,float cx,float cy,float k){
x = (x*xscale+xshift);
y = (y*yscale+yshift);
float res = y+((y-cy)*k*((x-cx)*(x-cx)+(y-cy)*(y-cy)));
return res;
}
float calc_shift(float x1,float x2,float cx,float k){
float x3 = (float)(x1+(x2-x1)*0.5);
float res1 = x1+((x1-cx)*k*((x1-cx)*(x1-cx)));
float res3 = x3+((x3-cx)*k*((x3-cx)*(x3-cx)));
if(res1>-thresh && res1 < thresh)
return x1;
if(res3<0){
return calc_shift(x3,x2,cx,k);
}
else{
return calc_shift(x1,x3,cx,k);
}
}
void sampleImage(Bitmap arr, float idx0, float idx1)
{
// s = new int [4];
if(idx0<0 || idx1<0 || idx0>(arr.getHeight()-1) || idx1>(arr.getWidth()-1)){
s[0]=0;
s[1]=0;
s[2]=0;
s[3]=0;
return;
}
float idx0_fl=(float) Math.floor(idx0);
float idx0_cl=(float) Math.ceil(idx0);
float idx1_fl=(float) Math.floor(idx1);
float idx1_cl=(float) Math.ceil(idx1);
s1 = getARGB(arr,(int)idx0_fl,(int)idx1_fl);
s2 = getARGB(arr,(int)idx0_fl,(int)idx1_cl);
s3 = getARGB(arr,(int)idx0_cl,(int)idx1_cl);
s4 = getARGB(arr,(int)idx0_cl,(int)idx1_fl);
float x = idx0 - idx0_fl;
float y = idx1 - idx1_fl;
// s[0]= (int) (s1[0]*(1-x)*(1-y) + s2[0]*(1-x)*y + s3[0]*x*y + s4[0]*x*(1-y));
s[1]= (int) (s1[1]*(1-x)*(1-y) + s2[1]*(1-x)*y + s3[1]*x*y + s4[1]*x*(1-y));
s[2]= (int) (s1[2]*(1-x)*(1-y) + s2[2]*(1-x)*y + s3[2]*x*y + s4[2]*x*(1-y));
s[3]= (int) (s1[3]*(1-x)*(1-y) + s2[3]*(1-x)*y + s3[3]*x*y + s4[3]*x*(1-y));
}
#Override public PartialResult call() {
PartialResult partialResult = new PartialResult(startJ, endJ,input);
float centerX=input.getWidth()/2; //center of distortion
float centerY=input.getHeight()/2;
int width = input.getWidth(); //image bounds
int height = input.getHeight();
xshift = calc_shift(0,centerX-1,centerX,k);
float newcenterX = width-centerX;
float xshift_2 = calc_shift(0,newcenterX-1,newcenterX,k);
yshift = calc_shift(0,centerY-1,centerY,k);
float newcenterY = height-centerY;
float yshift_2 = calc_shift(0,newcenterY-1,newcenterY,k);
xscale = (width-xshift-xshift_2)/width;
yscale = (height-yshift-yshift_2)/height;
int p = startJ*radius;
int origPixel = 0;
int color = 0;
int i;
for (int j = startJ; j < endJ; j++){
for ( i = 0; i < width; i++, p++){
origPixel = input.getPixel(i,j);
float x = getRadialX((float)j,(float)i,centerX,centerY,k);
float y = getRadialY((float)j,(float)i,centerX,centerY,k);
sampleImage(input,x,y);
color = ((s[1]&0x0ff)<<16)|((s[2]&0x0ff)<<8)|(s[3]&0x0ff);
//Log.e(TAG, "radius = "+radius);
if(((i-centerX)*(i-centerX) + (j-centerY)*(j-centerY)) <= radius*(radius/4)){
partialResult.addValue(p, color);
}else{
partialResult.addValue(p, origPixel);
}
}//end of inner for
}//end of outer for
return partialResult;
}//end of call
}// end of partialprocessing
}//end of MultiProcesorFilter
.
[update] I'll post the view class that calls the barrel method. this class gets the touch events and sets the radius of the distortion prior to processing. You can see more how everything is set up before the distortion is applied.
public class TouchView extends View{
private File tempFile;
private byte[] imageArray;
private Bitmap bgr;
private Bitmap crop;
private Bitmap crop2;
private Bitmap overLay;
private Bitmap overLay2;
private Paint pTouch;
private float centreX;
private float centreY;
private float centreA;
private float centreB;
private Boolean xyFound = false;
private Boolean abFound = false;
private int Progress = 1;
private static final String TAG = "*********TouchView";
private Filters f = null;
private Filters f2 = null;
private boolean bothCirclesInPlace = false;
private MultiProcessorFilter mpf;
private MultiProcessorFilter mpf2;
private MultiRuntimeProcessorFilter mrpf;
private MultiRuntimeProcessorFilter mrpf2;
private int radius = 50;
protected boolean isLocked = false;
protected boolean isSaved = false;
protected byte [] data;
private float distance1;
private float distance2;
public TouchView(Context context) {
super(context);
}
public TouchView(Context context, AttributeSet attr) {
super(context,attr);
Log.e(TAG, "++++++++++ inside touchview constructor");
tempFile = new File(Environment.getExternalStorageDirectory().
getAbsolutePath() + "/"+"image.jpeg");
imageArray = new byte[(int)tempFile.length()];
// new Thread(new Runnable() {
// public void run() {
try{
InputStream is = new FileInputStream(tempFile);
BufferedInputStream bis = new BufferedInputStream(is);
DataInputStream dis = new DataInputStream(bis);
int i = 0;
while (dis.available() > 0 ) {
imageArray[i] = dis.readByte();
i++;
}
dis.close();
} catch (Exception e) {
e.printStackTrace();
}
// }
// }).start();
Bitmap bm = BitmapFactory.decodeByteArray(imageArray, 0, imageArray.length);
if(bm == null){
Log.e(TAG, "bm = null");
}else{
Log.e(TAG, "bm = not null");
}
bgr = bm.copy(bm.getConfig(), true);
overLay = null;
overLay2 = null;
bm.recycle();
pTouch = new Paint(Paint.ANTI_ALIAS_FLAG);
// pTouch.setXfermode(new PorterDuffXfermode(Mode.SRC_OUT));
pTouch.setColor(Color.RED);
pTouch.setStyle(Paint.Style.STROKE);
}// end of touchView constructor
public void findCirclePixels(){
//f = new Filters();
// f2 = new Filters();
//mpf = new MultiProcessorFilter();
//mpf2 = new MultiProcessorFilter();
mrpf = new MultiRuntimeProcessorFilter();
mrpf2 = new MultiRuntimeProcessorFilter();
crop = Bitmap.createBitmap(bgr,Math.max((int)centreX-radius,0),Math.max((int)centreY-radius,0),radius*2,radius*2);
crop2 = Bitmap.createBitmap(bgr,Math.max((int)centreA-radius,0),Math.max((int)centreB-radius,0),radius*2,radius*2);
new Thread(new Runnable() {
public void run() {
float prog = (float)Progress/150001;
// final Bitmap bgr3 = f.barrel(crop,prog);
// final Bitmap bgr4 = f2.barrel(crop2,prog);
//final Bitmap bgr3 = mpf.barrel(crop,prog);
// final Bitmap bgr4 = mpf2.barrel(crop2,prog);
final Bitmap bgr3 = mrpf.barrel(crop,prog,radius*2);
final Bitmap bgr4 = mrpf2.barrel(crop2,prog, radius*2);
TouchView.this.post(new Runnable() {
public void run() {
TouchView.this.overLay = bgr3;
TouchView.this.overLay2 = bgr4;
TouchView.this.invalidate();
}
});
}
}).start();
}// end of findCirclePixels()
#Override
public boolean onTouchEvent(MotionEvent ev) {
switch (ev.getAction()) {
case MotionEvent.ACTION_DOWN: {
int w = getResources().getDisplayMetrics().widthPixels;
int h = getResources().getDisplayMetrics().heightPixels;
if(ev.getX() <radius || ev.getX() > w - radius ){
// Log.e(TAG, "touch event is too near width edge!!!!!!!!!!");
showToastMessage("You touched too near the screen edge");
break;
}
if(ev.getY() <radius || ev.getY() > h - radius ){
// Log.e(TAG, "touch event is too near height edge!!!!!!!!!!");
showToastMessage("You touched too near the screen edge");
break;
}
distance1 = (float) Math.sqrt(Math.pow(ev.getX() - centreX, 2.0) + Math.pow(ev.getY() - centreY, 2.0));
distance2 = (float) Math.sqrt(Math.pow(ev.getX() - centreA, 2.0) + Math.pow(ev.getY() - centreB, 2.0));
Log.e(TAG, "dist1 = "+distance1 +" distance2 = " + distance2);
if(isLocked == false){
if(abFound == false){
centreA = (int) ev.getX();
centreB = (int) ev.getY();
abFound = true;
invalidate();
}
if(xyFound == false){
centreX = (int) ev.getX();
centreY = (int) ev.getY();
xyFound = true;
invalidate();
}
if(abFound == true && xyFound == true){
bothCirclesInPlace = true;
}
break;
}
}
case MotionEvent.ACTION_MOVE: {
if(isLocked == false){
/*if(xyFound == false){
centreX = (int) ev.getX()-70;
centreY = (int) ev.getY()-70;
xyFound = true;
}else{
centreA = (int) ev.getX()-70;
centreB = (int) ev.getY()-70;
bothCirclesInPlace = true;
invalidate();
}
*/
if(distance1 < distance2){
centreX = (int) ev.getX();
centreY = (int) ev.getY();
xyFound = true;
invalidate();
}else{
centreA = (int) ev.getX();
centreB = (int) ev.getY();
bothCirclesInPlace = true;
invalidate();
}
break;
}
}
case MotionEvent.ACTION_UP:
break;
}
return true;
}//end of onTouchEvent
public void initSlider(final HorizontalSlider slider)
{
slider.setOnProgressChangeListener(changeListener);
}
private OnProgressChangeListener changeListener = new OnProgressChangeListener() {
#Override
public void onProgressChanged(View v, int progress) {
if(isLocked == true){
setProgress(progress);
}else{
Toast.makeText(TouchView.this.getContext(), "press lock before applying distortion ", Toast.LENGTH_SHORT).show();
}
}
};
#Override
public void onDraw(Canvas canvas){
super.onDraw(canvas);
Log.e(TAG, "******about to draw bgr ");
canvas.drawBitmap(bgr, 0, 0, null);
if(isSaved == false){
if (isLocked == true && bothCirclesInPlace == true){
if(overLay != null)
canvas.drawBitmap(overLay, centreX-radius, centreY-radius, null);
if(overLay2 != null)
canvas.drawBitmap(overLay2, centreA-radius, centreB-radius, null);
}
if(bothCirclesInPlace == true && isLocked == false){
canvas.drawCircle(centreX, centreY, radius,pTouch);
canvas.drawCircle(centreA, centreB, radius,pTouch);
}
}else{
// String mFilePath : Absolute Path of the file to be saved
// Bitmap mBitmap1 : First bitmap. This goes as background.
// Bitmap mCBitmap : Bitmap associated with the Canvas. All draws on the canvas are drawn into this bitmap.
// Bitmap mBitmap2 : Second bitmap. This goes on top of first (in this example serves as foreground.
// Paint mPaint1 : Paint to draw first bitmap
// Paint mPaint2 : Paint to draw second bitmap on top of first bitmap
isSaved = false;
Bitmap mCBitmap = Bitmap.createBitmap(bgr.getWidth(), bgr.getHeight(), bgr.getConfig());
Canvas tCanvas = new Canvas(mCBitmap);
tCanvas.drawBitmap(bgr, 0, 0, null);
if(overLay != null)
tCanvas.drawBitmap(overLay, centreX-radius, centreY-radius, null);
if(overLay2 != null)
tCanvas.drawBitmap(overLay2, centreA-radius, centreB-radius, null);
canvas.drawBitmap(mCBitmap, 0, 0, null);
ByteArrayOutputStream bos = new ByteArrayOutputStream();
mCBitmap.compress(CompressFormat.JPEG, 100 /*ignored for PNG*/, bos);
data = bos.toByteArray();
try {
bos.flush();
bos.close();
} catch (IOException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
}
try {
bos.flush();
bos.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
if ( data == null){
Log.e(TAG, "data in touchview before save clicked is null");
}else{
Log.e(TAG, "data in touchview before saved clicked is not null");
}
}
}//end of onDraw
protected void setProgress(int progress2) {
Log.e(TAG, "***********in SETPROGRESS");
this.Progress = progress2;
findCirclePixels();
}
public int getRadius() {
return radius;
}
public void setRadius(int r) {
radius = r;
invalidate();
}
public void showToastMessage(String mess){
Toast.makeText(TouchView.this.getContext(), mess.toString(), Toast.LENGTH_SHORT).show();
}
}
My guess would be that when the bottom of the image is processed, the operation operates partially on the input image, and partially outside of the image, due to the radius in your barrel method. Edges can often cause issues when operating outside the bounds of an actual image, giving 0 as a result, which can cause a black line...
I suggest to try to increase the size of your image:
#SuppressWarnings("unchecked")
public Bitmap barrel (Bitmap input, float k, int r){
this.radius = r;
this.input = input;
// Add an offset to the width and height equal to the radius
// To avoid performing processing outside the bounds of the input image
int []arr = new int[(input.getWidth() + this.radius) * (input.getHeight() + this.radius)];
// Continue...
Again, this is my first guess, and I have no time to check right now, but surely, investigating the edge first, would be my recommendation.
just a guess, what happen if you put this
BitmapDrawable bmpd = new BitmapDrawable(input);
int []arr = new int[(bmpd.getIntrinsicWidth() + this.radius) * (bmpd. getIntrinsicHeight() + this.radius)];
Your problem most likely has to do with your assumed coordinate system of the image and of the spherize algorithm.
See MathWorks Image Coordinate Systems
I expect that you are treating your input/output images according to the Pixel Indices method, but the spherize algorithm is processing your data using the Spatial Coordinate System. This often causes the outermost border of a processed image to be blank because the algorithm has translated your image up and to the left by 0.5 pixels. Coordinate 3 in the original system is now 3.5 in the new system and has fallen outside the bounds of computation.
This is actually a huge problem in 2D to 3D image processing algorithms as the projection between the two spaces is not exactly trivial and tiny implementation differences cause noticeable problems. Notice how the Pixel Indices coordinate system is 3x3, but the Spatial Coordinate system is essentially 4x4.
Try setting your spherize barrel to be width+1/height+1 instead of width/height and see if that fills out your missing row.