Control dji mavic enterprise via virtual sticks - android

I am trying to control drone with command sendVirtualStickFlightControlData.
To move the drone left I use following code.
sendVirtualStickDataTask = new SendVirtualStickDataTask(0, -5, 0, 0);
sendVirtualStickDataTimer = new Timer();
sendVirtualStickDataTimer.schedule(sendVirtualStickDataTask, 100, 200);
private class SendVirtualStickDataTask extends TimerTask {
private float pitch;
private float roll;
private float yaw;
private float throttle;
private long startTime = System.currentTimeMillis();
public SendVirtualStickDataTask(float inputPitch, float inputRoll, float inputYaw, float inputThrottle) {
pitch = inputPitch;
roll = inputRoll;
yaw = inputYaw;
throttle = inputThrottle;
}
#Override
public void run() {
if (System.currentTimeMillis() - startTime > 300) {
mFlightController.sendVirtualStickFlightControlData(new FlightControlData(0, 0, 0, 0),
new CommonCallbacks.CompletionCallback() {
#Override
public void onResult(final DJIError djiError) {
}
});
cancel();
} else {
mFlightController
.sendVirtualStickFlightControlData(new FlightControlData(pitch,
roll,
yaw,
throttle),
new CommonCallbacks.CompletionCallback() {
#Override
public void onResult(final DJIError djiError) {
}
});
}
}
}
However, the drone moves to the left and then drops down sharply.
What is the reason for this drone behavior?

I have no clue why it is dropping sharply downwards, since you are not changing the throttle values. Make sure to enable virtual sticks before usage. It might have to do with you needing to set maximum values. This code works fine for me.
/* controllers virtual sticks command and triggers drone movement */
private void sendVirtualStickCommands(final float pX, final float pY, final float pZ, final float pYaw){
//maximum amounts
float verticalJoyControlMaxSpeed = 2;
float yawJoyControlMaxSpeed = 30;
float pitchJoyControlMaxSpeed = 10;
float rollJoyControlMaxSpeed = 10;
//set yaw, pitch, throttle, roll
float mYaw = (float)(yawJoyControlMaxSpeed * pYaw);
float mThrottle = (float)(verticalJoyControlMaxSpeed * pZ);
float mPitch = (float)(pitchJoyControlMaxSpeed * pX);
float mRoll = (float)(rollJoyControlMaxSpeed * pY);
if (mFlightController != null){
//if virtual sticks are enabled, send the command, otherwise turn them on
if (virtualSticksEnabled){
mFlightController.sendVirtualStickFlightControlData(
new FlightControlData(
mPitch, mRoll, mYaw, mThrottle
), new CommonCallbacks.CompletionCallback() {
#Override
public void onResult(DJIError djiError) {
if (djiError!=null){
setResultToToast(djiError.getDescription());
}
}
}
);
} else {
setResultToToast("flight controller virtual mode off");
//if not enabled, enable
mFlightController.setVirtualStickModeEnabled(true, new CommonCallbacks.CompletionCallback() {
#Override
public void onResult(DJIError djiError) {
if (djiError != null){
setResultToToast(djiError.getDescription());
}else
{
setResultToToast("Enable Virtual Stick Success");
virtualSticksEnabled = true;
sendVirtualStickCommands(pX, pY, pZ, pYaw);
}
}
});
}
} else{
setResultToToast("Flight Controller Null");
}
}
Then, to move in a direction:
sendVirtualStickCommands(0.1f, 0, 0, 0);//move right
sendVirtualStickCommands(-0.1f, 0, 0, 0);//move left
sendVirtualStickCommands(0, 0.1f, 0, 0);//move forward
sendVirtualStickCommands(0, -0.1f, 0, 0);//move backwards
sendVirtualStickCommands(0, 0, 0.1f, 0);//move upwards
sendVirtualStickCommands(0, 0, -0.1f, 0);//move downwards

try this
mFlightController.setRollPitchControlMode(RollPitchControlMode.VELOCITY);
mFlightController.setYawControlMode(YawControlMode.ANGULAR_VELOCITY);
mFlightController.setVerticalControlMode(VerticalControlMode.POSITION);
mFlightController.setRollPitchCoordinateSystem(FlightCoordinateSystem.BODY);
and set mThrottle >= 0.0f
in VerticalControlMode.POSITION, mThrottle is a flight height you want to set in meters above the ground.
For example mThrottle = 2.0f means to move the drone to 2 meters above the ground. Set the height you want and the drone will rise to it or "drops down" if mThrottle = 0f.

Related

Align a rotated Sprite with a DinamicBody libgdx

I've a sprite (rectangular) that rotates around its central point. I used:
setOriginCenter();
setRotation(angleDegrees);
That works fine.
I defined a DynamicBody (box2d) using a CircleShape.
In the update method of the sprite, I want to center the sprite (rotated sprite) with the center of the DynamicBody.
I tried several method but I can't get a solution:
This doesn't work
Rectangle rect = myRotatedSprite.getBoundingRectangle();
myRotatedSprite.setPosition(b2body.getPosition().x - rect.getWidth() / 2, b2body.getPosition().y - rect.getHeight() / 2);
This doesn't work
myRotatedSprite.setPosition(b2body.getPosition().x - myRotatedSprite.getWidth() / 2, b2body.getPosition().y - myRotatedSprite.getHeight() / 2);
This doesn't work
float rot = myRotatedSprite.getRotation();
myRotatedSprite.setRotation(0);
Rectangle rect = getBoundingRectangle();
myRotatedSprite.setPosition(b2body.getPosition().x - rect.width / 2,
b2body.getPosition().y - rect.height / 2);
myRotatedSprite.setRotation(rot);
myRotatedSprite.getBoundingRectangle();
What I am doing wrong ? It seems to me that getBoundingRectangle() is not correct when the sprite is rotated...
Here is my full code:
public class HeroBullet extends Weapon {
private static final String TAG = HeroBullet.class.getName();
private float stateTimer;
private Animation heroBulletAnimation;
private Vector2 tmp; // Temp GC friendly vector
public HeroBullet(PlayScreen screen, float x, float y, float width, float height, float circleShapeRadius, float angle, Animation animation) {
super(screen, x, y, circleShapeRadius > 0 ? circleShapeRadius : Constants.HEROBULLET_CIRCLESHAPE_RADIUS_METERS);
setOriginCenter();
width = width > 0 ? width : Constants.HEROBULLET_WIDTH_METERS;
height = height > 0 ? height : Constants.HEROBULLET_HEIGHT_METERS;
setBounds(getX(), getY(), width, height);
velocity = new Vector2(Constants.HEROBULLET_VELOCITY_X, Constants.HEROBULLET_VELOCITY_Y);
if (angle > 0) {
velocity.rotate(angle);
setRotation(angle);
}
if (animation != null) {
heroBulletAnimation = animation;
} else {
heroBulletAnimation = Assets.instance.heroBullet.heroBulletAnimation;
}
stateTimer = 0;
currentState = State.SHOT;
AudioManager.instance.play(Assets.instance.sounds.heroShoot, 0.2f, MathUtils.random(1.0f, 1.1f));
tmp = new Vector2();
}
#Override
protected void defineWeapon() {
BodyDef bdef = new BodyDef();
bdef.position.set(getX(), getY()); // In b2box the origin is at the center of the body
bdef.type = BodyDef.BodyType.DynamicBody;
b2body = world.createBody(bdef);
FixtureDef fdef = new FixtureDef();
CircleShape shape = new CircleShape();
shape.setRadius(circleShapeRadius);
fdef.filter.categoryBits = Constants.HERO_WEAPON_BIT; // Depicts what this fixture is
fdef.filter.maskBits = Constants.BORDERS_BIT |
Constants.OBSTACLE_BIT |
Constants.POWERBOX_BIT |
Constants.FINAL_ENEMY_LEVEL_ONE_BIT |
Constants.ENEMY_BIT; // Depicts what this Fixture can collide with
fdef.shape = shape;
b2body.createFixture(fdef).setUserData(this);
}
#Override
public void update(float dt) {
switch (currentState) {
case SHOT:
stateShot(dt);
break;
case ONTARGET:
stateOnTarget();
break;
case FINISHED:
break;
default:
break;
}
super.checkBoundaries();
}
private void stateShot(float dt) {
b2body.setLinearVelocity(velocity);
/* ***********
Here I want to center the sprite (rotated sprite) with the center of the DynamicBody, but I don't know how
******** */
setRegion((TextureRegion) heroBulletAnimation.getKeyFrame(stateTimer, true));
stateTimer += dt;
}
private void stateOnTarget() {
world.destroyBody(b2body);
currentState = State.FINISHED;
}
#Override
public void renderDebug(ShapeRenderer shapeRenderer) {
shapeRenderer.rect(getBoundingRectangle().x, getBoundingRectangle().y, getBoundingRectangle().width, getBoundingRectangle().height);
}
#Override
public void onTarget() {
currentState = State.ONTARGET;
}
public void draw(Batch batch) {
if (currentState == State.SHOT) {
super.draw(batch);
}
}
}

Tango Project LoadADF and render object in Rajawali3D surface

I am trying to load the latest ADF and Localize it.If the text is localized then it would be displayed in a textView.
If it is localized then on clicking save button editText would appear and the text of button will be changed to "save Object" on clicking that the text entered would be toasted.
I am using Rajawali 3D surface view to render the Video.
But on running it My screen is black only textView and button are shown.
Localization TextView is also blank.
Here is my MainActivity Code
public class MainActivity extends Activity {
private static final String TAG = MainActivity.class.getSimpleName();
private static final double UPDATE_INTERVAL_MS = 100.0;
private final Object mSharedLock = new Object();
// private static final String sTimestampFormat = "Timestamp: %f";
private static final int SECS_TO_MILLISECS = 1000;
private double mPreviousPoseTimeStamp;
private double mTimeToNextUpdate = UPDATE_INTERVAL_MS;
private TextView mUuidTextView;
private TextView mRelocalizationTextView;
private boolean mIsRelocalized;
private EditText mObjectValueEditText;
private Button mSaveButton;
private boolean mIsLocationSave = false;
private boolean mIsSavePoseData = false;
private static final int INVALID_TEXTURE_ID = 0;
private RajawaliSurfaceView mSurfaceView;
private VideoRenderer mRenderer;
// For all current Tango devices, color camera is in the camera id 0.
private static final int COLOR_CAMERA_ID = 0;
private TangoCameraIntrinsics mIntrinsics;
private Tango mTango;
private TangoConfig mConfig;
private boolean mIsConnected = false;
private double mCameraPoseTimestamp = 0;
// NOTE: Naming indicates which thread is in charge of updating this variable
private int mConnectedTextureIdGlThread = INVALID_TEXTURE_ID;
private AtomicBoolean mIsFrameAvailableTangoThread = new AtomicBoolean(false);
private double mRgbTimestampGlThread;
private int mColorCameraToDisplayAndroidRotation = 0;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
startActivityForResult(
Tango.getRequestPermissionIntent(Tango.PERMISSIONTYPE_ADF_LOAD_SAVE),
Tango.TANGO_INTENT_ACTIVITYCODE);
mSurfaceView = (RajawaliSurfaceView) findViewById(R.id.surfaceview);
mObjectValueEditText = (EditText) findViewById(R.id.location_edittext);
mSaveButton = (Button) findViewById(R.id.save_button);
mRenderer = new VideoRenderer(this);
onClickSaveButton();
// Set-up a dummy OpenGL renderer associated with this surface view
DisplayManager displayManager = (DisplayManager) getSystemService(DISPLAY_SERVICE);
if (displayManager != null) {
displayManager.registerDisplayListener(new DisplayManager.DisplayListener() {
#Override
public void onDisplayAdded(int displayId) {}
#Override
public void onDisplayChanged(int displayId) {
synchronized (this) {
Display display = getWindowManager().getDefaultDisplay();
Camera.CameraInfo colorCameraInfo = new Camera.CameraInfo();
Camera.getCameraInfo(COLOR_CAMERA_ID, colorCameraInfo);
mColorCameraToDisplayAndroidRotation =
getColorCameraToDisplayAndroidRotation(display.getRotation(),
colorCameraInfo.orientation);
mRenderer.updateColorCameraTextureUv(mColorCameraToDisplayAndroidRotation);
}
}
#Override
public void onDisplayRemoved(int displayId) {}
}, null);
}
setupRenderer();
}
private void onClickSaveButton() {
mSaveButton.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
if (mIsLocationSave == false) {
mIsLocationSave = true;
mObjectValueEditText.setVisibility(View.VISIBLE);
mSaveButton.setText(getResources().getString(R.string.save_object));
} else {
mIsLocationSave = false;
String text=mObjectValueEditText.getText().toString();
Toast.makeText(getApplicationContext(),text,Toast.LENGTH_SHORT).show();
mObjectValueEditText.setVisibility(View.INVISIBLE);
//TODO:Add helper logic
mIsSavePoseData=true;
mSaveButton.setText(getResources().getString(R.string.save));
}
}
});
}
#Override
protected void onResume() {
super.onResume();
mSurfaceView.onResume();
// Set render mode to RENDERMODE_CONTINUOUSLY to force getting onDraw callbacks until the
// Tango service is properly set-up and we start getting onFrameAvailable callbacks.
mSurfaceView.setRenderMode(GLSurfaceView.RENDERMODE_CONTINUOUSLY);
// Initialize Tango Service as a normal Android Service, since we call mTango.disconnect()
// in onPause, this will unbind Tango Service, so every time when onResume gets called, we
// should create a new Tango object.
mTango = new Tango(MainActivity.this, new Runnable() {
// Pass in a Runnable to be called from UI thread when Tango is ready, this Runnable
// will be running on a new thread.
// When Tango is ready, we can call Tango functions safely here only when there is no UI
// thread changes involved.
#Override
public void run() {
// Synchronize against disconnecting while the service is being used in
// the OpenGL thread or in the UI thread.
synchronized (MainActivity.this) {
try {
TangoSupport.initialize();
mConfig = setupTangoConfig(mTango);
mTango.connect(mConfig);
startupTango();
mIsConnected = true;
} catch (TangoOutOfDateException e) {
Log.e(TAG, getString(R.string.exception_out_of_date), e);
} catch (TangoErrorException e) {
Log.e(TAG, getString(R.string.exception_tango_error), e);
} catch (TangoInvalidException e) {
Log.e(TAG, getString(R.string.exception_tango_invalid), e);
}
runOnUiThread(new Runnable() {
#Override
public void run() {
synchronized (MainActivity.this) {
setupTextViewsAndButtons(mTango, true,
true);
}
}
});
}
}
});
}
#Override
protected void onPause() {
super.onPause();
mSurfaceView.onPause();
mIsRelocalized = false;
// Synchronize against disconnecting while the service is being used in the OpenGL
// thread or in the UI thread.
// NOTE: DO NOT lock against this same object in the Tango callback thread.
// Tango.disconnect will block here until all Tango callback calls are finished.
// If you lock against this object in a Tango callback thread it will cause a deadlock.
synchronized (this) {
try {
mIsConnected = false;
mTango.disconnectCamera(TangoCameraIntrinsics.TANGO_CAMERA_COLOR);
// We need to invalidate the connected texture ID so that we cause a
// re-connection in the OpenGL thread after resume
mConnectedTextureIdGlThread = INVALID_TEXTURE_ID;
mTango.disconnect();
mIsConnected = false;
} catch (TangoErrorException e) {
Log.e(TAG, getString(R.string.exception_tango_error), e);
}
}
}
/**
* Sets up the tango configuration object. Make sure mTango object is initialized before
* making this call.
*/
private TangoConfig setupTangoConfig(Tango tango) {
// Create a new Tango Configuration and enable the Camera API
TangoConfig config = tango.getConfig(TangoConfig.CONFIG_TYPE_DEFAULT);
config.putBoolean(TangoConfig.KEY_BOOLEAN_COLORCAMERA, true);
config.putBoolean(TangoConfig.KEY_BOOLEAN_DRIFT_CORRECTION, true);
config.putBoolean(TangoConfig.KEY_BOOLEAN_LOWLATENCYIMUINTEGRATION, true);
config.putBoolean(TangoConfig.KEY_BOOLEAN_LEARNINGMODE, false);
config.putBoolean(TangoConfig.KEY_BOOLEAN_MOTIONTRACKING, true);
// Tango service should automatically attempt to recover when it enters an invalid state.
config.putBoolean(TangoConfig.KEY_BOOLEAN_AUTORECOVERY, true);
// Check for Load ADF/Constant Space relocalization mode.
ArrayList<String> fullUuidList;
// Returns a list of ADFs with their UUIDs.
fullUuidList = tango.listAreaDescriptions();
// Load the latest ADF if ADFs are found.
if (fullUuidList.size() > 0) {
config.putString(TangoConfig.KEY_STRING_AREADESCRIPTION,
fullUuidList.get(fullUuidList.size() - 1));
}
return config;
}
/**
* Set up the callback listeners for the Tango service and obtain other parameters required
* after Tango connection.
* Listen to updates from the RGB camera.
*/
private void startupTango() {
// Lock configuration and connect to Tango
// Select coordinate frame pair
ArrayList<TangoCoordinateFramePair> framePairs = new ArrayList<TangoCoordinateFramePair>();
framePairs.add(new TangoCoordinateFramePair(
TangoPoseData.COORDINATE_FRAME_START_OF_SERVICE,
TangoPoseData.COORDINATE_FRAME_DEVICE));
framePairs.add(new TangoCoordinateFramePair(
TangoPoseData.COORDINATE_FRAME_AREA_DESCRIPTION,
TangoPoseData.COORDINATE_FRAME_DEVICE));
framePairs.add(new TangoCoordinateFramePair(
TangoPoseData.COORDINATE_FRAME_AREA_DESCRIPTION,
TangoPoseData.COORDINATE_FRAME_START_OF_SERVICE));
// Listen for new Tango data
mTango.connectListener(framePairs, new Tango.OnTangoUpdateListener() {
#Override
public void onPoseAvailable(final TangoPoseData pose) {
// Make sure to have atomic access to Tango Data so that UI loop doesn't interfere
// while Pose call back is updating the data.
synchronized (mSharedLock) {
// Check for Device wrt ADF pose, Device wrt Start of Service pose, Start of
// Service wrt ADF pose (This pose determines if the device is relocalized or
// not).
if (pose.baseFrame == TangoPoseData.COORDINATE_FRAME_AREA_DESCRIPTION
&& pose.targetFrame == TangoPoseData
.COORDINATE_FRAME_START_OF_SERVICE) {
if (pose.statusCode == TangoPoseData.POSE_VALID) {
mIsRelocalized = true;
} else {
mIsRelocalized = false;
}
}
if (mIsSavePoseData&&mIsRelocalized) {
mIsSavePoseData=false;
float [] traslations=pose.getTranslationAsFloats();
Log.i("translation",traslations[0]+" "+traslations[1]+" "+traslations[2]);
}
}
final double deltaTime = (pose.timestamp - mPreviousPoseTimeStamp) *
SECS_TO_MILLISECS;
mPreviousPoseTimeStamp = pose.timestamp;
mTimeToNextUpdate -= deltaTime;
if (mTimeToNextUpdate < 0.0) {
mTimeToNextUpdate = UPDATE_INTERVAL_MS;
runOnUiThread(new Runnable() {
#Override
public void run() {
synchronized (mSharedLock) {
mRelocalizationTextView.setText(mIsRelocalized ?
getString(R.string.localized) :
getString(R.string.not_localized));
}
}
});
}
}
//Deprecated method
#Override
public void onXyzIjAvailable(TangoXyzIjData xyzIj) {
// We are not using onXyzIjAvailable for this app.
}
#Override
public void onPointCloudAvailable(final TangoPointCloudData pointCloudData) {
}
#Override
public void onTangoEvent(final TangoEvent event) {
}
#Override
public void onFrameAvailable(int cameraId) {
Log.d(TAG, "onFrameAvailable");
if (cameraId == TangoCameraIntrinsics.TANGO_CAMERA_COLOR) {
// Now that we are receiving onFrameAvailable callbacks, we can switch
// to RENDERMODE_WHEN_DIRTY to drive the render loop from this callback.
// This will result on a frame rate of approximately 30FPS, in synchrony with
// the RGB camera driver.
// If you need to render at a higher rate (i.e.: if you want to render complex
// animations smoothly) you can use RENDERMODE_CONTINUOUSLY throughout the
// application lifecycle.
if (mSurfaceView.getRenderMode() != GLSurfaceView.RENDERMODE_WHEN_DIRTY) {
mSurfaceView.setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);
}
mSurfaceView.requestRender();
}
}
});
}
/**
* Sets Texts views to display statistics of Poses being received. This also sets the buttons
* used in the UI. Please note that this needs to be called after TangoService and Config
* objects are initialized since we use them for the SDK related stuff like version number
* etc.
*/
private void setupTextViewsAndButtons(Tango tango, boolean isLearningMode, boolean isLoadAdf) {
mRelocalizationTextView = (TextView) findViewById(R.id.relocalization_textview);
mUuidTextView = (TextView) findViewById(R.id.adf_uuid_textview);
if (isLoadAdf) {
ArrayList<String> fullUuidList;
// Returns a list of ADFs with their UUIDs
fullUuidList = tango.listAreaDescriptions();
if (fullUuidList.size() == 0) {
mUuidTextView.setText(R.string.no_uuid);
} else {
mUuidTextView.setText(getString(R.string.number_of_adfs) + fullUuidList.size()
+ getString(R.string.latest_adf_is)
+ fullUuidList.get(fullUuidList.size() - 1));
}
}
}
/**
* Here is where you would set-up your rendering logic. We're replacing it with a minimalistic,
* dummy example using a standard GLSurfaceView and a basic renderer, for illustration purposes
* only.
*/
private void setupRenderer() {
mRenderer.getCurrentScene().registerFrameCallback(new ASceneFrameCallback() {
#Override
public void onPreFrame(long sceneTime, double deltaTime) {
// into the scene.
// Prevent concurrent access to {#code mIsFrameAvailableTangoThread} from the Tango
// callback thread and service disconnection from an onPause event.
try {
synchronized (MainActivity.this) {
// Don't execute any tango API actions if we're not connected to the service
if (!mIsConnected) {
return;
}
// Set-up scene camera projection to match RGB camera intrinsics.
if (!mRenderer.isSceneCameraConfigured()) {
mRenderer.setProjectionMatrix(
projectionMatrixFromCameraIntrinsics(mIntrinsics,
mColorCameraToDisplayAndroidRotation));
}
// Connect the camera texture to the OpenGL Texture if necessary
// NOTE: When the OpenGL context is recycled, Rajawali may re-generate the
// texture with a different ID.
if (mConnectedTextureIdGlThread != mRenderer.getTextureId()) {
mTango.connectTextureId(TangoCameraIntrinsics.TANGO_CAMERA_COLOR,
mRenderer.getTextureId());
mConnectedTextureIdGlThread = mRenderer.getTextureId();
Log.d(TAG, "connected to texture id: " + mRenderer.getTextureId());
}
if (mIsFrameAvailableTangoThread.compareAndSet(true, false)) {
mRgbTimestampGlThread =
mTango.updateTexture(TangoCameraIntrinsics.TANGO_CAMERA_COLOR);
}
// If a new RGB frame has been rendered, update the camera pose to match.
if (mRgbTimestampGlThread > mCameraPoseTimestamp) {
// Calculate the camera color pose at the camera frame update time in
// OpenGL engine.
//
// When drift correction mode is enabled in config file, we must query
// the device with respect to Area Description pose in order to use the
// drift corrected pose.
//
// Note that if you don't want to use the drift corrected pose, the
// normal device with respect to start of service pose is available.
TangoPoseData lastFramePose = TangoSupport.getPoseAtTime(
mRgbTimestampGlThread,
TangoPoseData.COORDINATE_FRAME_AREA_DESCRIPTION,
TangoPoseData.COORDINATE_FRAME_CAMERA_COLOR,
TangoSupport.TANGO_SUPPORT_ENGINE_OPENGL,
mColorCameraToDisplayAndroidRotation);
if (lastFramePose.statusCode == TangoPoseData.POSE_VALID) {
// Update the camera pose from the renderer
mRenderer.updateRenderCameraPose(lastFramePose);
mCameraPoseTimestamp = lastFramePose.timestamp;
} else {
// When the pose status is not valid, it indicates the tracking has
// been lost. In this case, we simply stop rendering.
//
// This is also the place to display UI to suggest the user walk
// to recover tracking.
Log.w(TAG, "Can't get device pose at time: " +
mRgbTimestampGlThread);
}
}
}
// Avoid crashing the application due to unhandled exceptions
} catch (TangoErrorException e) {
Log.e(TAG, "Tango API call error within the OpenGL render thread", e);
} catch (Throwable t) {
Log.e(TAG, "Exception on the OpenGL thread", t);
}
}
#Override
public void onPreDraw(long sceneTime, double deltaTime) {
}
#Override
public void onPostFrame(long sceneTime, double deltaTime) {
}
#Override
public boolean callPreFrame() {
return true;
}
});
mSurfaceView.setSurfaceRenderer(mRenderer);
}
private static int getColorCameraToDisplayAndroidRotation(int displayRotation,
int cameraRotation) {
int cameraRotationNormalized = 0;
switch (cameraRotation) {
case 90:
cameraRotationNormalized = 1;
break;
case 180:
cameraRotationNormalized = 2;
break;
case 270:
cameraRotationNormalized = 3;
break;
default:
cameraRotationNormalized = 0;
break;
}
int ret = displayRotation - cameraRotationNormalized;
if (ret < 0) {
ret += 4;
}
return ret;
}
/**
* Use Tango camera intrinsics to calculate the projection Matrix for the Rajawali scene.
private static float[] projectionMatrixFromCameraIntrinsics(TangoCameraIntrinsics intrinsics,
int rotation) {
// Adjust camera intrinsics according to rotation
float cx = (float) intrinsics.cx;
float cy = (float) intrinsics.cy;
float width = (float) intrinsics.width;
float height = (float) intrinsics.height;
float fx = (float) intrinsics.fx;
float fy = (float) intrinsics.fy;
switch (rotation) {
case Surface.ROTATION_90:
cx = (float) intrinsics.cy;
cy = (float) intrinsics.width - (float) intrinsics.cx;
width = (float) intrinsics.height;
height = (float) intrinsics.width;
fx = (float) intrinsics.fy;
fy = (float) intrinsics.fx;
break;
case Surface.ROTATION_180:
cx = (float) intrinsics.width - cx;
cy = (float) intrinsics.height - cy;
break;
case Surface.ROTATION_270:
cx = (float) intrinsics.height - (float) intrinsics.cy;
cy = (float) intrinsics.cx;
width = (float) intrinsics.height;
height = (float) intrinsics.width;
fx = (float) intrinsics.fy;
fy = (float) intrinsics.fx;
break;
default:
break;
}
// Uses frustumM to create a projection matrix taking into account calibrated camera
// intrinsic parameter.
// Reference: http://ksimek.github.io/2013/06/03/calibrated_cameras_in_opengl/
float near = 0.1f;
float far = 100;
float xScale = near / fx;
float yScale = near / fy;
float xOffset = (cx - (width / 2.0f)) * xScale;
// Color camera's coordinates has y pointing downwards so we negate this term.
float yOffset = -(cy - (height / 2.0f)) * yScale;
float m[] = new float[16];
Matrix.frustumM(m, 0,
xScale * (float) -width / 2.0f - xOffset,
xScale * (float) width / 2.0f - xOffset,
yScale * (float) -height / 2.0f - yOffset,
yScale * (float) height / 2.0f - yOffset,
near, far);
return m;
}
}
VideoRender.java Code
//Extending RajawaliSurface View
/**
* Renderer that implements a basic augmented reality scene using Rajawali.
* It creates a scene with a background quad taking the whole screen, where the color camera is
*/
public class VideoRenderer extends RajawaliRenderer {
private static final String TAG = VideoRenderer.class.getSimpleName();
private float[] textureCoords0 = new float[]{0.0F, 0.0F, 1.0F, 0.0F, 1.0F, 1.0F, 0.0F, 1.0F};
private float[] textureCoords270 = new float[]{0.0F, 1.0F, 0.0F, 0.0F, 1.0F, 0.0F, 1.0F, 1.0F};
private float[] textureCoords180 = new float[]{1.0F, 1.0F, 0.0F, 1.0F, 0.0F, 0.0F, 1.0F, 0.0F};
private float[] textureCoords90 = new float[]{1.0F, 0.0F, 1.0F, 1.0F, 0.0F, 1.0F, 0.0F, 0.0F};
// Rajawali texture used to render the Tango color camera.
private ATexture mTangoCameraTexture;
// Keeps track of whether the scene camera has been configured.
private boolean mSceneCameraConfigured;
private ScreenQuad mBackgroundQuad = new ScreenQuad();
public VideoRenderer(Context context) {
super(context);
}
#Override
protected void initScene() {
// Create a quad covering the whole background and assign a texture to it where the
// Tango color camera contents will be rendered.
Material tangoCameraMaterial = new Material();
tangoCameraMaterial.setColorInfluence(0);
mBackgroundQuad.getGeometry().setTextureCoords(textureCoords0);
// We need to use Rajawali's {#code StreamingTexture} since it sets up the texture
// for GL_TEXTURE_EXTERNAL_OES rendering
mTangoCameraTexture =
new StreamingTexture("camera", (StreamingTexture.ISurfaceListener) null);
try {
tangoCameraMaterial.addTexture(mTangoCameraTexture);
mBackgroundQuad.setMaterial(tangoCameraMaterial);
} catch (ATexture.TextureException e) {
Log.e(TAG, "Exception creating texture for RGB camera contents", e);
}
getCurrentScene().addChildAt(mBackgroundQuad, 0);
// Add a directional light in an arbitrary direction.
DirectionalLight light = new DirectionalLight(1, 0.2, -1);
light.setColor(1, 1, 1);
light.setPower(0.8f);
light.setPosition(3, 2, 4);
getCurrentScene().addLight(light);
}
/**
* Update background texture's UV coordinates when device orientation is changed. i.e change
* between landscape and portrait mode.
*/
public void updateColorCameraTextureUv(int rotation) {
switch (rotation) {
case Surface.ROTATION_90:
mBackgroundQuad.getGeometry().setTextureCoords(textureCoords90);
break;
case Surface.ROTATION_180:
mBackgroundQuad.getGeometry().setTextureCoords(textureCoords180);
break;
case Surface.ROTATION_270:
mBackgroundQuad.getGeometry().setTextureCoords(textureCoords270);
break;
default:
mBackgroundQuad.getGeometry().setTextureCoords(textureCoords0);
break;
}
}
public void updateRenderCameraPose(TangoPoseData cameraPose) {
float[] rotation = cameraPose.getRotationAsFloats();
float[] translation = cameraPose.getTranslationAsFloats();
Quaternion quaternion = new Quaternion(rotation[3], rotation[0], rotation[1], rotation[2]);
getCurrentCamera().setRotation(quaternion.conjugate());
getCurrentCamera().setPosition(translation[0], translation[1], translation[2]);
}
/**
* It returns the ID currently assigned to the texture where the Tango color camera contents
* should be rendered.
public int getTextureId() {
return mTangoCameraTexture == null ? -1 : mTangoCameraTexture.getTextureId();
}
/**
* We need to override this method to mark the camera for re-configuration (set proper
* projection matrix) since it will be reset by Rajawali on surface changes.
*/
#Override
public void onRenderSurfaceSizeChanged(GL10 gl, int width, int height) {
super.onRenderSurfaceSizeChanged(gl, width, height);
mSceneCameraConfigured = false;
}
public boolean isSceneCameraConfigured() {
return mSceneCameraConfigured;
}
/**
* Sets the projection matrix for the scene camera to match the parameters of the color camera
public void setProjectionMatrix(float[] matrixFloats) {
getCurrentCamera().setProjectionMatrix(new Matrix4(matrixFloats));
}
#Override
public void onOffsetsChanged(float xOffset, float yOffset,
float xOffsetStep, float yOffsetStep,
int xPixelOffset, int yPixelOffset) {
}
#Override
public void onTouchEvent(MotionEvent event) {
}
}

How to use custom created drawable images for Android watch Face

Its been a week I was trying to create a watch Face for Android wear. As a kick start I followed Google official documentation and found these Android official watch face app tutorial with source code
So my current issue is , In Google documentation they use canvas to create analogue watch faces . The watch hands are generated using paint
Here is the sample of code for creating dial hand
public class AnalogWatchFaceService extends CanvasWatchFaceService {
private static final String TAG = "AnalogWatchFaceService";
/**
* Update rate in milliseconds for interactive mode. We update once a second to advance the
* second hand.
*/
private static final long INTERACTIVE_UPDATE_RATE_MS = TimeUnit.SECONDS.toMillis(1);
#Override
public Engine onCreateEngine() {
return new Engine();
}
private class Engine extends CanvasWatchFaceService.Engine {
static final int MSG_UPDATE_TIME = 0;
static final float TWO_PI = (float) Math.PI * 2f;
Paint mHourPaint;
Paint mMinutePaint;
Paint mSecondPaint;
Paint mTickPaint;
boolean mMute;
Calendar mCalendar;
/** Handler to update the time once a second in interactive mode. */
final Handler mUpdateTimeHandler = new Handler() {
#Override
public void handleMessage(Message message) {
switch (message.what) {
case MSG_UPDATE_TIME:
if (Log.isLoggable(TAG, Log.VERBOSE)) {
Log.v(TAG, "updating time");
}
invalidate();
if (shouldTimerBeRunning()) {
long timeMs = System.currentTimeMillis();
long delayMs = INTERACTIVE_UPDATE_RATE_MS
- (timeMs % INTERACTIVE_UPDATE_RATE_MS);
mUpdateTimeHandler.sendEmptyMessageDelayed(MSG_UPDATE_TIME, delayMs);
}
break;
}
}
};
final BroadcastReceiver mTimeZoneReceiver = new BroadcastReceiver() {
#Override
public void onReceive(Context context, Intent intent) {
mCalendar.setTimeZone(TimeZone.getDefault());
invalidate();
}
};
boolean mRegisteredTimeZoneReceiver = false;
/**
* Whether the display supports fewer bits for each color in ambient mode. When true, we
* disable anti-aliasing in ambient mode.
*/
boolean mLowBitAmbient;
Bitmap mBackgroundBitmap;
Bitmap mBackgroundScaledBitmap;
#Override
public void onCreate(SurfaceHolder holder) {
if (Log.isLoggable(TAG, Log.DEBUG)) {
Log.d(TAG, "onCreate");
}
super.onCreate(holder);
setWatchFaceStyle(new WatchFaceStyle.Builder(AnalogWatchFaceService.this)
.setCardPeekMode(WatchFaceStyle.PEEK_MODE_SHORT)
.setBackgroundVisibility(WatchFaceStyle.BACKGROUND_VISIBILITY_INTERRUPTIVE)
.setShowSystemUiTime(false)
.build());
Resources resources = AnalogWatchFaceService.this.getResources();
Drawable backgroundDrawable = resources.getDrawable(R.drawable.bg, null /* theme */);
mBackgroundBitmap = ((BitmapDrawable) backgroundDrawable).getBitmap();
mHourPaint = new Paint();
mHourPaint.setARGB(255, 200, 200, 200);
mHourPaint.setStrokeWidth(5.f);
mHourPaint.setAntiAlias(true);
mHourPaint.setStrokeCap(Paint.Cap.ROUND);
mMinutePaint = new Paint();
mMinutePaint.setARGB(255, 200, 200, 200);
mMinutePaint.setStrokeWidth(3.f);
mMinutePaint.setAntiAlias(true);
mMinutePaint.setStrokeCap(Paint.Cap.ROUND);
mSecondPaint = new Paint();
mSecondPaint.setARGB(255, 255, 0, 0);
mSecondPaint.setStrokeWidth(2.f);
mSecondPaint.setAntiAlias(true);
mSecondPaint.setStrokeCap(Paint.Cap.ROUND);
mTickPaint = new Paint();
mTickPaint.setARGB(100, 255, 255, 255);
mTickPaint.setStrokeWidth(2.f);
mTickPaint.setAntiAlias(true);
mCalendar = Calendar.getInstance();
}
#Override
public void onDestroy() {
mUpdateTimeHandler.removeMessages(MSG_UPDATE_TIME);
super.onDestroy();
}
#Override
public void onPropertiesChanged(Bundle properties) {
super.onPropertiesChanged(properties);
mLowBitAmbient = properties.getBoolean(PROPERTY_LOW_BIT_AMBIENT, false);
if (Log.isLoggable(TAG, Log.DEBUG)) {
Log.d(TAG, "onPropertiesChanged: low-bit ambient = " + mLowBitAmbient);
}
}
#Override
public void onTimeTick() {
super.onTimeTick();
if (Log.isLoggable(TAG, Log.DEBUG)) {
Log.d(TAG, "onTimeTick: ambient = " + isInAmbientMode());
}
invalidate();
}
#Override
public void onAmbientModeChanged(boolean inAmbientMode) {
super.onAmbientModeChanged(inAmbientMode);
if (Log.isLoggable(TAG, Log.DEBUG)) {
Log.d(TAG, "onAmbientModeChanged: " + inAmbientMode);
}
if (mLowBitAmbient) {
boolean antiAlias = !inAmbientMode;
mHourPaint.setAntiAlias(antiAlias);
mMinutePaint.setAntiAlias(antiAlias);
mSecondPaint.setAntiAlias(antiAlias);
mTickPaint.setAntiAlias(antiAlias);
}
invalidate();
// Whether the timer should be running depends on whether we're in ambient mode (as well
// as whether we're visible), so we may need to start or stop the timer.
updateTimer();
}
#Override
public void onInterruptionFilterChanged(int interruptionFilter) {
super.onInterruptionFilterChanged(interruptionFilter);
boolean inMuteMode = (interruptionFilter == WatchFaceService.INTERRUPTION_FILTER_NONE);
if (mMute != inMuteMode) {
mMute = inMuteMode;
mHourPaint.setAlpha(inMuteMode ? 100 : 255);
mMinutePaint.setAlpha(inMuteMode ? 100 : 255);
mSecondPaint.setAlpha(inMuteMode ? 80 : 255);
invalidate();
}
}
#Override
public void onSurfaceChanged(SurfaceHolder holder, int format, int width, int height) {
if (mBackgroundScaledBitmap == null
|| mBackgroundScaledBitmap.getWidth() != width
|| mBackgroundScaledBitmap.getHeight() != height) {
mBackgroundScaledBitmap = Bitmap.createScaledBitmap(mBackgroundBitmap,
width, height, true /* filter */);
}
super.onSurfaceChanged(holder, format, width, height);
}
#Override
public void onDraw(Canvas canvas, Rect bounds) {
mCalendar.setTimeInMillis(System.currentTimeMillis());
int width = bounds.width();
int height = bounds.height();
// Draw the background, scaled to fit.
canvas.drawBitmap(mBackgroundScaledBitmap, 0, 0, null);
// Find the center. Ignore the window insets so that, on round watches with a
// "chin", the watch face is centered on the entire screen, not just the usable
// portion.
float centerX = width / 2f;
float centerY = height / 2f;
// Draw the ticks.
float innerTickRadius = centerX - 10;
float outerTickRadius = centerX;
for (int tickIndex = 0; tickIndex < 12; tickIndex++) {
float tickRot = tickIndex * TWO_PI / 12;
float innerX = (float) Math.sin(tickRot) * innerTickRadius;
float innerY = (float) -Math.cos(tickRot) * innerTickRadius;
float outerX = (float) Math.sin(tickRot) * outerTickRadius;
float outerY = (float) -Math.cos(tickRot) * outerTickRadius;
canvas.drawLine(centerX + innerX, centerY + innerY,
centerX + outerX, centerY + outerY, mTickPaint);
}
float seconds =
mCalendar.get(Calendar.SECOND) + mCalendar.get(Calendar.MILLISECOND) / 1000f;
float secRot = seconds / 60f * TWO_PI;
float minutes = mCalendar.get(Calendar.MINUTE) + seconds / 60f;
float minRot = minutes / 60f * TWO_PI;
float hours = mCalendar.get(Calendar.HOUR) + minutes / 60f;
float hrRot = hours / 12f * TWO_PI;
float secLength = centerX - 20;
float minLength = centerX - 40;
float hrLength = centerX - 80;
if (!isInAmbientMode()) {
float secX = (float) Math.sin(secRot) * secLength;
float secY = (float) -Math.cos(secRot) * secLength;
canvas.drawLine(centerX, centerY, centerX + secX, centerY + secY, mSecondPaint);
}
float minX = (float) Math.sin(minRot) * minLength;
float minY = (float) -Math.cos(minRot) * minLength;
canvas.drawLine(centerX, centerY, centerX + minX, centerY + minY, mMinutePaint);
float hrX = (float) Math.sin(hrRot) * hrLength;
float hrY = (float) -Math.cos(hrRot) * hrLength;
canvas.drawLine(centerX, centerY, centerX + hrX, centerY + hrY, mHourPaint);
}
}
The entire code can be found inside the official sample app . Below you can find the screen shot of application which I made using Google official tutorial .
If anyone have any idea how to replace the clock hands with an drawable images ? . Any help would be appreciated .
Create a Bitmap of your drawable resource:
Bitmap hourHand = BitmapFactory.decodeResource(context.getResources(), R.drawable.hour_hand);
Do whatever transformations you need to your canvas and draw the bitmap:
canvas.save();
canvas.rotate(degrees, px, py);
canvas.translate(dx, dy);
canvas.drawBitmap(hourHand, centerX, centerY, null); // Or use a Paint if you need it
canvas.restore();
Use following method to rotate bitmap from canvas,
/**
* To rotate bitmap on canvas
*
* #param canvas : canvas on which you are drawing
* #param handBitmap : bitmap of hand
* #param centerPoint : center for rotation
* #param rotation : rotation angle in form of seconds
* #param offset : offset of bitmap from center point (If not needed then keep it 0)
*/
public void rotateBitmap(Canvas canvas, Bitmap handBitmap, PointF centerPoint, float rotation, float offset) {
canvas.save();
canvas.rotate(secondRotation - 90, centerPoint.x, centerPoint.y);
canvas.drawBitmap(handBitmap, centerPoint.x - offset, centerPoint.y - handBitmap.getHeight() / Constants.INTEGER_TWO, new Paint(Paint.FILTER_BITMAP_FLAG));
canvas.restore();
}
I am a little bit late with the answer, but maybe it can be helpful for others
canvas.save()
val antialias = Paint()
antialias.isAntiAlias = true
antialias.isFilterBitmap = true
antialias.isDither = true
canvas.rotate(secondsRotation - minutesRotation, centerX, centerY)
canvas.drawBitmap(
secondsHandBitmap,
centerX - 10,
centerY - 160,
antialias
)
canvas.restore()
Here is my public Git Repo you can check the source code

My box2d body drops way too slow

I tried almost everything but nothing works.
If you run it you will see a square falling down very slow.
I'm a beginner so explain it not too complicated.
Any questions about this code dylan.missu#gmail.com
Here is my code:
#Override
public void show() {
camera = new OrthographicCamera();
debugRenderer = new Box2DDebugRenderer();
BodyDef bodyDef = new BodyDef();
bodyDef.type = BodyDef.BodyType.DynamicBody;
bodyDef.position.set(100,100);
body = world.createBody(bodyDef);
PolygonShape shape = new PolygonShape();
shape.setAsBox(Gdx.graphics.getHeight()/6,Gdx.graphics.getHeight()/6);
FixtureDef fixtureDef = new FixtureDef();
fixtureDef.shape = shape;
fixtureDef.density = 1f;
fixtureDef.restitution = 2;
Fixture fixture = body.createFixture(fixtureDef);
}
private void line(float X, float Y, float w, float h)
{
BodyDef bodyDef = new BodyDef();
bodyDef.type=BodyDef.BodyType.StaticBody;
bodyDef.position.set(X,Y);
PolygonShape polygonShape=new PolygonShape();
polygonShape.setAsBox(w,h);
FixtureDef fixtureDef = new FixtureDef();
fixtureDef.shape=polygonShape;
fixtureDef.restitution=0.4f;
fixtureDef.friction=0.5f;
Body theFloor=world.createBody(bodyDef);
theFloor.createFixture(fixtureDef);
}
private void wall(float A)
{
// is there a better way of doing this?
line(0,Gdx.graphics.getHeight()/2-A,Gdx.graphics.getWidth()/2-A,0);
line(Gdx.graphics.getWidth()/2-A,0,0,Gdx.graphics.getHeight()/2-A);
line(0,-Gdx.graphics.getHeight()/2+A,Gdx.graphics.getWidth()/2-A,0);
line(-Gdx.graphics.getWidth()/2+A,0,0,Gdx.graphics.getHeight()/2-A);
}
#Override
public void render(float delta) {
world.step(1 / 5f, 6, 2);
OrthographicCamera camera = new OrthographicCamera(Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
processAccelerometer();
wall(1);
Gdx.gl.glClearColor(0, 0, 0, 1);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
debugRenderer.render(world, camera.combined);
}
#Override
public void dispose() {
world.dispose();
debugRenderer.dispose();
}
private void processAccelerometer() {
float y = Gdx.input.getAccelerometerY();
float x = Gdx.input.getAccelerometerX();
if (prevAccelX != x || prevAccelY != y) {
world.setGravity(new Vector2(y, -x));
prevAccelX = x;
prevAccelY = y;
}
}
#Override
public void hide()
{
// TODO: Implement this method
}
#Override
public void resize(int p1, int p2)
{
// TODO: Implement this method
}
#Override
public void resume()
{
// TODO: Implement this method
}
#Override
public void pause()
{
// TODO: Implement this method
}
#Override
public void render()
{
// TODO: Implement this method
}
}
In Box2D you have to take into account that 1 pixel = 1 meter. So, basically, everything you simulate in Box2D will be HUGE by default. And Box2D simulations are not accurate for huge distances, huge masses, huge speeds...
So all you have to do is convert your viewport with a conversion factor, so you'll just look at small entities, that will be easier to simulate.
For example, let's say you want 100 pixel = 1 meter, you'll put that code when you create your game screen :
WORLD_TO_BOX = 1/100f;
BOX_TO_WORLD = 1/WORLD_TO_BOX;
//Creation of the camera with your factor 100
camera = new OrthographicCamera();
camera.viewportHeight = Gdx.graphics.getHeight() * WORLD_TO_BOX;
camera.viewportWidth = Gdx.graphics.getWidth() * WORLD_TO_BOX;
camera.position.set(camera.viewportWidth/2, camera.viewportHeight/2, 0f);
camera.update();
Then, you'll create your boxes in term of camera.viewportWidth and camera.viewportHeight, instead of Gdx.graphics.getWidth() and Gdx.graphics.getHeight()
I your case you'll have :
PolygonShape shape = new PolygonShape();
shape.setAsBox(camera.viewportHeight/6,camera.viewportHeight/6);
You can see in my code, there is also a BOX_TO_WORLD conversion factor. It will be used when you want to render graphics over your box2D bodies.
For that, look at the answer of this question.

How to zoom properly in OpenGL ES 2

I've a 3d-model in OpenGL ES in Android. I've already implemented swipe-gestures for translating, rotating and zooming into the model. Everything but the zooming works fine. I'm not sure what I'm missing or what I have to change but I'm not able to zoom into my model.
The model is a building. What I'd like to do is to zoom into the different floors of the building. But no matter how I change my implementation, I'm not able to do this.
Either the building disappears when I zoom in or the zoom has a limitation so that I can't zoom into it further....
First of all I decreased the field of view by modifying the Matrix:
frustumM(matrix, 0, -ratio/zoom, ratio/zoom, -1/zoom, 1/zoom, nearPlane, farPlane).
Someone told me, that this is not the correct approach and I should modify the eyeZ value like:
eyeZ = -1.0/zoom
The first approach is working, but I'd like to know what my mistake with the second approach is, because it has the issues I mentioned in the beginning.
My renderer-class is the following:
public class MyGLRenderer implements GLSurfaceView.Renderer {
private float[] mModelMatrix = new float[16];
private final float[] mMVMatrix = new float[16];
private final float[] mProjectionMatrix = new float[16];
private final float[] mViewMatrix = new float[16];
private float nearPlaneDistance = 1f;
private float farPlaneDistance = 200f;
private float modelRatio = 1.0f;
private int offset = 0;
private float eyeX = 0;
private float eyeY = 0;
private float eyeZ = -1;
private float centerX = 0f;
private float centerY = 0f;
private float centerZ = 0f;
private float upX = 0f;
private float upY = 1.0f;
private float upZ = 0.0f;
private float mZoomLevel = 1f;
private float defaultRotationX = 100.0f; //building otherwise on the wrong side
private float defaultRotationZ = 180.0f; //building otherwise on the wrong side
private float rotationX = defaultRotationX;
private float rotationY = 0.0f;
private float rotationZ = defaultRotationZ;
private float translateX = 0.0f;
private float translateY = 0.0f;
private float translateZ = 0.0f;
private float scaleFactor = 20.0f; //no matter what scale factor -> it's not possible to zoom into the building...
private float ratio;
private float width;
private float height;
private List<IDrawableObject> drawableObjects;
public Model3D model3d;
public MyGLRenderer(Model3D model3d) {
this.model3d = model3d;
getModelScale();
}
private void getModelScale() {
float highestValue = (model3d.width > model3d.height) ? model3d.width
: model3d.height;
modelRatio = 2f / highestValue;
}
#Override
public void onSurfaceCreated(GL10 unused, EGLConfig config) {
// Set the background frame color
GLES20.glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
GLES20.glDisable(GLES20.GL_CULL_FACE);
GLES20.glEnable(GLES20.GL_DEPTH_TEST);
drawableObjects = ... ; //to much detail, basically getting triangles
}
#Override
public void onDrawFrame(GL10 unused) {
float[] mMVPMatrix = new float[16];
// Draw background color
Matrix.setIdentityM(mModelMatrix, 0);
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
// model is in origin-solution too big
Matrix.scaleM(mModelMatrix, 0, modelRatio * scaleFactor, modelRatio
* scaleFactor, modelRatio * scaleFactor);
Matrix.translateM(mModelMatrix, 0, translateX, translateY, translateZ);
rotateModel(mModelMatrix, rotationX, rotationY, rotationZ, true);
// Set the camera position (View matrix)
Matrix.setLookAtM(mViewMatrix, offset, eyeX, eyeY, eyeZ / mZoomLevel,
centerX, centerY, centerZ, upX, upY, upZ);
// combine the model with the view matrix
Matrix.multiplyMM(mMVMatrix, 0, mViewMatrix, 0, mModelMatrix, 0);
// this projection matrix is applied to object coordinates
// in the onDrawFrame() method
Matrix.frustumM(mProjectionMatrix, 0, -ratio, ratio, 1, -1,
nearPlaneDistance, farPlaneDistance);
// Calculate the projection and view transformation
Matrix.multiplyMM(mMVPMatrix, 0, mProjectionMatrix, 0, mMVMatrix, 0);
for (IDrawableObject d : drawableObjects) {
d.draw(mMVPMatrix);
}
}
private void rotateModel(float[] mModelMatrix, Float x, Float y, Float z,
boolean rotateAroundCenter) {
// translation for rotating the model around its center
if (rotateAroundCenter) {
Matrix.translateM(mModelMatrix, 0, (model3d.width / 2f), 0,
(model3d.height / 2f));
}
if (x != null) {
Matrix.rotateM(mModelMatrix, 0, x, 1.0f, 0.0f, 0.0f);
}
if (y != null) {
Matrix.rotateM(mModelMatrix, 0, y, 0.0f, 1.0f, 0.0f);
}
if (z != null) {
Matrix.rotateM(mModelMatrix, 0, z, 0.0f, 0.0f, 1.0f);
}
// translation back to the origin
if (rotateAroundCenter) {
Matrix.translateM(mModelMatrix, 0, -(model3d.width / 2f), 0,
-(model3d.height / 2f));
}
}
#Override
public void onSurfaceChanged(GL10 unused, int width, int height) {
// Adjust the viewport based on geometry changes,
// such as screen rotation
GLES20.glViewport(0, 0, width, height);
this.width = width;
this.height = height;
ratio = (float) width / height;
}
public static int loadShader(int type, String shaderCode) {
// create a vertex shader type (GLES20.GL_VERTEX_SHADER)
// or a fragment shader type (GLES20.GL_FRAGMENT_SHADER)
int shader = GLES20.glCreateShader(type);
// add the source code to the shader and compile it
GLES20.glShaderSource(shader, shaderCode);
GLES20.glCompileShader(shader);
return shader;
}
public int getFPS() {
return lastMFPS;
}
public static void checkGlError(String glOperation) {
int error;
while ((error = GLES20.glGetError()) != GLES20.GL_NO_ERROR) {
Log.e(TAG, glOperation + ": glError " + error);
throw new RuntimeException(glOperation + ": glError " + error);
}
}
public void setZoom(float zoom) {
this.mZoomLevel = zoom;
}
public void setDistance(float distance) {
eyeZ = distance;
}
public float getDistance() {
return eyeZ;
}
public float getRotationX() {
return rotationX;
}
public void setRotationX(float rotationX) {
this.rotationX = defaultRotationX + rotationX;
}
public float getRotationY() {
return rotationY;
}
public void setRotationY(float rotationY) {
this.rotationY = rotationY;
}
public float getRotationZ() {
return rotationZ;
}
public void setRotationZ(float rotationZ) {
this.rotationZ = defaultRotationZ + rotationZ;
}
public float getFarPlane() {
return farPlaneDistance;
}
public float getNearPlane() {
return nearPlaneDistance;
}
public void addTranslation(float mPosX, float mPosY) {
this.translateX = mPosX;
this.translateY = mPosY;
}
public void downPressed() {
translateX -= 10;
}
public void upPressed() {
translateX += 10;
}
public void actionMoved(float mPosX, float mPosY) {
float translationX = (mPosX / width);
float translationY = -(mPosY / height);
addTranslation(translationX, translationY);
}
public float getmZoomLevel() {
return mZoomLevel;
}
public void setmZoomLevel(float mZoomLevel) {
this.mZoomLevel = mZoomLevel;
}
public float getWidth() {
return width;
}
public float getHeight() {
return height;
}
public void setTranslation(Float x, Float y, Float z) {
if (x != null) {
this.translateX = -x;
}
if (y != null) {
this.translateY = y;
}
if (z != null) {
this.translateZ = -z;
}
}
public void setRotation(Float x, Float y, Float z) {
if (x != null) {
this.rotationX = defaultRotationX + x;
}
if (y != null) {
this.rotationY = y;
}
if (z != null) {
this.rotationZ = defaultRotationZ + z;
}
}
public void setScale(float scale) {
this.mZoomLevel = scale;
}
public float getDefaultRotationX() {
return defaultRotationX;
}
}
Do you see any mistake I'm currently doing? You can also have a look into the github repository: https://github.com/Dalanie/OpenGL-ES/tree/master/buildingGL
First you must define what you mean by "zoom". In the scenario of a perspective projection, there are a few possibilities:
You change the field of view. This is analogous to the zoom of cameras, where zooming results in changing the focal width.
You just scale the model before the projection.
You change the distance of the model (nearer to the camera to zoom in, farer away to zoom out)
The variant 1 is what you did by changing the frustum. In my opinion, that is the most intuitive effect. At least to someone who is used to cameras. ALso note that this has the same effect as upscaling some sub-rectangle of the 2d projected image to fill the whole screen.
Changing eyeZ is approach 3. But now you must be carefol to not move the object out of the viewing volume (which seems to be the issue you are describing). Ideally you would modify the frustum here, too. But to keep the field of view, while moving the near/far planes so that the object always stays inbetween. Note that this requires changing all 6 values of the frustum, what stays the same should be the ratios left/near, right/near, top/near and bottom/near to keep the FOV/aspect you had before.

Categories

Resources