Scrolling a childscene in AndEngine - android

I have a problem scrolling my childscene. I have created a CameraScene which i am trying to scroll with a touch event. My childscene is not scrolling, however, if i scroll on the camera attached to the engine the parent scene scrolls fine.
So how do i get my child scene to scroll without the objects attached to myparents scene scrolls along?
public StatsScene(Context context, VertexBufferObjectManager vbo) {
super(new SmoothCamera(0, 0, WITDH, HEIGHT, 0, SPEEDY, 0));
this.setOnSceneTouchListener(new IOnSceneTouchListener() {
#Override
public boolean onSceneTouchEvent(Scene pScene, TouchEvent pSceneTouchEvent) {
if(pSceneTouchEvent.getAction() == MotionEvent.ACTION_DOWN) {
mTouchY = pSceneTouchEvent.getMotionEvent().getY();
}
else if(pSceneTouchEvent.getAction() == MotionEvent.ACTION_MOVE) {
float newY = pSceneTouchEvent.getMotionEvent().getY();
mTouchOffsetY = (newY - mTouchY);
float newScrollX = getCamera().getCenterX();
float newScrollY = getCamera().getCenterY() - mTouchOffsetY;
getCamera().setCenter(newScrollX, newScrollY);
mTouchY = newY;
}
return true;
}
});
}

I'm not really into AndEngine and I'm not sure if I get your problem right (in your code is nothing about "myparents" or "childscene"), but when something is attached to your scene, then this implies it will move with it. You could scroll your children in the other direction to maintain their position, but that could get you into trouble in the longterm. If it is possible, try to seperate your scrolling scene and your objects, meaning, that they shouldn't be children of each other. Instead, if you want them to keep them related, give them a common parent. If you move one object now, the siblings won't. Hope that helps.

From your description I would think that your parent scene is the one getting your the input so I'm guessing, please correct me if I'm wrong, that you are setting your child scene something like this:
mMainScene.attachChild(mChildScene);
In this case you will have to deal with deviating the input to the child instead of the parent. However, you have a few options here:
If your child scene occupies full screen and you don't need to worry about updating/drawing your parent scene, simply swap scenes with
mEngine.setScene(mChildScene);
If you do need to keep drawing and updating your parent scene check the MenuScene pre-made class and Scene.setChildScene() method, there is one example on how using this in the AndengineExamples project I think. Using this class will let you take the input on the child scene but still drawing and updating your main scene, it even let's you set your child in a modal way.

Related

Android View.requestRectangleOnScreen: Scroll-error in API-23! How, do I work around it?

I am developing my first Android App and after a good start, I have spent days of deep debugging on a problem, which by now seems to be an error in the implementation of View.requestRectangleOnScreen in API-23 and probably many levels before that. Just now, I have discovered that the implementation of this routine is changed significantly in API-25.
The problem is that a request for focus on an EditText placed inside a HorizontalScrollView may cause the HorizontalScrollView to scroll away from the field requesting the focus.
In my case it is an EditText with centered text, which is then placed in the center of 1048576 pixels and scrolled roughly half a million pixels to the right making the text centered and visible (this part is perfectly ok!) But then this offset of half a million pixels is propagated up the parent chain and makes the HorizontalScrollView move to its far right and far away from the input field.
I have tracked it down to the View.requestRectangleOnScreen routine, which in the API-23 sources is as follows:
public boolean requestRectangleOnScreen(Rect rectangle, boolean immediate) {
if (mParent == null) {
return false;
}
View child = this;
RectF position = (mAttachInfo != null) ? mAttachInfo.mTmpTransformRect : new RectF();
position.set(rectangle);
ViewParent parent = mParent;
boolean scrolled = false;
while (parent != null) {
rectangle.set((int) position.left, (int) position.top,
(int) position.right, (int) position.bottom);
scrolled |= parent.requestChildRectangleOnScreen(child,
rectangle, immediate);
if (!child.hasIdentityMatrix()) {
child.getMatrix().mapRect(position);
}
position.offset(child.mLeft, child.mTop);
if (!(parent instanceof View)) {
break;
}
View parentView = (View) parent;
position.offset(-parentView.getScrollX(), -parentView.getScrollY());
child = parentView;
parent = child.getParent();
}
return scrolled;
}
The idea is to make the rectangle visible by scrolling it onto the screen in every containing View, starting at the leaf level and passing the request up the chain of parents. The initial rectangle is given in child coordinates, which of course have to be adjusted as we work our way up the chain of parents. This is done with the statement
position.offset(-parentView.getScrollX(), -parentView.getScrollY());
close to the end of the code above.
What I have found, is that this is wrong because we are transforming the position given in child coordinates using the scroll X/Y values pertaining to the parent coordinates. Using the scroll X/Y of the child instead solved my problem but it was not possible to make a perfect override of this routine because it relies on private member variables. Specifically, I found no way of mimicing the mAttachInfo.
Now, digging a bit further, I found that the code for this routine in API-25 has changed significantly and (IMHO) correctly to the following:
public boolean requestRectangleOnScreen(Rect rectangle, boolean immediate) {
if (mParent == null) {
return false;
}
View child = this;
RectF position = (mAttachInfo != null) ? mAttachInfo.mTmpTransformRect : new RectF();
position.set(rectangle);
ViewParent parent = mParent;
boolean scrolled = false;
while (parent != null) {
rectangle.set((int) position.left, (int) position.top,
(int) position.right, (int) position.bottom);
scrolled |= parent.requestChildRectangleOnScreen(child, rectangle, immediate);
if (!(parent instanceof View)) {
break;
}
// move it from child's content coordinate space to parent's content coordinate space
position.offset(child.mLeft - child.getScrollX(), child.mTop -child.getScrollY());
child = (View) parent;
parent = child.getParent();
}
return scrolled;
}
The most important change being the line
position.offset(child.mLeft - child.getScrollX(), child.mTop -child.getScrollY());
where the scroll X/Y adjustment is now made with child values.
Now, I have two questions.
First, do you agree with my observations above?
Second, how do I implement an App that can be used on both API-23 and API-25 under the given circumstances?
My current thoughts are to sub class the EditText and override the requestRectangleOnScreen method such that when the API is 25 and above, it just calls the super class method and when the API is below 25, I basically do a full override using code along the lines of the code from API-25 but then missing out on the mAttachInfo part.

Change a value depending on drag of speed on Android

I am trying to change the value of a number based on sliding my finger. I am currently using ACTION_MOVE to change the value when I drag across my view but if I drag to fast the number barely changes. If I drag slowly I can get the correct number.
Is there a way to make the change quicker depending on the speed of the motion. I am looking into Velocity Tracker but this only returns the speed of the move and I need to make the change while dragging my finger.
Is there an optimization needed to detect ACTION_MOVE in real-time?
More like pseudo code right now, but Here is what I would do:
Extend the image view to check if the event x value is between the starting x value of the image view and the width as such:
private void checkFingerPosition(int eventXPosition) {
if(eventXPosition > this.X && eventXPosition < this.x + this.getWidth())
imageInterface.showImage(this.getDrawable());
}
I would create the interface that your activity would have to implement
public interface ImageInterface {
public void showImage(Drawable drawable);
}
In your activity, implement the ImageInterface like so
implements ImageInterface
#Override
public void showImage(Drawable drawable) {
//TODO - show drawable here
}
Then take your touch event ACTION_MOVE, report the x value to all you image views like so:
ACTION_MOVE:
for(ExtendedImageView extendedImageView : ExtendedImageViewArray) {
entendedImageView.checkFingerPosition(event.X);
}

How to detect when any child view recieves a click

Given an arbitrary ViewGroup G with an arbitrary collection of child views, how can I detect when the user clicks on any of the child views? In this case, I want to draw a highlight for G.
I could add an onClick listener for each child, but I'm trying to avoid that so that the code doesn't have to be changed when the layouts change.
Alternatively, I could add onTouch handlers to G and set the highlight during ACTION_DOWN. However, this would trigger for actions that don't actually result in clicks, such as a swipe (the swipe could be handled by ViewPager, for example, and ultimately be irrelevant to G).
My layout for G has the focusable attributes:
android:focusable="true"
android:focusableInTouchMode="true"
Thanks.
Here is how I do it:
//in onTouch method of parent, I get the coordinates of click
int x = ((int) motionEvent.getX());
int y = ((int) motionEvent.getY());
//obtain the clickable arrea of the child as HitRect
Rect clickRect = new Rect();
Rect rect = new Rect();
imageView.getHitRect(rect);
//ask if the area contains the coordinates of the click
if(rect.contains(x, y)){
//do some work like if onClickListener on the child was called.
return false; //you clicked here, don't need to handle other Childs
}
//ask for other childs like before...
Now, you can target the parent as the delegate of all clicks done inside it, even if it is done in a child.
EDIT:
To ignore other touch event that are not click, you can ask for how much user moved the finger:
case MotionEvent.ACTION_MOVE:
case MotionEvent.ACTION_CANCEL:
if (Math.abs(motionEvent.getRawX() - initialTouchX) > 5 || Math.abs(motionEvent.getRawY() - initialTouchY) > 5) {
return true; // user mover finger too much, ignore touch
}
return false; // finger still there waiting for click
I give a square of 10 pixels to permit a confortable click, and if you exit it, I ignore it.
EXTRA:
Here is the complete code for click and long click with onTouchListener.
You could use the View.getChildCount() to loop through all child views and see if the touch intersects with the child view.
This involves getting x and y positions and calculating if it fits within the child view, use View.getChildAt(position) to get the reference to the child view .
So it would be something like this:
int childNr = theView.getChildCount();
for (int i = 0; i < childNr; i++){
YourView tmp = (YourView) theView.getChildAt(i);
if(tmp.intersects(x, y)){
do some work
}
}
here you would have to put your view variable instead of theView and the class name which handles the views instead of (YourView) and x, y are the coordinates of the pressed spot.
In your XML, you could add point all the children to the same onClick method. Inside that method you could draw the highlight to G and then do something (or nothing) for the individual child view.

Overlapping Fragments and Touch Events

I have a ViewPager with a custom PagerAdapter that displays a set of fragments.
These fragments are (purposely) positioned on top of each other so that I can use a PageTransformer that makes it look as if the user is sliding the fragments from a stack (almost like a deck of cards).
The issue is that each fragment has their own Views/Widgets (e.g. a seekbar) which, due to the overlapping, are occupying the same coordinates and sometimes the touch event is caught by the fragment bellow the current one (e.g. the user adjusts a seekbar's position, but instead of updating the currently shown seekbar, it's the seekbar in the next fragment that's gets its progress updated).
I've come across this answer but it's not the same exact problem.
Has anyone ever found a similar issue? What's the smartest way (except for the lazy solution: change the PageTransformer to one that doesn't overlap the fragments) of dealing with this issue?
EDIT:
In my Fragment class I have:
public View onCreateView(LayoutInflater inflater, ViewGroup container, Bundle savedInstanceState)
{
View rootView = inflater.inflate(R.layout.fragment, container, false);
rootView.setOnTouchListener(new OnTouchListener() {
#Override
public boolean onTouch(View v, MotionEvent event) {
return true;
}
});
}
as suggested by Zsombor Erdődy-Nagy, but this doesn't help: it's still possible for the widget bellow the current fragment to receive the event instead of the current one's.
I've also looked at this open issue, with no success.
If you are still looking for the solution, than you should look at this:
Issue 58918.
Here you can find the answer to your problem. Quoting from the link:
If I remember right it was after 4.1 that the framework respects a
custom child drawing order as implied Z-ordering for dispatching touch
events. If your views overlap after this page transformation they may
not receive touch events in the expected order on older platform
versions. Check which view is receiving the touch events to be
certain.
If this is what you are seeing you have a few options:
Enforce the desired ordering as you add/remove child views in your PagerAdapter
Remove the X translation applied by the PageTransformer when a page is no longer fully visible - i.e. the "position" parameter reports a full -1 or 1.
Example:
this.viewPager.setPageTransformer(true, new PageTransformer() {
#Override
public void transformPage(View page, float position) {
float translationX;
float scale;
float alpha;
if (position >= 1 || position <= -1) {
// Fix for https://code.google.com/p/android/issues/detail?id=58918
translationX = 0;
scale = 1;
alpha = 1;
} else if (position >= 0) {
translationX = -page.getWidth() * position;
scale = -0.2f * position + 1;
alpha = Math.max(1 - position, 0);
} else {
translationX = 0.5f * page.getWidth() * position;
scale = 1.0f;
alpha = Math.max(0.1f * position + 1, 0);
}
ViewHelper.setTranslationX(page, translationX);
ViewHelper.setScaleX(page, scale);
ViewHelper.setScaleY(page, scale);
ViewHelper.setAlpha(page, alpha);
}
});
In this case all your fragments must have a background View that stops TochEvents propagating to fragments in the back.
I'm guessing that you already have opaque backgrounds for these fragments, or else the fragments in the back would show through the fragment currently on the top of the stack. So try setting a ClickListener for all your fragment's root ViewGroup instance that does nothing, it just catches the touch events.
In case anyone runs into this issue (which can happen not only with ViewPager), it is just a matter of adding this attribute to the layout at the top of your hierarchy:
android:clickable="true"
This way it will handle the clicks, not doing really anything with them, but at least avoiding them to reach the fragment in the background.

android: move to touch location after release with libgdx/universal-tween-engine

I'm learning to use libgdx with universal-tween-engine and haven't been able to figure out how to touch (or click on the desktop app) a point on the screen and have a texture move all the way to the touched location without keeping the touch or click active until the end-point is reached.
When the touch event is initiated, the animation begins and the graphic moves towards the location. The graphic will follow the finger/mouse-pointer if a touch and drag is initiated. If I touch a point, the graphic will move towards the point until the touch is released. Then it stops where it was when touch is released.
I'm looking to touch-and-release and have that graphic move to the touched point, and am probably not understanding something about the tween engine implementation. I've pasted the tweening code below.
public void render() {
camera.update();
batch.setProjectionMatrix(camera.combined);
batch.begin();
batch.draw(texture.getTexture(), texture.getBoundingBox().x, texture.getBoundingBox().y);
batch.end();
Tween.registerAccessor(Plane.class, new TextureAccessor());
TweenManager planeManager = new TweenManager();
float newX = 0;
float newY = 0;
boolean animateOn = false;
if(Gdx.input.isTouched()) {
newX = Gdx.input.getX();
newY = Gdx.input.getY();
animateOn = true;
}
if (animateOn == true && (texture.getX() != newX || texture.getY() != newY)) {
Tween.to(texture, TextureAccessor.POSITION_XY, 10)
.target(newX, newY)
.ease(TweenEquations.easeNone)
.start(planeManager);
planeManager.update(1);
if (texture.getX() == newX && texture.getY() == newY) {
animateOn = false;
}
}
}
Originally, I had the tweening code inside the conditional for isTouched() and didn't use the newX, newY or animateOn variables. I thought using isTouched() to only set the new coordinates and animation state would then make the loop trigger the tween. The older code looked like this:
if(Gdx.input.isTouched()) {
newX = Gdx.input.getX();
newY = Gdx.input.getY();
Tween.to(texture, TextureAccessor.POSITION_XY, 10)
.target(newX, newY)
.ease(TweenEquations.easeNone)
.start(planeManager);
planeManager.update(1);
}
I've also tried using justTouched(), but the graphic would only move very slightly toward the touched point.
I've been struggling with this for a few hours, I'd really appreciate it if anyone could point me in the right direction.
Thanks.
Tween.registerAccessor(Plane.class, new TextureAccessor());
TweenManager planeManager = new TweenManager();
These two lines should go in the create() method, not the render() one! Here, you're instantiating a new manager on every frame, you only need one manager, that's all, not an army of them!
Also, you need to update the manager on every frame, not just when animateOn is true, else you'll need to keep your finger pressed...
The correct code is as follows, learn from it, you'll get a better understanding of how the Tween Engine works :)
// Only one manager is needed, like a Spritebatch
private TweenManager planeManager;
public void create() {
Tween.registerAccessor(Plane.class, new TextureAccessor());
planeManager = new TweenManager();
}
public void render() {
// The manager needs to be updated on every frame.
planeManager.update(Gdx.graphics.getDeltaTime());
camera.update();
batch.setProjectionMatrix(camera.combined);
batch.begin();
batch.draw(texture.getTexture(), texture.getBoundingBox().x, texture.getBoundingBox().y);
batch.end();
// When the user touches the screen, we start an animation.
// The animation is managed by the TweenManager, so there is
// no need to use an "animateOn" boolean.
if (Gdx.input.justTouched()) {
// Bonus: if there is already an animation running,
// we kill it to prevent conflicts with the new animation.
planeManager.killTarget(texture);
// Fire the animation! :D
Tween.to(texture, TextureAccessor.POSITION_XY, 10)
.target(Gdx.input.getX(), Gdx.input.getY())
.ease(TweenEquations.easeNone)
.start(planeManager);
}
}
I was trying to implement this behavior in the wrong way. Instead of using isTouched or justTouched(), I needed to use touchDown() from GestureListener.
I created a class that implemented GestureDetector (call it touchListener())inside of my main class (the one that implements ApplicationLisetener )in the main libgdx project and put the x and y capturing code inside of toucDown (I noticed tap() was also being triggered). I moved the tween functions (the actual tweening, the call to registerAccessor(), and the creation of the new tween manager) into the update() method of touchListener().
I added a call to touchListener()'s update function inside the render() loop of the main libgdx class.
I doubt I did this is the best way, but I hope it's helpful to someone else in the future.

Categories

Resources