I am using Marmalade to create a project using Cocos2d-X, however I am not being able to use Multitouch in my project. The idea would be to identify pinch gestures to zoom in and out.
I've enabled touches in my GameLayer.cpp file using:
this->setTouchMode(kCCTouchesAllAtOnce);
this->setTouchEnabled(true);
I've also added the configuration below to my application's ICF file:
[S3E]
AndroidPointMultiEnable = 1
I have tested my application both on the simulator (enabling multitouch there) and in an Android tablet, but in the simulator the touches are not enven registed by my application and on the Android tablet I am receiving each touch as a separate event and not both at the same time.
Could you help me?
Thanks
UPDATE
Here is my code for ccTouchesBegan:
void GameLayer::ccTouchesBegan(CCSet* pTouches, CCEvent* event)
{
CCSetIterator it;
CCTouch *touch;
CCPoint touchA;
CCPoint touchB;
IwTrace(APPLICATION, ("Touches Began - touch count: %d", pTouches->count()));
it = pTouches->begin();
if (usePinchGesture && pTouches->count() == 2)
{
touch = (CCTouch*) (*it);
touchA = touch->getLocation();
it++;
touch = (CCTouch*) (*it);
touchB = touch->getLocation();
pinchDistance = GeomHelper::getDistanceSq(touchA, touchB);
IwTrace(APPLICATION, ("Pinch gesture detected. Starting distance between points: %f", pinchDistance));
}
}
The problem is that the count of touches pTouches->count() is always 1, so each touch event gets treated separately.
Thanks
Yep, pTouches->count() is always 1 in android!
cocos2d-x v2.2.3
in ..\cocos2dx\platform\android\jni\TouchesJni.cpp
blablabla...
JNIEXPORT void JNICALL Java_org_cocos2dx_lib_Cocos2dxRenderer_nativeTouchesBegin(JNIEnv * env, jobject thiz, jint id, jfloat x, jfloat y) {
cocos2d::CCDirector::sharedDirector()->getOpenGLView()->handleTouchesBegin(1, &id, &x, &y);
}
It's always 1 in the first param.
On android, the multi-touch is open by default. You don’t need to open anything before get the touch coordinates in void MyLayer::ccTouchesBegan/Moved/Ended
Related
I'm an experienced native iOS developer making my first foray into Android through Unity. I'm trying to set up a custom shader, but I'm having some trouble with the Normal maps. I've got them working perfectly in the Unity simulator on my computer, but when I build to an actual device (Samsung Galaxy S8+), the Normal maps don't work at all.
I'm using Mars as my test case. Here's the model running in the simulator on my computer:
And here's a screenshot from my device, running exactly the same code.
I've done a LOT of research, and apparently using Normal maps on Android with Unity is not an easy thing. There are a lot of people asking about it, but almost every answer I've found has said the trick is to override the texture import settings, and force it to be "Truecolor" which seems to be "RGBA 32 Bit" according to Unity's documentation. This hasn't helped me, though.
Another thread suggested reducing the Asino Level to zero, and another suggested turning off Mip Maps. I don't know what either of those are, but neither helped.
Here's my shader code, simplified but containing all references to Normal mapping:
void surf (Input IN, inout SurfaceOutputStandard o) {
half4 d = tex2D (_MainTex , IN.uv_MainTex);
half4 n = tex2D (_BumpMap , IN.uv_BumpMap);
o.Albedo = d.rgb;
o.Normal = UnpackNormal(n);
o.Metallic = 0.0;
o.Smoothness = 0.0;
}
I've seen some threads suggesting replacements for the "UnpackNormal()" function in the shader code, indicating that it might not be the thing to do on Android or mobile in general, but none of the suggested replacements have changed anything for better or worse: the normal maps continue to work in the simulator, but not on the device.
I've even tried making my own normal maps programmatically from a grayscale heightmap, to try to circumvent any import settings I may have done wrong. Here's the code I used, and again it works in the simulator but not on the device.
public Texture2D NormalMap(Texture2D source, float strength = 10.0f) {
Texture2D normalTexture;
float xLeft;
float xRight;
float yUp;
float yDown;
float yDelta;
float xDelta;
normalTexture = new Texture2D (source.width, source.height, TextureFormat.RGBA32, false, true);
for (int y=0; y<source.height; y++) {
for (int x=0; x<source.width; x++) {
xLeft = source.GetPixel (x - 1, y).grayscale * strength;
xRight = source.GetPixel (x + 1, y).grayscale * strength;
yUp = source.GetPixel (x, y - 1).grayscale * strength;
yDown = source.GetPixel (x, y + 1).grayscale * strength;
xDelta = ((xLeft - xRight) + 1) * 0.5f;
yDelta = ((yUp - yDown) + 1) * 0.5f;
normalTexture.SetPixel(x,y,new Color(xDelta,yDelta,1.0f,yDelta));
}
}
normalTexture.Apply();
return normalTexture;
}
Lastly, in the Build Settings, I've got the Platform set to Android and I've tried it using Texture Compression set to both "Don't Override" and "ETC (default)". The former was the original setting and the latter seemed to be Unity's suggestion both by the name and in the documentation.
I'm sure there's just some flag I haven't checked or some switch I haven't flipped, but I can't for the life of me figure out what I'm doing wrong here, or why there would be such a stubborn difference between the simulator and the device.
Can anyone help a Unity newbie out, and show me how these damn Normal maps are supposed to work on Android?
Check under:
Edit -> Project Settings -> Quality
Android is usually set to Fastest.
I'm having a bit of a problem with OnTriggerEnter when I'm using my mobile as a test device.
I have some touch code that successfully lets me drag objects around the screen.
I am then having the objects collide with other objects on the screen.
This was working perfectly until I turned the objects into prefabs. ( I'm needing to do this as the objects are being randomly generated at runtime)
Now, I can still move the objects around the screen but they no longer collide with the other objects, which are also prefabs. It does however still work fine when running it on my laptop in the unity editor.
All my objects have colliders on them with trigger checked, and the moving objects have rigidbodies.
On trigger enter code
public void OnTriggerEnter(Collider other)
{
Debug.Log ("here");
Debug.Log(this.gameObject.tag +"is this");
Debug.Log(other.gameObject.tag + "is other");
if (this.gameObject.tag == other.gameObject.tag)
{
Debug.Log("here2)");
Reftomanager.miniGameScore++;
Reftomanager.updateScore();
Destroy(this.gameObject);
}
}
touch code
if (Input.touchCount > 0)
{
Touch touch = Input.GetTouch(0);
switch(touch.phase)
{
case TouchPhase.Began:
Ray ray = Camera.main.ScreenPointToRay (touch.position);
if (Physics.Raycast(ray,out hit))
{
thisObject = hit.collider.gameObject;
touchPos = Camera.main.ScreenToWorldPoint (touch.position);
if(thisObject.name!="circle")
{
draggingMode = true;
}
}
break;
case TouchPhase.Moved:
if (draggingMode)
{
touchPos = Camera.main.ScreenToWorldPoint (touch.position);
newCentre = touchPos;
thisObject.transform.position = touchPos;
}
break;
case TouchPhase.Ended:
draggingMode = false;
break;
}
}
}
I'm completely stumped so any help would be amazing.
Thanks
Just got this same error recently. I suggest using
If(other.gameObject.CompareTag ("YourTagName"))
Also if you recently added a tag or edited any tags, I found that unity has a bug where your tags will not register on your android build unless you restart unity.
GL.
Since your using 3D colliders, is it possible that the position you are assigning them is different? Touch.position is a Vector2, which means ScreenToWorldPoint would be using 0 for z. If you are using a Vector3 with a z value other than 0 to get the world point in the editor (Standalone Input), it could get you a different value even if x and y are the same.
Another possibility is that there is a platform specific error happening somewhere else in the code, upon object instantiate. Your movement code would still work fine, if it isn't in the same Monobehavior.
If you have an Android, you can use Android Monitor with the Unity tag to check for error messages.
I'm new to Unity and I am trying to build a solar system exploration app through unity. I have the environment set up, and now all I need is the ability to look around (via tilting and moving the phone itself, which is android) smoothly. I have the ability to look around, but if I do a complete 180, it seems to invert the physical orientation of the phone with the visual movements in game, e.g. if I have turn 180 degrees, if I tilt the phone down it shifts my vision in game to the right, up results in visual shift to the left. Here is the code I have thus far:
#pragma strict
private var quatMult : Quaternion;
private var quatMap : Quaternion;
function Start () {
Input.gyro.enabled = true;
}
function Update () {
#if UNITY_ANDROID
quatMap = Input.gyro.attitude;
#endif
transform.localRotation = Quaternion.Euler(90, 0, 0) * quatMap * Quaternion(0,0,1,0) /*quatMult*/;
}
Any help is greatly appreciated. Thanks.
This should be what you're looking for: https://gist.github.com/chanibal/baf46307c4fee3c699d5. Just drag it to the camera and it should work.
You might want to remove the reset on touch part (Input.touchCount > 0 in Update) and debug information (the OnGui method).
I write a very simple android application, that I can draw something on the pad. Touch the screen with a finger, you will see a green ball, move your finger, you will see a red line.
But I found a very strange thing: If I touch the screen with two fingers one by one very fast, it will draw a line between them! (Imaging you are pressing two keys jkjkjkjkjkjjkjkjkjkjkjkj on the keyboard)
The key code is pretty simple:
public boolean onTouch(View v, MotionEvent event) {
int action = event.getAction();
switch (action & MotionEvent.ACTION_MASK) {
case MotionEvent.ACTION_DOWN:
multiTouch = false;
id = event.getPointerId(0);
PointF p = getPoint(event, 0);
path = new Path();
path.moveTo(p.x, p.y);
paths.add(path);
points.add(copy(p));
break;
case MotionEvent.ACTION_POINTER_DOWN:
multiTouch = true;
for (int i = 0; i < event.getPointerCount(); i++) {
int tId = event.getPointerId(i);
if (tId != id) {
points.add(getPoint(event, i));
}
}
break;
case MotionEvent.ACTION_MOVE:
if (!multiTouch) {
p = getPoint(event, 0);
path.lineTo(p.x, p.y);
}
break;
}
invalidate();
return true;
}
The full source is here: https://github.com/freewind/TouchTest/blob/master/src/com/example/MyImageView.java
And it's a working demo: https://github.com/freewind/TouchTest
Or you can just download the signed apk on your android device, and test it yourself: https://github.com/freewind/TouchTest/blob/master/TouchTest.apk?raw=true
You can see in my code, I have checked if it's multi touch and disabled drawing on that case.
My android version is 4.0, and my code target is 2.3.3
There is a picture on my android pad:
You can see there are some lines but it should not be, there should be a green ball on the left of the red line instead.
I'm not sure why android treat fast single touch as moving, I considered 3 reasons:
My code has something wrong
Android sdk has something wrong
My android pad has something wrong, e.g. missing a ACTION_DOWN event
How to find out the real reason?
UPDATE
One of my friend used his android mobile(android 2.1) to test this app and found there is no red line, another used android 2.3.5 and found there are red lines.
Please review my code, I have checked multi-touch by ACTION_POINTER_DOWN, and will do nothing on ACTION_MOVE if there are more than 1 points. So the id of point is not needed. (Actually, in my first version of this code, I used id but have the same issue).
And I don't think this is an expected behavior, because it made the development of touching programs hard. I found this issue because in my another application(user can drag/zoom/rotate an image by fingers), the image sometimes "jump" on screen.
I even tried a popular game (Fruit Ninja) on my android pad and iTouch, and found android version has the issue but iTouch doesn't.
Now I'm sure there is something wrong (missing an ACTION_UP event when the first finger ups), but I still don't know what causes it. My android pad? Or Android sdk?
That is way it works for multitouch. When you press fast android handle it as gesture, and you will have 2 pressed pointers. To avoid it try to handle action_up, or use action_pointer_down instead
you can check the id of the touch , so that you will handle only the first touch alone.
alternatively , you can monitor all touches and handle them together .
I'm working on a custom view for an android application, similar to the Analog Gauge sample code available from Mind the Robot.
Running the code from listed site, I get see this on my screen:
(Motorola Droid, 2.2.3), (Emulator, 4.0.3)
(Xoom, 4.0.3)(Other phone, 4.0.3)
The hand is missing!
The drawing calls are being made (I can see them in logcat), but the canvas elements the calls draw are invisible.
It's not API level dependent, though; if I import it the right way into a project, it will hand will show up when I run it on the Xoom.
But, when I move the files to a different project folder (same source code, same layouts) it goes back to missing the dial.
What's going on? How could the same code be producing such different outcomes on different devices?
So, the key clue in my mystery seemed to be that it worked on the emulator, but not on the hardware devices.
Hardware Rendering
I did peruse the hardware rendering page on the Android Developer's website, but apparently not closely enough.
http://developer.android.com/guide/topics/graphics/hardware-accel.html
While it does mention that the API's are available beginning version 11, it does not say that Hardware Rendering is turned on for all applications by default, starting with API Level 14 (ICS).
What does this mean for us?
Almost everything is faster; except for the few things that don't work.
I managed to violate two of these, without realizing it:
Canvas.DrawTextOnPath()
Paint.setShadowLayer()
It's not mentioned in the API reference (or anywhere else I can find, and certainly not checked by Lint), but using any of the listed operations can do weird things.
In my case, Canvas.DrawTextOnPath() seemed to work just fine.
But when Android notice that the paint that I used on the hand had shadow layer set, it silently ignored it.
How do I know if my View is hardware accelerated?
From the documentation link above:
There are two different ways to check whether the application is hardware accelerated:
View.isHardwareAccelerated() returns true if the View is attached to a hardware accelerated window.
Canvas.isHardwareAccelerated() returns true if the Canvas is hardware accelerated
If you must do this check in your drawing code, use Canvas.isHardwareAccelerated() instead >of View.isHardwareAccelerated() when possible. When a view is attached to a hardware >accelerated window, it can still be drawn using a non-hardware accelerated Canvas. This >happens, for instance, when drawing a view into a bitmap for caching purposes.
In my case, the opposite appears to have occurred.
The custom view logs that it is not Hardware-accelerated; however, the canvas reports that it is hardware-accelerated.
Work Arounds and Fixings
The simplest fix is forcing the custom view to do software rendering. Per the documentation this can be accomplished by:
myView.setLayerType(View.LAYER_TYPE_SOFTWARE, null);
Alternatively, you could remove the offending operations, and keep hardware rendering turned on.
Learn from my misfortune. Good luck, all.
I put it into init() and worked fine after that.
private void init() {
setLayerType(myView.LAYER_TYPE_SOFTWARE, null);
....
}
With myView.setLayerType(View.LAYER_TYPE_SOFTWARE, null); suggestion I can see hand. But I have still a problem: I see scale with only 0 written! As in the picture and two strage zeros out of the schema: (GALAXY NEXUS 4.2.1)
My drawScale() method is as in the example:
private void drawScale(Canvas canvas) {
canvas.drawOval(scaleRect, scalePaint);
canvas.save(Canvas.MATRIX_SAVE_FLAG);
for (int i = 0; i < totalNicks; ++i) {
float y1 = scaleRect.top;
float y2 = y1 - 0.020f;
canvas.drawLine(0.5f, y1, 0.5f, y2, scalePaint);
if ((i % 5) == 0) {
int value = nickToDegree(i);
if ((value >= minDegrees) && (value <= maxDegrees)) {
String valueString = Integer.toString(value);
canvas.drawText(valueString, 0.5f, y2 - 0.015f, scalePaint);
}
}
canvas.rotate(degreesPerNick, 0.5f, 0.5f);
}
canvas.restore();
}
in my case i made this:
AnalogView bar = (AnalogView) findViewById(R.id.AnalogBar);
bar.setLayerType(bar.LAYER_TYPE_SOFTWARE, null);
if (value_list.size()>0) bar.SetData(Double.parseDouble(value_list.get(value_list.size()-1)));
where SetData in AnalogView is
public void SetData(double data) {
setHandTarget((float)data);
invalidate();
}
On Galaxy S4 Android 4.4.2
TYPE_TEMPERATURE is deprecated
use
TYPE_AMBIENT_TEMPERATURE
For anyone having problems with text drawing on scale in the initialisation do this:
scalePaint.setLinearText(true);