I am using an input processor for touchinput on a flappy bird like game. This works fine on my droid turbo, and a couple other newer phones. But with my two older tables, a xoom, and verizon tablet, touchDown occasionally doesn't fire. I should mention that the FPS is 60 throughout gameplay. Also, I use an inputMultiplexer which adds both the playerInput and hud/play stages. Could this just be a problem with older android? Any fix? I am sure it's not my code for the fact that it works on newer phones.
EDIT
I tried using Gdx.input.isTouched like so :
if(Gdx.input.isTouched()){
if(!touched){
jump();
}
touched = true;
} else{
touched = false;
}
But it gives me the same results as input processor :\
This is not a problem with the jump method, as of right now it just prints "touched" to the console.
it is the issue with the viewPort. that is the different phone has different screen size. please check the below link which will help you
http://stackoverflow.com/questions/39810169/libgdx-text-not-rendering-properly-on-larger-screens/39946652#39946652
Related
I'm attempting to extract the white balance parameters from the auto white balance algorithm in the S9. On every other device I've tested, it gives meaningful parameters back (the numbers have a floating point precision of like 6 digits and are constantly changing) but the S9 appears to round it's result parameters to the nearest whole number which ends up being giving some very poor results in terms of color balance. Here's the code I am using to do this:
if (result.get(CaptureResult.COLOR_CORRECTION_GAINS) != null) {
channelVector = result.get(CaptureResult.COLOR_CORRECTION_GAINS);
}
Anybody else run into this issue and if so... any solutions to it out there???
Consider working with custom Samsung Camera API. These days, it is based on camera2.
Specifically, they provide their COLOR_CORRECTION_GAINS. They also explain that
… the camera device may do additional processing but android.colorCorrection.gains and android.colorCorrection.transform will still be provided by the camera device (in the results) and be roughly correct.
(the emphasis is mine)
Using OpenGLES 1.1 (don't have a choice at this time). Target OS is Android.
I'm having some inconsistency when rendering to the main framebuffer, and when rendering to a texture.
When I render to the normal screen, everything is fine. When I render to a texture, I get a dark rim around my graphics wherever alpha is translucent.
Here's my helper functions:
void RenderNormal()
{
if (!gIsRenderToTexture)
{
glBlendFunc(GL_SRC_ALPHA,GL_ONE_MINUS_SRC_ALPHA);
glBlendEquationOES(GL_FUNC_ADD_OES);
}
else
{
glBlendFuncSeparateOES(GL_SRC_ALPHA,GL_ONE_MINUS_SRC_ALPHA,GL_ONE,GL_ONE_MINUS_SRC_ALPHA);
glBlendEquationSeparateOES(GL_FUNC_ADD_OES,GL_FUNC_ADD_OES);
}
}
void RenderAdditive()
{
glBlendFunc(GL_SRC_ALPHA, GL_ONE);
}
void RenderMultiply()
{
glBlendFunc(GL_ZERO, GL_SRC_COLOR);
}
So, some data:
On newer systems, this works just fine (also on iOS, OSX, and Linux)
On Kindle Fire, I still get the dark rims.
On an older Android device running KitKat, my additive/multiply functions don't turn off (I assume because of juggling between glBlendFunc and glBlendFuncSeparate... I'm not turning something off, but whatever I try to do to fix it makes it worse)
I'm looking for a way to square these three functions so that they can operate both with render to texture, and with rendering to the normal ol' screen. Can you assist?
Okay, a day of working and research and I finally figure out that on the target device, the OES are not supported. So that said, anyone having this problem... the Kindle Fire and a lot of older tablets just outright don't support glBlendFuncSeparateOES or glBlendEquationOES and they will fail SILENTLY.
I use Unity 3D with Samsung Gear. I have a working OVRPlayerController in my scene but I am having difficulties mapping the oculus tap, swipe and return button.
I have tried with something like:
if (Input.GetMouseButtonDown(0))
{
Debug.Log("input detected");
}
And with this I detect the tap encircled in red
I have tried also something like:
if (OVRInput.Get(OVRInput.Button.PrimaryThumbstick))
{
Debug.Log("Input detected");
}
Or :
if (OVRInput.Get(OVRInput.Button.One))
{
Debug.Log("Input detected");
}
But nothing seems to work. Is there any documentation that explains how is the input mapped on Samsung Gear that I encircled in yellow ? Does anyone have experience with this or can maybe guide me to some useful documentation on this matter ?
Cheers
My project settings for input:
For swipes, you can get the current Vector2 on the touchpad (either the HMD and/or the incoming remote controller touchpad) by using:
CurrentVector = OVRInput.Get(OVRInput.Axis2D.PrimaryTouchpad);
as for the back button, it is accessible using OVRInput.Button.Back either on Get, GetDown or GetUp. But know that a 2 seconds long back press is reserved for the Universal Menu and should not have any implications inside your game or app.
I instantiate the following gameObject, which contains an Animator with the mode "always animate" on, the animation goes for 340ms, after that time I destroy the gameObject.
The gameObject Inspector properties:
I instantiate it using the following code:
instancia = (Instantiate(cardAnimation, new Vector3(0, 0, 0), Quaternion.identity) as GameObject).GetComponent<Image>();
instancia.rectTransform.SetParent(transform);
StartCoroutine(KillOnAnimationEnd());
Here is the Coroutine:
private IEnumerator KillOnAnimationEnd()
{
yield return new WaitForSeconds(0.34f);
DestroyImmediate(instancia);
}
Here is how the animation looks like when simulating in Unity (PC-Windows):
But on android after I open the chest it waits 340ms with nothing happening and then show the information above, does this have something to do with the plataform or is some unity or perhaps code related issue?
NOTE: I also have another animation in another scene that is just a already instantiated gameObject in the Hierarchy with always animated on and it works on Android.
--EDIT--
So I have ran the newest version of the app in a emulator which is almost about 1080x480 and the animation showed just as the PC, also running on a 720p smartphone did the job, the only problem I'm still having is with my QuadHD resolution from Galaxy S6, everything else shows but the animation, I have even tried making the animation run without any script so it runs in a loop, but it doesn't show up in galaxy screen.
Given the news about the issue I think this might change a little bit the perspective of answers and perhaps help someone else solve the same problem in the future.
Okay, figured out the problem, its something to do with "rotation" in animations using Unity3D in 2D mode, gonna be reporting it form Unity so it is fixed.
The solution: Animate your UI only using scale/position, if used rotation it will not show on high resolution display.
I am pretty sure your WaitForSeconds(0.34f) is not working properly because there is no thing such as yield keyword in Java. I recommend you to use a invoke method instead to call your method that destroys your GameObject.
Kinetic scrolling is very important for developing mobile applications with Qt and I noticed that it’s not smooth on the devices ( tried with android device and iphone ) . It looks a bit choppy, jumping from one position to the next one and following the finger movement with delay. This is visible especially when trying to scroll slow. The other applications in the devices are scrolling a lot smoother especially in iphone.
I made a simple test project with QFrame ( ui->frame ) containing only buttons. The buttons are added to QVBoxLayout. This frame is added to QcrollArea object which reacts to touch events. And this scrollArea is added to QGridLayout. So the scrolling is only in vertical direction.
I have this code in the constructor of my class which is based on QFrame:
ui->frame->setSizePolicy( QSizePolicy::Expanding, QSizePolicy::Fixed );
ui->frame->setMinimumHeight( 1000 );
ui->frame->setMaximumHeight( 1000 );
m_scrollArea = new QScrollArea();
m_scrollArea->setWidget( ui->frame );
m_scrollArea->setWidgetResizable( true );
m_scrollArea->setHorizontalScrollBarPolicy( Qt::ScrollBarAlwaysOff );
m_scrollArea->setVerticalScrollBarPolicy( Qt::ScrollBarAlwaysOff );
m_layout = new QGridLayout();
m_layout->addWidget( m_scrollArea );
m_layout->setContentsMargins( 0, 0, 0, 0 );
setLayout( m_layout );
QScroller::grabGesture( m_scrollArea, QScroller::LeftMouseButtonGesture );
Am I doing something wrong in my code and what can I do to fix this thing. Is someone else experiencing the same thing? I want my application to be looking as native as possible and this choppy scrolling is really not something normal.
If you need more information I will try to provide. I may try to upload my test project somewhere and add some screen capture of the device if needed.
The Qt version that i’m using is 5.1.1 for the android and Qt 5.1.0RC1 self-build for the ios.
I added
QScrollerProperties sp;
sp.setScrollMetric( QScrollerProperties::DragStartDistance, 0.001 );
sp.setScrollMetric( QScrollerProperties::ScrollingCurve, QEasingCurve::Linear );
QScroller* qs = QScroller::scroller( m_scrollArea );
qs->setScrollerProperties( sp );
The DragStartDistance makes the scrolling more responsive. What other properties can I fine tune to make the scrolling look better?
I also noticed that the GUI paintEvent() is not called every time the QEvent::Scroll is received from the QScrollArea which I guess may lead to choppy scrolling. So I added code to repaint the GUI every time I receive QEvent::Scroll and the scrolling looks little smoother but still not perfect in the android device.
What else can I try?
I'll appreciate any help that you can give me. Thanks!
The solution for you problem is here: qtcentre.org. Take a look in my post there. My QScroller is working fine now!
Here is the Qt Code in case the link breaks:
Try setting the verticalScrollMode of your view to ScrollPerPixel>
setHorizontalScrollMode(QAbstractItemView::ScrollPerPixel);