Strange "stutter" in box2D on different android devices - android

I'm developing an engine and a game at the same time in C++ and I'm using box2D for the physics back end. I'm testing on different android devices and on 2 out of 3 devices, the game runs fine and so do the physics. However, on my galaxy tab 10.1 I'm sporadically getting a sort of "stutter". Here is a youtube video demonstrating:
http://www.youtube.com/watch?v=DSbd8vX9FC0
The first device the game is running on is an Xperia Play... the second device is a Galaxy Tab 10.1. Needless to say the Galaxy tab has much better hardware than the Xperia Play, yet Box2D is lagging at random intervals for random lengths of time. The code for both machines is exactly the same. Also, the rest of the engine/game is not actually lagging. The entire time, it's running at solid 60 fps. So this "stuttering" seems to be some kind of delay or glitch in actually reading values from box2D.
The sprites you see moving check to see if they have an attached physical body at render time and set their positional values based on the world position of the physical body. So it seems to be in this specific process that box2D is seemingly out of sync with the rest of the application. Quite odd. I realize it's a long shot but I figured I'd post it here anyway to see if anyone had ideas... since I'm totally stumped. Thanks for any input in advance!
Oh, P.S. I am using a fixed time step since that seems to be the most commonly suggested solution for things like this. I moved to a fixed time step while developing this on my desktop, I ran into a similar issue just more severe and the fixed step was the solution. Also like I said the game is running steady at 60 fps, which is controlled by a low latency timer so I doubt simple lag is the issue. Thanks again!

As I mentioned in the comments here, this came down to being a timer resolution issue. I was using a timer class which was supposed to access the highest resolution system timer, cross platform. Everything worked great, except when it came to Android, some versions worked and some versions it did not. The galaxy tab 10.1 was one such case.
I ended up re-writing my getSystemTime() method to use a new addition to C++11 called std::chrono::high_resolution_clock. This also worked great (everywhere but Android)... except it has yet to be implemented in any NDK for android. It is supposed to be implemented in version 5 of the crystax NDK R7, which at the time of this post is 80% complete.
I did some research into various methods of accessing the system time or something by which I could base a reliable timer on the NDK side, but what it comes down to is that these various methods are not supported on all platforms. I've went through the painful process of writing my own engine from scratch simply so that I could support every version of android, so betting on methods that are inconsistently implemented is nonsensical.
The only sensible solution for anyone facing this problem, in my opinion, is to simply abandon the idea of implementing such code on the NDK side. I'm going to do this on the Java end instead, since thus far in all my tests this has been sufficiently reliable across all devices that I've tested on. More on that here:
http://www.codeproject.com/Articles/189515/Androng-a-Pong-clone-for-Android#Gettinghigh-resolutiontimingfromAndroid7
Update
I have now implemented my proposed solution, to do timing on the java side and it has worked. I also discovered that handling any relatively large number, regardless of data type (a number such as the nano seconds from calling the monotonic clock) in the NDK side also results in serious lagging on some versions of android. As such I've optimized this as much as possible by passing around a pointer to the system time, to ensure we're not passing-by-copy.
One last thing too, my statement that calling the monotonic clock from the NDK side is unreliable is however, it would seem, false. From the Android docks on System.nanoTime(),
...and System.nanoTime(). This clock is guaranteed to be monotonic,
and is the recommended basis for the general purpose interval timing
of user interface events, performance measurements, and anything else
that does not need to measure elapsed time during device sleep.
So it would seem, if this can be trusted, that calling the clock is reliable, but as mentioned there are other issues that then arise, like handling allocating and dumping the massive number that results which alone nearly cut my framerate in half on the Galaxy Tab 10.1 with Android 3.2. Ultimate conclusion: supporting all android devices equally is either damn near or flat out impossible and using native code seems to make it worse.

I am very new to game development, and you seem a lot more experienced and it may be silly to ask, but are you using delta time to update your world? Altough you say you have a constant frame rate of 60 fps, maybe your frame counter calculates something wrong, and you should use delta time to skip some frames when the FPS is low, or your world seem to "stay behind". I am pretty sure that you are familiar with this, but I think a good example is here : DeltaTimeExample altough it is a C implementation. If you need I can paste some code from my Android Projects of how I use delta time, that I've developed following this book : Beginning Android Games.

Related

Unity Game suddenly has poor performance on Samsung S8, but still perfect in editor

The last few days I have been building a simple platformer to be deployed on Android Phones. However, after the last deploy to my phone, the framerate was bearly at 25fps on my quite fast Samsung S8. The game in the editor is somewhere at 100 fps and works fine. The game also worked fine on my S8 just a day before. I have not really changed anything, so what could be going on here? This is absolutely unacceptable. If you need further info, tell me!
Thank you!
This is from the profiler:
I cannot tell you, why this happend but this was all I needed to do:
In any script that starts (like playermovement or what ever) add this line:
Application.targetFrameRate = 60;
E voila.
I didnt have this line before and it worked but it stopped working. Adding this line fixed it entierely. Thank you all.
First of all, try deducing number of Tries and Vertices in your Meshes, Use baked lighting to improve performance, Also try object pooling wherever possible, and generate Occlusion map to optimize your game for mobile devices. You can set quality settings from Unity for mobile devices. You will need to maintain LOD levels if you want more complex meshes to be rendered on your phone, and optimize game as much as possible to render it on mobile devices without any glitches. I will also recommend you to use this toolset, which will help you in optimization.
UPDATE
If you have recently changed anything, which made your app performance Bad, then check you haven't added any Co-routine or IEnumetaror relative code that may iterate over time in separate thread in your recent changes, try using Collab to maintain versioning, as you can restore your previous code with help of collab anytime, in case you loose something, it is totally free and very useful.

Good FPS in Editor But not on devices(Google Cardboard), How to increase FPS?

I'm working on Google Cardboard Game. I have 5-10 sprites, a terrain with 15-30 pieces of bush on it(single bushes like grass),15 trees(low poly with <64 vertices),a Camera (provided by GVR Sdk).
On the Editor Framerate is good. But when I test it on my Galaxy S6, the FPS is low as 10-20(I've attached a script that gets Framerate on a Text).
Any optimization tip?
Here is detail of stats in Unity Editor:
CPU: main 6.0ms
FPS: 160(average)
Batches:117
Tris: 6.1k
Verts: 3.3k
I believe you had lot of times used methods "GameObject.Find()" and "gameobject.GetComponent()" in the runtime.
This is most common issue of unity developers.
Better do not use them at runtime at all! Only inside of awake or in start methods.
You need to make all needed GetComponent<> inside of awake or start methods and assign result to some global variables. After -- just use this variable in runtime.
This will really increase your game speed on the mobile devices.
Make sure your device is not overheating, might seem obvious but you can't always tell when it is on your face (I also own an S6).
Also, make sure you are not in energy saver mode, sounds dumb but I fell for it already ;)
Among the huge amount of things that can ruin the performance of a game on smartphone are:
Make sure you don't have a script doing too much work in Update (especially Instanciate()/Destroy()
Don't move static objects, just don't
Make sure you don't use high resolution textures (in my small experience > 512x512 is, that they are squared and have a resolution that is a power of two
As a side note, GetComponents can be an issue, the alternative was already posted by #Andrew, just use GetComponent in the Start()/Awake() method and store them to use them later on.

How to get real FPS with createjs

I created a game with createjs, but it runs very slowly on a mobile.
I added a function to show FPS obtained with createjs.Ticker.getMeasuredFPS(). However, the FPS shown by the function is quite normal. I set the FPS to 60, and the result of getMeasuredFPS() is about 55-60, while the animation is laggy and the FPS shouldn't be so high (it might be 5-10).
How can I get the real FPS on the device?
How can I profile it on a mobile?
getMeasuredFPS() is well tested, and should be accurate. That said, there's always a chance your mobile browser is doing something tricky (running the code every tick, but not rendering). There's also a chance your code may not be updating properly.
Either way, it's worth looking at a couple of alternatives. You can check getMeasuredTickTime(), which will tell you the average time spent in each tick (see the docs for more info).
You should also take a look at profiling using devtools.
http://google.com?q=profile+mobile+chrome

Raycasting vs OpenGL-ES 2.0 - is there a noticeable difference for Doom like game on Android?

It's my first question, I searched on google as always, round two was searching directly on SO, but still I couldn't get the exact answer.
I'm going to write 3D graphics engine for games I want to make in the future for Android platform. I played Doom recently on my mobile, on break in high-school, and when I lifted my head after being killed by Baron of Hell I saw more that twenty people struggling to see me playing, followed by loud "AAAAAAAW!" when they saw me dead. My jaw went down to the floor. None of them suspected that it was game from 1993.
But let's get back to the point. If you wanted to write "find it by yourself" you can stop typing now. I can't check this on my own, for different reasons. At first I own only one mid-cost device (HTC Wildfire) I can test my engine on. The second and more important reason is time. I don't have time to write whole OpenGL-ES 2.0 graphics engine and realize that my Wildfire can't even run it. Or it gets around 1FPS when walking.
For raycasting to be more realistic it needs few calculations, which are not important for me in OpenGL because I just set vertices & indices and it goes. I love the way that Doom levels are designed but I wanted to add ability to move your head up and down (rotation on X-axis) to look around and shoot more precisely, and the complexity of calculations in raycasting are growing (not for me, for CPU). I know Wildfire doesn't have any GPU and anything based on OGL (even 2D) is lagging as hell even if I overclock my CPU to #748 with PERFORMANCE governor (it turns off on higher values, and I got to move it from my Casemate to stop it from overheating). But Doom is running perfectly without any lag even if I underclock it to #448. Is this only because of lo-res textures and non-complex levels or because of raycasting?
And please don't flame for going back to such old thing like raycasting engine. Actual mobile devices, or smartphones - call them like you want - that doesn't cost $1k are in stage that computers was 18 (Doom release) years ago. There will always be a group of such devices. And even if I don't make my billion on this - I will have something to play on travel.
I've got some pretty nice ideas for games like this, and I want to hit the low cost devices as they don't have anything more spectacular than simple logic games. As this is my first question please correct anything I wrote wrong, poorly or just bad - I'll rewrite it.

Scheduling android graphics events

I asked this question over in the Android Developer's user group, last week. Nobody responded, so I thought I'd ask it over here.
Does anyone have any suggestions about how to schedule video events to happen at an exact clock time? I've been thinking about an application that would require two adjacent phones to display the same thing at exactly the same time. I'm wondering what that granularity of "exactly" is going to be.
I've done some testing on a couple of devices and it seems that the delay between an invalidate and the subsequent redraw can be as much 16ms. Perhaps I can do better with OpenGL?
Ideas? Anyone?
OpenGL itself is capable of very high framerates (unless I am mistaken). What I can tell you is that plenty of games have been written to run and maintain 30 frames per second. That's one frame every 3.33ms. At that speed, the change should be imperceptible to the human eye, or so I've heard (the estimate limit is 5ms).
However, there is a major difference between what OpenGL can do, and what the device running OpenGL can do. Again, Unless I am mistaken, you should be able to instruct OpenGL to run at 200 frames per second. The caveat is that if the machine you are running the animation on can't handle that framerate, it will either frame-skip or lag, and in either case will hog the processor and GPU like no other.
Again, as I don't know the specifics, I can only guess, but I would think that this is less of an issue with OpenGL vs the other leading brand, and more of an issue of the devices you are trying to sync. With the right code, a proven framework, two powerful machines, and high-speed data transfer capability (read: LAN at the least), there is no reason why you shouldn't be able to sync up the video. If any of these things are not the case, all bets are off.
-Cody

Categories

Resources