I am currently developing a game for Android, and I would like your expertise on an issue that I have been having.
Background:
My game incorporates frame rate independent motion, which takes into
account the delta time value before performing necessary Velocity
calculations.
The game is a traditional 2D platformer.
The Issue:
Here's my issue (simplified). Let's pretend that my character is a square standing on top of a platform (with "gravity" being a constant downward velocity of characterVelocityDown).
I have defined the collision detection as follows (assuming Y axis points downwards):
Given characterFootY is the y-coordinate of the base of my square character, platformSurfaceY is the upper y-coordinate of my platform, and platformBaseY is the lower y-coordinate of my platform:
if (characterFootY + characterVelocityDown > platformSurfaceY && characterFootY + characterDy < platformBaseY) {
//Collision Is True
characterFootY = platformSurfaceY;
characterVelocityDown = 0;
} else{
characterVelocityDown = deltaTime * 6;
This approach works perfectly fine when the game is running at regular speed; however, if the game slows down, the deltaTime (which is the elapsed time between the previous frame and the current frame) becomes large, and characterFootY + characterVelocityDown exceed the boundaries that define collision detection and the character just falls straight through (as if teleporting).
How should I approach this issue to prevent this?
Thanks in advance for your help and I am looking forward to learning from you!
What you need to do is to run your physics loop with constant delta time and iterate it as many time as it need with current tick.
const float PHYSICS_TICK = 1/60.f; // 60 FPS
void Update( float dt )
{
m_dt += dt;
while( m_dt > PHYSICS_TICK )
{
UpdatePhysics( PHYSICS_TICK );
m_dt -= PHYSICS_TICK;
}
}
There are various technics used to handle the tick left ( m_dt )
Caps for miniumum tick and maximum tick are also a must.
I guess the issue here is that slowdowns are inevitable. You can try and optimize the code but you'll always have users with slow devices or busy sections of your game where it takes a little longer than usual to process it all. Instead of assuming a consistent delta, assume the opposite. Code under the realization that someone could try installing it on an abacus.
So basically, as SeveN says, make your game loop handle slowdowns. The only real way to do this (in my admittedly limited experience) would be to place a cap on how large delta can be. This will result in your clock not running on time exactly, but when you think about it, that's how most games handle slowdown. You don't fire up StarCraft on your pentium 66 and have it run at 5 FPS but full speed, it slow down and processes it as normal, albeit at a slideshow.
If you did such a thing, during periods of slowdown in your game, it'd visibly slow down... but the calculations should still all be spot on.
edit: just realised you're SeveN. Well done.
Related
We're using a d3.layout.force on a web app, and I've been investigating a bug report that it is sluggish on Android: it feels like the nodes are in oil, compared to how it works on desktop browsers, or iOS.
(By the way, we only ever have between 4 and 9 nodes, and the sluggishness does not feel different between 4 and 9.)
We set size(), linkDistance() and charge(); so we're using the defaults for friction, theta, alpha, gravity, etc. I experimented with these to try and reproduce the effect on desktop, but couldn't. (friction(0.67), instead of default of 0.9, was closest, but still felt different, somehow.)
I then set up an FPS meter (based on calls to the tick() function). We get 60fps on desktop, and it seems in the 40s and 50s on an ipad. But on Android Chrome (on a Nexus 7) it seems capped at 30fps, and is often half that. Android Firefox was in the 20s normally, but sometimes into the 30s.
So, is it a reasonable hypothesis that are Android devices are just slower? Could there be a cap of 30fps in Android Chrome?
Then how can I fix this? I believe d3.js uses requestAnimationFrame(). Often animation libraries take the time between calls to requestAnimationFrame() to decide how far to move objects (so when the CPU gets overloaded the animation becomes jerkier, but takes the same amount of time to complete). But it appears that d3.js does not do this, and moves everything the same amount by tick, not by elapsed time. What can I do about this?
(Ideally I'd like a solution based on how slow/fast the machine is, rather than having to sniff the browser.)
Curiously, adding more calls to force.tick() in my own requestAnimationFrame() handler (see https://stackoverflow.com/a/26189110/841830), does increase the FPS. That suggests it is not CPU bound, but instead a limit that Android is enforcing (perhaps to save battery?).
Here is the code I'm using, that tries to adapt dynamically to the current fps; it ain't beautiful but seems to be getting the job done in my test android devices, without changing the behaviour in iOS or desktop.
First, where you set up the force layout:
var ticksPerRender = 0;
var animStartTime,animFrameCount;
force.on('start',function start(){
animStartTime = new Date();animFrameCount=0;
});
requestAnimationFrame(function render() {
for(var i = 0;i < ticksPerRender;i++)force.tick();
if(force.alpha() > 0)requestAnimationFrame(render);
});
The above does two things:
sets up the fps counter
sets up our own animation callback, which does nothing by default (ticksPerRender starts off as zero).
Then at the end of your tick handler:
++animFrameCount;
if(animFrameCount>=15){ //Wait for 15, to get an accurate count
var now = new Date();
var fps = (animFrameCount / (now - animStartTime))*1000;
if(fps < 30){
ticksPerRender++;
animStartTime = now;animFrameCount = 0; //Reset the fps counter
}
if(fps > 60 && ticksPerRender >= 1){
ticksPerRender--;
animStartTime = now;animFrameCount = 0; //Reset the fps counter
}
}
This says that if the FPS is low (below 30), do an extra call to tick() on each animation frame. And if it goes high (over 60), remove that extra call.
Each time ticksPerRender is changed, we measure the FPS from scratch.
I'm trying to detect patterns of knocking on the surface that the device is located on.
For example, if a user knocks one time on the surface, run method A. If the user knocks two times, run method B.
I have everything I need to make this happen except for the logics in the onSensorChanged method.
This is my code right now:
#Override
public void onSensorChanged(SensorEvent event) {
float x = event.values[0];
float y = event.values[1];
float z = event.values[2];
//Check time interval, not sure if this is correct though.
long actualTime = System.currentTimeMillis();
if ((actualTime - lastUpdate) > 100) {
long diffTime = (actualTime - lastUpdate);
lastUpdate = actualTime;
//This is where the magic should happen
}
}
I guess the main question is, how do I detect vibrations? Almost all other examples on the net is about how to detect shakes and movement.
Real answer- you're going to need to study DSP to get good results. This is a non-trivial problem.
Quick overview- when a vibration occurs, you're going to see a sinusoidal attenuating wave pattern (the attenuating signal after the main vibration is called "ringing" and is a bad thing for us- it means we need to separate ringing from real results). This can be detected and a vibration signalled based on looking for rapid changes in amplitude on the downwards vector (whichever one has gravity on it at the moment). The relative heights of the peak of the waves should be the relative strength of the knock.
So detecting one knock is fairly easy. Things that aren't easy:
*Telling the difference between a knock and footsteps across the room- both cause vibrations. They'll look the same. You may be able to filter it out via frequency analysis and filters
*Telling two knocks vs one knock in a short time frame. The second knock tends to be weaker, and will be difficult to tell apart form the ringing of the first knock. It may also have destructive interference with the first wave.
*Telling exactly when a knock occurred. There will be a time delay that may not be constant, and trying to figure it out means trying to find an exact peak. Difficult to do with noise.
*Telling a knock in a noisy environment (vibrationally noisy, not sound). Again, you'll need filtering.
I've actually done this, somewhat. And failed mostly I think. We were able to detect knocks well, but not to filter out noise at all. Of course we were looking for extremely small (1 finger) knocks, if you're looking for sharp raps you'll have fewer problems as the spike will be larger compared to the noise level. If you're expecting a single sharp knock the basics of looking for large spikes and ignoring secondary spikes for N milliseconds afterwards may be enough for you. If it isn't you're going to spend a lot of time on this.
I am using the following code to calculate Frame Rate in Unity3d 4.0. It's applied on a NGUI label.
void Update () {
timeleft -= Time.deltaTime;
accum += Time.timeScale/Time.deltaTime;
++frames;
// Interval ended - update GUI text and start new interval
if( timeleft <= 0.0 )
{
// display two fractional digits (f2 format)
float fps = accum/frames;
string format = System.String.Format("{0:F2} FPS",fps);
FPSLabel.text = format;
timeleft = updateInterval;
accum = 0.0F;
frames = 0;
}
}
It was working previously, or at least seemed to be working.Then I was having some problem with physics, so I changed the fixed timestep to 0.005 and the Max timestep to 0.017 . Yeah I know it's too low, but my game is working fine on that.
Now the problem is the above FPS code returns 58.82 all the time. I've checked on separate devices (Android). It just doesn't budge. I thought it might be correct, but when I saw profiler, I can clearly see ups and downs there. So obviously it's something fishy.
Am I doing something wrong? I copied the code from somewhere (must be from the script wiki). Is there any other way to know the correct FPS?
By taking cues from this questions, I've tried all methods in the first answer. Even the following code is returning a constant 58.82 FPS. It's happening in the android device only. In the editor I can see fps difference.
float fps = 1.0f/Time.deltaTime;
So I checked the value of Time.deltaTime and it's 0.017 constant in the device. How can this be possible :-/
It seems to me that the fps counter is correct and the FPS of 58.82 is caused by the changes in your physics time settings. The physics engine probably cannot finish its computation in the available timestep (0.005, which is very low), and that means it will keep computing until it reaches the maximum timestep, which in your case is 0.017. That means all frames will take 0.017 plus any other overhead from rendering / scripts you may have. And 1 / 0.017 equals exactly 58.82.
Maybe you can fix any problems you have with the physics in other ways, without lowering the fixed timestep so much.
I try to create game for Android and I have problem with high speed objects, they don't wanna to collide.
I have Sphere with Sphere Collider and Bouncy material, and RigidBody with this param (Gravity=false, Interpolate=Interpolate, Collision Detection = Continuous Dynamic)
Also I have 3 walls with Box Collider and Bouncy material.
This is my code for Sphere
function IncreaseBallVelocity() {
rigidbody.velocity *= 1.05;
}
function Awake () {
rigidbody.AddForce(4, 4, 0, ForceMode.Impulse);
InvokeRepeating("IncreaseBallVelocity", 2, 2);
}
In project Settings I set: "Min Penetration For Penalty Force"=0.001, "Solver Interation Count"=50
When I play on the start it work fine (it bounces) but when speed go to high, Sphere just passes the wall.
Can anyone help me?
Thanks.
Edited
var hit : RaycastHit;
var mainGameScript : MainGame;
var particles_splash : GameObject;
function Awake () {
rigidbody.AddForce(4, 4, 0, ForceMode.Impulse);
InvokeRepeating("IncreaseBallVelocity", 2, 2);
}
function Update() {
if (rigidbody.SweepTest(transform.forward, hit, 0.5))
Debug.Log(hit.distance + "mts distance to obstacle");
if(transform.position.y < -3) {
mainGameScript.GameOver();
//Application.LoadLevel("Menu");
}
}
function IncreaseBallVelocity() {
rigidbody.velocity *= 1.05;
}
function OnCollisionEnter(collision : Collision) {
Instantiate(particles_splash, transform.position, transform.rotation);
}
EDITED added more info
Fixed Timestep = 0.02 Maximum Allowed Tir = 0.333
There is no difference between running the game in editor player and on Android
No. It looks OK when I set 0.01
My Paddle is Box Collider without Rigidbody, walls are the same
There are all in same layer (when speed is normal it all works) value in PhysicsManager are the default (same like in image) exept "Solver Interation Co..." = 50
No. When I change speed it pass other wall
I am using standard cube but I expand/shrink it to fit my screen and other objects, when I expand wall more then it's OK it bouncing
No. It's simple project simple example from Video http://www.youtube.com/watch?v=edfd1HJmKPY
I don't use gravity
See:
Similar SO Question
A community script that uses ray tracing to help manage fast objects
UnityAnswers post leading to the script in (2)
You could also try changing the fixed time step for physics. The smaller this value, the more times Unity calculates the physics of a scene. But be warned, making this value too small, say <= 0.005, will likely result in an unstable game, especially on a portable device.
The script above is best for bullets or small objects. You can manually force rigid body collisions tests:
public class example : MonoBehaviour {
public RaycastHit hit;
void Update() {
if (rigidbody.SweepTest(transform.forward, out hit, 10))
Debug.Log(hit.distance + "mts distance to obstacle");
}
}
I think the main problem is the manipulation of Rigidbody's velocity. I would try the following to solve the problem.
Redesign your code to ensure that IncreaseBallVelocity and every other manipulation of Rigidbody is called within FixedUpdate. Check that there are no other manipulations to Transform.position.
Try to replace setting velocity directly by using AddForce or similar methods so the physics engine has a higher chance to calculate all dependencies.
If there are more items (main player character, ...) involved related to the physics calculation, ensure that their code runs in FixedUpdate too.
Another point I stumbled upon were meshes that are scaled very much. Having a GameObject with scale <= 0.01 or >= 100 has definitely a negative impact on physics calculation. According to the docs and this Unity forum entry from one of the gurus you should avoid Transform.scale values != 1
Still not happy? OK then the next test is starting with high velocities but no acceleration. At this phase we want to know, if the high velocity itself or the acceleration is to blame for the problem. It would be interesting to know the velocities' values at which the physics engine starts to fail - please post them so that we can compare them.
EDIT: Some more things to investigate
6.7 m/sec does not sound that much so that I guess there is a special reason or a combination of reasons why things go wrong.
Is your Maximum Allowed Timestep high enough? For testing I suggest 5 to 10x Fixed Timestep. Note that this might kill the frame rate but that can be dfixed later.
Is there any difference between running the game in editor player and on Android?
Did you notice any drops in frame rate because of the 0.01 FixedTimestep? This would indicate that the physics engine might be in trouble.
Could it be that there are static colliders (objects having a collider but no Rigidbody) that are moved around or manipulated otherwise? This would cause heavy recalculations within PhysX.
What about the layers: Are all walls on the same layer resp. are the involved layers are configured appropriately in collision detection matrix?
Does the no-bounce effect always happen at the same wall? If so, can you just copy the 1st wall and put it in place of the second one to see if there is something wrong with this specific wall.
If not to much effort, I would try to set up some standard cubes as walls just to be sure that transform.scale is not to blame for it (I made really bad experience with this).
Do you manipulate gravity or TimeManager.timeScale from within a script?
BTW: are you using gravity? (Should be no problem just
I'll try to explain what I mean.
I'm developing a 2d game. When I run the code below on the small screen it works more quickly than the same code on the big screen. I think it depends on an iteration of the game loop takes more time on the big screen than on the small. How can I implement time unit or something else to it doesn't depend on the iteration of the game loop?
private void createDebris(){
if(dx<=0) return;
if(stepDebris==2){
Debris debris = new Debris(gameActivity, dx-=1280*coefX/77, 800*coefY-50*coefY, coefX, coefY);
synchronized (necessaryObjects) {
necessaryObjects.add(debris);
}
stepDebris=-1;
Log.e("COUNT", (count++)+"");
}
stepDebris++;
}
P.S. Debris is visual object which is drawn on canvas. I'll appreciate your answers. Thanks.
If, stepDebris is the unit by which you move objects on the screen - then incrementing it per draw call is wrong, because it ties the rate of movement to the framerate.
What you want is something like this
stepDebris += elapsedMilliseconds * speedFactor
where elapsedMilliseconds is the time elapsed since the game started (in mS). Once you find the correct speedFactor for stepDebris - it will move at the same speed on different machines/resolutions irrespective of framerate.
Hope this helps!
You can make an 'iteration' take x milliseconds by measuring the time it takes to do the actual iteration (= y), and afterwards sleeping (x-y) millisecs.
See also step 3 of this tutorial
Android provies the Handler API, which implements timed event loops, to control the timing of computation. This article is pretty good.
You will save battery life by implementing a frame rate controller with this interface rather redrawing as fast as you can.