We've just started using Firebase Analytics and have exported all the events to BigQuery. While processing the "app_remove" event we noticed an odd thing. Sometimes our servers record activity from the app after the timestamp of the app_remove event (i.e. event_dim.timestamp_micros).
While running, the app periodically contacts our servers and we then record the UTC time of that request. Sometimes the latest activity time is higher than the timestamp of app_remove event. The largest time difference we've noticed is 12h 23m 17s.
Each app instance gets a unique certificate that it uses when authenticating with servers so if an uninstall completes it's impossible to record any new calls for that app instance, even if you reinstall. After reinstall a new certificate is issued and the activity time would be recorded on a different row in the db.
How could this occur? Are the timestamps of app_remove set by the client so it could be caused an incorrect clock on the user's phone? How else could it occur?
The event_dim.timestamp_micros is the UTC time that the event was logged on the client based on the device time, and so it is indeed subject to errant clock times on the device.
Related
I am developing an app using Android & Firebase Realtime Database where users join a room, then when the host presses start game, all clients start the main game Activity (through a ValueEventListener on a "Started" child node in the room). The main game has a 60sec countdown where users make a sentence then at the end of the 60secs all sentences are collected and displayed.
I am having a hard time collecting all of the sentences at the end due to the 60sec timers being so off on different clients. I need a way to ensure all games end at the same time so the collection process is smooth and nothing gets missed.
I know that Firebase has both: /.info/serverTimeOffset and ServerValue.TIMESTAMP but i'm struggling on how to use them to sync timers.
I have tried to use System.currentTimeMillis() + serverTimeOffset to estimate the server time and get all clients to count down to endTime - (System.currenTimeMillis() + serverTimeOffset) where endTime is a time written to the database by the host that all clients read but timers are still way off.
What is the best way to handle this situation?
I would suggest:
Instead of running a timer that updates every second on the server, simply store the start/stop times of the event and allow the clients to manage their own timers.
Not be done by -1 every second (as setInterval and other client-side tools are not very exact) but by comparing the current timestamp to the end, and determining the difference.
I have been doing some test to try to send events from my application to pub/sub or bigquery in real time via firebase sdk and google tag manager.
The function is triggering in real time on the logcat.
But even though the logcat seem to show real time triggers the data appears in the stackdriver sometimes in real time if not between 5 to 10 minutes after
I am wondering if GTM is not blocking the events to not be send in real time when there is many events coming at the same time and if there is a way to remove that delay.
A more general question will be can we send data from firebase to a real time db using GTM or is it a myth ( I know that there is analytics cloud function for firebase)
The GTM SDKs send traffic in batches to avoid excessive power drain from turning on the cell radio or wifi.
I have an application that relies heavily on current timestamps. Currently, when user submits a request, I get the current timestamp in UTC using System.currentTimeMillis(). While this works fine, it becomes a problem when the user starts manipulating their Date/Time on their device and would result as inaccurate timestamps.
Why do it on the client then? Why not just handle it on the server? Well, my application needs to work offline. All my requests get pushed into a jobQueue when connectivity to the internet is unavailable. In these cases, I must have the original time wherein the user did the action so if I submit a request at 4:02pm, but due to network problems, server will only receive it around 7:30pm, server MUST know that I sent the request at 4:02pm.
Now what options have I considered?
Upon user login, I sync the device time with the server time and store that time locally. If any user manipulation occurs while the user is logged in, I'll have a BroadcastReceiver listening in onto any intents of Date/Time manipulation, then store the offset so that whenever a user submits a request, I will calculate the synced time with the offset to ensure the timestamp is accurate.
Have a server sync api done in the backend and set up a service within my application to continuously sync up with the server time and look for any drift while also listening in onto any user manipulation.
Use push notifications and listen downstream for time synchronization adjustments while also listening onto any user manipulation.
I could also make use of the NTP servers to synchronize time with my device.
I'm not entirely sure which would be the most optimal (assuming I have listed down all the possible solutions). If there are other solutions I haven't thought of, please let me know.
P.S. If I happen to use the BroadcastReceiver to listen onto any datetime manipulation on the device, how would I even calculate the offset in that?
It has been some time since I asked this question and there hasn't been any elegant answers to the problem so after some research and some trial and error, I decided to take the NTP route and after some digging around, I found a nice library that does the entire thing for you.
It can be found here:
NTP TRUE TIME
Credits to these guys who had made life a lot easier.
You must sync with the ntp servers just once and from there on, they will calculate the Delta for us giving us accurate UTC regardless of SystemClock time.
For time synchronization you can implement like getting time zone from server. Once we have timzone we can get the current time of server.
Disclaimer: I am a backend developer, and I have no idea of what happens inside an Android or iOS app, so please bear with me.
In a specific use case of our platform we would need all client applications (Android and iOS) to keep an internal timer accurate and in sync with an external source of time (so that they can accurately timestamp messages sent between one another). Is this possible at all (using, for example, an NTP client or a different / better technique)?
When the client connects to the server it can fetch what the server's reference time source is. Once it has obtained this it can calculate and store the difference between the server's time and the device's time.
Then when the client needs to do something based on the time* on the handset it takes the server's time into consideration when doing whatever it needs to do such as schedule a timer or whatever.
*You can't really do anything based on time in iOS, only if the app is in the foreground. An exception is posting a local notification. Its not possible to schedule a timer for example if the app is in the background.
As per Martin H, you could use an offset from the device's internal time. The client device will probably be within a second of current time if the user has not manually set the time (which does happen - I just read about a user that changed her date/time to tomorrow to get a reward in a game).
I have dealt with this in a time-based app by using the device time. The client and server validate against server time when communicating with the server.
For example, when the client and server communicate, the server can validate the client time against server time. If the user time is off by more than 'x' minutes, the server sends back an error message.
It will be difficult to keep all clients within a few milliseconds of each other. One idea is to use the server to coordinate messages (messages are not sent between devices, but to the server, which then sends the message on). Then, you can use the received time at the server as the basis for the message time.
We have an android(or iphone) client we are developing. The client allows the android user to send entries to a server which we also develop. If the client does not have data services (GPRS) at the moment the user sends the entry to the server, the client also supports saving the entry to an offline database and sending it later to the server.
One important aspect of the whole process is accuracy of the timestamps on which the user sent the entry to the server (whether the entry is made in real time or sent by the client from the offline database)
When available on the client, we get a GPS location and are able to use the GPS timestamp to send that to the server (or save the GPS timestamp on the offline DB and send it later to the server). However if the user has turned off the GPS (and all other location services), the device will not have a GPS fix and therefore the server can not determine accurately when an entry was made.
We can not use the local device clock as the user may change the clock to make entries on different times than they actually occurred (these entries are part of the users salary so he might have an interest to "fix" them).
So basically I am searching for a way to determine as best I can the time some entry was made when I can not trust the internal clock of the mobile. The algorithm should support both entries sent in real time or entries sent from an offline DB. the algorithm should also support cases where the user changes the time of the mobile, turns the mobile on/off, turns the GPS on/off while the application is running on the mobile etc...
Few ideas that I thought of:
Although I can not trust the mobile's time, it can still perform as a stop watch:
Have a class that will loop until the application exists, the loop will sleep 1 second and increase an internal clock variable by 1 second. On every GPS location my code gets we update the internal clock variable. This way I have an absolute clock that came from outside the device (from the GPS) and when the client sends an entry to the server, we can use the internal clock as an absolute time.
PROS: the user can not modify this clock as it is only updated when we get a location from the GPS
CONS: the application needs at least one GPS fix before the user can make any reliable entries
I can take advantage of the fact that the server has an accurate clock which is correct. If the client would send to the server info that the age of the entry is 10 minutes, the server could use its internal time and know the exact time the entry was made on.
The biggest problem is how to know the entry age? I thought about saving the entries to the offline DB with an age of 0, then every 1 second increase the age of the entry in the DB. The problem is that if the app is closed and/or the device is off this will now happen
This is where I am currently stuck. Any ideas on how to solve this are more than welcome
Thanks
Here's how I handle this issue for iPhone. When the app starts, I call my server and ask for the current GMT time (you could also call a public NTP server if you preferred). I then compare it to the system time. If it is different by more than X then I popup a message saying, sorry your system time is wrong so you can't use the app until you fix this. I then monitor for the user changing the system time while the app is running and if they do that, then I do the compare again (and popup the error message if the time is off by more than X). This ensures that their system time is always correct (within some reasonable allowance) and you can trust [NSDate date]. However, this solution does require a valid network connection. If this solution works for you, I can post the sample code.
i think i am going to combine Jules and Joel's answers into one solution which will provide for my needs the best solution:
since the user might change the clock when the mobile doed not have GPRS, just detecting the time change event will not help us as we can not validate at that moment the new time is correct.
As Joel recommended i will pull the time from my server when my application is started (at that point i still must have communications with the server or else my application will not start). The time pulled from the server along with the current device upTime will be saved.
when the user wants to make an entry i will calculate the current time using (Server Base Time + Current UpTime - Base UpTime). this way i will have an independent source of time regardless of the current clock of the device
this will defenitly work on android
on iPhone we will try to use something out of http://www.cocoadev.com/index.pl?FindingUptime to get the upTime
Jules & Joel, thanks for your answers!
Look into android.os.SystemClock. Specifically, elapsedRealtime() returns a time since the phone was switched on, which is not affected if the clock is changed by the user.
You can correlate times in event the phone is switched off by having code that runs when it is switched on and checks the realtime clock. As the clock can't be changed when the phone is off, I suspect you could use this to put together a system that will catch any simple attempts at cheating. (If the user roots the phone all bets are off -- they could modify the behaviour of the APIs from under you).
Running code every second will kill the phone's battery life. Most phones would be unlikely to last a day if you did this.