How do we refresh /.info/serverTimeOffset on device time change? - android

I am using /.info/serverTimeOffset to get the approximate server time from firebase.
Long clockSkew = null;
...
void registerForClockSkew() {
DatabaseReference offsetRef =
FirebaseDatabase.getInstance().getReference(".info/serverTimeOffset");
offsetRef.addValueEventListener(new ValueEventListener() {
#Override
public void onDataChange(DataSnapshot snapshot) {
double offset = snapshot.getValue(Double.class);
clockSkew = (long) offset;
}
#Override
public void onCancelled(DatabaseError error) {
clockSkew = null;
}
});
}
However as mentioned here - https://github.com/firebase/firebase-ios-sdk/issues/363 - the offset does not refresh when there is a device-time-change. I have tried to close and reconnect firebase instance to refresh the skew value but it does not seem to be working. In my code I simply put the following when there is device time change -
FirebaseDatabase.getInstance().goOffline();
FirebaseDatabase.getInstance().goOnline();
but this does not trigger the value-event-listener callback. What am I missing here? Is it because goOffline and goOnline are consecutive statements, it is not having any effect?

The only way to handle the situation that I have found is to keep the skew w.r.t elapsedRealtime which keeps constant in case of device time change.
long clockSkewWRTELapsed
....
#Override
public void onDataChange(DataSnapshot snapshot) {
double offset = snapshot.getValue(Double.class);
clockSkewWRTELapsed = ((long) offset) + System.currentTimeMillis() - SystemClock.elapsedRealTime();
}
Whenever needed, one can get the server-time by adding elapsedRealTime to the clockSkewWRTElapsed time.

Related

fetching data for the second time increments my long variable by 1, causing it not possible to reload data more than once

I have a few data fetching methods that call each other in order to fetch data according to the previous data fetched.
My issue is that I am holding a long variable called 'allContestsCounterIndex' which is incremented only at 2 certain places and for some reason after fetching the data for the first time, when clicking again to try to fetch for the second time the value of it starts with 1, meaning I can't continue because it makes the logic stuck.
here are the functions I am using -
private void updateContestCallCounter(long childrenCount) {
allContestsCounterIndex++;
if (allContestsCounterIndex == (childrenCount - 1)) { // at this point at the second time this variable starts with a value of 1 instead of 0
fetchWinnersFromOurValidContests();
}
}
private void fetchAllEndedStatusContests(DataSnapshot dataSnapshot) {
if (!dataSnapshot.exists()) {
// For some reason, we've reached this point with no dataSnapShot
Timber.d("dataSnapShot doesnt exists");
return;
}
long timeInSeconds = (System.currentTimeMillis() / 1000);
mEndedContests = new ArrayList<>();
long childrenCount = dataSnapshot.getChildrenCount();
for (DataSnapshot status : dataSnapshot.getChildren()) {
//fetching the status key for deeper querys inside the db
final String key = status.getKey();
checkIfContestEnded(timeInSeconds, key, new OnContestStateReceived() {
#Override
public void contestHasEnded() {
mEndedContests.add(key);
updateContestCallCounter(childrenCount);
}
#Override
public void contestStillScheduled() {
updateContestCallCounter(childrenCount);
}
});
}
}
You should set all of your counters to 0 once you have finished re-fetching the data.

How to Implement Paginate a query on android Firestore [duplicate]

This question already has answers here:
How to paginate Firestore with Android?
(3 answers)
Closed 4 years ago.
On working around to learn firebase firestore for an example from GitHub friendly eat app
I thought to implement pagination to limiting nodes for 10
private static final int LIMIT = 10;
in the firestore example app the mAdapter loads data/nodes as below
mFirestore = FirebaseFirestore.getInstance();
// Get ${LIMIT} restaurants
mQuery = mFirestore.collection("restaurants")
.orderBy("avgRating", Query.Direction.DESCENDING)
.limit(LIMIT);
// RecyclerView
mAdapter = new RestaurantAdapter(mQuery, this) {
#Override
protected void onDataChanged() {
// Show/hide content if the query returns empty.
if (getItemCount() == 0) {
mRestaurantsRecycler.setVisibility(View.GONE);
mEmptyView.setVisibility(View.VISIBLE);
} else {
mRestaurantsRecycler.setVisibility(View.VISIBLE);
mEmptyView.setVisibility(View.GONE);
}
}
#Override
protected void onError(FirebaseFirestoreException e) {
// Show a snackbar on errors
Snackbar.make(findViewById(android.R.id.content),
"Error: check logs for info.", Snackbar.LENGTH_LONG).show();
}
};
mRestaurantsRecycler.setLayoutManager(new LinearLayoutManager(this));
mRestaurantsRecycler.setAdapter(mAdapter);
// Filter Dialog
mFilterDialog = new FilterDialogFragment();
}
and
#Override
public void onStart() {
super.onStart();
// Start sign in if necessary
if (shouldStartSignIn()) {
startSignIn();
return;
}
// Apply filters
onFilter(mViewModel.getFilters());
// Start listening for Firestore updates
if (mAdapter != null) {
mAdapter.startListening();
}
}
on firestore docs says about to paginate
// Construct query for first 25 cities, ordered by population
Query first = db.collection("cities")
.orderBy("population")
.limit(25);
first.get()
.addOnSuccessListener(new OnSuccessListener<QuerySnapshot>() {
#Override
public void onSuccess(QuerySnapshot documentSnapshots) {
// ...
// Get the last visible document
DocumentSnapshot lastVisible =
documentSnapshots.getDocuments()
.get(documentSnapshots.size() -1);
// Construct a new query starting at this document,
// get the next 25 cities.
Query next = db.collection("cities")
.orderBy("population")
.startAfter(lastVisible)
.limit(25);
// Use the query for pagination
// ...
}
});
combining those above codes how should I implement paginate to load more than 10
nodes to load when I scroll to the bottom of the recycler view
// Use the query for pagination
// ...
Update: I am working based on firestore doc about Paginate query and taking look at a possible duplicate of another question I did not get it to done working
Thank you
Here I found solution around but not better one, if is there any better way please post
by saving RecyclerView state before loading more nodes and reloading RecyclerView state after increasing the limit
private static final int LIMIT = 10;
Changed to
private int LIMIT = 10;
when recyclerView is scrolled to the bottom
mRestaurantsRecycler.addOnScrollListener(new RecyclerView.OnScrollListener() {
#Override
public void onScrolled(RecyclerView recyclerView, int dx, int dy) {
super.onScrolled(recyclerView, dx, dy);
final int mLastVisibleItemPosition = mManager.findLastVisibleItemPosition();
if ( mLastVisibleItemPosition == (LIMIT-1)) {
LIMIT = LIMIT*2;
showSpotDialog();
// save RecyclerView state
mBundleRecyclerViewState = new Bundle();
Parcelable listState = mRestaurantsRecycler.getLayoutManager().onSaveInstanceState();
mBundleRecyclerViewState.putParcelable(KEY_RECYCLER_STATE, listState);
loadMore(query);
new Handler().postDelayed(new Runnable() {
#Override
public void run() {
// restore RecyclerView state
if (mBundleRecyclerViewState != null) {
Parcelable listState = mBundleRecyclerViewState.getParcelable(KEY_RECYCLER_STATE);
mRestaurantsRecycler.getLayoutManager().onRestoreInstanceState(listState);
}
hideSpotDialog();
}
}, 500);
}
}
});
it looks very unusual when nodes are loaded after the limit, but no way for now...
and yes I am looking for loading more nodes without flaws
I have extended the FirestoreAdapter as followed:
Keep track DocumentSnapshots identifier.
private Set<String> mIdentifier = new HashSet<>();
Add a new public method for pagination, as i expect the rest of the query to remain the same, the given query does not need to be changed
/**
* Extends the query to load even more data rows. This method will do nothing if the query has
* not yet been set.
* #param limit the new limit
*/
public void paginate(long limit) {
if (mQuery != null) {
if (mRegistration != null) {
mRegistration.remove();
mRegistration = null;
}
// Expect the query to stay the same, only the limit will change
mQuery = mQuery.limit(limit);
startListening();
}
}
Clear the identifier in the setQuery(Query) method by calling mIdentifier.clear()
Adopt the onDocumentAdded(DocumentChange) and the onDocumentRemoved(DocumentChange) as followed
protected void onDocumentAdded(DocumentChange change) {
if (!mIdentifier.contains(change.getDocument().getId())) {
mSnapshots.add(change.getNewIndex(), change.getDocument());
mIdentifier.add(change.getDocument().getId());
notifyItemInserted(change.getNewIndex());
}
}
protected void onDocumentRemoved(DocumentChange change) {
mSnapshots.remove(change.getOldIndex());
mIdentifier.remove(change.getDocument().getId());
notifyItemRemoved(change.getOldIndex());
}
For the onScrolling listener i stick to this guide: Endless Scrolling with AdapterViews and RecyclerView

Paging library - populate from cache while requesting from network

I have to load cached version of data from database and simultaneously I want to make a request to server for fresh data and I want to do this on per page basis.
So, for example for first page I want to show a cached version of first page data from database while requesting fresh data only for first page.
I want to achieve this using Paging Library.
I tried creating custom data source which helped me intercept page load request which then I used to make a network call with required page number and limit and meanwhile I returned a cached version from db, the problem is after getting fresh data from network I update the database but those updates are not reflected.
(I believe the whole table is being observed for any modifications using Invalidation Tracker and data source is invalidated whenever tables are invalidated, I added that tracker in my data source too but still it ain't working; I was able to make out that Invalidation Tracker thing by temporarily creating: LivePagedListProvider getJobs() in JobDao and checking generated implementation)
Code:
public class JobListDataSource<T> extends TiledDataSource<T> {
private final JobsRepository mJobsRepository;
private final InvalidationTracker.Observer mObserver;
String query = "";
public JobListDataSource(JobsRepository jobsRepository) {
mJobsRepository = jobsRepository;
mObserver = new InvalidationTracker.Observer(JobEntity.TABLE_NAME) {
#Override
public void onInvalidated(#NonNull Set<String> tables) {
invalidate();
}
};
jobsRepository.addInvalidationTracker(mObserver);
}
#Override
public int countItems() {
return DataSource.COUNT_UNDEFINED;
}
#Override
public List<T> loadRange(int startPosition, int count) {
return (List<T>) mJobsRepository.getJobs(query, startPosition, count);
}
public void setQuery(String query) {
this.query = query;
}
}
Jobs repository functions:
public List<JobEntity> getJobs(String query, int startPosition, int count) {
if (!isJobListInit) {
JobList jobList = mApiService.getOpenJobList(
mRequestJobList.setPageNo(startPosition/count + 1)
.setMaxResults(count)
.setSearchKeyword(query)
).blockingSingle();
mJobDao.insert(jobList.getJobsData());
}
return mJobDao.getJobs(startPosition, count);
}
public void addInvalidationTracker(InvalidationTracker.Observer observer) {
mAppDatabase.getInvalidationTracker().addObserver(observer);
}
So I understood why it wasn't working, there was a mistake at my end, I was passing wrong parameters to getJobs method of JobDao in JobsRepository.
The getJobs method of JobDao goes as follows:
#Query("SELECT * FROM jobs ORDER BY jobID ASC LIMIT :limit OFFSET :offset")
List<JobEntity> getJobs(int limit, int offset);
And the call getJobs() in JobsRepository goes as follows:
return mJobDao.getJobs(startPosition, count);
So the first parameter was the limit and the next one was the offset but I was passing other way around.
Now it works like a charm!
Furthermore, I made a change to getJobs() in JobsRepository:
First get data from db, if available return and make an async request to network if required.
If no data is available in db, make a synchronous call to network, get data from network, parse it and save it db and now access latest data from db and return it.
So the function goes like this:
//you can even refactor this code so that all the network related stuff is in one class and just call that method
public List<JobListItemEntity> getJobs(String query, int startPosition, int count) {
Observable<JobList> jobListObservable = mApiService.getOpenJobList(
mRequestJobList.setPageNo(startPosition / count + 1)
.setMaxResults(count)
.setSearchKeyword(query));
List<JobListItemEntity> jobs = mJobDao.getJobsLimitOffset(count, startPosition);
//no data in db, make a synchronous call to network to get the data
if (jobs.size() == 0) {
JobList jobList = jobListObservable.blockingSingle();
updateJobList(jobList, startPosition, false);
} else if (shouldFetchJobList(jobs)) {
//data available in db, so show a cached version and make async network call to update data as this data is no longer fresh
jobListObservable.subscribe(new Observer<JobList>() {
#Override
public void onSubscribe(Disposable d) {
}
#Override
public void onNext(JobList jobList) {
updateJobList(jobList, startPosition, true);
}
#Override
public void onError(Throwable e) {
Timber.e(e);
}
#Override
public void onComplete() {
}
});
}
return mJobDao.getJobsLimitOffset(count, startPosition);
}
updateJobList() code:
private void updateJobList(JobList jobList, int startPosition, boolean performInvalidation) {
JobListItemEntity[] jobs = jobList.getJobsData();
Date currentDate = Calendar.getInstance().getTime();
//tracks when this item was inserted in db, used in calculating whether data is stale
for (int i = 0; i < jobs.length; i++) {
jobs[i].insertedAt = currentDate;
}
mJobDao.insert(jobs);
if (performInvalidation) {
mJobListDataSource.invalidate();
}
}
(I also renamed the getJobs() in JobDao to getJobsLimitOffset() as it makes it more readable and that is also the way methods are generated by paging library)

Running Firebase callback on another thread is blocking UI

I know this question might seem duplicated, I've read like ten other threads about this same thing, but I cannot find the problem
I have this method in my activity:
public void saveResponse(final Response studentResponse, final Content content)
fb.getReference("...").addListenerForSingleValueEvent(new ValueEventListener() {
#Override
public void onDataChange(final DataSnapshot dataSnapshot) {
new Thread(new Runnable() {
#Override
public void run() {
Map<String, Object> responseMap = new HashMap<>();
responseMap.put("end_date", studentResponse.end_date);
responseMap.put("start_date", studentResponse.start_date);
responseMap.put("time", studentResponse.time);
responseMap.put("points", studentResponse.points);
responseMap.put("max_points", studentResponse.max_points);
responseMap.put("selected_options", studentResponse.selected_options);
if (!TextUtils.isEmpty(studentResponse.free_text))
responseMap.put("free_text", studentResponse.free_text);
DataSnapshot contentRef = dataSnapshot.child("/sections/" + currentSection + "/sections/" + currentSubsection + "/contents/" + content.id);
final int oldPoints = contentRef.hasChild("points") ? contentRef.child("points").getValue(int.class) : 0;
contentRef.getRef().setValue(responseMap);
contentRef.getRef().setPriority(ServerValue.TIMESTAMP);
DataSnapshot subSectionRef = dataSnapshot.child("/sections/" + currentSection + "/sections/" + currentSubsection);
long subSectionPoints = (subSectionRef.hasChild("points") ? subSectionRef.child("points").getValue(long.class) : 0) + studentResponse.points - oldPoints;
subSectionRef.child("points").getRef().setValue(subSectionPoints);
int indexOf = currentContents.indexOf(content) + 1;
if(indexOf > 0 && indexOf < currentContents.size()) {
CourseContent content = currentContents.get(indexOf);
subSectionRef.child("currentPosition").getRef().setValue(content.order);
}
DataSnapshot sectionRef = dataSnapshot.child("/sections/" + currentSection);
long sectionPoints = (sectionRef.hasChild("points") ? sectionRef.child("points").getValue(long.class) : 0) + studentResponse.points - oldPoints;
sectionRef.child("points").getRef().setValue(sectionPoints);
long coursePoints = (dataSnapshot.hasChild("points") ? dataSnapshot.child("points").getValue(long.class) : 0) + studentResponse.points - oldPoints;
dataSnapshot.child("points").getRef().setValue(coursePoints);
dataSnapshot.getRef().setPriority(MAX_SAFE_INTEGER - coursePoints);
int completed = 0;
for (DataSnapshot sect : dataSnapshot.child("sections").getChildren()) {
for (DataSnapshot subSect : sect.child("sections").getChildren()) {
int currPos = subSect.hasChild("currentPosition") ? subSect.child("currentPosition").getValue(int.class) : 0;
completed += currPos;
}
}
double progress = totalContents > 0 ? (double) completed / (double) totalContents : 0;
dataSnapshot.child("progress").getRef().setValue(progress);
}
}.start();
}
...
});
}
in a click handler I call this method, and then I change the fragment (with custom animations).
The thing is, the fragment transition is not smooth, it freezes a little, if I comment everything inside the runnable then it runs smooth. I've tried also with an AsyncTask and the same happens.
Inside the runnable, I'm just querying the dataSnapshot and its children, and setting some values (dataSnapshot.child("item").getRef().setValue(x))
Another strange thing is that if I put a breakpoint inside run(), it also works smooth.
every time you call onDataChange method will create a new thread,and the strange thing: " if I put a breakpoint inside run(), it also works smooth."
may be you should check out whether there are too many thread created.
I think the problem is with the logic of your method.
The listener onDataChange() is active and it will respond to any change of the data. I mean if you change data inside the onDataChange() method, it will be called each time you set values with (dataSnapshot.child("item").getRef().setValue(x))), so, it is similar to do a "recursive" call without exit.
In order to fix this problem, you should obtain the key of what you want to change in the on click event and just use
mDatabase = FirebaseDatabase.getInstance().getReference();
mDatabase.child("key").child("item").setValue(x);
Check https://firebase.google.com/docs/database/android/read-and-write for more info
A spawned thread inherits the priority of the thread that created it. Try lowering the priority of your worker thread to prevent it from competing with the UI thread:
#Override
public void onDataChange(final DataSnapshot dataSnapshot) {
new Thread(new Runnable() {
#Override
public void run() {
Process.setThreadPriority(Process.THREAD_PRIORITY_BACKGROUND);
...
}).start();

TransferUtility Does not trigger onProgresschanged

I am working on a project where I want to upload files after developer authentication is complete. I am using AWS Cognito for authentication. Problem here is sometimes TransferUtility does not trigger onProgresschanged. Although it does not trigger onprogresschanged but the file is getting uploaded. I want to show a progressbar on the UI for every upload.It is working sometimes and sometimes it is not working.
Here is how I am uploading files.
public void upload() {
ClientConfiguration configuration = new ClientConfiguration();
configuration.setProtocol(Protocol.HTTP);
configuration.setSocketTimeout(5 * 10000);
configuration.setConnectionTimeout(5 * 10000);
configuration.setMaxErrorRetry(3);
if(sS3Client==null) {
sS3Client = new AmazonS3Client(credentials,configuration);
}
sTransferUtility = new TransferUtility(sS3Client,
this.ctx);
observer = sTransferUtility.upload("bucketer", "Filename", "file");
observer.setTransferListener(new UploadListener(progress));
}
private class UploadListener implements TransferListener {
ProgressBar progressBar;
public UploadListener(ProgressBar progress){
this.progressBar = progress;
}
#Override
public void onStateChanged(int i, TransferState transferState) {
Log.d("STATUS CHANGED:".concat(String.valueOf(i)),transferState.toString());
switch (transferState.toString())
{
case "IN_PROGRESS":
{
Log.d("IN_PROGRESS", "IN_PROGRESS");
}
break;
case "COMPLETED":
{
Log.d("COMPLETED COMPLETED", "COMPLETED");
}
break;
}
}
#Override
public void onProgressChanged(int i, long l, long l1) {
updator();
this.progressBar.setProgress(transferprogres);
}
#Override
public void onError(int i, Exception e) {
Log.d("UPOLADING ERROR:",String.valueOf(e));
}
}
public void updator(){
transferprogres = (int) ((double) observer.getBytesTransferred() * 100 / observer.getBytesTotal());
}
The code above is a part of total project. For more details comment.
Why is it showing weird performance?
See Aws S3 TransferService Upload failing without errors and https://github.com/aws/aws-sdk-android/issues/101. In short, v2.2.12 requires user to manage the life cycle of transfer listeners as transfer utility keeps only weak references of them. You can make the listener or the observer as class variable to prevent it from garbage collected. Anyway, we are tweaking the listeners in future releases. Sorry for the trouble and please stay tuned.

Categories

Resources