How many times onUpdate function will trigger on batch update? - android

I want to update two fields in a document in a collection called recipes and i used the batch update to do it:
let recipeRef = db.collection('recipes').doc(`${recipeId}`);
let batch = db.batch();
batch.update(recipeRef, {ratingsCount : admin.firestore.FieldValue.increment(1)});
batch.update(recipeRef, {totalRating : admin.firestore.FieldValue.increment(snap.data().rating)})
return batch.commit();
And I have a trigger function on the recipes collection like this:
exports.RecipeUpdated = functions.firestore
.document('recipes/{recipeId}')
.onUpdate((change, context) =>
{
//code
});
My question is as there are two updates in the above batch, will this also trigger the onUpdate function twice Or since the batch writes completes atomically, trigger will be called only ones? I want the trigger to be called only one time.

As #DougStevenson mentioned in his comment, you should just simply run the code and see the behaviour. But note, even if you are using the same recipeRef, you are creating two difference updates in your batch. In this case, the result that you'll get will be that onUpdate() will fire twice.
If you want to be called only once, then you should not create two different updates, you can create a single one. update() function allows you to pass multiple properties that can be updated. If you are using objects, than simply get a reference to the document, get it, make the changes and write it back.

Related

How to execute firebase cloud function one minute after an event occurred?

I am using Firebase realtime database for my Android app. I want to execute a firebase function one minute after a database write occurred(not every after 1 minute like Cron job). Is it possible? If possible how can I do that?
Cloud Functions does not offer delayed invocations, but you can integrate with Cloud Tasks to arrange for a callback to another function after some delay.
this is a work around, if you want to be delayed for quite amount of time consider pay attention to the cloud functions timeout option increase it if needed the default timeout is after 60s of the functions first begin executed
export const {Your function Name} = functions
.runWith({ timeoutSeconds: 120 }) // set the timeout in second, (the type is number)
.https.onRequest(async (req, res) => {}
you could wrap everything in you want to be delayed inside the functions in a
setTimeout(() => { // your code inside the functions you want it to be delayed }, delay time in seconds) function
One simple way would to calculate the time in one minute and add this to a database collection.
Then have cloud scheduler trigger a query for all items in that minute and process those items with you functions and delete or update the item in the database.
You could also use this collection to make sure your process completed by updating this table. If the item didn't get updated as completed, you could have another scheduler query to rerun the process.

Why addSnapshotListener is called twice on firestore CollectionReference? [duplicate]

My firestore onSnapshot() function is being called twice.
let user = firebase.firestore().collection('users').doc(userID).onSnapshot
({
next: (documentSnapshot: firebase.firestore.DocumentSnapshot) =>
{
this.userArray.push(documentSnapshot as User);
console.log(documentSnapshot);
//here
},
error: (firestoreError: firebase.firestore.FirestoreError) =>
{
console.log(firestoreError);
//here
}
});
I have also tried subscribing like in https://firebase.google.com/docs/firestore/query-data/listen#detach_a_listener by including user() at the //here comment but to no avail.
How can I modify such that the function only executes one time, i.e. push only one user object per time instead of twice.
I don't know if this is related to your question. If one is using
firebase.firestore.FieldValue.serverTimestamp()
to give a document a timestamp, then onSnaphot will fire twice. This seem to be because when you add a new document to your database onSnapshot will fire, but the serverTimestamp has not run yet. After a few milliseconds serverTimestamp will run and update you document => onSnapshot will fire again.
I would like to add a small delay before onSnapshot fires (say 0,5s or so), but I couldn't find the way to do this.
You can also make a server side function for onCreate event, I believe that would solve your problem. Maybe your userArray.push-action would be more suitable to execute in server side.
Update: To learn more about the behavior of serverTimestamp() and why it triggers the listener twice read this article: The secrets of Firestore’s FieldValue.serverTimestamp() — REVEALED!. Also, the official documentation states:
When you perform a write, your listeners will be notified with the new data before the data is sent to the backend.
In the article there are a couple of suggested solutions, one of which is to use the metadata property of the snapshot to find whether the Boolean value of metadata.hasPendingWrites is true (which tells you that the snapshot you’re looking at hasn’t been written to the server yet) or false.
For example, in your case you can check whether hasPendingWrites is false and then push the object:
if ( !documentSnapshot.metadata.hasPendingWrites ){
// This code will only execute once the data has been written to the server
this.userArray.push(documentSnapshot as User);
console.log(documentSnapshot);
}
In a more generic example, the code will look like this:
firestore.collection("MyCollection")
.onSnapshot( snapshot => {
if ( snapshot.metadata.hasPendingWrites ){
// Local changes have not yet been written to the backend
} else {
// Changes have been written to the backend
}
});
Another useful approach, found in the documentation is the following:
If you just want to know when your write has completed, you can listen to the completion callback rather than using hasPendingWrites. In JavaScript, use the Promise returned from your write operation by attaching a .then() callback.
I hope these resources and the various approaches will help anyone trying to figure out a solution.
REFERENCES:
Events for local changes
The hasPendingWrites metadata property
Snapshot Listen Options
If you need a one time response, use the .get() method for a promise.
firebase.firestore().collection('users').doc(userID).get().then(snap => {
this.userArray = [...this.userArray, snap.doc);
});
However, I suggest using AngularFire (totally biased since I maintain the library). It makes handling common Angular + Firebase tasks much easier.

List of One-to-Many Entities shows temporary entity from another Activity

I have two activities. First activity shows list of notes. Notes themselves are lists.
I use Android Architecture Components: ViewModel, LiveData; with Repository, Room, Dao, etc.
So, I make a method getAllNotes() in Dao, Repository and ViewModel like in google sample apps. In onCreate method of first activity I call observe and set adapter's content of a RecyclerView. And it works fine - it shows the list with Note titles.
Like that:
override fun onCreate(savedInstanceState: Bundle?) {
//some code
viewModel = obtainViewModel()
viewModel.getAllNotes().observe(this, Observer<List<Notes>> { notes ->
recView.setNote(notes)
}
}
Then I have a button that starts new Activity to create new Note. That note contains list of Lines which for now contains only string and foreign key.
data class Line {
var id: Long? = null
var note_id: Long? = null
var payload: String? = null
}
Note and Line are one-to-many relation and they are connected by id of Note and foreign key note_id in Line.
(I don't write here all of the code, it works, trust me)
The problem is, that to insert Lines in database I firstly need to insert the parent Note and I do that. And it works almost OK too. But the liveData of the getAllNotes() from the first Activity gets notified by this insertion. And if the user, as a result, decides to delete all the lines and go back to the first activity even if I delete temporary Note entity from the database the list on the first Activity shows it for a moment because it gets deleted in a background with a small delay.
What I see as a solution:
1) Unsubscribe observers from livedata. I tried to do it in onStop method, but it gets called after the onCreate method of the second activity where the entity is being created, so the livedata already gets notified and observers are removed after temporary Note passed into the list.
2) Not use Room/SQLite as cache. Since this Note and Lines are not guaranteed to stay then and shouldn't be shown or inserted into a table. So, I can keep it all in properties of viewModel (i.e. in memory). But I see a lot of overhead work to save these entities through screen rotation, minimizing the app and all that stuff with saving state and restoring it.
3) Create two additional entities like CachedNote and CachedLine and corresponding tables, to work with it until I decide to persist the work, insert it into original tables and show it.
4) Add property to the Note entity like "visible" and add this parameter to Query, to make entity note shown, until I decide to persist the work. But there could be a lot of "updateNoteWithLines" every where.
What should I do? I didn't google anything useful.
I know it's like "What's the best way question", forgive me.
You can try to call the observe in onResume and then call removeObserver in onPause, that way the Activity will not be updated, please look at the example here.

What triggers LiveData onChanged()?

I'm using Room and in the Dao I have this method:
LiveData<List<Books>> getAllBooks();
In MainActivity I have subscribed to that method from the ViewModel. Changes to the data trigger the onChanged() callback:
viewModel.getAllBooks()
.observe(this, books -> {
Log.d(TAG, "onChanged()");
booksListAdapter.setData(new ArrayList<>(books));
});
What I would like to know is what constitutes an update? When the app first starts I do 100 insertions, each of them changes the database but the onChanged() is not invoked 100 times. Last time I checked it called onChanged() the first time which I think it always calls when starting and then two more calls.
Can I have control over this? For example if I know I will be doing 100 insertions perhaps it would be better if I only got the callback at the end of the insertions.
You don't have control of that. What you can do is use MediatorLiveData and post the value after all insertions. Whenever you update, delete or insert Room knows that there has been change but doesn't know what has been changed. So it just re-queries and sends the results to observing LiveData
Check this blog and mainly section 7. Avoid false positive notifications for observable queries. Author gives pretty good example of MediatorLiveData which is similar to what you are looking for

Combine 2 observables and get output from one which complete first

I have a subscription that wait for the push notification and another one that is polling the server to get response. I want to start both observable together and return the data from the one which finish first. What would be operator to use here?
Since you want to have the data of the first one to finish, you have to put the data somewhere until you get to the terminal event by collecting each into its own list and using amb that picks the source that signals an event (the collected list) first. Then you can unroll the list back to individual items.
Observable<A> source1 = ...
Observable<A> source2 = ...
Observable.amb(source1.toList(), source2.toList())
.flatMapIterable(list -> list)
.subscribe(...);
The operator you are looking for is first. Of-course, you'll have to merge the Observables first (by using merge, or probably better - mergeDelayError, so if only one of them fails, you'll still get the first which finishes with a vaild result).
Should look like:
Observable.mergeDelayError(pushObservable, pullObservable)
.first()
.subscribe(data->...);

Categories

Resources