I have been using Parse to retrieve a data for a list view. Unfortunately they limit requests to 100 by default to a 1000 max. I have well over that 1000 max in my class. I found a link on the web which shows a way to do it on iOS but how would you do it on Android? Web Link
I am currently adding all the data into a arraylist in a loop until all items are complete (100) then adding them to the list
I have figured out how to achieve my goal:
Declare Global Variable
private static List<ParseObject>allObjects = new ArrayList<ParseObject>();
Create Query
final ParseQuery parseQuery = new ParseQuery("Objects");
parseQuery.setLimit(1000);
parseQuery.findInBackground(getAllObjects());
Callback for Query
int skip=0;
FindCallback getAllObjects(){
return new FindCallback(){
public void done(List<ParseObject> objects, ParseException e) {
if (e == null) {
allObjects.addAll(objects);
int limit =1000;
if (objects.size() == limit){
skip = skip + limit;
ParseQuery query = new ParseQuery("Objects");
query.setSkip(skip);
query.setLimit(limit);
query.findInBackground(getAllObjects());
}
//We have a full PokeDex
else {
//USE FULL DATA AS INTENDED
}
}
};
}
Here is a JavaScript version without promises..
These are the global variables (collections are not required, just a bad habit of mine)..
///create a collection of cool things and instantiate it (globally)
var CoolCollection = Parse.Collection.extend({
model: CoolThing
}), coolCollection = new CoolCollection();
This is the "looping" function that gets your results..
//recursive call, initial loopCount is 0 (we haven't looped yet)
function getAllRecords(loopCount){
///set your record limit
var limit = 1000;
///create your eggstra-special query
new Parse.Query(CoolThings)
.limit(limit)
.skip(limit * loopCount) //<-important
.find({
success: function (results) {
if(results.length > 0){
//we do stuff in here like "add items to a collection of cool things"
for(var j=0; j < results.length; j++){
coolCollection.add(results[j]);
}
loopCount++; //<--increment our loop because we are not done
getAllRecords(loopCount); //<--recurse
}
else
{
//our query has run out of steam, this else{} will be called one time only
coolCollection.each(function(coolThing){
//do something awesome with each of your cool things
});
}
},
error: function (error) {
//badness with the find
}
});
}
This is how you call it (or you could do it other ways):
getAllRecords(0);
IMPORTANT None of the answers here are useful if you are using open
source parse server then it does limit 100 rows by default but you can
put any value in query,limit(100000) //WORKS
No need for recursive
calls just put the limit to number of rows you want.
https://github.com/parse-community/parse-server/issues/5383
JAVA
So after 5 years, 4 months the above answer of #SquiresSquire needed some changes to make it work for me, and I would like to share it with you.
private static List<ParseObject>allObjects = new ArrayList<ParseObject>();
ParseQuery<ParseObject> parseQuery = new ParseQuery<ParseObject>("CLASSNAME");
parseQuery.setLimit(1000);
parseQuery.findInBackground(getAllObjects());
FindCallback <ParseObject> getAllObjects() {
return new FindCallback <ParseObject>() {
#Override
public void done(List<ParseObject> objects, ParseException e) {
if (e == null) {
allObjects.addAll(objects);
int limit = 1000;
if (objects.size() == limit) {
skip = skip + limit;
ParseQuery query = new ParseQuery("CLASSNAME");
query.setSkip(skip);
query.setLimit(limit);
query.findInBackground(getAllObjects());
}
//We have a full PokeDex
else {
//USE FULL DATA AS INTENDED
}
}
}
};
In C# I use this recursion:
private static async Task GetAll(int count = 0, int limit = 1000)
{
if (count * limit != list.Count) return;
var res = await ParseObject.GetQuery("Row").Limit(limit).Skip(list.Count).FindAsync();
res.ToList().ForEach(x => list.Add(x));
await GetAll(++count);
}
JS version:
function getAll(list) {
new Parse.Query(Row).limit(1000).skip(list.length).find().then(function (result) {
list = list.concat(result);
if (result.length != 1000) {
//do here something with the list...
return;
}
getAll(list);
});
}
Usage: GetAll() in C#, and getAll([]) in JS.
I store all rows from the class Rowin the list. In each request I get 1000 rows and skip the current size of the list. Recursion stops when the current number of exported rows is different from the expected.
**EDIT : Below answer is redundant because open source parse server doesn't put any limit on max rows to be fetched
//instead of var result = await query.find();
query.limit(99999999999);//Any value greater then max rows you want
var result = await query.find();**
Original answer:
Javascript / Cloud Code
Here's a clean way working for all queries
async function fetchAllIgnoringLimit(query,result) {
const limit = 1000;
query.limit(limit);
query.skip(result.length);
const results = await query.find();
result = result.concat(results)
if(results.length === limit) {
return await fetchAllIgnoringLimit(query,result );
} else {
return result;
}
}
And here's how to use it
var GameScore = Parse.Object.extend("GameScore");
var query = new Parse.Query(GameScore);
//instead of var result = await query.find();
var result = await fetchAllIgnoringLimit(query,new Array());
console.log("got "+result.length+" rows")
YAS (Yet Another Solution!) Using async() and await() in javascript.
async parseFetchAll(collected = []) {
let query = new Parse.Query(GameScore);
const limit = 1000;
query.limit(limit);
query.skip(collected.length);
const results = await query.find();
if(results.length === limit) {
return await parseFetchAll([ ...collected, ...results ]);
} else {
return collected.concat(results);
}
}
A Swift 3 Example:
var users = [String] ()
var payments = [String] ()
///set your record limit
let limit = 29
//recursive call, initial loopCount is 0 (we haven't looped yet)
func loadAllPaymentDetails(_ loopCount: Int){
///create your NEW eggstra-special query
let paymentsQuery = Payments.query()
paymentsQuery?.limit = limit
paymentsQuery?.skip = limit*loopCount
paymentsQuery?.findObjectsInBackground(block: { (objects, error) in
if let objects = objects {
//print(#file.getClass()," ",#function," loopcount: ",loopCount," #ReturnedObjects: ", objects.count)
if objects.count > 0 {
//print(#function, " no. of objects :", objects.count)
for paymentsObject in objects {
let user = paymentsObject[Utils.name] as! String
let amount = paymentsObject[Utils.amount] as! String
self.users.append(user)
self.payments.append(amount)
}
//recurse our loop with increment because we are not done
self.loadAllPaymentDetails(loopCount + 1); //<--recurse
}else {
//our query has run out of steam, this else{} will be called one time only
//if the Table had been initially empty, lets inform the user:
if self.users.count == 1 {
Utils.createAlert(self, title: "No Payment has been made yet", message: "Please Encourage Users to make some Payments", buttonTitle: "Ok")
}else {
self.tableView.reloadData()
}
}
}else if error != nil {
print(error!)
}else {
print("Unknown Error")
}
})
}
adapted from #deLux_247's example above.
You could achieve this using CloudCode... Make a custom function you can call that will enumerate the entire collection and build a response from that but a wiser choice would be to paginate your requests, and fetch the records 1000 (or even less) at a time, adding them into your list dynamically as required.
GENERIC VERSION For SWIFT 4:
Warning: this is not tested!
An attempt to adapt nyxee's answer to be usable for any query:
func getAllRecords(for query: PFQuery<PFObject>, then doThis: #escaping (_ objects: [PFObject]?, _ error: Error?)->Void) {
let limit = 1000
var objectArray : [PFObject] = []
query.limit = limit
func recursiveQuery(_ loopCount: Int = 0){
query.skip = limit * loopCount
query.findObjectsInBackground(block: { (objects, error) in
if let objects = objects {
objectArray.append(contentsOf: objects)
if objects.count == limit {
recursiveQuery(loopCount + 1)
} else {
doThis(objectArray, error)
}
} else {
doThis(objects, error)
}
})
}
recursiveQuery()
}
Here's my solution for C# .NET
List<ParseObject> allObjects = new List<ParseObject>();
ParseQuery<ParseObject> query1 = ParseObject.GetQuery("Class");
int totalctr = await query1.CountAsync()
for (int i = 0; i <= totalctr / 1000; i++)
{
ParseQuery<ParseObject> query2 = ParseObject.GetQuery("Class").Skip(i * 1000).Limit(1000);
IEnumerable<ParseObject> ibatch = await query2.FindAsync();
allObjects.AddRange(ibatch);
}
Related
I am making an application on Android and Firestore. When I try to upload the information to Firestore using a batch, I receive the following message:
com.google.firebase.firestore.FirebaseFirestoreException:
INVALID_ARGUMENT: maximum 500 writes allowed per request
I understand that this is a standard limitation for everyone (see Usage and limits), but how can the batch be divided into multiple batches to avoid this problem?
WriteBatch batch = mFirestore.batch();
batch.set(personRef, personData); // This is done 1 time
batch.set(productRef, myProduct, SetOptions.merge()); // This is done multiple times
batch.set(inventoryRef, inventoryData); // This is done multiple times
batch.set(clientRef, clientData); // This is done multiple times
batch.commit().addOnCompleteListener(new OnCompleteListener<Void>() {
#Override
public void onComplete(#NonNull Task<Void> task) {
if (task.isSuccessful()) {
Log.d(TAG, "Batch successfully completed!");
} else {
Log.d(TAG, "Error batch: ", task.getException());
}
}
});
I have searched for information about it, but, I only find solutions for Web using async / task, nothing that helps for Android.
I have tried this, but, without luck:
WriteBatch batch = mFirestore.batch();
int operationCounter = 0;
// This is just 1 time
DocumentReference personRef = mFirestore.collection..................
batch.set(personRef, personData);
// Multiple times
for (Product product : myProductList) {
DocumentReference productRef = mFirestore.collection...............
batch.set(productRef, product, SetOptions.merge());
operationCounter++;
if (operationCounter == 500) {
batch.commit();
// Start a new one
batch = mFirestore.batch();
// Reset counter
operationCounter = 0;
}
// This is just 1 time
DocumentReference inventoryRef = mFirestore.collection..................
batch.set(inventoryRef, inventory);
My goal is to be able to generate multiple batches to avoid the mentioned error and to be able to execute them one after another.
Actually you can do it, in the next way:
const BATCH_SIZE = 5;
let batch = db.batch();
let j = 0;
for (let i = 0; i < products.length; i++) {
// Do here the stuff you need
// Add the stuff you have done to the batch
const ref = db.collection(collection).doc(document);
batch.set(ref, data);
// Push the data
if (j === BATCH_SIZE) {
await batch.commit();
batch = db.batch();
j = 0; // Reset
} else j++;
}
// Push the remaining data into Firestore
await batch.commit();
Keep in mind the remaining data will be less than BATCH_SIZE (in this case 5). The constant BATCH_SIZE will indicate the size of data inside the batch.
I have 1000 documents in a single collection in Cloud Firestore, is it possible to fetch random documents?
Say for example: Students is a collection in Firestore and I have 1000 students in that collection, my requirement is to pick 10 students randomnly on each call.
As per Alex's answer I got hint to get duplicate records from Firebase Firestore Database (Specially for small amount of data)
I got some problems in his question as follow:
It gives all the records same as randomNumber is not updated.
It may have duplicate records in final list even we update randomNumber everytime.
It may have duplicate records which we are already displaying.
I have updated answer as follow:
FirebaseFirestore database = FirebaseFirestore.getInstance();
CollectionReference collection = database.collection(VIDEO_PATH);
collection.get().addOnCompleteListener(new OnCompleteListener<QuerySnapshot>() {
#Override
public void onComplete(#NonNull Task<QuerySnapshot> task) {
if (task.isSuccessful()) {
List<VideoModel> videoModelList = new ArrayList<>();
for (DocumentSnapshot document : Objects.requireNonNull(task.getResult())) {
VideoModel student = document.toObject(VideoModel.class);
videoModelList.add(student);
}
/* Get Size of Total Items */
int size = videoModelList.size();
/* Random Array List */
ArrayList<VideoModel> randomVideoModels = new ArrayList<>();
/* for-loop: It will loop all the data if you want
* RANDOM + UNIQUE data.
* */
for (int i = 0; i < size; i++) {
// Getting random number (inside loop just because every time we'll generate new number)
int randomNumber = new Random().nextInt(size);
VideoModel model = videoModelList.get(randomNumber);
// Check with current items whether its same or not
// It will helpful when you want to show related items excepting current item
if (!model.getTitle().equals(mTitle)) {
// Check whether current list is contains same item.
// May random number get similar again then its happens
if (!randomVideoModels.contains(model))
randomVideoModels.add(model);
// How many random items you want
// I want 6 items so It will break loop if size will be 6.
if (randomVideoModels.size() == 6) break;
}
}
// Bind adapter
if (randomVideoModels.size() > 0) {
adapter = new RelatedVideoAdapter(VideoPlayerActivity.this, randomVideoModels, VideoPlayerActivity.this);
binding.recyclerView.setAdapter(adapter);
}
} else {
Log.d("TAG", "Error getting documents: ", task.getException());
}
}
});
Hope this logic helps to all who has small amount of data and I don't think It will create any problem for 1000 to 5000 data.
Thank you.
Yes it is and to achieve this, please use the following code:
FirebaseFirestore rootRef = FirebaseFirestore.getInstance();
CollectionReference studentsCollectionReference = rootRef.collection("students");
studentsCollectionReference.get().addOnCompleteListener(new OnCompleteListener<QuerySnapshot>() {
#Override
public void onComplete(#NonNull Task<QuerySnapshot> task) {
if (task.isSuccessful()) {
List<Student> studentList = new ArrayList<>();
for (DocumentSnapshot document : task.getResult()) {
Student student = document.toObject(Student.class);
studentList.add(student);
}
int studentListSize = studentList.size();
List<Students> randomStudentList = new ArrayList<>();
for(int i = 0; i < studentListSize; i++) {
Student randomStudent = studentList.get(new Random().nextInt(studentListSize));
if(!randomStudentList.contains(randomStudent)) {
randomStudentList.add(randomStudent);
if(randomStudentList.size() == 10) {
break;
}
}
}
} else {
Log.d(TAG, "Error getting documents: ", task.getException());
}
}
});
This is called the classic solution and you can use it for collections that contain only a few records but if you are afraid of getting huge number of reads then, I'll recommend you this second approach. This also involves a little change in your database by adding a new document that can hold an array with all student ids. So to get those random 10 students, you'll need to make only a get() call, which implies only a single read operation. Once you get that array, you can use the same algorithm and get those 10 random ids. Once you have those random ids, you can get the corresponding documents and add them to a list. In this way you perform only 10 more reads to get the actual random students. In total, there are only 11 document reads.
This practice is called denormalization (duplicating data) and is a common practice when it comes to Firebase. If you're new to NoSQL database, so for a better understanding, I recommend you see this video, Denormalization is normal with the Firebase Database. It's for Firebase realtime database but same principles apply to Cloud Firestore.
But rememebr, in the way you are adding the random products in this new created node, in the same way you need to remove them when there are not needed anymore.
To add a student id to an array simply use:
FieldValue.arrayUnion("yourArrayProperty")
And to remove a student id, please use:
FieldValue.arrayRemove("yourArrayProperty")
To get all 10 random students at once, you can use List<Task<DocumentSnapshot>> and then call Tasks.whenAllSuccess(tasks), as explained in my answer from this post:
Android Firestore convert array of document references to List<Pojo>
I faced a similar problem (I only needed to get one random document every 24 hours or when users refresh the page manually but you can apply this solution on your case as well) and what worked for me was the following:
Technique
Read a small list of documents for the first time, let's say from 1 to 10 documents (10 to 30 or 50 in your case).
Select random document(s) based on a randomly generated number(s) within the range of the list of documents.
Save the last id of the document you selected locally on the client device (maybe in shared preferences like I did).
if you want a new random document(s), you will use the saved document id to start the process again (steps 1 to 3) after the saved document id which will exclude all documents that appeared before.
Repeat the process until there are no more documents after the saved document id then start over again from the beginning assuming this is the first time you run this algorithm (by setting the saved document id to null and start the process again (steps 1 to 4).
Technique Pros and Cons
Pros:
You can determine the jump size each time you get a new random document(s).
No need to modify the original model class of your object.
No need to modify the database that you already have or designed.
No need to add a document in the collection and handle adding random id for each document when adding a new document to the collection like solution mentioned here.
No need to load a big list of documents to just get one document or small-sized list of documents,
Works well if you are using the auto-generated id by firestore (because the documents inside the collection are already slightly randomized)
Works well if you want one random document or a small-sized random list of documents.
Works on all platforms (including iOS, Android, Web).
Cons
Handle saving the id of the document to use in the next request of getting random document(s) (which is better than handling a new field in each document or handling adding ids for each document in the collection to a new document in the main collection)
May get some documents more than one time if the list is not large enough (in my case it wasn't a problem) and I didn't find any solution that is avoiding this case completely.
Implementation (kotlin on android):
var documentId = //get document id from shared preference (will be null if not set before)
getRandomDocument(documentId)
fun getRandomDocument(documentId: String?) {
if (documentId == null) {
val query = FirebaseFirestore.getInstance()
.collection(COLLECTION_NAME)
.limit(getLimitSize())
loadDataWithQuery(query)
} else {
val docRef = FirebaseFirestore.getInstance()
.collection(COLLECTION_NAME).document(documentId)
docRef.get().addOnSuccessListener { documentSnapshot ->
val query = FirebaseFirestore.getInstance()
.collection(COLLECTION_NAME)
.startAfter(documentSnapshot)
.limit(getLimitSize())
loadDataWithQuery(query)
}.addOnFailureListener { e ->
// handle on failure
}
}
}
fun loadDataWithQuery(query: Query) {
query.get().addOnSuccessListener { queryDocumentSnapshots ->
val documents = queryDocumentSnapshots.documents
if (documents.isNotEmpty() && documents[documents.size - 1].exists()) {
//select one document from the loaded list (I selected the last document in the list)
val snapshot = documents[documents.size - 1]
var documentId = snapshot.id
//SAVE the document id in shared preferences here
//handle the random document here
} else {
//handle in case you reach to the end of the list of documents
//so we start over again as this is the first time we get a random document
//by calling getRandomDocument() with a null as a documentId
getRandomDocument(null)
}
}
}
fun getLimitSize(): Long {
val random = Random()
val listLimit = 10
return (random.nextInt(listLimit) + 1).toLong()
}
Based on #ajzbc answer I wrote this for Unity3D and its working for me.
FirebaseFirestore db;
void Start()
{
db = FirebaseFirestore.DefaultInstance;
}
public void GetRandomDocument()
{
Query query1 = db.Collection("Sports").WhereGreaterThanOrEqualTo(FieldPath.DocumentId, db.Collection("Sports").Document().Id).Limit(1);
Query query2 = db.Collection("Sports").WhereLessThan(FieldPath.DocumentId, db.Collection("Sports").Document().Id).Limit(1);
query1.GetSnapshotAsync().ContinueWithOnMainThread((querySnapshotTask1) =>
{
if(querySnapshotTask1.Result.Count > 0)
{
foreach (DocumentSnapshot documentSnapshot in querySnapshotTask1.Result.Documents)
{
Debug.Log("Random ID: "+documentSnapshot.Id);
}
} else
{
query2.GetSnapshotAsync().ContinueWithOnMainThread((querySnapshotTask2) =>
{
foreach (DocumentSnapshot documentSnapshot in querySnapshotTask2.Result.Documents)
{
Debug.Log("Random ID: " + documentSnapshot.Id);
}
});
}
});
}
A second approach as described by Alex Mamo would look similar to this:
Get the array list with the stored document ids
Get a number of strings (I stored the doc ids as string) from that list
In the code below you get 3 random and unique strings from the array and store it in a list, from where you can access the strings and make a query. I am using this code in a fragment:
#Nullable
#Override
public View onCreateView(#NonNull LayoutInflater inflater, #Nullable ViewGroup container, #Nullable Bundle savedInstanceState) {
View view = inflater.inflate(R.layout.fragment_category_selection, container, false);
btnNavFragCat1 = view.findViewById(R.id.btn_category_1);
btnNavFragCat1.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
questionKeyRef.document(tvCat1).get().addOnCompleteListener(new OnCompleteListener<DocumentSnapshot>() {
#Override
public void onComplete(#NonNull Task<DocumentSnapshot> task) {
if (task.isSuccessful()) {
DocumentSnapshot document = task.getResult();
List<String> questions = (List<String>) document.get("questions"); // This gets the array list from Firestore
List<String> randomList = getRandomElement(questions, 0);
removeDuplicates(randomList);
...
}
}
});
}
});
...
return view;
}
private List<String> getRandomElement(List<String> list, int totalItems) {
int PICK_RANDOM_STRING = 3;
Random rand = new Random();
List<String> newList = new ArrayList<>();
int count = 0;
while (count < PICK_RANDOM_STRING) {
int randomIndex = rand.nextInt(list.size());
String currentValue = list.get(randomIndex);
if (!newList.contains(currentValue)) {
newList.add(currentValue);
count++;
}
}
return newList;
}
private void removeDuplicates(List<String> list) {
try {
Log.e("One", list.get(0));
Log.e("Two", list.get(1));
Log.e("Three", list.get(2));
query1 = list.get(0); // In this vars are the strings stored with them you can then make a normal query in Firestore to get the actual document
query2 = list.get(1);
query3 = list.get(2);
} catch (Exception e) {
e.printStackTrace();
}
}
Here is the array that I get from Firestore:
Is it possible to store multiple documents in Firestore with only one request?
With this loop it's possible but this would cause one save operation per item in the list.
for (counter in counters) {
val counterDocRef = FirebaseFirestore.getInstance()
.document("users/${auth.currentUser!!.uid}/lists/${listId}/counters/${counter.id}")
val counterData = mapOf(
"name" to counter.name,
"score" to counter.score,
)
counterDocRef.set(counterData)
}
From Firebase documentation :
You can also execute multiple operations as a single batch, with any combination of the set(), update(), or delete() methods. You can batch writes across multiple documents, and all operations in the batch complete atomically.
// Get a new write batch
WriteBatch batch = db.batch();
// Set the value of 'NYC'
DocumentReference nycRef = db.collection("cities").document("NYC");
batch.set(nycRef, new City());
// Update the population of 'SF'
DocumentReference sfRef = db.collection("cities").document("SF");
batch.update(sfRef, "population", 1000000L);
// Delete the city 'LA'
DocumentReference laRef = db.collection("cities").document("LA");
batch.delete(laRef);
// Commit the batch
batch.commit().addOnCompleteListener(new OnCompleteListener<Void>() {
#Override
public void onComplete(#NonNull Task<Void> task) {
// ...
}
});
Firestore multiple write operations
Hope it helps..
Update some properties on all documents in a collection:
resetScore(): Promise<void> {
return this.usersCollectionRef.ref.get().then(resp => {
console.log(resp.docs)
let batch = this.afs.firestore.batch();
resp.docs.forEach(userDocRef => {
batch.update(userDocRef.ref, {'score': 0, 'leadsWithSalesWin': 0, 'leadsReported': 0});
})
batch.commit().catch(err => console.error(err));
}).catch(error => console.error(error))
}
void createServiceGroups() {
List<String> serviceGroups = [];
serviceGroups.addAll([
'Select your Service Group',
'Cleaning, Laundry & Maid Services',
'Movers / Relocators',
'Electronics & Gadget',
'Home Improvement & Maintenance',
'Beauty, Wellness & Nutrition',
'Weddings',
'Food & Beverage',
'Style & Apparel',
'Events & Entertainment',
'Photographer & Videographers',
'Health & Fitness',
'Car Repairs & Maintenance',
'Professional & Business Services',
'Language Lessons',
'Professional & Hobby Lessons',
'Academic Lessons',
]);
Firestore db = Firestore.instance;
// DocumentReference ref = db
// .collection("service_groups")
// .document(Random().nextInt(10000).toString());
// print(ref.documentID);
// Get a new write batch
for (var serviceGroup in serviceGroups) {
createDocument(db, "name", serviceGroup);
}
print("length ${serviceGroups.length}");
}
createDocument(Firestore db, String k, String v) {
WriteBatch batch = db.batch();
batch.setData(db.collection("service_groups").document(), {k: v});
batch.commit();
}
createDocument(Firestore db, String k, String v) {
WriteBatch batch = db.batch();
batch.setData(db.collection("service_groups").document(), {k: v});
batch.commit();
}
This may help you:
for (var serviceGroup in serviceGroups) {
createDocument(db, "name", serviceGroup );
}
If you are in the need to use add() instead of set, please follow the code below,
public void createMany(List<T> datas) throws CustomException {
Firestore firestore = connection.firestore();
CollectionReference colRef = firestore.collection("groups");
WriteBatch batch = firestore.batch();
for (T data : datas) {
batch.create(colRef.document(), data);
}
ApiFuture<List<WriteResult>> futureList = batch.commit();
try {
for (WriteResult result : futureList.get()) {
logger.debug("Batch output: {}", result.getUpdateTime());
}
} catch (InterruptedException | ExecutionException e) {
throw new CustomException(500, e.getMessage());
}
}
This might be useful when you are in the need to generate the id from the firestore db.
I have a list of data which again contains , 2 string URL, which again should place 2 api calls to fetch result and add those data to the main list and then display in the listview .
Here i am able to obtain the list with right data but the order gets mismatched . Is that like the first completed result gets added to my ArrayList or something . Please help .
Here is my code below :
void handleOrdersResponse(#NonNull List<OrderRowViewModel> response) {
for (int i = 0; i < response.size(); i++) {
String fulfillmentUrl = response.get(i).mFulFillmentMethodUrl.get();
String paymentUrl = response.get(i).mPaymentMethodUrl.get();
OrderRowViewModel response1 = response.get(i);
subscribe(Observable.zip(getFulFillment(fulfillmentUrl), getPayment(paymentUrl), Pair::new)
.compose(ObservableTransformers.getInstance().networkOperation())
.doOnSubscribe(() -> {
})
.subscribe(pair -> {
response1.setOrderFulFilemntname(pair.first.state(), pair.first.state());
response1.setPaymnetType(response1.mPaymentTitle.get(), pair.second.methodData().methodName());
mModels.add(response1);
mFilteredConversations.add(response1);
if (mFilteredConversations.size() == response.size() && mModels.size() == response.size()) {
mAdapter.refresh();
}
},
throwable -> mEventBus.post(new BaseActivity.ShowSnackbarEvent(R.string.failure_updating_store))));
}
}
I am looking for the way how to chain multiple but same API requests with different parameters. So far my method looks like this:
#Override
public Observable<List<Entity>> getResult(Integer from, Integer to, Integer limit) {
MyService myService = restClient.getMyService();
if (null != from && null != to) {
Observable<List<Response>> responseObservable = myService.get(from, limit);
for (int i = from + 1; i <= to; i++) {
responseObservable = Observable.concat(responseObservable, myService.get(i, limit));
}
return responseObservable.map(mapResponseToEntity);
} else {
int fromParameter = null == from ? DEFAULT_FROM : from;
return myService.get(fromParameter, limit).map(mapResponseToEntity);
}
}
I expected that concat method combines Oservables data into one stream and returns combined Observable but I am getting only the last one calls result. However, in logcat I can see that correct number of calls to API was made.
Try using Observable.merge() and Observable.toList() as follows:
List<Observable<Response>> observables = new ArrayList();
// add observables to the list here...
Subscription subscription = Observable.merge(observables)
.toList()
.single()
.subscribe(...); // subscribe to List<Response>