Android Parse SDK issue - android

I'm using parse to store my data and do a lot of queries while using my program.
The issue is that after about +/-20 similar queries, parse findInBackground() or getFirstInBackground() doesn't return a callback and app stuck at that possition.
My query code:
ParseQuery<OptionCodeDTO> mQuery;
mQuery = ParseQuery.getQuery(OptionCodeDTO.class);
mQuery.whereEqualTo("code", prCode);
mQuery.getFirstInBackground(new GetCallback<OptionCodeDTO>() {
#Override
public void done(OptionCodeDTO optionCodeDTO, ParseException e) {
if (isVisible()) {
if (e == null) {
OptionCode opCode = new OptionCode(optionCodeDTO);
mCodes.push(opCode);
printCodes();
prDescrLayout.setVisibility(View.VISIBLE);
prDescProgress.setVisibility(View.GONE);
mPRLable.setVisibility(View.GONE);
} else {
if (e.getCode() == ParseException.CONNECTION_FAILED) {
mPrDescr.setText(R.string.dtc_lookup_check_network);
} else if (e.getCode() == ParseException.OBJECT_NOT_FOUND) {
mPrDescr.setText(R.string.pr_lookup_code_not_found);
} else {
mPrDescr.setText(R.string.dtc_lookup_other_problems);
}
prDescrLayout.setVisibility(View.VISIBLE);
prDescProgress.setVisibility(View.GONE);
}
}
}
});

First of all, if your app ANRs (application not responding) because of something from UI thread that relies on background threads, that is incorrect architecture.
Probably you have to optimize your app's interaction with Parse. Generally it is a bad practice to make lots of saveInBackground, for example from inside a loop. You can add objects, those need to be saved, to a list and then use ParseObject.saveAllInBackground(objectList)
Also an idea to optimize is to use local storage - android's built in SQLite. For example if your app relies on something being saved to Parse, the logic is like this:
When saving object first you save to local DB and run a saveInBackground method.
When fetching objects you first fetch from your local DB and then run a getInBackground method, which inside a callback persists the information to your local DB.
This way you will make your app usable without internet connection.

Related

Room Update the data only if it changed from the server

I have a function which runs every 30 seconds, in which it fetches the latest data from the server and stores locally.
Currently, I'm deleting all the existing rows and reinserting all the data again, but I think this way is not the efficient one, I should be updating the data only if the local data and server data defers.
So how can I do that?
Here's what I'm doing:
DatabaseDao:
#Dao
public interface GeneralDatabaseDao {
#Query("DELETE FROM table_tables")
int deleteTables();
#Insert(onConflict = OnConflictStrategy.REPLACE)
void insertRestaurantTablesData(TablesModel... TablesModels);
}
Repository:
public LiveData<List<TablesModel>> getTablesData(int mLocationID) {
mTablesList = new MutableLiveData<>();
LiveData<TablesModel> mTablesData = mTableDataSource.getTablesData();
Observer<TablesModel> mObserver = tableModels -> {
mExecutors.diskIO().execute(() -> {
//Completed: delete old table data if there are conflicts.
if (tableModels != null) {
mDatabaseDao.deleteTables();
mDatabaseDao.insertTablesFromServer(tableModels.getTables());
} else {
Log.e(LOG_TAG, "Nothing: ");
}
});
Log.e("Handlers", "repository getTablesData");
};
if (!mTablesData.hasObservers()) {
mTablesData.observeForever(mObserver);
}
return mDatabaseDao.getTablesData(mLocationID);
}
I know I need to compare all the local rows to the data which I got from the server and then update only changed data, But for that, I need to query local data and check row by row in a loop and then update. I'm fine with doing that but is there any other efficient way to that?
No, there's no alternative. Moreover, if you delete and insert again, you will activate the UI refresh each time. Since you used live data to update UI only when it is needed, you need to update the database only when it is strictly necessary.
With the code you write, each time you check web service, UI will be refresh. Better check each time if there are rows to change and modify them.

Android Azure Offline Sync - Not completing sync

I have an android app with Azure Mobile Services and implemented Offline Sync. The app works well but when syncing data it seems not to complete so there is always a few rows on tables which have not synced?
Anyone have any ideas what the problem might be. I believe that on the next try it would finish where it left off or am I wrong?
Thanks in advance
The app works well but when syncing data it seems not to complete so there is always a few rows on tables which have not synced?
I would recommend you use fiddler to capture the network traces when handling the sync operations.
For Incremental Sync, the request would be as follows:
Get https://{your-app-name}.azurewebsites.net/tables/TodoItem?$filter=(updatedAt%20ge%20datetimeoffset'2017-11-03T06%3A56%3A44.4590000%2B00%3A00')&$orderby=updatedAt&$skip=0&$top=50&__includeDeleted=true
For opting out of incremental sync, you would retrieve all records without the filter updatedAt.
Get https://{your-app-name}.azurewebsites.net/tables/TodoItem?$skip=0&$top=50&__includeDeleted=true
Note: If there are too many items, the SDK would send multiple requests to pull all items that match your given query from the associated remote table. Also, you need to make sure you specify the includeDeleted() in your query.
In summary, you need to make sure that all items could be retrieved via the above requests. Additionally, if the pull operation has pending local updates, then the pull operation would first execute a push operation. So, I assume that you could catch the exception when calling pull operation for handling the conflict resolution.
Bruce's answer is fine but I used a slightly different method without the need to use fiddler.
I change my connection from this
mClient = new MobileServiceClient("[AZUREWEBSITE]", cntxall);
mClient.setAndroidHttpClientFactory(new MyOkHttpClientFactory());
To this
mClient = new MobileServiceClient("[AZUREWEBSITE]", cntxall).withFilter(
new ServiceFilter() {
#Override
public ListenableFuture<ServiceFilterResponse> handleRequest(ServiceFilterRequest request, NextServiceFilterCallback nextServiceFilter) {
// Get the request contents
String url = request.getUrl();
String content = request.getContent();
if (url != null) {
Log.d("Request URL:", url);
}
if (content != null) {
Log.d("Request Content:", content);
}
// Execute the next service filter in the chain
ListenableFuture<ServiceFilterResponse> responseFuture = nextServiceFilter.onNext(request);
Futures.addCallback(responseFuture, new FutureCallback<ServiceFilterResponse>() {
#Override
public void onFailure(Throwable e) {
Log.d("Exception:", e.getMessage());
}
#Override
public void onSuccess(ServiceFilterResponse response) {
if (response != null && response.getContent() != null) {
Log.d("Response Content:", response.getContent());
}
}
});
return responseFuture;
}
}
);
This is the logging method for Azure connections and shows the request in the log.

Managing different Exceptions from different rx.Observables

Following the topic discussed here. I'm coding an Android App using the Clean Architecture. I've an Interactor that takes care of retriving the User's feed data. The flow is like this:
I must fetch the Feed data from the a Repository which calls a Retrofit's service to do the API call.
If something goes wrong I've to fetch the feed data from a FeedCache that internally works with Sqlite.
I've to merge this feed collection with another bunch of feeds from another cache called PendingPostCache. This cache contains all the articles that the user couldn't post (because something went wrong, didn't had internet connection, etc.)
My FeedCache and PendingPostCache both work with Sqlite. Botch can throw DBExceptions if something went wrong. My FeedRepository the ones that makes the requests against the server-side can also throw exceptions if something goes wrong (ServerSideException).
Here's the whole code from my Interactor:
mFeedRepository.getFeed(offset, pageSize) //Get items from the server-side
.onErrorResumeNext(mFeedCache.getFeed(userSipid)) //If something goes wrong take it from cache
.mergeWith(mPendingPostCache.getAllPendingPostsAsFeedItems(user)) //Merge the response with the pending posts
.subscribe(new DefaultSubscriber<List<BaseFeedItem>>() {
#Override
public void onNext(List<BaseFeedItem> baseFeedItems) {
callback.onFeedFetched(baseFeedItems);
}
#Override
public void onError(Throwable e) {
if (e instanceof ServerSideException) {
//Handle the http error
} else if (e instanceof DBException) {
//Handle the database cache error
} else {
//Handle generic error
}
}
});
I don't like having those instanceof. I'm thinking on creating a custom subscriber, something called like MyAppSubscriber, which implements the onError method, makes those instanceof comparations, and execute some methods called onServerSideError(), onDBError(). That way the code is going te be a lot cleaner and I can spare writing that instanceof boilerplate code. Has someone a better idea about how to approach this issue? Some way to avoid the custom Subscriber?
Just use composition:
public <T,E> Function<Throwable, Observable<T>> whenExceptionIs(Class<E> what, Function<E, Observable<T>> handler) {
return t -> {
return what.isInstance(t) ? handler.apply(what.cast(t)) : Observable.error(t);
};
}
Then you use it normally :
Observable.from(...).flatMap(...)
.onErrorResumeNext(whenExceptionIs(ServerSideException.class, e-> Observable.empty()))
.onErrorResumeNext(whenExceptionIs(DBException.class, e-> ...))
You can even abstract all that in one method:
public <T> Transformer<T, T> errorHandling() {
return src -> src
.onErrorResumeNext(whenExceptionIs(ServerSideException.class, e-> Observable.empty()))
.onErrorResumeNext(whenExceptionIs(DBException.class, e-> ...));
}
Observable.from(...).flatMap(...)
.compose(errorHandling())
.subscribe();

Parse findAllInBackground & fetchAllInBackground

i'm having an issue that soon enough going to blow me.
i have Database table lets call it A. table A has field that determines if this row is processed or no. i update the field myself from within the Parse Browser to either True | False, and trying to call query.findInBackground() to check with the Boolean value however the returned List always returns False if its True and vice versa. enough talking let me show you what i'm doing.
public static void getMyRequests(ParseUser user, final FindCallback<ServicesModel> callback) {
ParseQuery<ServicesModel> query = new ParseQuery<>(ServicesModel.class);
if (!user.getBoolean(ParseHelper.CAN_UPLOAD)) {
query.whereEqualTo("user", user);
}
query.findInBackground(new FindCallback<ServicesModel>() {
#Override public void done(final List<ServicesModel> objects, ParseException e) {
if (e == null) {
if (objects != null && !objects.isEmpty()) {
for (ServicesModel object : objects) {
object.setHandlerUser(object.getParseUser("handlerUser"));
object.setProcessedTime(object.getLong("processedTime"));
object.setCategoryType(object.getString("categoryType"));
object.setUser(object.getParseUser("user"));
object.setUserRequest(object.getString("userRequest"));
object.setImageUrl(object.getString("imageUrl"));
object.setProcessed(object.getBoolean("isProcessed"));
Logger.e(object.getBoolean("isProcessed") + "");
}
callback.done(objects, null);
} else {
callback.done(null, new ParseException(1001, "No Services"));
}
} else {
callback.done(null, e);
}
}
});
}
the code above suppose to refresh my data but however my log always shows that isProcessed is False even tho it's set to True inside the Parse Browser
what i have tried besides this? fetchAllInBackground & fetch() you name it. the object will always return false until i re-run the application from Android Studio what i'm doing here wrong? btw here is how i initialize Parse
Parse.setLogLevel(BuildConfig.DEBUG ? DEBUG_LEVEL : Parse.LOG_LEVEL_NONE);
ParseObject.registerSubclass(ProductsModel.class);
ParseObject.registerSubclass(ProductRentalModel.class);
ParseObject.registerSubclass(ServicesModel.class);
Parse.enableLocalDatastore(context);
Parse.initialize(context, context.getString(R.string.app_id), context.getString(R.string.client_id));
the answer was to remove
Parse.enableLocalDatastore(context);
which is bad anyway, without the datastore enabled the data are refreshed probably, however with enabling the local database, the data will not refresh unless if i killed the app and/or re-install it. that's bad. but did the trick.

Network query returns local datastore pointer

I have recently managed to switch to Local Datastore, moving all the needed parts of the database to the client. Furthermore I send a push notification whenever data changes so that the client data stays up to date.
Now my problem is that one of the pointers in my data keeps returning the old pointer after updating.
Here is the code that I use:
public ParseQuery<CircuitUnit> circuitUnits() {
ParseQuery<CircuitUnit> query = CircuitUnit.getQuery();
query.include(CircuitUnit.circuits);
query.include(CircuitUnit.guard);
query.setLimit(1000);
return query;
}
circuitUnits().findInBackground(
new FindCallback<CircuitUnit>() {
#Override
public void done(List<CircuitUnit> objects, ParseException e) {
Log.d(TAG, "Pinning circuitUnits " + objects.size());
// debugging loop of results
for (CircuitUnit circuitUnit : objects) {
if (circuitUnit.getObjectId().equals("TXEZDch6wK")) {
Guard guard = circuitUnit.getGuard();
Log.d(TAG, "Guard: " + guard.getObjectId());
}
}
// This function updates the Local datastore with the newly fetched objects
// updateServerDataPin(objects, PinsGlobal.CIRCUIT_UNITS,
e, callback);
}
});
The Logged output is: "Guard: L440gHXKTY" where it should be SDDg23rR4h
I have added a screenshot of the CircuitUnit table showing that SDDg23rR4h is indeed the expected objectId of the pointer.
I tried to create a sample project showing the issue as part of the bug report for parse.com, but that returned the correct pointer.
My theory is then that this problem lies in having data in the Local Datastore which somehow interferes with the result. Not that I understand why, cause I am clearly querying the network and not locally.
It seems as if the include statements are ignored and simply filled with the known data from the Local Datastore.
Has anyone experienced something similar or have a possible explanation to this behaviour?

Categories

Resources