how to synchronise 2 async threads / tasks - android

For my app I have to run two operations, both being asynchronous:
read from a file ( I use this file to simulate reading from a data bus ) - async operation because I don't know "when" arrive a new
message/character on the bus. I search for a specific sequence
character, eg frame start_bytes = "xx" and the 4 following bytes are
"the data" I wait for.
read / update data to Firebase, depending on the "data" read from file - async operation due to addValueEventListener use.
I'm thinking a semaphore/mutex mechanism or a simple boolean flag that one task signal to the other one that a new data must be saved/updated to Firebase.
How can I synchronize these two operations ( by embedding them in a Task / AsyncTask / Thread)?
I ran a search for these topics but I found examples related to UI, ProgressBars and so on .. not really suited/useful to my situation.
read / update data in Firebase
myRefDevices.addValueEventListener(new ValueEventListener() {
// addValueEventListener
// This method is called once with the initial value and again
// whenever data at this location is updated.
#Override
public void onDataChange(DataSnapshot dataSnapshot) {
boolean bChildFound = false;
DatabaseReference dbrefChildFound;
final CDeviceStatus obj_new = new CDeviceStatus();
for( DataSnapshot val : dataSnapshot.getChildren() )
{
if( val.getKey().contentEquals(MAC_ADDRESS[ iIterator ]) )
{
bChildFound = true;
dbrefChildFound = val.getRef();
obj_new.setiAvailable_A( val.getValue( CDeviceStatus.class ).getiAvailable_A() + 1 );
obj_new.setsID(val.getValue( CDeviceStatus.class).getsID() );
dbrefChildFound.setValue(obj_new);
}
}
if(!bChildFound)
{
Log.d("child=" + MAC_ADDRESS[ iIterator ], "not found");
}
if(++iIterator == 16)
{
iIterator = 0;
}
}
#Override
public void onCancelled(DatabaseError databaseError) {
}
});
read from file :
try {
// open input stream text file for reading
Resources res = getResources();
InputStream instream = res.openRawResource( R.raw.simulated_bus );
// we convert it to bufferred input stream
BufferedInputStream bistreamSimulatedBus = new BufferedInputStream(instream);
try{
// if we want to stop reading from the file / simulated bus for whatever reason..
boolean bStayInLoop = true;
while ((bistreamSimulatedBus.available() > 0) && bStayInLoop)
{
try {
// throw new InterruptedException();
char c = (char) bistreamSimulatedBus.read();
if( COUNT_CHARACTERS_NEWLINE )
{
if ( '\n' == c ){
// we can count how much NewLine character we have
//iNL_Counter++;
}
}
...
}
catch ( InterruptedException e ) {
throw new RuntimeException( e );
}
}
} catch (IOException e) {
throw new RuntimeException( e );
}
finally {
// release any resource associated with streams
if ( null != instream ) {
instream.close();
}
if ( null != bistreamSimulatedBus ) {
bistreamSimulatedBus.close();
}
}
}
catch (Exception e) {
throw new RuntimeException( e );
}
Thank you.

Let us break the solution like this:
The basics
You have two operations : o1 and o2. You want the second operation to execute as soon as the first one has completed.
It clearly appears to me that you need an event-driven solution here.
Approach
Using the concept of Publisher/Subscriber design pattern, you can make the Initiator of o1 be the Publisher of an event. Then, when this particular operation o1 is completed, let the class (activity, fragment, service) notify the other class which we will call Subscriber.
Code
Add the following line to your build.gradle (app-level):
compile 'org.greenrobot:eventbus:3.0.0'
Then, simply create a simple Plain Old Java Object (POJO) that represents your event.
public class RequestCompletedEvent{ // add constructor and anything you want}
Next, to Publish the event, you simply call the post(POJO instance) like this:
EventBus.getDefault().post(new RequestCompletedEvent(true));
Then, finally, in the Subscriber class, simply listen for notifications by adding the following lines of code:
#Override
public void onStart() {
super.onStart();
EventBus.getDefault().register(this);
}
#Override
public void onStop() {
super.onStop();
EventBus.getDefault().unregister(this);
}
Then still within the same class, use the Subscribe annotation to catch any signals:
#Subscribe
public void onEvent(RequestCompletedEvent event) {
/* Do something */
//trigger the second operation here;
startOperationTwo();
}
Summary
It would help to note here that the easiest way to pull this off is to use an async task (AsyncTask sub class) to read your files, then when successfully done, inside onPostExecute(), you can notify the Subscriber to initiate the next operation.
I hope this helps; and good luck! Let me know if you need further assistance!

Related

OnPartialResult (Speechrecognition capacitor)

I am having a problem understanding how to change the OnPartialResults function inside of the android code (in speechrecognition) to only return the new work every time a word is detected instead of the whole array of words
For example if i am saying (test) the result returned while session is remaining active is [test] but if i then proceed to say (test) again the returned result (onpartial) is now including the word found earlier [test, test], and i am only needing it to return the newly found word.
Current code
#Override
public void onPartialResults(Bundle partialResults) {
ArrayList<String> matches = partialResults.getStringArrayList(
SpeechRecognizer.RESULTS_RECOGNITION
);
JSArray matchesJSON = new JSArray(matches);
try {
if (
matches != null &&
matches.size() > 0 &&
!previousPartialResults.equals(matchesJSON)
) {
previousPartialResults = matchesJSON;
}
} catch (Exception ex) {}
}

The Cloud function that returns a batch commit does not wait for that commit to make changes to the database before it completes

In Cloud Functions, I have defined a function that makes somes updates using a batch that I commit. This commit is the return of the function. This function simply computes the number of likes of every post (likes and posts are two distinct collections in my Firestore database). Since the whole code is short and very simple to understand, I show it below.
The fact to like or unlike a post (to add or remove a like document from the likes collection) is done client-side by the app.
The fact to compute some statistics (number of likes per post for example) is down server-side in the following Cloud Function. (because if it was client side, it would be hackable, i. e. Bad statistics would be generated and saved + dealing with statistics doesn't concern the Android app directly and so it should definitely be computed server-side).
The important thing to note is: return batch.commit is the return of this Cloud Function.
exports.sortPostsByUsersPostsLikes = functions.https.onCall((data, context) => {
if(!context.auth) {
throw new functions.https.HttpsError('failed-precondition', 'The function must be called while authenticated.');
}
const batch = admin.firestore().batch();
const posts = admin_firestore.collection('list_of_users_posts');
const likes = admin_firestore.collection('likes_of_users_posts');
const map_posts_id_with_number_of_likes = [];
likes.get().then(function(likes_docs) {
likes_docs.forEach(like_doc => {
if(!(like_doc.data().post in map_posts_id_with_number_of_likes)) {
map_posts_id_with_number_of_likes[like_doc.data().post] = 0;
}
map_posts_id_with_number_of_likes[like_doc.data().post] += 1;
});
return posts.get();
}).then(function(posts_docs) {
posts_docs.forEach(post_doc => {
if(post_doc.id in map_posts_id_with_number_of_likes) {
batch.update(post_doc.ref, "number_of_likes", map_posts_id_with_number_of_likes[post_doc.id]);
} else {
batch.update(post_doc.ref, "number_of_likes", 0);
}
});
return batch.commit();
}).catch(function(error) {
console.log("UNABLE TO SORT THE POSTS");
console.log(error);
throw new functions.https.HttpsError('unknown', 'An error occurred when trying to sort the posts.');
});
});
In my Android app, when the user, in the list of the posts, likes a post:
First, I add a like in the likes collection
When the like is successfully added in the likes collection in database, I refresh the list of the posts
When the list of the posts is shown (or refreshed), I call the above Cloud Function in order to re-compute the number of likes of the posts (soon, "of the shown posts").
When the number of likes of the posts is successfully recomputed, I show the posts (so the number of likes of each shown post would be correct).
Question
The problem is: at step 4., the number of likes of each shown post is NOT correct (sometimes it is, sometimes it is not). As if the Cloud Function didn't wait the batch commit ends. Is it a normal behavior? Is there any way to force the Cloud Function to wait for the batch's commit's success?
The code I use, in the Android app, to call the above Cloud Function and then, normally if it succeeds, to show the posts (so normally with the good number of likes, but it's not the case in practice) is:
FirebaseFunctions.getInstance()
.getHttpsCallable("sortPostsByUsersPostsLikes")
.call()
.continueWith(new Continuation<HttpsCallableResult, Void>() {
#Override
public Void then(#NonNull final Task<HttpsCallableResult> task) {
if(requireActivity().isDestroyed() || requireActivity().isFinishing()) {
return null;
}
if(!task.isSuccessful()) {
Exception e = task.getException();
if (e instanceof FirebaseFunctionsException) {
FirebaseFunctionsException ffe = (FirebaseFunctionsException) e;
if(ffe.getCode() == FirebaseFunctionsException.Code.UNKNOWN) {
miscellaneous.showErrorPopIn(requireActivity(), R.string.error_sortPostsByUsersPostsLikes);
}
}
return null;
}
postsDatabaseModel.getListOfPostsOfUser(the_posts_owner).get().addOnCompleteListener(new OnCompleteListener<QuerySnapshot>() {
Returning the batch commit and adding a then doesn't work
I have tried the following but it doesn't work:
.then(function(posts_docs) {
posts_docs.forEach(post_doc => {
if(post_doc.id in map_posts_id_with_number_of_likes) {
batch.update(post_doc.ref, "number_of_likes", map_posts_id_with_number_of_likes[post_doc.id]);
} else {
batch.update(post_doc.ref, "number_of_likes", 0);
}
});
return batch.commit();
}).then(function() {
return true;
}).catch(function(error) {
You are correctly chaining the promises returned by the asynchronous methods but your don't return this entire chain. You should do as follows:
exports.sortPostsByUsersPostsLikes = functions.https.onCall((data, context) => {
if(!context.auth) {
throw new functions.https.HttpsError('failed-precondition', 'The function must be called while authenticated.');
}
const batch = admin.firestore().batch();
const posts = admin_firestore.collection('list_of_users_posts');
const likes = admin_firestore.collection('likes_of_users_posts');
const map_posts_id_with_number_of_likes = [];
// SEE THE ADDITION OF RETURN BELOW
return likes.get().then(function(likes_docs) {
likes_docs.forEach(like_doc => {
if(!(like_doc.data().post in map_posts_id_with_number_of_likes)) {
map_posts_id_with_number_of_likes[like_doc.data().post] = 0;
}
map_posts_id_with_number_of_likes[like_doc.data().post] += 1;
});
return posts.get();
}).then(function(posts_docs) {
posts_docs.forEach(post_doc => {
if(post_doc.id in map_posts_id_with_number_of_likes) {
batch.update(post_doc.ref, "number_of_likes", map_posts_id_with_number_of_likes[post_doc.id]);
} else {
batch.update(post_doc.ref, "number_of_likes", 0);
}
});
return batch.commit();
}).catch(function(error) {
console.log("UNABLE TO SORT THE POSTS");
console.log(error);
throw new functions.https.HttpsError('unknown', 'An error occurred when trying to sort the posts.');
});
});
I would suggest you watch the 3 videos about "JavaScript Promises" from the Firebase video series (https://firebase.google.com/docs/functions/video-series/) which emphasize how important it is to return a Promise. Without that, the Cloud Function may terminate at anytime before all the asynchronous operatins are completed.
UPDATE FOLLOWING YOUR COMMENTS
If you want to log the fact that the Cloud Function was successful, you could do as follows:
exports.sortPostsByUsersPostsLikes = functions.https.onCall((data, context) => {
if(!context.auth) {
throw new functions.https.HttpsError('failed-precondition', 'The function must be called while authenticated.');
}
//...
return likes.get().then(function(likes_docs) {
//...
return posts.get();
}).then(function(posts_docs) {
//...
return batch.commit();
}).then(function() {
console.log("SUCCESS")
return null;
})
.catch(function(error) {
//...
});
});

Microblink recognizer set up RegexParserSettings

I am trying to scan an image taken from resources using a Recognizer with a RegerParserSettings inside a fragment. The problem is that BaseRecognitionResult obtained through the callback onScanningDone is always null. I have tried to set up the RecognitionSettings with MRTDRecognizer and worked fine, so I think that the library is properly integrated. This is the source code that I am using:
#Override
public void onAttach(Context context) {
...
try {
mRecognizer = Recognizer.getSingletonInstance();
mRecognizer.setLicenseKey(context, LICENSE_KEY);
} catch (FeatureNotSupportedException | InvalidLicenceKeyException e) {
Log.d(TAG, e.getMessage());
}
buildRecognitionSettings();
mRecognizer.initialize(context, mRecognitionSettings, new DirectApiErrorListener() {
#Override
public void onRecognizerError(Throwable t) {
//Handle exception
}
});
}
private void buildRecognitionSettings() {
mRecognitionSettings = new RecognitionSettings();
mRecognitionSettings.setRecognizerSettingsArray(setupSettingsArray());
}
private RecognizerSettings[] setupSettingsArray() {
RegexParserSettings regexParserSettings = new RegexParserSettings("[A-Z0-9]{17}");
BlinkOCRRecognizerSettings sett = new BlinkOCRRecognizerSettings();
sett.addParser("myRegexParser", regexParserSettings);
return new RecognizerSettings[] { sett };
}
I scan the image like:
mRecognizer.recognizeBitmap(bitmap, Orientation.ORIENTATION_PORTRAIT, FragMicoblink.this);
And this is the callback handled in the fragment
#Override
public void onScanningDone(RecognitionResults results) {
BaseRecognitionResult[] dataArray = results.getRecognitionResults();
//dataArray is null
for(BaseRecognitionResult baseResult : dataArray) {
if (baseResult instanceof BlinkOCRRecognitionResult) {
BlinkOCRRecognitionResult result = (BlinkOCRRecognitionResult) baseResult;
if (result.isValid() && !result.isEmpty()) {
String parsedAmount = result.getParsedResult("myRegexParser");
if (parsedAmount != null && !parsedAmount.isEmpty()) {
Log.d(TAG, "Result: " + parsedAmount);
}
}
}
}
}`
Thanks in advance!
Helllo Spirrow.
The difference between your code and SegmentScanActivity is that your code uses DirectAPI, which can process only single bitmap image you send for processing, while SegmentScanActivity processes camera frames as they arrive from the camera. While doing so, it can utilize time redundant information to improve the OCR quality, i.e. it combines consecutive OCR results from multiple video frames to obtain a better quality OCR result.
This feature is not available via DirectAPI - you need to use either SegmentScanActivity, or custom scan activity with our camera management.
You can also find out more here:
https://github.com/BlinkID/blinkid-android/issues/54
Regards

Using Observable Zip misbehaving

I have two observables(A, B), and I want the first to finish running before the second runs. But, that's not even the problem I'm having. The problem is that, when A is added before B, B doesn't run at all unless I place B before A then, the two runs. But, the scenario I'm in is like thus:
A - Pickup
B - Delivery
There are three types of orders. Pickup Only, Delivery Only and Pickup And Delivery. Pickups need to run before Deliveries in every situation. A Delivery only already have Pickup marked as true. A Pickup only, needs to be picked up and delivered on it being closed. Which is why I need Pickup to send all locally saved pickups first before sending deliveries. So, I did this:
Pickup
private Observable<UpdateMainResponse> getDeliveredOrders() {
String token = PrefUtil.getToken(context);
BehaviorSubject<Integer> pageControl = BehaviorSubject.create(1);
Observable<UpdateMainResponse> ret = pageControl.asObservable().concatMap(integer -> {
if (integer - 1 != deliveryUpdate.size()) {
Log.e(TAG, "DeliveredOrders: " + deliveryUpdate.size());
RealmOrderUpdate theDel = deliveryUpdate.get(integer-1);
Log.e(TAG, "DeliveryUpdate: " + theDel.toString());
DeliverOrder pickupOrder = new DeliverOrder();
pickupOrder.setUuid(theDel.getUuid());
pickupOrder.setCode(theDel.getDest_code());
pickupOrder.setDelivered_lat(theDel.getLoc_lat());
pickupOrder.setDelivered_long(theDel.getLoc_long());
return apiService.deliverOrder(theDel.getOrderId(), token, pickupOrder)
.subscribeOn(Schedulers.immediate())
.doOnNext(updateMainResponse -> {
try {
Log.e(TAG, updateMainResponse.toString());
realm.executeTransaction(realm1 -> theDel.deleteFromRealm());
} catch (Exception e) {
e.printStackTrace();
} finally {
pageControl.onNext(integer + 1);
}
});
} else {
return Observable.<UpdateMainResponse>empty().doOnCompleted(pageControl::onCompleted);
}
});
return Observable.defer(() -> ret);
}
Delivery
private Observable<UpdateMainResponse> getPickedOrders() {
Log.e(TAG, "PickedOrders: " + pickUpdate.size());
String token = PrefUtil.getToken(context);
BehaviorSubject<Integer> pageControl = BehaviorSubject.create(1);
Observable<UpdateMainResponse> ret = pageControl.asObservable().concatMap(integer -> {
Log.e(TAG, "MainPickedInteger: " + integer);
if (integer - 1 != pickUpdate.size()) {
RealmOrderUpdate thePick = pickUpdate.get(integer - 1);
Log.e(TAG, "PickedUpdate: " + thePick.toString());
PickupOrder pickupOrder = new PickupOrder();
pickupOrder.setUuid(thePick.getUuid());
pickupOrder.setCode(thePick.getSource_code());
pickupOrder.setPicked_lat(thePick.getLoc_lat());
pickupOrder.setPicked_long(thePick.getLoc_long());
return apiService.pickupOrder(thePick.getOrderId(), token, pickupOrder)
.subscribeOn(Schedulers.immediate())
.doOnNext(updateMainResponse -> {
try {
Log.e(TAG, updateMainResponse.toString());
realm.executeTransaction(realm1 -> thePick.deleteFromRealm());
} catch (Exception e) {
e.printStackTrace();
} finally {
pageControl.onNext(integer + 1);
}
});
} else {
return Observable.<UpdateMainResponse>empty().doOnCompleted(pageControl::onCompleted);
}
});
return Observable.defer(() -> ret);
}
Zipper
private Observable<ZipperResponse> batchedZip() {
return Observable.zip(getPickedOrders(), getDeliveredOrders(), (updateMainResponse, updateMainResponse2) -> {
List<UpdateMainResponse> orders = new ArrayList<>();
bakeries.add(updateMainResponse);
bakeries.add(updateMainResponse2);
return new ZipperResponse(orders);
});
}
Utilizing Zipper
public void generalUpload(APIRequestListener listener) {
batchedZip.subscribe(new Subscriber<ZipperResponse>() {
#Override
public void onCompleted() {
listener.didComplete();
unsubscribe();
}
#Override
public void onError(Throwable e) {
listener.handleDefaultError(e);
unsubscribe();
}
#Override
public void onNext(ZipperResponse zipperResponse) {
Log.e(TAG, zipperResponse.size());
}
});
}
Problem
I don't know why getDeliveredOrders() doesn't get called unless I move it to the first before getPickedOrders()
Reading through Rx Documentation for Zip I can see that it's not going to work as I expected where all of getPickedOrders() runs first before getDeliveredOrders() runs. It'll have to do it one by one. E.g: One of Pickup and then One of Delivery
Any help to understand what's going on would be appreciated. Thanks
Ok, so if I got that right:
Pickup only: need to run through the Pickup process, then they complete.
Delivery only: need to run through the Delivery process, then they complete.
Pickup and Delivery: need to run through Pickup first, then through Delivery.
On a very high level, almost preudo-code, why does this process not work?
Observable<Item> performPickup(Item item);
Observable<Item> performDelivery(Item item);
Observable<Items> items = ...;
items
.flatMap(item -> item.needsPickup() ? performPickup(item) : Observable.just(item))
.flatMap(item -> item.needsDelivery() ? performDelivery(item) : Observable.just(item))
.doOnNext(completedItem -> ...)
If you have different sources for the three types:
Observable<Item> items = Observable.merge(
pickupSource(),
deliverySource(),
pickupAndDeliverySource());

Games.RealtimeMultiplayer.getWaitingRoomIntent null pointer exception

I create a room and it gets successfully made. And my onRoomCreated method gets called...
#Override
public void onRoomCreated(int statusCode, Room room) {
mRoomId = room.getRoomId();
Intent i = Games.RealTimeMultiplayer.getWaitingRoomIntent(gApiClient, room, 2);
startActivityForResult(i, RC_WAITING_ROOM);
}
Then in my onActivityResult...
Room r = data.getExtras().getParcelable(Multiplayer.EXTRA_ROOM);
ArrayList<String> invitees = new ArrayList<String>();
for (Participant p : r.getParticipants()) {
invitees.add(p.getPlayer().getPlayerId()); //<---NULL POINTER!
}
I get that null pointer. Why?
EDIT: The android docs say this about the getPlayer() method...
Returns the Player that this participant represents. Note that this may be null if the identity of the player is unknown. This occurs in automatching scenarios where some players are not permitted to see the real identity of others.
That is why I am getting null, because my room is through auto-matching.
Now the question is. How can I create a turnbasedgame using only participant IDs? Not Player IDs
Now that I see what you are asking more clearly (my fault, not yours), here is how I do it:
(for clarification I use LibGDX, so may be some interface stuff you don't need, and I am still using GamesClient not the new API methods, but is for all intents the same)
First, the final call I look to start my game is onRoomConnected
#Override
public void onRoomConnected(int statusCode, Room room) {
//dLog("onRoomConnected");
mRoomCurrent = room;
mParticipants = room.getParticipants();
mMyID = room.getParticipantId(aHelper.getGamesClient().getCurrentPlayerId());
//dLog("The id is " + mMyID);
try {
bWaitRoomDismissedFromCode = true;
finishActivity(RC_WAITING_ROOM);
} catch (Exception e) {
//dLog("would have errored out in waiting room");
}
//tell the Game the room is connected
if (statusCode == GamesClient.STATUS_OK) {
theGameInterface.onRoomConnected(room.getParticipantIds(), mMyID, room.getCreationTimestamp() );
} else {
leaveRoom();
}
}
So, now have all the participantIDs.. now in my Game code (where I sent that List of Ids), I sort the list of IDs so that in determining Player order, it is the same methodology for all Players. First I build my opponents.
private void buildOpponents() {
// this creates a new opponent with a View on the Stage()
//sort the participants the same for all players
sortParticipantIDs();
for (String s : mParticipantIds) {
if(s.contains(mMyID) || mMyID.contains(s)) continue;
newOpponentWindow ow = new newOpponentWindow(s, MyAssetManager.getMySkin(), getStage());
Opponent o = new Opponent(this, s);
mapOpponents.put(s, o);
o.setWindow(ow);
getStage().addActor(ow);
}
setOpponentWindowPositions();
}
Then after some more setup I start Play and my first Time through, I have chosen that whoever is the top ID gets the honor of starting (I find this randomizes play enough, without having to do another method.. .but you can let the top ID do another method, and send that out to the other Players) Note this checks over my Opponents to determine Starting Player if someone leaves the room later in the game as well.
private boolean determineIfStartingBidder() {
Collections.sort(mParticipantIds);
// now look thru list
// if the number is mine then return true
// if the number is not mine.. and opponent is not Out of Game or Disconnected.. then return false
for (String s : mParticipantIds) {
if(s.contains(mMyID) || mMyID.contains(s)){
return true;
}
if(mapOpponents.get(s).getCurrentState() == currentState.DISCONNECTED || mapOpponents.get(s).getCurrentState() == currentState.OUTOFGAME ||
mapOpponents.get(s).getCurrentState() == currentState.LOSTGAME) {
continue;
}
return false;
}
return false;
}
Then in your game logic, just go through your ParticipantID list in whatever manner makes sense to pass the baton around! This works well, since all the calls for passing messages require the ParticipantID, and are there for easy grab n go!
Prior Answer Below ------------------------------------------------
try
data.getParcelableExtra(Multiplayer.EXTRA_ROOM);
no need for the getExtras

Categories

Resources