Improve ConcurrentLinkedQueue to Channel Queue in Kotlin - android

I am building an application in which user can read BPM data in mobile. I was reading in some posts that I need to build a queue for that, it run at a time and hold next command in queue until it finish the first job. I used some piece of code from the library. I want to check my existing queue why is it slow? if anything which is more efficient then ConcurrentLinkedQueue, then definitely I'll try that. I was reading some articles with Channel is type of Queue which behaves First In First Out. TBH I don't know it will work. Can you guys help me on this?
This is function of setupQueuePolling
private fun setupQueuePolling() {
viewModelScope.launch(Dispatchers.IO) {
Log.e(TAG, "Starting Polling")
while (true) {
synchronized(commandQueue) {
if (!commandQueue.isEmpty()) {
commandQueue.poll()?.let { qItem ->
qItem("This is input")
}
}
}
}
}
}
I have added the queue command for calling this function addItemToQueue
fun addItemToQueue(item: (input: String) -> Unit) {
Log.e(TAG, "Added Item ->> $item")
commandQueue.add(item)
}
I am calling addItemToQueue in MainActivity.kt, onConnectionStateChange, onServicesDiscovered & onCharacteristicChanged with GlobalScope using onServicesDiscovered & startScan.
I don't understand, why my queue is so slow to response back in time. Library is very fast to give response. My whole project is in here.
Thanks

At first glance, it's quite hard to say why it is slow. What I see is that synchronized(commandQueue) is being used while ConcurrentLinkedQueue is already a thread-safe queue, so the synchronized(commandQueue) can be omitted.
Using features of Kotlin coroutines, I would use a Flow in this case, particularly MutableSharedFlow. It is thread-safe and uses principles of queue. For example in this case it would look like the following:
private val commandFlow = MutableSharedFlow<(input: String) -> Unit>()
suspend fun addItemToQueue(item: (input: String) -> Unit) {
commandFlow.emit(item1) // emitting item to commandFlow
}
private fun setupQueuePolling() {
viewModelScope.launch {
// handle commands emitted in addItemToQueue() method.
commandFlow.collect { item ->
item("This is input")
}
}
}
If this doesn't improve the speed, further investigation should be made, perhaps BLE device executes commands slowly. Additional logs of each operation could be helpful.

Related

Kotlin Flow not collected anymore after working initially

Basically I want to make a network request when initiated by the user, collect the Flow returned by the repository and run some code depending on the result. My current setup looks like this:
Viewmodel
private val _requestResult = MutableSharedFlow<Result<Data>>()
val requestResult = _requestResult.filterNotNull().shareIn(
scope = viewModelScope,
started = SharingStarted.WhileViewSubscribed,
replay = 0
)
fun makeRequest() {
viewModelScope.launch {
repository.makeRequest().collect { _requestResult.emit(it) }
}
}
Fragment
buttonLayout.listener = object : BottomButtonLayout.Listener {
override fun onButtonClick() {
viewModel.makeRequest()
}
}
lifecycleScope.launchWhenCreated {
viewModel.requestResult.collect { result ->
when (result) {
Result.Loading -> {
doStuff()
}
is Result.Success -> {
doDifferentStuff(result.data)
}
is Result.Failure -> {
handleError()
}
}
}
}
The first time the request is made everything seems to work. But starting with the second time the collect block in the fragment does not run anymore. The request is still made, the repository returns the flow as expected, the collect block in the viewmodel runs and emit() also seems to be executed successfully.
So what could be the problem here? Something about the coroutine scopes? Admittedly I lack any sort of deeper understanding of the matter at hand.
Also is there a more efficient way of accomplishing what I'm attempting using Kotlin Flows in general? Collecting a flow and then emitting the same flow again seems a bit counterintuitive.
Thanks in advance:)
According to the documentation there are two recommended alternatives:
viewLifecycleOwner.lifecycleScope.launch {
viewLifecycleOwner.repeatOnLifecycle(Lifecycle.State.STARTED) {
//your thing
}
}
I rather the other alternative:
viewLifecycleOwner.lifecycleScope.launch {
viewModel.makeReques().flowWithLifecycle(viewLifecycleOwner.lifecycle, Lifecycle.State.STARTED)
.collect {
// Process the value.
}
}
I like the flowWithLifecycle shorter syntax and less boiler plate. Be carefull thar is bloking so you cant have anything after that.
The oficial docs
https://developer.android.com/topic/libraries/architecture/coroutines
Please be aware you need the lifecycle aware library.

Coroutine Thread Safety with Retrofit

I have still a little bit of trouble putting all information together about the thread-safety of using coroutines to launch network requests.
Let's say we have following use-case, there is a list of users we get and for each of those users, I will do some specific check which has to run over a network request to the API, giving me some information back about this user.
The userCheck happens inside a library, which doesn't expose suspend functions but rather still uses a callback.
Inside of this library, I have seen code like this to launch each of the network requests:
internal suspend fun <T> doNetworkRequest(request: suspend () -> Response<T>): NetworkResult<T> {
return withContext(Dispatchers.IO) {
try {
val response = request.invoke()
...
According to the documentation, Dispatchers.IO can use multiple threads for the execution of the code, also the request function is simply a function from a Retrofit API.
So what I did is to launch the request for each user, and use a single resultHandler object, which will add the results to a list and check if the length of the result list equals the length of the user list, if so, then all userChecks are done and I know that I can do something with the results, which need to be returned all together.
val userList: List<String>? = getUsers()
val userCheckResultList = mutableListOf<UserCheckResult>()
val handler = object : UserCheckResultHandler {
override fun onResult(
userCheckResult: UserCheckResult?
) {
userCheckResult?.let {
userCheckResultList.add(
it
)
}
if (userCheckResultList.size == userList?.size) {
doSomethingWithResultList()
print("SUCCESS")
}
}
}
userList?.forEach {
checkUser(it, handler)
}
My question is: Is this implementation thread-safe? As far as I know, Kotlin objects should be thread safe, but I have gotten feedback that this is possibly not the best implementation :D
But in theory, even if the requests get launched asynchronous and multiple at the same time, only one at a time can access the lock of the thread the result handler is running on and there will be no race condition or problems with adding items to the list and comparing the sizes.
Am I wrong about this?
Is there any way to handle this scenario in a better way?
If you are executing multiple request in parallel - it's not. List is not thread safe. But it's simple fix for that. Create a Mutex object and then just wrap your operation on list in lock, like that:
val lock = Mutex()
val userList: List<String>? = getUsers()
val userCheckResultList = mutableListOf<UserCheckResult>()
val handler = object : UserCheckResultHandler {
override fun onResult(
userCheckResult: UserCheckResult?
) {
lock.withLock {
userCheckResult?.let {
userCheckResultList.add(
it
)
}
if (userCheckResultList.size == userList?.size) {
doSomethingWithResultList()
print("SUCCESS")
}
}
}
}
userList?.forEach {
checkUser(it, handler)
}
I have to add that this whole solution seems very hacky. I would go completely other route. Run all of your requests wrapping those in async { // network request } which will return Deferred object. Add this object to some list. After that wait for all of those deferred objects using awaitAll(). Like that:
val jobs = mutableListOf<Job>()
userList?.forEach {
// i assume checkUser is suspendable here
jobs += async { checkUser(it, handler) }
}
// wait for all requests
jobs.awaitAll()
// After that you can access all results like this:
val resultOfJob0 = jobs[0].getCompleted()

How do I pause execution in Kotlin whilst APIs complete

I am trying to write a simple app in Android Studio using Kotlin. It is a very steep learning curve for me, but I am almost there. My final problem is getting the app to wait for the APIs to complete before moving the next Intent.
I have three calls each uploading data via my API. They are called from a button and only when the three are uploaded, should the button send the user to the next intent/screen.
My API calls are working and I can see the data in the database. However, since enqueue is asynchronous the calls are firing and the code is moving on the start the next intent before the data is present.
The code below is executed 3 times (once for each upload). I realise this is probably not the best way to do it, but I'm trying to get it working before I finesse the code.
I thought that perhaps I could have a variable, UploadedReadCount, that I increment in the onResponse, but this doesn't seem to be working properly.
Could someone offer some advice as to how I should be pausing the code until the APIs complete? For example, is there an enqueue methos that isn't async?
ReadInterface.create().AddRead("new", rFuel, rRegister, rReadDate, rRead)
.enqueue(object : Callback<UploadedRead> {
override fun onFailure(call: Call<UploadedRead>, t: Throwable) {
Log.d("Err: ", t.localizedMessage!!)
t.printStackTrace()
}
override fun onResponse(call: Call<UploadedRead>, response: Response<UploadedRead>) {
Log.d("Response: ", response.body().toString())
val p = response.body()?.APIResult!![0]
msgShow("Gas read " + rRead.toString() + " uploaded")
UploadedReadCount += 1
}
})
while ( UploadedReadCount < 3) {
Log.d("Waiting ", UploadedReadCount.toString() + " reads uploaded...")
}
val intent = Intent(this, Billing::class.java).apply {
putExtra("ReadDate", txtReadDate.text.toString())
}
startActivity(intent)
In most cases you don't want to pause execution while API call returns, Instead you want to follow the reactive model, that is when you call API you specify some callbacks (onResponse, onFailure), and once these callbacks are invoked then you react.
code is moving on the start the next intent before the data is
present.
Move all of your code that depends on data received from API in onResponse or onFailure methods (callbacks), When API is ready with some response one of those callbacks will be invoked and then depending on the data that you receive from API you can continue your work.
is there an enqueue methos that isn't async?
There are options available to call an API in blocking manner but I don't think that is good idea. Instead of doing a blocking API call, you should try to do reactive programming that is as soon as any callback (onResponse, onFailure) is called only then you continue.
There is an alternative to enqueue that is suspending instead of async, so you can call your code sequentially without blocking the main thread in a coroutine. The function is await() and it returns the successful result or throws an HttpException on failure.
But to run three requests in parallel, you need to use the async coroutine builder. This can be done by mapping a list of Calls to async calls that await the individual results, and then using awaitAll() on the list of Deferreds to wait for all three. So, it's more complicated than just running sequential code in a coroutine, but I think this is still easier than trying to run and wait for three parallel calls using callbacks.
I'm not exactly sure what your other two calls are so I'll just make up some and assume this function already has all the data it needs to make the calls. I also don't know how you want to handle failure, so I'm just making it stop early if any of the three calls fail.
lifecycleScope.launch {
val requests: List<Call<UploadedRead>> = listOf(
ReadInterface.create().AddRead("new", rFuel, rRegister, rReadDate, rRead),
ReadInterface.create().AddRead("new2", rFuel, rRegister, rReadDate, rRead),
ReadInterface.create().AddRead("new3", rFuel, rRegister, rReadDate, rRead)
)
val responses: List<UploadedRead> = try {
coroutineScope { // any failure in this block cancels them all
requests.map { async { it.await() } } // run them simultaneously with async
.awaitAll()
}
} catch (e: HttpException) {
Log.d("Err: ", e.localizedMessage.toString())
printStackTrace(e)
return#launch
}
// Do something with the list of three UploadedReads here.
}
I just duplicated the functionality of your code above, but it doesn't look like you're using the response for anything and you have an unused variable p.
Edit: If this is a pattern you use frequently, this helper function might be useful. I didn't check this thoroughly or test it.
/**
* Await the result of all the Calls in parallel. Any exception thrown by any item
* in the list will cancel all unfinished calls and be rethrown.
*/
suspend fun <T: Any> Iterable<Call<T>>.awaitAll(): List<T> =
coroutineScope { map { async { it.await } }.awaitAll() }
//...
lifecycleScope.launch {
val requests: List<Call<UploadedRead>> = listOf(
//...
)
val responses: List<UploadedRead> = try {
requests.awaitAll()
} catch (e: HttpException) {
//...
return#launch
}
//...
}

Using Coroutines in Intent service

I have a problem with freezing of my application.
So:
I have intent service (it is mean all already in another thread)
I have a list of users.
I should download photos for each user and push them to another cloud service (with face-recognition).
Now we are using a trial version of this service. So it can sends only 10 requests in minute. I want sequential execution of the program (simple version)
But application freeze when it is trying to download user's photos. Only after finishing, application starts work. I hope you understand my explanation.
Here is simple code (I'm using corotinues for it):
private val ioScope = CoroutineScope(Dispatchers.IO + Job())
onCreate method of IntentService
override fun onCreate() {
super.onCreate()
ioScope.launch {
Repository.getUserList().observeOnce(Observer { users ->
users?.forEach { user ->
addedNewUser(user)
}
})
}
}
addedNewUser
private suspend fun addedNewUser(user: User) = withTimeoutOrNull(TWO_MINUTES) {
user.mail ?: return#withTimeoutOrNull
launch {
try {
// withContext is freez place
val file = withContext(ioScope.coroutineContext) { getUserAvatar(applicationContext, user.mail) }
// do something.......
file.delete()
} catch (e: ClientException) { // in free price mode face api allow only 20 requests in minutes
delay(ONE_MINUTE)
}
}.join()
}
Do you have any ideas? Why withContext(Dispatchers.Default) is freeze?
Also I have tried withContext(this.coroutineContext), but it doesn't wait when file will be downloaded.
Thx, for your time!
UPDATE (answer)
Thx everybody, who tried to help me! I think we found the problem. Repository.getUserList() - return a livedata. So when suspend or runBlocking function started in observer, I watched a freezing. If wraped it in corotinues or new thread it works correctly:
Repository.getUserList().observeOnce(Observer { users ->
ioScope.launch {
users?.forEach { user ->
addedNewUser(user)
}
}.start()
})
Unfortunately I don't know all details how it work under hood, but seems, that observer return value in main thread. So suspend function just stopped main thread.

Kotlin Coroutine to escape callback hell

I'm trying to use Kotlin's coroutines to avoid callback hell, but it doesnt look like I can in this specific situation, I would like some thougths about it.
I have this SyncService class which calls series of different methods to send data to the server like the following:
SyncService calls Sync Student, which calls Student Repository, which calls DataSource that makes a server request sending the data through Apollo's Graphql Client.
The same pattern follows in each of my features:
SyncService -> Sync Feature -> Feature Repository -> DataSource
So every one of the method that I call has this signature:
fun save(onSuccess: ()-> Unit, onError:()->Unit) {
//To Stuff here
}
The problem is:
When I sync and successfully save the Student on server, I need to sync his enrollment, and if I successfully save the enrollment, I need to sync another object and so on.
It all depends on each other and I need to do it sequentially, that's why I was using callbacks.
But as you can imagine, the code result is not very friendly, and me and my team starting searching for alternatives to keep it better. And we ended up with this extension function:
suspend fun <T> ApolloCall<T>.execute() = suspendCoroutine<Response<T>> { cont ->
enqueue(object: ApolloCall.Callback<T>() {
override fun onResponse(response: Response<T>) {
cont.resume(response)
}
override fun onFailure(e: ApolloException) {
cont.resumeWithException(e)
}
})
}
But the function in DataSource still has a onSuccess() and onError() as callbacks that needs to be passed to whoever call it.
fun saveStudents(
students: List<StudentInput>,
onSuccess: () -> Unit,
onError: (errorMessage: String) -> Unit) {
runBlocking {
try {
val response = GraphQLClient.apolloInstance
.mutate(CreateStudentsMutation
.builder()
.students(students)
.build())
.execute()
if (!response.hasErrors())
onSuccess()
else
onError("Response has errors!")
} catch (e: ApolloException) {
e.printStackTrace()
onError("Server error occurred!")
}
}
}
The SyncService class code changed to be like:
private fun runSync(onComplete: () -> Unit) = async(CommonPool) {
val syncStudentProcess = async(coroutineContext, start = CoroutineStart.LAZY) {
syncStudents()
}
val syncEnrollmentProcess = async(coroutineContext, start = CoroutineStart.LAZY) {
syncEnrollments()
}
syncStudentProcess.await()
syncEnrollmentProcess.await()
onComplete()
}
It does execute it sequentially, but I need a way to stop every other coroutine if any got any errors. Error that might come only from Apollo's
So I've been trying a lot to find a way to simplify this code, but didn't get any good result. I don't even know if this chaining of callbacks can be simplify at all. That's why I came here to see some thoughts on it.
TLDR: I want a way to execute all of my functions sequentially, and still be able to stop all coroutines if any got an exception without a lot o chaining callbacks.

Categories

Resources