kotlin, how to run sequential background threads [duplicate] - android

I have an instance of CoroutineScope and log() function which look like the following:
private val scope = CoroutineScope(Dispatchers.IO)
fun log(message: String) = scope.launch { // launching a coroutine
println("$message")
TimeUnit.MILLISECONDS.sleep(100) // some blocking operation
}
And I use this test code to launch coroutines:
repeat(5) { item ->
log("Log $item")
}
The log() function can be called from any place, in any Thread, but not from a coroutine.
After a couple of tests I can see not sequential result like the following:
Log 0
Log 2
Log 4
Log 1
Log 3
There can be different order of printed logs. If I understand correctly the execution of coroutines doesn't guarantee to be sequential. What it means is that a coroutine for item 2 can be launched before the coroutine for item 0.
I want that coroutines were launched sequentially for each item and "some blocking operation" would execute sequentially, to always achieve next logs:
Log 0
Log 1
Log 2
Log 3
Log 4
Is there a way to make launching coroutines sequential? Or maybe there are other ways to achieve what I want?
Thanks in advance for any help!

One possible strategy is to use a Channel to join the launched jobs in order. You need to launch the jobs lazily so they don't start until join is called on them. trySend always succeeds when the Channel has unlimited capacity. You need to use trySend so it can be called from outside a coroutine.
private val lazyJobChannel = Channel<Job>(capacity = Channel.UNLIMITED).apply {
scope.launch {
consumeEach { it.join() }
}
}
fun log(message: String) {
lazyJobChannel.trySend(
scope.launch(start = CoroutineStart.LAZY) {
println("$message")
TimeUnit.MILLISECONDS.sleep(100) // some blocking operation
}
)
}

Since Flows are sequential we can use MutableSharedFlow to collect and handle data sequentially:
class Info {
// make sure replay(in case some jobs were emitted before sharedFlow is being collected and could be lost)
// and extraBufferCapacity are large enough to handle all the jobs.
// In case some jobs are lost try to increase either of the values.
private val sharedFlow = MutableSharedFlow<String>(replay = 10, extraBufferCapacity = 10)
private val scope = CoroutineScope(Dispatchers.IO)
init {
sharedFlow.onEach { message ->
println("$message")
TimeUnit.MILLISECONDS.sleep(100) // some blocking or suspend operation
}.launchIn(scope)
}
fun log(message: String) {
sharedFlow.tryEmit(message)
}
}
fun test() {
val info = Info()
repeat(10) { item ->
info.log("Log $item")
}
}
It always prints the logs in the correct order:
Log 0
Log 1
Log 2
...
Log 9
It works for all cases, but need to be sure there are enough elements set to replay and extraBufferCapacity parameters of MutableSharedFlow to handle all items.
Another approach is
Using Dispatchers.IO.limitedParallelism(1) as a context for the CoroutineScope. It makes coroutines run sequentially if they don't contain calls to suspend functions and launched from the same Thread, e.g. Main Thread. So this solution works only with blocking (not suspend) operation inside launch coroutine builder:
private val scope = CoroutineScope(Dispatchers.IO.limitedParallelism(1))
fun log(message: String) = scope.launch { // launching a coroutine from the same Thread, e.g. Main Thread
println("$message")
TimeUnit.MILLISECONDS.sleep(100) // only blocking operation, not `suspend` operation
}
It turns out that the single thread dispatcher is a FIFO executor. So limiting the CoroutineScope execution to one thread solves the problem.

Related

Thread safe LiveData updates

I have the following code which has a race condition. I try to find an item in a list and set its loading property. But if onLoaded("A") and onLoaded("B") are called multiple times from different threads. I always lose the data of the first call if it doesn't complete before second starts.
How can I make this work? Is using Mutex should be the correct approach?
val list = MutableLiveData<List<Model>>() // assume this is initialized with ["Model(false, "A"), Model(false, "B")]
data class Model(
val loaded: Boolean,
val item: String,
)
fun onLoaded(item: String) = viewModelScope.launch {
val currList = list.value ?: return#launch
withContext(Dispatchers.Default) {
val updated = currList.find { it.item == item }?.copy(loaded = true)
val mutable = currList.toMutableList()
updated?.let {
val index = mutable.indexOf(it)
mutable[index] = it
}
list.postValue(mutable.toList())
}
}
onLoaded("A")
onLoaded("B")
expected: ["Model(true, "A"), Model(true, "B")]
actual: ["Model(false, "A"), Model(true, "B")]
In onLoaded() a new coroutine is launched using viewModelScope. viewModelScope has Dispatchers.Main.immediate context, so the code inside it will be executed on the Main Thread, e.g. execution is limited to only one thread. The reason you have a Race Condition because calling the onLoaded() function consecutively doesn't guarantee the order of coroutines execution.
If you call onLoaded() consecutively from one thread I suggest to remove launching a coroutine viewModelScope.launch in it. Then the order of calling will be preserved. Use list.postValue() in this case.
If you call onLoaded() from different threads and still want to launch a coroutine you can refer to answers to this question.
Try to use #Synchronized anotation without launching a coroutine:
#Synchronized
fun onLoaded(item: String) { ... }
Method will be protected from concurrent execution by multiple threads by the monitor of the instance on which the method is defined. Use list.postValue() in this case.

Flow.collect blocking the main thread

I've the following code that seems to blocking the main thread even though the flow is called on IO coroutine. I'm a kotlin and flow noob. What am I doing wrong here that's blocking the main thread?
Repository:
fun observeData(): Flow<Data> {
return flow {
//third party api is getting data from a ContentProvider
ThirdPartyApi.getData().map { convertFromExternalModelToDataModel(it) }
.collect {
emit(it)
}
}
}
ViewModel:
fun updateUI() {
scope.launch(Dispatchers.IO) {
repository.observerData().collect {
withContext(Dispatchers.Main) {
textView.text = data.name
}
}
}
}
Upon running the following code it I see logs from Android Choreographer "Skipped 200 frames. App is going too much work on main thread"
To collect the data stream with Kotlin Flows as they're emitted, use collect. And as collect is a suspending function, it needs to be executed within a coroutine. It takes a lambda as a parameter that is called on every new value. Since it's a suspend function, the coroutine that calls collect may suspend until the flow is closed.
And you shouldn't be updating your UI inside a ViewModel.
In this case we collect flow inside an activity's lifecycle scope that is main safe and has activity's lifecycle awareness.
And to make our service or repository to execute in a different CouroutineContext, use the intermediate operator flowOn.
flowOn changes the CoroutineContext of the upstream flow, meaning the producer and any intermediate operators applied before (or above) flowOn.
The downstream flow (the intermediate operators after flowOn along with the consumer) is not affected and executes on the CoroutineContext used to collect from the flow.
ViewModel:
fun getData():Flow<Data> = repository.observeData() // Execute on the io dispatcher
// flowOn affects the upstream flow ↑
.flowOn(Dispatchers.IO)
// the downstream flow ↓ is not affected
.catch { exception -> // Executes in the consumer's context
emit(Data())
}
Activity:
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
lifecycleScope.launch { // Consumer's context
viewModel.getData().collect { // Suspended
textView.text = data.name // Collect on consumer's context
}
}
}

How to run Kotlin coroutines sequentially?

I have an instance of CoroutineScope and log() function which look like the following:
private val scope = CoroutineScope(Dispatchers.IO)
fun log(message: String) = scope.launch { // launching a coroutine
println("$message")
TimeUnit.MILLISECONDS.sleep(100) // some blocking operation
}
And I use this test code to launch coroutines:
repeat(5) { item ->
log("Log $item")
}
The log() function can be called from any place, in any Thread, but not from a coroutine.
After a couple of tests I can see not sequential result like the following:
Log 0
Log 2
Log 4
Log 1
Log 3
There can be different order of printed logs. If I understand correctly the execution of coroutines doesn't guarantee to be sequential. What it means is that a coroutine for item 2 can be launched before the coroutine for item 0.
I want that coroutines were launched sequentially for each item and "some blocking operation" would execute sequentially, to always achieve next logs:
Log 0
Log 1
Log 2
Log 3
Log 4
Is there a way to make launching coroutines sequential? Or maybe there are other ways to achieve what I want?
Thanks in advance for any help!
One possible strategy is to use a Channel to join the launched jobs in order. You need to launch the jobs lazily so they don't start until join is called on them. trySend always succeeds when the Channel has unlimited capacity. You need to use trySend so it can be called from outside a coroutine.
private val lazyJobChannel = Channel<Job>(capacity = Channel.UNLIMITED).apply {
scope.launch {
consumeEach { it.join() }
}
}
fun log(message: String) {
lazyJobChannel.trySend(
scope.launch(start = CoroutineStart.LAZY) {
println("$message")
TimeUnit.MILLISECONDS.sleep(100) // some blocking operation
}
)
}
Since Flows are sequential we can use MutableSharedFlow to collect and handle data sequentially:
class Info {
// make sure replay(in case some jobs were emitted before sharedFlow is being collected and could be lost)
// and extraBufferCapacity are large enough to handle all the jobs.
// In case some jobs are lost try to increase either of the values.
private val sharedFlow = MutableSharedFlow<String>(replay = 10, extraBufferCapacity = 10)
private val scope = CoroutineScope(Dispatchers.IO)
init {
sharedFlow.onEach { message ->
println("$message")
TimeUnit.MILLISECONDS.sleep(100) // some blocking or suspend operation
}.launchIn(scope)
}
fun log(message: String) {
sharedFlow.tryEmit(message)
}
}
fun test() {
val info = Info()
repeat(10) { item ->
info.log("Log $item")
}
}
It always prints the logs in the correct order:
Log 0
Log 1
Log 2
...
Log 9
It works for all cases, but need to be sure there are enough elements set to replay and extraBufferCapacity parameters of MutableSharedFlow to handle all items.
Another approach is
Using Dispatchers.IO.limitedParallelism(1) as a context for the CoroutineScope. It makes coroutines run sequentially if they don't contain calls to suspend functions and launched from the same Thread, e.g. Main Thread. So this solution works only with blocking (not suspend) operation inside launch coroutine builder:
private val scope = CoroutineScope(Dispatchers.IO.limitedParallelism(1))
fun log(message: String) = scope.launch { // launching a coroutine from the same Thread, e.g. Main Thread
println("$message")
TimeUnit.MILLISECONDS.sleep(100) // only blocking operation, not `suspend` operation
}
It turns out that the single thread dispatcher is a FIFO executor. So limiting the CoroutineScope execution to one thread solves the problem.

How do you make make a subscriber to a kotlin sharedflow run operations in parallel?

I have a connection to a Bluetooth device that emits data every 250ms
In my viewmodel I wish to subscribe to said data , run some suspending code (which takes approximatelly 1000ms to run) and then present the result.
the following is a simple example of what I'm trying to do
Repository:
class Repo() : CoroutineScope {
private val supervisor = SupervisorJob()
override val coroutineContext: CoroutineContext = supervisor + Dispatchers.Default
private val _dataFlow = MutableSharedFlow<Int>()
private var dataJob: Job? = null
val dataFlow: Flow<Int> = _dataFlow
init {
launch {
var counter = 0
while (true) {
counter++
Log.d("Repo", "emmitting $counter")
_dataFlow.emit(counter)
delay(250)
}
}
}
}
the viewmodel
class VM(app:Application):AndroidViewModel(app) {
private val _reading = MutableLiveData<String>()
val latestReading :LiveData<String>() = _reading
init {
viewModelScope.launch(Dispatchers.Main) {
repo.dataFlow
.map {
validateData() //this is where some validation happens it is very fast
}
.flowOn(Dispatchers.Default)
.forEach {
delay(1000) //this is to simulate the work that is done,
}
.flowOn(Dispatchers.IO)
.map {
transformData() //this will transform the data to be human readable
}
.flowOn(Dispatchers.Default)
.collect {
_reading.postValue(it)
}
}
}
}
as you can see, when data comes, first I validate it to make sure it is not corrupt (on Default dispatcher) then I perform some operation on it (saving and running a long algorithm that takes time on the IO dispatcher) then I change it so the application user can understand it (switching back to Default dispatcher) then I post it to mutable live data so if there is a subscriber from the ui layer they can see the current data (on the Main dispatcher)
I have two questions
a) If validateData fails how can I cancel the current emission and move on to the next one?
b) Is there a way for the dataFlow subscriber working on the viewModel to generate new threads so the delay parts can run in parallel?
the timeline right now looks like the first part, but I want it to run like the second one
Is there a way to do this?
I've tried using buffer() which as the documentation states "Buffers flow emissions via channel of a specified capacity and runs collector in a separate coroutine." but when I set it to BufferOverflow.SUSPEND I get the behaviour of the first part, and when I set it to BufferOverflow.DROP_OLDEST or BufferOverflow.DORP_LATEST I loose emissions
I have also tried using .conflate() like so:
repo.dataFlow
.conflate()
.map { ....
and even though the emissions start one after the other, the part with the delay still waits for the previous one to finish before starting the next one
when I use .flowOn(Dispatchers.Default) for that part , I loose emissions, and when I use .flowOn(Dispatchers.IO) or something like Executors.newFixedThreadPool(4).asCoroutineDispatcher() they always wait for the previous one to finish before starting a new one
Edit 2:
After about 3 hours of experiments this seems to work
viewModelScope.launch(Dispatchers.Default) {
repo.dataFlow
.map {
validateData(it)
}
.flowOn(Dispatchers.Default)
.map {
async {
delay(1000)
it
}
}
.flowOn(Dispatchers.IO) // NOTE (A)
.map {
val result = it.await()
transformData(result)
}
.flowOn(Dispatchers.Default)
.collect {
_readings.postValue(it)
}
}
however I still haven't figured out how to cancel the emission if validatedata fails
and for some reason it only works if I use Dispatchers.IO , Executors.newFixedThreadPool(20).asCoroutineDispatcher() and Dispatchers.Unconfined where I put note (A), Dispatchers.Main does not seem to work (which I expected) but Dispatchers.Default also does not seem to work and I don't know why
First question: Well you cannot recover from an exception in a sense of continuing
the collection of the flow, as per docs "Flow collection can complete with an exception when an emitter or code inside the operators throw an exception." therefore once an exception has been thrown the collection is completed (exceptionally) you can however handle the exception by either wrapping your collection inside try/catch block or using the catch() operator.
Second question: You cannot, while the producer (emitting side) can be made concurrent
by using the buffer() operator, collection is always sequential.
As per your diagram, you need fan out (one producer, many consumers), you cannot
achieve that with flows. Flows are cold, each time you collect from them, they start
emitting from the beginning.
Fan out can be achieved using channels, where you can have one coroutine producing
values and many coroutines that consume those values.
Edit: Oh you meant the validation failed not the function itself, in that case you can use the filter() operator.
The BroadcastChannel and ConflatedBroadcastChannel are getting deprecated. SharedFlow cannot help you in your use case, as they emit values in a broadcast fashion, meaning producer waits until all consumers consume each value before producing the next one. That is still sequential, you need parallelism. You can achieve it using the produce() channel builder.
A simple example:
val scope = CoroutineScope(Job() + Dispatchers.IO)
val producer: ReceiveChannel<Int> = scope.produce {
var counter = 0
val startTime = System.currentTimeMillis()
while (isActive) {
counter++
send(counter)
println("producer produced $counter at ${System.currentTimeMillis() - startTime} ms from the beginning")
delay(250)
}
}
val consumerOne = scope.launch {
val startTime = System.currentTimeMillis()
for (x in producer) {
println("consumerOne consumd $x at ${System.currentTimeMillis() - startTime}ms from the beginning.")
delay(1000)
}
}
val consumerTwo = scope.launch {
val startTime = System.currentTimeMillis()
for (x in producer) {
println("consumerTwo consumd $x at ${System.currentTimeMillis() - startTime}ms from the beginning.")
delay(1000)
}
}
val consumerThree = scope.launch {
val startTime = System.currentTimeMillis()
for (x in producer) {
println("consumerThree consumd $x at ${System.currentTimeMillis() - startTime}ms from the beginning.")
delay(1000)
}
}
Observe production and consumption times.

doAsync Kotlin-android doesn't work well

I am using a callback function when async ends. but it doesn't work well :(
my case:
fun function1(callback : (obj1: List<ObjT1>,obj2: List<ObjT1>)-> Unit?){
doAsync {
//long task
uiThread { callback(result1, result2) }
}
}
the callback is called but result1 and result2(lists) are empty. I checked previous the content of the list.
EDIT:
PROBLEM: my callback is a function that receives 2 objects result 1 and result2, the problem is the function callback sometimes receives the results empty, i check their content and is not empty.
It may be because you've declared return type as Unit? but are returning two values. A quick fix would be to put result1 and result2 in an array.
Now this question is about deprecated Kotlin library.
I recommend use coroutines.
Consider using Kotlin's coroutines. Coroutines is a newer feature in Kotlin. It is still technically in it's experimental phase, but JetBrains has told us that it is very stable.
Read more here: https://kotlinlang.org/docs/reference/coroutines.html
Here is some sample code:
fun main(args: Array<String>) = runBlocking { // runBlocking is only needed here because I am calling join below
val job = launch(UI) { // The launch function allows you to execute suspended functions
val async1 = doSomethingAsync(250)
val async2 = doSomethingAsync(50)
println(async1.await() + async2.await()) // The code within launch will
// be paused here until both async1 and async2 have finished
}
job.join() // Just wait for the coroutines to finish before stopping the program
}
// Note: this is a suspended function (which means it can be "paused")
suspend fun doSomethingAsync(param1: Long) = async {
delay(param1) // pause this "thread" (not really a thread)
println(param1)
return#async param1 * 2 // return twice the input... just for fun
}

Categories

Resources