I have a project using RecognitionListener written in Kotlin. The speech-to-text function was always a success and never presented any problems.
Since last week, it's onResult function started to be called twice. No changes were made on the project. I tested old versions of the project (from months ago) and those had the same problem.
There are three different cases:
Small text (1 to 8 words) and SpeechRecognizer being stopped automatically -> onResult() called twice;
Big text (9 words or more) and SpeechRecognizer being stopped automatically -> Normal behavior (onResult() called once);
Any text size and SpeechRecognizer stopListening() function called manually (from code) -> Normal behavior.
Here is the VoiceRecognition speech-to-text class code:
class VoiceRecognition(private val activity: Activity, language: String = "pt_BR") : RecognitionListener {
private val AudioLogTag = "AudioInput"
var voiceRecognitionIntentHandler: VoiceRecognitionIntentHandler? = null
var voiceRecognitionOnResultListener: VoiceRecognitionOnResultListener? = null //Must have this
var voiceRecognitionLayoutChanger: VoiceRecognitionLayoutChanger? = null
var isListening = false
private val intent: Intent
private var speech: SpeechRecognizer = SpeechRecognizer.createSpeechRecognizer(activity)
init {
speech.setRecognitionListener(this)
intent = Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH)
intent.putExtra(
RecognizerIntent.EXTRA_LANGUAGE_MODEL,
RecognizerIntent.LANGUAGE_MODEL_FREE_FORM
)
intent.putExtra(RecognizerIntent.EXTRA_LANGUAGE, language)
}
//It is important to put this function inside a clickListener
fun listen(): Boolean {
if (ContextCompat.checkSelfPermission(activity, Manifest.permission.RECORD_AUDIO) != PackageManager.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(activity, arrayOf(Manifest.permission.RECORD_AUDIO), 1)
return false
}
speech.startListening(intent)
Log.i(AudioLogTag, "startListening")
return true
}
//Use this if you want to stop listening but still get recognition results
fun endListening(){
Log.i(AudioLogTag, "stopListening")
speech.stopListening()
isListening = false
}
fun cancelListening(){
Log.i(AudioLogTag, "cancelListening")
speech.cancel()
voiceRecognitionLayoutChanger?.endListeningChangeLayout()
isListening = false
}
override fun onReadyForSpeech(p0: Bundle?) {
Log.i(AudioLogTag, "onReadyForSpeech")
voiceRecognitionLayoutChanger?.startListeningChangeLayout()
isListening = true
}
override fun onRmsChanged(p0: Float) {
// Log.i(AudioLogTag, "onRmsChanged: $p0")
// progressBar.setProgress((Int) p0)
}
override fun onBufferReceived(p0: ByteArray?) {
Log.i(AudioLogTag, "onBufferReceived: $p0")
}
override fun onPartialResults(p0: Bundle?) {
Log.i(AudioLogTag, "onPartialResults")
}
override fun onEvent(p0: Int, p1: Bundle?) {
Log.i(AudioLogTag, "onEvent")
}
override fun onBeginningOfSpeech() {
Log.i(AudioLogTag, "onBeginningOfSpeech")
}
override fun onEndOfSpeech() {
Log.i(AudioLogTag, "onEndOfSpeech")
voiceRecognitionLayoutChanger?.endListeningChangeLayout()
isListening = false
}
override fun onError(p0: Int) {
speech.cancel()
val errorMessage = getErrorText(p0)
Log.d(AudioLogTag, "FAILED: $errorMessage")
voiceRecognitionLayoutChanger?.endListeningChangeLayout()
isListening = false
}
override fun onResults(p0: Bundle?) {
val results: ArrayList<String> = p0?.getStringArrayList(SpeechRecognizer.RESULTS_RECOGNITION) as ArrayList<String>
Log.i(AudioLogTag, "onResults -> ${results.size}")
val voiceIntent: Int? = voiceRecognitionIntentHandler?.getIntent(results[0])
if (voiceIntent != null && voiceIntent != 0) {
voiceRecognitionIntentHandler?.handle(voiceIntent)
return
}
voiceRecognitionOnResultListener!!.onResult(results[0])
}
private fun getErrorText(errorCode: Int): String {
val message: String
when (errorCode) {
SpeechRecognizer.ERROR_AUDIO -> message = "Audio recording error"
SpeechRecognizer.ERROR_CLIENT -> message = "Client side error"
SpeechRecognizer.ERROR_INSUFFICIENT_PERMISSIONS -> message = "Insufficient permissions"
SpeechRecognizer.ERROR_NETWORK -> message = "Network error"
SpeechRecognizer.ERROR_NETWORK_TIMEOUT -> message = "Network timeout"
SpeechRecognizer.ERROR_NO_MATCH -> message = "No match"
SpeechRecognizer.ERROR_RECOGNIZER_BUSY -> message = "RecognitionService busy"
SpeechRecognizer.ERROR_SERVER -> message = "Error from server"
SpeechRecognizer.ERROR_SPEECH_TIMEOUT -> message = "No speech input"
else -> message = "Didn't understand, please try again."
}
return message
}
//Use it in your overriden onPause function.
fun onPause() {
voiceRecognitionLayoutChanger?.endListeningChangeLayout()
isListening = false
speech.cancel()
Log.i(AudioLogTag, "pause")
}
//Use it in your overriden onDestroy function.
fun onDestroy() {
speech.destroy()
}
listen(), endListening() and cancelListening() are all called from a button.
I found this open issue: https://issuetracker.google.com/issues/152628934
As I commented, I assume it is an issue with the "speech recognition service" and not with the Android RecognitionListener class.
this is my temporary workaround
singleResult=true;
#Override
public void onResults(Bundle results) {
Log.d(TAG, "onResults"); //$NON-NLS-1$
if (singleResult) {
ArrayList<String> matches = results.getStringArrayList(SpeechRecognizer.RESULTS_RECOGNITION);
if (matches != null && matches.size() > 0) {
Log.d("single Result", "" + matches.get(0));
}
singleResult=false;
}
getHandler().postDelayed(new Runnable() {
#Override
public void run() {
singleResult=true;
}
},100);
}
I had the same problem and I've just added a boolean flag in my code, but ofcourse it's a temporary solution and I don't know the source of this problem.
val recognizer = SpeechRecognizer.createSpeechRecognizer(context)
recognizer.setRecognitionListener(
object : RecognitionListener {
var singleResult = true
override fun onResults(results: Bundle?) {
if (singleResult) {
results?.getStringArrayList(SpeechRecognizer.RESULTS_RECOGNITION).let {
// do something with result
}
// next result will be ignored
singleResult = false
}
}
}
This just started happening in one of my apps yesterday. I added a boolean to allow the code to execute only once, but I'd love an explanation as to why it suddenly started doing this. Any updates?
I use the the following code based on time differences, which should continue to work if Google ever gets around to fixing this bug.
long mStartTime = System.currentTimeMillis(); // Global Var
#Override
public void onResults(Bundle results)
{
long difference = System.currentTimeMillis() - mStartTime;
if (difference < 100)
{
return;
}
mStartTime = System.currentTimeMillis();
ArrayList<String> textMatchList =
results.getStringArrayList(SpeechRecognizer.RESULTS_RECOGNITION);
Event_Handler(VOICE_DATA, textMatchList.get(0));
// process event
}
ya even I faced the same issue with my app but I fixed it with a custom logic that is using a flag means a variable let it be a temp variable and by default set it as a false.
what you need to do set the temp as true wherever you are starting listening voice.
Then in your handler what you need to do is just add a if condition on the basis of temp variable like
if (temp) {
do something
temp = false
}
so what will happen you handler will be called twice as usual but you will be able to handle this data only.
Related
I want to call an api multiple times using WorkManager.
where idsArrayList is a list of ids.
I send each id in the api as Path to get response and similarly for other ids.
I want the workManager to return success after it has called api for all ids.
But the problem is WorkManager only returns SUCCESS for one id from the list. This is the first time I'm using WorkManager and I tried starting work manager for every id too by iterating over idsList one by one and making instance of workManger for every id in the for loop. But I thought sending the idsList as data in the workmanager and then itering over ids from inside doWork() would be better, but it's not working like I want and I don't understand why. Here's my code:
class MyWorkManager(appContext: Context, workerParams: WorkerParameters):
Worker(appContext, workerParams) {
private lateinit var callGrabShifts: Call<ConfirmStatus>
override fun doWork(): Result {
val idsList = inputData.getStringArray("IDS_LIST")
val idsArrayList = idsList?.toCollection(ArrayList())
var response = ""
if (idsArrayList != null) {
try {
response = callConfirmShiftApi(idsArrayList)
if (response.contains("CONFIRM")) {
return Result.success()
}
} catch (e: Exception) {
e.printStackTrace()
return Result.failure()
}
}
return Result.retry()
}
private fun callConfirmShiftApi(idsArrayList: ArrayList<String>): String {
var response = ""
for ((index, id) in idsArrayList.withIndex()) {
response = callApiForId(id)
if(index == idsArrayList.lastIndex) {
response = "CONFIRM"
}
}
return response
}
private fun callApiForId(id: String): String {
var shiftGrabStatus = ""
callGrabShifts = BaseApp.apiInterface.confirmGrabAllShifts(BaseApp.userId, id)
callGrabShifts.enqueue(object : Callback<ConfirmStatus> {
override fun onResponse(call: Call<ConfirmStatus>, response: Response<ConfirmStatus>) {
if (response.body() != null) {
shiftGrabStatus = response.body()!!.status
if (shiftGrabStatus != null) {
if (shiftGrabStatus.contains("CONFIRM")) {
val shiftNumber = ++BaseApp.noOfShiftsGrabbed
sendNotification(applicationContext)
shiftGrabStatus = "CONFIRM"
return
} else {
shiftGrabStatus = "NOT CONFIRM"
return
}
} else {
shiftGrabStatus = "NULL"
return
}
} else {
shiftGrabStatus = "NULL"
return
}
}
override fun onFailure(call: Call<ConfirmStatus>, t: Throwable) {
shiftGrabStatus = "FAILURE"
return
}
})
return shiftGrabStatus
}
}
And this is the code where I'm starting the WorkManager:
private fun confirmShiftApi(availableShiftsIdList: ArrayList<String>) {
val data = Data.Builder()
data.putStringArray("IDS_LIST", availableShiftsIdList.toArray(arrayOfNulls<String>(availableShiftsIdList.size)))
val oneTimeWorkRequest = OneTimeWorkRequestBuilder<MyWorkManager>().setInputData(data.build())
.build()
WorkManager.getInstance(applicationContext).enqueue(oneTimeWorkRequest)
WorkManager.getInstance(this).getWorkInfoByIdLiveData(oneTimeWorkRequest.id)
.observe(this, Observer { workInfo: WorkInfo? ->
if (workInfo != null && workInfo.state.isFinished) {
val progress = workInfo.progress
}
Log.d("TESTING", "(MainActivity) : observing work manager - workInfo?.state - ${workInfo?.state}")
})
}
Any suggestions what I might be doing wrong or any other alternative to perform the same? I chose workmanager basicaly to perform this task even when app is closed and for learning purposes as I haven't used WorkManager before. But would switch to other options if this doesn't work.
I tried the following things:
removed the 'var response line in every method that I'm using to set the response, though I added it temporarily just for debugging earlier but it was causing an issue.
I removed the check for "CONFIRM" in doWork() method and just made the api calls, removed the extra return lines.
I tried adding manual delay in between api calls for each id.
I removed the code where I'm sending the ids data from my activity before calling workmanager and made the api call to fetch those ids inside workmanager and added more delay in between those calls to that keep running in background to check for data one round completes(to call api for all ids that were fetched earlier, it had to call api again to check for more ids on repeat)
I removed the extra api calls from onRestart() and from other conditons that were required to call api again.
I tested only one round of api calls for all ids with delay and removed the repeated call part just to test first. Didn't work.
None of the above worked, it just removed extra lines of code.
This is my final code that is tested and It cleared my doubt. Though it didn't fix this issue as the problem was because of backend server and Apis were returning failure in onResponse callback for most ids(when calls are made repeatedly using a for loop for each id) except first id and randomly last id from the list sometimes(with delay) for the rest of the ids it didn't return CONFIRM status message from api using Workmanager. Adding delay didn't make much difference.
Here's my Workmanager code:
class MyWorkManager(appContext: Context, workerParams: WorkerParameters):
Worker(appContext, workerParams) {
private lateinit var callGrabShifts: Call<ConfirmStatus>
override fun doWork(): Result {
val idsList = inputData.getStringArray("IDS_LIST")
val idsArrayList = idsList?.toCollection(ArrayList())
if (idsArrayList != null) {
try {
response = callConfirmShiftApi(idsArrayList)
if (response.contains("CONFIRM")) {
return Result.success()
}
} catch (e: Exception) {
e.printStackTrace()
return Result.failure()
}
}
return Result.success()
}
private fun callConfirmShiftApi(idsArrayList: ArrayList<String>): String {
for ((index, id) in idsArrayList.withIndex()) {
response = callApiForId(id)
Thread.sleep(800)
if(index == idsArrayList.lastIndex) {
response = "CONFIRM"
}
}
return response
}
private fun callApiForId(id: String): String {
callGrabShifts = BaseApp.apiInterface.confirmGrabAllShifts(BaseApp.userId, id)
callGrabShifts.enqueue(object : Callback<ConfirmStatus> {
override fun onResponse(call: Call<ConfirmStatus>, response: Response<ConfirmStatus>) {
if (response.body() != null) {
shiftGrabStatus = response.body()!!.status
if (shiftGrabStatus != null) {
if (shiftGrabStatus.contains("CONFIRM")) {
return
} else {
return
}
} else {
return
}
} else {
return
}
}
override fun onFailure(call: Call<ConfirmStatus>, t: Throwable) {
return
}
})
return shiftGrabStatus
}
Eventually this problem(when an individual call is made for an id, it always returns success but when i call the api for every id using a loop, it only returns success for first call and failure for others) was solved using Service, it didn't have a complete success rate from apis either, but for 6/11 ids the api returned success(400ms delay between each api call), so it served the purpose for now.
I want to detect, how long a specific work is already in enqueue mode. I need this information, in order to inform the user about his state (e.g when workmanager is longer than 10 seconds in enqueue mode -> cancel work -> inform user that he needs to do X in order to achieve Y). Something like this:
Pseudo Code
workInfo.observe(viewLifecylceOwner) {
when(it.state) {
WorkInfo.State.ENQUEUED -> if(state.enqueue.time > 10) cancelWork()
}
}
I didn't find anything about this anywhere. Is this possible?
I appreciate every help.
I have managed to create a somewhat robust "Workmanager watcher". My intention was the following: When the Workmanager is not finished within 7 seconds, tell the user that an error occurred. The Workmanager itself will never be cancelled, furthermore my function is not even interacting with the Workmanager itself. This works in 99% of all cases:
Workerhelper
object WorkerHelper {
private var timeStamp by Delegates.notNull<Long>()
private var running = false
private var manuallyStopped = false
private var finished = false
open val maxTime: Long = 7000000000L
// Push the current timestamp, set running to true
override fun start() {
timeStamp = System.nanoTime()
running = true
manuallyStopped = false
finished = false
Timber.d("Mediator started")
}
// Manually stop the WorkerHelper (e.g when Status is Status.Success)
override fun stop() {
if (!running) return else {
running = false
manuallyStopped = true
finished = true
Timber.d("Mediator stopped")
}
}
override fun observeMaxTimeReachedAndCancel(): Flow<Boolean> = flow {
try {
coroutineScope {
// Check if maxTime is not passed with => (System.nanoTime() - timeStamp) <= maxTime
while (running && !finished && !manuallyStopped && (System.nanoTime() - timeStamp) <= maxTime) {
emit(false)
}
// This will be executed only when the Worker is running longer than maxTime
if (!manuallyStopped || !finished) {
emit(true)
running = false
finished = true
this#coroutineScope.cancel()
} else if (finished) {
this#coroutineScope.cancel()
}
}
} catch (e: CancellationException) {
}
}.flowOn(Dispatchers.IO)
Then in my Workmanager.enqueueWork function:
fun startDownloadDocumentWork() {
WorkManager.getInstance(context)
.enqueueUniqueWork("Download Document List", ExistingWorkPolicy.REPLACE, downloadDocumentListWork)
pushNotification()
}
private fun pushNotification() {
WorkerHelper.start()
}
And finally in my ViewModel
private fun observeDocumentList() = viewModelScope.launch {
observerWorkerState(documentListWorkInfo).collect {
when(it) {
is Status.Loading -> {
_documentDataState.postValue(Status.loading())
// Launch another Coroutine, otherwise current viewmodelscrope will be blocked
CoroutineScope(Dispatchers.IO).launch {
WorkerHelper.observeMaxTimeReached().collect { lostConnection ->
if (lostConnection) {
_documentDataState.postValue(Status.failed("Internet verbindung nicht da"))
}
}
}
}
is Status.Success -> {
WorkerHelper.finishWorkManually()
_documentDataState.postValue(Status.success(getDocumentList()))
}
is Status.Failure -> {
WorkerHelper.finishWorkManually()
_documentDataState.postValue(Status.failed(it.message.toString()))
}
}
}
}
I've also created a function that converts the Status of my workmanager to my custom status class:
Status
sealed class Status<out T> {
data class Success<out T>(val data: T) : Status<T>()
class Loading<T> : Status<T>()
data class Failure<out T>(val message: String?) : Status<T>()
companion object {
fun <T> success(data: T) = Success<T>(data)
fun <T> loading() = Loading<T>()
fun <T> failed(message: String?) = Failure<T>(message)
}
}
Function
suspend inline fun observerWorkerState(workInfoFlow: Flow<WorkInfo>): Flow<Status<Unit>> = flow {
workInfoFlow.collect {
when (it.state) {
WorkInfo.State.ENQUEUED -> emit(Status.loading<Unit>())
WorkInfo.State.RUNNING -> emit(Status.loading<Unit>())
WorkInfo.State.SUCCEEDED -> emit(Status.success(Unit))
WorkInfo.State.BLOCKED -> emit(Status.failed<Unit>("Workmanager blocked"))
WorkInfo.State.FAILED -> emit(Status.failed<Unit>("Workmanager failed"))
WorkInfo.State.CANCELLED -> emit(Status.failed<Unit>("Workmanager cancelled"))
}
}
}
How can i test a Delegates.Observable that is inside a BroadcastReceiver. I need to get battery level of device and check if it's just went below or above pre-defined critical level, and upload to server using UseCase of clean architecture. I used observable to observe only changing states.
private fun handleIntent(context: Context, intent: Intent) {
when (intent.action) {
Intent.ACTION_BATTERY_CHANGED -> {
try {
val batteryStatus =
context.registerReceiver(null, IntentFilter(Intent.ACTION_BATTERY_CHANGED))
val level = batteryStatus?.getIntExtra(BatteryManager.EXTRA_LEVEL, -1) ?: -1
val scale = batteryStatus?.getIntExtra(BatteryManager.EXTRA_SCALE, -1) ?: -1
batteryPct = (level / scale.toFloat() * 100).toInt()
isBatteryBelowCritical = batteryPct > CRITICAL_BATTERY
} catch (e: Exception) {
}
}
}
}
And observable
private var isBatteryBelowCritical by Delegates.observable(false) { _, old, new ->
//has gone above critical battery value
if (old && !new) {
sendAlarmUseCase.sendBatteryAlarm(batteryPct)
} else if (!old && new) {
// has gone below critical battery value
sendAlarmUseCase.sendBatteryAlarm(batteryPct)
}
}
Do i have to use parameters or assume old value to test current value? How is state is tested? Should i use parameterized test or assume previous value?
You could use a kind of dependency injection and refactor out the logic that checks for the state change:
fun notifyOnlyOnChange(initialValue: Boolean, notify: () -> Unit): ReadWriteProperty<Any?, Boolean> =
Delegates.observable(initialValue) { _, old, new ->
if (old != new) // your logic can be simplified to this
notify()
}
Then you can use it in your BroadcastReceiver like this:
private var isBatteryBelowCritical by notifyOnlyOnChange(false) {
sendAlarmUseCase.sendBatteryAlarm(batteryPct)
}
And unit test it like this:
#Test
fun `test observers are not notified when value is not changed`() {
var observable1 by notifyOnlyOnChange(false) { fail() }
observable1 = false
var observable2 by notifyOnlyOnChange(true) { fail() }
observable2 = true
}
#Test
fun `test observers are notified when value is changed`() {
var notified1 = false
var observable1 by notifyOnlyOnChange(false) { notified1 = true }
observable1 = true
assertTrue(notified1)
var notified2 = false
var observable2 by notifyOnlyOnChange(true) { notified2 = true }
observable2 = false
assertTrue(notified2)
}
I am using interactive video broadcasting in my app.
I am attaching class in which I am using live streaming.
I am getting the audio issue when I go back from the live streaming screen to the previous screen. I still listen to the audio of the host.
previously I was using leave channel method and destroying rtc client object, but after implementing this when I go back from streaming class then it closes all users screen who are using this app because of leave channel method. after that, I removed this option from my on destroy method.
Now I am using disable audio method which disables the audio but when I open live streaming class it doesn't enable audio. Enable audio method is not working I also used the mute audio local stream method and rtc handler on user mute audio method.
I am getting error--
"LiveStreamingActivity has leaked IntentReceiver io.agora.rtc.internal.AudioRoutingController$HeadsetBroadcastReceiver#101a7a7
that was originally registered here. Are you missing a call to
unregisterReceiver()? android.app.IntentReceiverLeaked: Activity
com.allin.activities.home.homeActivities.LiveStreamingActivity has
leaked IntentReceiver
io.agora.rtc.internal.AudioRoutingController$HeadsetBroadcastReceiver#101a7a7
that was originally registered here. Are you missing a call to
unregisterReceiver()?"
Receiver is registering in SDK and exception is coming inside the SDK that is jar file I can't edit.
Please help this in resolving my issue as I have to live the app on
play store.
//firstly I have tried this but it automatically stops other
devices streaming.
override fun onDestroy() {
/* if (mRtcEngine != null) {
leaveChannel()
RtcEngine.destroy(mRtcEngine)
mRtcEngine = null
}*/
//second I have tried disabling the audio so that user will
not hear
the host voice
if (mRtcEngine != null) //
{
mRtcEngine!!.disableAudio()
}
super.onDestroy()
}
// then I when I came back from the previous screen to live streaming activity everything is initializing again but the audio is not able to audible.
override fun onResume() {
super.onResume()
Log.e("resume", "resume")
if (mRtcEngine != null) {
mRtcEngine!!.enableAudio()
// mRtcEngine!!.resumeAudio()
}
}
code I am using
//agora rtc engine and handler initialization-----------------
private var mRtcEngine: RtcEngine? = null
private var mRtcEventHandler = object : IRtcEngineEventHandler() {
#SuppressLint("LongLogTag")
override fun onFirstRemoteVideoDecoded(uid: Int, width: Int,
height: Int, elapsed: Int) {
}
override fun onUserOffline(uid: Int, reason: Int) {
runOnUiThread {
val a = reason //if login =0 user is offline
try {
if (mUid == uid) {
if (surfaceView?.parent != null)
(surfaceView?.parent as ViewGroup).removeAllViews()
if (mRtcEngine != null) {
leaveChannel()
RtcEngine.destroy(mRtcEngine)
mRtcEngine = null
}
setResult(IntentConstants.REQUEST_CODE_LIVE_STREAMING)
finish()
}
} catch (e: Exception) {
e.printStackTrace()
}
}
}
override fun onUserMuteVideo(uid: Int, muted: Boolean) {
runOnUiThread {
// onRemoteUserVideoMuted(uid, muted);
Log.e("video","muted")
}
}
override fun onAudioQuality(uid: Int, quality: Int, delay:
Short, lost: Short) {
super.onAudioQuality(uid, quality, delay, lost)
Log.e("", "")
}
override fun onUserJoined(uid: Int, elapsed: Int) {
// super.onUserJoined(uid, elapsed)
mUid = uid
runOnUiThread {
try {
setupRemoteVideo(mUid!!)
} catch (e: Exception) {
e.printStackTrace()
}
}
Log.e("differnt_uid----", mUid.toString())
}
}
private fun initAgoraEngineAndJoinChannel() {
if(mRtcEngine==null)
{
initializeAgoraEngine()
setupVideoProfile()
}
}
//initializing rtc engine class
#Throws(Exception::class)
private fun initializeAgoraEngine() {
try {
var s = RtcEngine.getSdkVersion()
mRtcEngine = RtcEngine.create(baseContext, AgoraConstants.APPLICATION_ID, mRtcEventHandler)
} catch (e: Exception) {
// Log.e(LOG_TAG, Log.getStackTraceString(e));
throw RuntimeException("NEED TO check rtc sdk init fatal error\n" + Log.getStackTraceString(e))
}
}
#Throws(Exception::class)
private fun setupVideoProfile() {
//mRtcEngine?.muteAllRemoteAudioStreams(true)
// mLogger.log("channelName account = " + channelName + ",uid = " + 0);
mRtcEngine?.enableVideo()
//mRtcEngine.clearVideoCompositingLayout();
mRtcEngine?.enableLocalVideo(false)
mRtcEngine?.setEnableSpeakerphone(false)
mRtcEngine?.muteLocalAudioStream(true)
joinChannel()
mRtcEngine?.setVideoProfile(Constants.CHANNEL_PROFILE_LIVE_BROADCASTING, true)
mRtcEngine?.setChannelProfile(Constants.CHANNEL_PROFILE_LIVE_BROADCASTING)
mRtcEngine?.setClientRole(Constants.CLIENT_ROLE_AUDIENCE,"")
val speaker = mRtcEngine?.isSpeakerphoneEnabled
val camerafocus = mRtcEngine?.isCameraAutoFocusFaceModeSupported
Log.e("", "")
}
#Throws(Exception::class)
private fun setupRemoteVideo(uid: Int) {
val container = findViewById<FrameLayout>(R.id.fl_video_container)
if (container.childCount >= 1) {
return
}
surfaceView = RtcEngine.CreateRendererView(baseContext)
container.addView(surfaceView)
mRtcEngine?.setupRemoteVideo(VideoCanvas(surfaceView, VideoCanvas.RENDER_MODE_HIDDEN, uid))
mRtcEngine?.setRemoteVideoStreamType(uid, 1)
mRtcEngine?.setCameraAutoFocusFaceModeEnabled(false)
mRtcEngine?.muteRemoteAudioStream(uid, false)
mRtcEngine?.adjustPlaybackSignalVolume(0)
// mRtcEngine.setVideoProfile(Constants.VIDEO_PROFILE_180P, false); // Earlier than 2.3.0
surfaceView?.tag = uid // for mark purpose
val audioManager: AudioManager =
this#LiveStreamingActivity.getSystemService(Context.AUDIO_SERVICE) as AudioManager
//audioManager.mode = AudioManager.MODE_IN_CALL
val isConnected: Boolean = audioManager.isWiredHeadsetOn
if (isConnected) {
/* audioManager.isSpeakerphoneOn = false
audioManager.isWiredHeadsetOn = true*/
mRtcEngine?.setEnableSpeakerphone(false)
mRtcEngine?.setDefaultAudioRoutetoSpeakerphone(false)
mRtcEngine?.setSpeakerphoneVolume(0)
mRtcEngine?.enableInEarMonitoring(true)
// Sets the in-ear monitoring volume to 50% of original volume.
mRtcEngine?.setInEarMonitoringVolume(200)
mRtcEngine?.adjustPlaybackSignalVolume(200)
} else {
/* audioManager.isSpeakerphoneOn = true
audioManager.isWiredHeadsetOn = false*/
mRtcEngine?.setEnableSpeakerphone(true)
mRtcEngine?.setDefaultAudioRoutetoSpeakerphone(true)
mRtcEngine?.setSpeakerphoneVolume(50)
mRtcEngine?.adjustPlaybackSignalVolume(50)
mRtcEngine?.enableInEarMonitoring(false)
// Sets the in-ear monitoring volume to 50% of original volume.
mRtcEngine?.setInEarMonitoringVolume(0)
}
Log.e("", "")
}
#Throws(Exception::class)
private fun joinChannel() {
mRtcEngine?.joinChannel(
null,
AgoraConstants.CHANNEL_NAME,
"Extra Optional Data",
0
) // if you do not specify the uid, we will generate the uid for you
}
#Throws(Exception::class)
private fun leaveChannel() {
mRtcEngine!!.leaveChannel()
}
I think first you want to put setupRemoteVideo in onFirstRemoteVideoDecoded callback instead of the onUserJoined callback. Also, in the onDestroy callback, you should call RtcEngine.destroy() instead of RtcEngine.destroy(mRtcEngine).
I have the following code for music recognition. I am using intent service to do all the music recognition in the service. I have done all the basic steps like adding all the permissions required and adding the ACRCloud android SDK in the project.
class SongIdentifyService(discoverPresenter : DiscoverPresenter? = null) : IACRCloudListener , IntentService("SongIdentifyService") {
private val callback : SongIdentificationCallback? = discoverPresenter
private val mClient : ACRCloudClient by lazy { ACRCloudClient() }
private val mConfig : ACRCloudConfig by lazy { ACRCloudConfig() }
private var initState : Boolean = false
private var mProcessing : Boolean = false
override fun onHandleIntent(intent: Intent?) {
Log.d("SongIdentifyService", "onHandeIntent called" )
setUpConfig()
addConfigToClient()
if (callback != null) {
startIdentification(callback)
}
}
public fun setUpConfig(){
Log.d("SongIdentifyService", "setupConfig called")
this.mConfig.acrcloudListener = this#SongIdentifyService
this.mConfig.host = "some-host"
this.mConfig.accessKey = "some-accesskey"
this.mConfig.accessSecret = "some-secret"
this.mConfig.protocol = ACRCloudConfig.ACRCloudNetworkProtocol.PROTOCOL_HTTP // PROTOCOL_HTTPS
this.mConfig.reqMode = ACRCloudConfig.ACRCloudRecMode.REC_MODE_REMOTE
}
// Called to start identifying/discovering the song that is currently playing
fun startIdentification(callback: SongIdentificationCallback)
{
Log.d("SongIdentifyService", "startIdentification called")
if(!initState)
{
Log.d("AcrCloudImplementation", "init error")
}
if(!mProcessing) {
mProcessing = true
if (!mClient.startRecognize()) {
mProcessing = false
Log.d("AcrCloudImplementation" , "start error")
}
}
}
// Called to stop identifying/discovering song
fun stopIdentification()
{
Log.d("SongIdentifyService", "stopIdentification called")
if(mProcessing)
{
mClient.stopRecordToRecognize()
}
mProcessing = false
}
fun cancelListeningToIdentifySong()
{
if(mProcessing)
{
mProcessing = false
mClient.cancel()
}
}
fun addConfigToClient(){
Log.d("SongIdentifyService", "addConfigToClient called")
this.initState = this.mClient.initWithConfig(this.mConfig)
if(this.initState)
{
this.mClient.startPreRecord(3000)
}
}
override fun onResult(result: String?) {
Log.d("SongIdentifyService", "onResult called")
Log.d("SongIdentifyService",result)
mClient.cancel()
mProcessing = false
val result = Gson().fromJson(result, SongIdentificationResult :: class.java)
if(result.status.code == 3000)
{
callback!!.onOfflineError()
}
else if(result.status.code == 1001)
{
callback!!.onSongNotFound()
}
else if(result.status.code == 0 )
{
callback!!.onSongFound(MusicDataMapper().convertFromDataModel(result))
//callback!!.onSongFound(Song("", "", ""))
}
else
{
callback!!.onGenericError()
}
}
override fun onVolumeChanged(p0: Double) {
TODO("not implemented") //To change body of created functions use File | Settings | File Templates.
}
interface SongIdentificationCallback {
// Called when the user is offline and music identification failed
fun onOfflineError()
// Called when a generic error occurs and music identification failed
fun onGenericError()
// Called when music identification completed but couldn't identify the song
fun onSongNotFound()
// Called when identification completed and a matching song was found
fun onSongFound(song: Song)
}
}
Now when I am starting the service I am getting the following error:
I checked the implementation of the ACRCloudClient and its extends android Activity. Also ACRCloudClient uses shared preferences(that's why I am getting a null pointer exception).
Since keeping a reference to an activity in a service is not a good Idea its best to Implement the above code in the activity. All the implementation of recognizing is being done in a separate thread anyway in the ACRCloudClient class so there is no point of creating another service for that.