Java Google cloud SpeechClient - How to stop streaming audio? - android

From sample doc Following code starts recording audio and streams to google cloud and receives responses. Everything works but I want to close straming after a certain condition is met.
if (mPermissionToRecord) {
val isFirstRequest = AtomicBoolean(true)
mAudioEmitter = AudioEmitter()
textView.setText("starting listener.")
// start streaming the data to the server and collect responses
val requestStream = mSpeechClient.streamingRecognizeCallable()
.bidiStreamingCall(object : ApiStreamObserver<StreamingRecognizeResponse> {
override fun onNext(value: StreamingRecognizeResponse) {
runOnUiThread {
when {
value.resultsCount > 0 -> mTextView.setText(
value.getResults(0).getAlternatives(
0
).transcript
)
else -> mTextView.setText(getString(R.string.api_error))
}
}
}
override fun onError(t: Throwable) {
//Log.e(TAG, "an error occurred", t)
textView.setText("an error occurred "+t.toString())
}
override fun onCompleted() {
//Log.d(TAG, "stream closed")
textView.setText("stream closed")
}
})
// monitor the input stream and send requests as audio data becomes available
mAudioEmitter!!.start { bytes ->
val builder = StreamingRecognizeRequest.newBuilder()
.setAudioContent(bytes)
// if first time, include the config
if (isFirstRequest.getAndSet(false)) {
builder.streamingConfig = StreamingRecognitionConfig.newBuilder()
.setConfig(
RecognitionConfig.newBuilder()
.setLanguageCode("en-US")
.setEncoding(RecognitionConfig.AudioEncoding.LINEAR16)
.setSampleRateHertz(16000)
.build()
)
.setInterimResults(true)
.setSingleUtterance(false)
.build()
}
// send the next request
requestStream.onNext(builder.build())
}
} else {
Log.e(TAG, "No permission to record! Please allow.")
}
AudioEmitter() is a audio recorder class. I tried to call:
mAudioEmitter?.stop()
mAudioEmitter = null
but that only stops the audio recording. I want to stop the audio streaming as well.
Calling mSpeechClient.shutdown() crashes the app.
How to stop SpeechClient bidiStreamingCall?

Related

Android, BLE continuously writing a characteristic value disconnects the gatt server

I notice that if i write fast and continuously a characteristic value the gatt server disconnect.
I know that I have to wait until onCharacteristicWrite callback, so that's not the problem I think.
This my queue implementation, I'm using a kotlin Channel to syncronize write and read.
private var continuation: CancellableContinuation<BluetoothGattCharacteristic>? = null
private val channel = Channel<WriteOp>(1)
private suspend fun processBluetoothWrite() {
do {
val writeOp = channel.receiveOrNull()
writeOp?.apply {
try {
suspendCancellableCoroutine<BluetoothGattCharacteristic> { cont ->
continuation = cont
characteristic.value = writeOp?.value
Log.d(TAG, "Write to ${characteristic?.uuid} value ${writeOp?.value?.toHexString()}...")
if (gatt?.writeCharacteristic(characteristic) == false) {
cont.resumeWithException(Exception("Write to ${characteristic?.uuid} fails."))
}
}
} catch (ex: Exception) {
Log.e(TAG, ex.message, ex)
}
}
} while (writeOp != null)
}
override fun onCharacteristicWrite(
gatt: BluetoothGatt?,
characteristic: BluetoothGattCharacteristic?,
status: Int
) {
Log.d(TAG, "Write to ${characteristic?.uuid} value ${characteristic?.value?.toHexString()} | ${status}")
characteristic?.apply {
if (status == BluetoothGatt.GATT_SUCCESS) {
continuation?.resume(this)
} else {
continuation?.resumeWithException(Exception("Write to ${characteristic?.uuid} value ${characteristic?.value?.toHexString()} | ${status}"))
}
}
}
I need to add a delay of about 100ms in the queue processing to avoid disconnection.
UPDATE
After setting writeType as default, it seems that onCharacteristicWrite is more realistic (I used to get GATT_SUCCESS even when the device stopped communicating, so I guess it was a "virtual" state), now when the device stopped communicating it didn't get the onCharacteristicWrite callback, though after a while it is fired with status = 133.
characteristic.writeType = BluetoothGattCharacteristic.WRITE_TYPE_DEFAULT
What does it mean?

Transmit audio to Doorbird device

I'm trying to create an Android app that connects to the Doorbird device, I know the company's official app, but, I need more features that are tailored to my needs.
For someone that doesn't know what is Doorbird device, Doorbird is a smart intercom, a product of Doorbird company, the device can transmit audio and video from him to any consumer, like Android system, over HTTP and RTSP and he can get Audio stream and play it, for example, to record audio from Android device and transmit it to Doorbird. The audio is in format G711 u-law.
I was able to get the video and audio stream received from Doorbird and it works perfectly but I don't succeed to transmit the audio, in the u-law format of course, to Doorbird.
The error I get is
HTTP FAILED: java.net.ProtocolException: Unexpected status line:
I tried to transmit the same bytes I get from Doorbird back to Doorbird but still the same error.
Of course, I work according to the API that they published but there is not much information about an agreed protocol to transmit audio.
Offical Doorbird API
Is there an example of an Android project that integrates with Doorbird?
Can anyone help in trying to broadcast audio to Doorbird?
Which protocol should be?
Even someone who knows to transmit audio to Doorbird with any other tools and any system and not just Android OS, I'd appreciate it.
This is what I tried, I received the data from Doorbird (and as I said its works) and waiting 3 seconds, and transmit it with Retrofit Libray back to Doorbird.
const val AUDIO_PATH =
"http://192.168.1.187/bha-api/audio-receive.cgi?http-user=XXXXXX0001&http-password=XXXXXXXXXX"
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
//InputStream inputStream = getResources().openRawResource(R.raw.piano12);
val thread = Thread { this.playUrl() }
thread.start()
//val inStr = assets.open("doorbird_record")
}
private fun playUrl() {
val inStr = URL(AUDIO_PATH).openStream()
val buffer = ByteArray(1000)
var i = 0
//while (inStr.read(buffer).also { i = it } != -1) {
Handler(Looper.getMainLooper()).postDelayed({
//inStr.close()
inStr.read(buffer)
Log.d("DoorbirdLog", inStr.toString())
val part = MultipartBody.Part.createFormData(
"doorbirdStream", "doorbird", buffer.toRequestBody(
("audio/basic").toMediaType()
)
)
//val rb = file.asRequestBody(("audio/*").toMediaType())
val call = NetworkManager.instanceServiceApi.upload(part)
call.enqueue(object : Callback<ResponseBody> {
override fun onResponse(
call: Call<ResponseBody>,
response: Response<ResponseBody>
) {
val i = response.body()
Log.d("success", i.toString())
}
override fun onFailure(call: Call<ResponseBody>, t: Throwable) {
Log.d("failed", t.message.toString())
}
})
}, 3000)
}
And the Retrofit instance:
#Multipart
#Headers( "Content-Type: audio/basic",
"Content-Length: 9999999",
"Connection: Keep-Alive",
"Cache-Control: no-cache")
#POST("audio-transmit.cgi?http-user=XXXXXX0001&http-password=XXXXXXXXXX")
fun upload(#Part part: MultipartBody.Part): Call<ResponseBody>
I'd appreciate your assistance
Eventually, I was able to find a solution, I'll briefly present here the solution for those who will encounter an attempt to integrate with Doorbird.
private const val FREQUENCY_SAMPLE_RATE_TRANSMIT = 8000
private const val RECORD_STATE_STOPPED = 0
override suspend fun recordAndTransmitAudio(audioTransmitUrl: String) =
withContext(Dispatchers.IO) {
val minBufferSize = AudioRecord.getMinBufferSize(
FREQUENCY_SAMPLE_RATE_TRANSMIT, AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT
)
mRecorder = AudioRecord(
MediaRecorder.AudioSource.VOICE_COMMUNICATION,
FREQUENCY_SAMPLE_RATE_TRANSMIT, AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT, minBufferSize
)
mRecorder?.let { enableAcousticEchoCanceler(it.audioSessionId) }
mRecorder?.startRecording()
val bufferShort = ShortArray(minBufferSize)
val buffer = ByteArray(minBufferSize)
val urlConnection = URL(audioTransmitUrl).openConnection() as HttpURLConnection
urlConnection.apply {
doOutput = true
setChunkedStreamingMode(minBufferSize)
}
val output = DataOutputStream(urlConnection.outputStream)
output.flush()
try {
mRecorder?.let { recorder ->
while (recorder.read(bufferShort, 0, bufferShort.size) != RECORD_STATE_STOPPED) {
G711UCodecManager.encode(bufferShort, minBufferSize, buffer, 0)
output.write(buffer)
}
}
}catch (e: Exception){
Log.d(TAG, e.message.toString())
}
output.close()
urlConnection.disconnect()
}
First, we will prepare the necessary parameters for recording and transmission
We get the minimum size of the buffer for recording
Define the object with which we will record
Activate the echo cancellation
And start recording
Open connection with the transmit URL
While loop as long as the recording has not stopped
Encode the data we recorded from PCM 16Bit format to G.711 μ-law format
And of course, after we finished the recording we cleaned up resources.

Twitter Streaming API HTTP 420

I want to consume twitter streaming api in android.
I've used kotlin coroutines and retrofit.
Somehow in the third request i get an HTTP 420 ERROR (Enhance your calm)
I cannot understand why this happens. I am using kotlin coroutines.
Here's my code:
fun getStreamData(str: String) {
Log.d("debug", "Fetching data..")
coroutineScope.launch {
withContext(Dispatchers.Main) {
//Display loading animation in UI
_status.value = DataApiStatus.LOADING
}
try {
val listResult = ApiService().api!!.getTweetList(str).await()
while (!listResult.source().exhausted()) {
val reader = JsonReader(InputStreamReader(listResult.byteStream()))
// https://stackoverflow.com/questions/11484353/gson-throws-malformedjsonexception
reader.setLenient(true);
val gson = GsonBuilder().create()
val j = gson.fromJson<JsonObject>(reader, JsonObject::class.java)
Log.d("debug", "JSON: " + j.toString())
if (j.get("text") != null && j.getAsJsonObject("user").get("profile_image_url_https") != null && j.getAsJsonObject("user").get("name") != null){
val t = gson.fromJson<Tweet>(j, Tweet::class.java)
withContext(Dispatchers.Main) {
_status.value = DataApiStatus.DONE
// https://stackoverflow.com/questions/47941537/notify-observer-when-item-is-added-to-list-of-livedata
tweetsList.add(t)
_tweetsList.value = tweetsList
}
}
}
}
catch (e : JsonSyntaxException) {
Log.e("error", "JsonSyntaxException ${e.message}");
}
catch (e: Exception) {
Log.e("error", "ERROR ${e.message}")
}
}
}
This function is responsible to search the stream accordingly to str string which is a parameter.
Also, when the search parameter changes i cancel the current job and relaunch a new one with the actual search parameter.
fun cancelJob(){
Log.d("debug", "Cancelling current Job!")
coroutineScope.coroutineContext.cancelChildren()
}
What am i doing wrong? In the third request i get an HTTP 420 ERROR.
Here's the full code:
https://github.com/maiamiguel/RHO-Challenge
The 420 Enhance Your Calm status code is an unofficial extension by Twitter. Twitter used this to tell HTTP clients that they were being rate limited. Rate limiting means putting restrictions on the total number of requests a client may do within a time period.

Enable Audio method issue in android agora rtc sdk

I am using interactive video broadcasting in my app.
I am attaching class in which I am using live streaming.
I am getting the audio issue when I go back from the live streaming screen to the previous screen. I still listen to the audio of the host.
previously I was using leave channel method and destroying rtc client object, but after implementing this when I go back from streaming class then it closes all users screen who are using this app because of leave channel method. after that, I removed this option from my on destroy method.
Now I am using disable audio method which disables the audio but when I open live streaming class it doesn't enable audio. Enable audio method is not working I also used the mute audio local stream method and rtc handler on user mute audio method.
I am getting error--
"LiveStreamingActivity has leaked IntentReceiver io.agora.rtc.internal.AudioRoutingController$HeadsetBroadcastReceiver#101a7a7
that was originally registered here. Are you missing a call to
unregisterReceiver()? android.app.IntentReceiverLeaked: Activity
com.allin.activities.home.homeActivities.LiveStreamingActivity has
leaked IntentReceiver
io.agora.rtc.internal.AudioRoutingController$HeadsetBroadcastReceiver#101a7a7
that was originally registered here. Are you missing a call to
unregisterReceiver()?"
Receiver is registering in SDK and exception is coming inside the SDK that is jar file I can't edit.
Please help this in resolving my issue as I have to live the app on
play store.
//firstly I have tried this but it automatically stops other
devices streaming.
override fun onDestroy() {
/* if (mRtcEngine != null) {
leaveChannel()
RtcEngine.destroy(mRtcEngine)
mRtcEngine = null
}*/
//second I have tried disabling the audio so that user will
not hear
the host voice
if (mRtcEngine != null) //
{
mRtcEngine!!.disableAudio()
}
super.onDestroy()
}
// then I when I came back from the previous screen to live streaming activity everything is initializing again but the audio is not able to audible.
override fun onResume() {
super.onResume()
Log.e("resume", "resume")
if (mRtcEngine != null) {
mRtcEngine!!.enableAudio()
// mRtcEngine!!.resumeAudio()
}
}
code I am using
//agora rtc engine and handler initialization-----------------
private var mRtcEngine: RtcEngine? = null
private var mRtcEventHandler = object : IRtcEngineEventHandler() {
#SuppressLint("LongLogTag")
override fun onFirstRemoteVideoDecoded(uid: Int, width: Int,
height: Int, elapsed: Int) {
}
override fun onUserOffline(uid: Int, reason: Int) {
runOnUiThread {
val a = reason //if login =0 user is offline
try {
if (mUid == uid) {
if (surfaceView?.parent != null)
(surfaceView?.parent as ViewGroup).removeAllViews()
if (mRtcEngine != null) {
leaveChannel()
RtcEngine.destroy(mRtcEngine)
mRtcEngine = null
}
setResult(IntentConstants.REQUEST_CODE_LIVE_STREAMING)
finish()
}
} catch (e: Exception) {
e.printStackTrace()
}
}
}
override fun onUserMuteVideo(uid: Int, muted: Boolean) {
runOnUiThread {
// onRemoteUserVideoMuted(uid, muted);
Log.e("video","muted")
}
}
override fun onAudioQuality(uid: Int, quality: Int, delay:
Short, lost: Short) {
super.onAudioQuality(uid, quality, delay, lost)
Log.e("", "")
}
override fun onUserJoined(uid: Int, elapsed: Int) {
// super.onUserJoined(uid, elapsed)
mUid = uid
runOnUiThread {
try {
setupRemoteVideo(mUid!!)
} catch (e: Exception) {
e.printStackTrace()
}
}
Log.e("differnt_uid----", mUid.toString())
}
}
private fun initAgoraEngineAndJoinChannel() {
if(mRtcEngine==null)
{
initializeAgoraEngine()
setupVideoProfile()
}
}
//initializing rtc engine class
#Throws(Exception::class)
private fun initializeAgoraEngine() {
try {
var s = RtcEngine.getSdkVersion()
mRtcEngine = RtcEngine.create(baseContext, AgoraConstants.APPLICATION_ID, mRtcEventHandler)
} catch (e: Exception) {
// Log.e(LOG_TAG, Log.getStackTraceString(e));
throw RuntimeException("NEED TO check rtc sdk init fatal error\n" + Log.getStackTraceString(e))
}
}
#Throws(Exception::class)
private fun setupVideoProfile() {
//mRtcEngine?.muteAllRemoteAudioStreams(true)
// mLogger.log("channelName account = " + channelName + ",uid = " + 0);
mRtcEngine?.enableVideo()
//mRtcEngine.clearVideoCompositingLayout();
mRtcEngine?.enableLocalVideo(false)
mRtcEngine?.setEnableSpeakerphone(false)
mRtcEngine?.muteLocalAudioStream(true)
joinChannel()
mRtcEngine?.setVideoProfile(Constants.CHANNEL_PROFILE_LIVE_BROADCASTING, true)
mRtcEngine?.setChannelProfile(Constants.CHANNEL_PROFILE_LIVE_BROADCASTING)
mRtcEngine?.setClientRole(Constants.CLIENT_ROLE_AUDIENCE,"")
val speaker = mRtcEngine?.isSpeakerphoneEnabled
val camerafocus = mRtcEngine?.isCameraAutoFocusFaceModeSupported
Log.e("", "")
}
#Throws(Exception::class)
private fun setupRemoteVideo(uid: Int) {
val container = findViewById<FrameLayout>(R.id.fl_video_container)
if (container.childCount >= 1) {
return
}
surfaceView = RtcEngine.CreateRendererView(baseContext)
container.addView(surfaceView)
mRtcEngine?.setupRemoteVideo(VideoCanvas(surfaceView, VideoCanvas.RENDER_MODE_HIDDEN, uid))
mRtcEngine?.setRemoteVideoStreamType(uid, 1)
mRtcEngine?.setCameraAutoFocusFaceModeEnabled(false)
mRtcEngine?.muteRemoteAudioStream(uid, false)
mRtcEngine?.adjustPlaybackSignalVolume(0)
// mRtcEngine.setVideoProfile(Constants.VIDEO_PROFILE_180P, false); // Earlier than 2.3.0
surfaceView?.tag = uid // for mark purpose
val audioManager: AudioManager =
this#LiveStreamingActivity.getSystemService(Context.AUDIO_SERVICE) as AudioManager
//audioManager.mode = AudioManager.MODE_IN_CALL
val isConnected: Boolean = audioManager.isWiredHeadsetOn
if (isConnected) {
/* audioManager.isSpeakerphoneOn = false
audioManager.isWiredHeadsetOn = true*/
mRtcEngine?.setEnableSpeakerphone(false)
mRtcEngine?.setDefaultAudioRoutetoSpeakerphone(false)
mRtcEngine?.setSpeakerphoneVolume(0)
mRtcEngine?.enableInEarMonitoring(true)
// Sets the in-ear monitoring volume to 50% of original volume.
mRtcEngine?.setInEarMonitoringVolume(200)
mRtcEngine?.adjustPlaybackSignalVolume(200)
} else {
/* audioManager.isSpeakerphoneOn = true
audioManager.isWiredHeadsetOn = false*/
mRtcEngine?.setEnableSpeakerphone(true)
mRtcEngine?.setDefaultAudioRoutetoSpeakerphone(true)
mRtcEngine?.setSpeakerphoneVolume(50)
mRtcEngine?.adjustPlaybackSignalVolume(50)
mRtcEngine?.enableInEarMonitoring(false)
// Sets the in-ear monitoring volume to 50% of original volume.
mRtcEngine?.setInEarMonitoringVolume(0)
}
Log.e("", "")
}
#Throws(Exception::class)
private fun joinChannel() {
mRtcEngine?.joinChannel(
null,
AgoraConstants.CHANNEL_NAME,
"Extra Optional Data",
0
) // if you do not specify the uid, we will generate the uid for you
}
#Throws(Exception::class)
private fun leaveChannel() {
mRtcEngine!!.leaveChannel()
}
I think first you want to put setupRemoteVideo in onFirstRemoteVideoDecoded callback instead of the onUserJoined callback. Also, in the onDestroy callback, you should call RtcEngine.destroy() instead of RtcEngine.destroy(mRtcEngine).

Android service can't connect to server after recreation only in battery saving mode

I'm writing a chat app with the server and Android client written in Kotlin. I create a background service that constantly reads from the socket connected to the server and sends notifications when a message arrives. Everything works fine until user taps 'x' button and closes the app. Connection with server fails during executing cleanUp code posted below. Server had gotten EOF before service managed to send EXIT request and close streams. Then, service is recreated but when it tries to connect to the server it gets ConnectException (connection refused). It happens only when battery saving mode is on. When it's off or phone is connected to my laptop with USB and charging there's no problem.
The ss command lists that there is someone listening on the specified port, so it's not that problem. I've tried to connect in a loop, i. e. try to connect 5 times every 10 seconds, but it got refused every time. I've tried listening on two different ports, but both failed even if one of them wasn't used before. Docs say that default backlog is 50, so I guess it's not that either. I tried to set a SO_REUSEADDR flag on the server socket, but still nothing. And the strange thing is, that when service is started from the app when I launch it for the second time it can connect again. So I've created a broadcast receiver that starts the service the same way as the app in case it crashes, but it's not helping either.
I really was googling it for over a week but it's my first attempt at using both Kotlin and sockets and I'm running out of ideas. If someone has a clue to what might be going on, I'd really appreciate some help.
Here is the service onStartCommand:
override fun onStartCommand(intent: Intent?, flags: Int, startId: Int): Int {
activeConversation = intent?.getStringExtra(CONV_NAME) ?: ""
login = intent?.getStringExtra(LOGIN) ?: login
if (thread?.isAlive != true) {
thread = thread(start = true) {
synchronized(lock) {
try {
socket = Socket(SERVER_IP, SERVICE_PORT)
output = ObjectOutputStream(socket?.getOutputStream())
input = ObjectInputStream(socket?.getInputStream())
output?.writeObject(Request(START_SERVICE, mutableMapOf(LOGIN to login)))
} catch (e: IOException) {
e.printStackTrace()
return#thread
}
}
handleMessages() //contains input?.readObject() in infinite loop
}
}
return START_STICKY
}
In onDestory() and onTaskRemoved() I call this function:
private fun cleanUp() {
synchronized(lock) {
thread(start = true) {
try {
output?.writeObject(Request(EXIT, mutableMapOf(LOGIN to login)))
output?.close()
input?.close()
socket?.close()
nullStreams()
thread?.join()
println("SERVICE: thread joined")
} catch(e: IOException) {
e.printStackTrace()
return#thread
} finally {
println("Service sends broadcast to ask for recreation")
val restartIntent = Intent(this, ServiceRestarter::class.java)
restartIntent.putExtra(LOGIN, login)
sendBroadcast(restartIntent)
}
}.join()
}
}
ServiceRestarter:
class ServiceRestarter : BroadcastReceiver() {
override fun onReceive(context: Context, intent: Intent?) {
val login = intent?.getStringExtra(LOGIN)
println("SERVICE RESTARTER: receiving restart request from $login")
val serviceIntent = Intent(context, MessengerService::class.java)
serviceIntent.putExtra(LOGIN, login)
context.startService(serviceIntent)
}}
Part of my server responsible for listening:
val clientsSocket = ServerSocket(CLIENTS_PORT)
val serviceSocket = ServerSocket(SERVICE_PORT)
serviceSocket.setReuseAddress(true)
println("Server socket ready!")
println("Service socket port: ${serviceSocket.localPort}")
thread(start = true) {
while(true) ClientThread(clientsSocket.accept(), loggedInUsers, pendingRequests).start()
}
thread(start = true) {
while(true) ServiceThread(serviceSocket.accept(), loggedInUsers).start()
}
And ServiceThread:
class ServiceThread(val socket: Socket,
val loggedInUsers: HashMap<String, UserConnection>) : Thread() {
private var login = ""
private val input = ObjectInputStream(socket.getInputStream())
private val output = ObjectOutputStream(socket.getOutputStream())
override fun run() {
var request = input.readObject() as Request
login = request.content[LOGIN] as String
var userConn: UserConnection?
synchronized(loggedInUsers) {
userConn = loggedInUsers[login]
if(request.action == START_SERVICE) {
println("SERVICE THREAD: New socket conn from $login")
userConn?.run {
println("SERVICE THREAD: putting $login output to logged in users")
serviceStream = output
if(pendingMessage != null) {
output.writeObject(Request(SEND,
mutableMapOf(RESULT to SUCCESS, DATA to pendingMessage)))
pendingMessage = null
}
}
}
}
try { request = input.readObject() as Request }
catch(e: IOException) {
println(e.printStackTrace())
cleanUp()
return#run
}
if(request.action == EXIT) {
println("SERVICE THREAD: Service of user $login is terminating")
cleanUp()
}
}
private fun cleanUp() {
synchronized(loggedInUsers) {
output.close()
input.close()
socket.close()
loggedInUsers[login]?.serviceStream = null
}
}}

Categories

Resources