Long running service with large amount of data transfer to activity - android

I am using USB to communicate audio and video from some external hardware.
Video is displayed on a Surface as long as the app is in foreground but audio should keep playing even when in background.
Using threads doesn't seem to work, when the app goes to background the audio starts to stutter and sound very bad. I believe i need to use a service, am i wrong?
But from all the online documentation i can't figure out, what is the best type of service (or any) to use for such long running application and the best way to communicate the large amount of data between its Activity & Surface?
I looked into plain services, WorkManagers, ForegroundService, ServiceSocket and many others
Here is the current code i am using to run the streamed audio from the USB. It is using a coroutine and as said, audio starts to stutter when app in background
class AudioTrackPlayer {
private val channel = Channel<Boolean>(capacity = 15)
private var dataList = mutableListOf<ByteArray>()
private var readIndex = 0
private var writeIndex = 0
private val scope = CoroutineScope(Job() + Dispatchers.IO)
private var audio: AudioTrack? = null
var lastSize = 0
var isPlaying = false
companion object {
private const val DATA_LIST_SIZE = 11
}
init {
scope.launch {
startThread()
}
}
fun config(channels: Int, sampleRate: Int):Boolean {
//......
}
private suspend fun startThread() = withContext(Dispatchers.IO) {
var data:ByteArray
var isPlayingLast = false
var finalByteArray:ByteArray
channel.consumeEach {
if (audio != null) {
if (isPlaying) {
if (readIndex != writeIndex) {
isPlayingLast = true
data = dataList[readIndex]
lastSize = data.size - 8
if (audio?.playState != AudioTrack.PLAYSTATE_PLAYING) {
audio?.play()
}
audio?.write(data, 8, lastSize)
readIndex++
if (readIndex == DATA_LIST_SIZE) {
readIndex = 0
}
}
} else {
if (isPlayingLast) {
isPlayingLast = false
finalByteArray = ByteArray(lastSize * 2)
if (audio?.playState != AudioTrack.PLAYSTATE_PLAYING) {
audio!!.play()
}
audio!!.write(finalByteArray, 0, finalByteArray.size)
audio!!.stop()
audio!!.flush()
writeIndex = readIndex
}
}
}
}
}
fun write(data: ByteArray) {
scope.launch {
if (isPlaying) {
dataList.add(writeIndex, data)
writeIndex++
if (writeIndex == DATA_LIST_SIZE) {
writeIndex = 0
}
channel.send(true)
}
}
}
fun play() {
//...
}
fun stop() {
//....
}
}

Related

Google Speech to Text does not return results

I want Google Speech to text API to recognize a short phrase after I press a button. So I came up with the following code. But it keeps returning no results. I'm quite confused, there are results in there (the buffer etc.), the mic is working well and is enabled in the emulator. Google console also doesn't show errors.
Here's my code.
Click listener that starts the recording:
val clicker: View.OnClickListener = View.OnClickListener {
Log.d(TAG, "Starting record thread")
mAudioRecorder.record(LISTEN_TIME_MILLIS)
}
mReadButton.setOnClickListener(clicker)
Here's a broadcast receiver that processes the results and tries to send them to Google:
private val broadCastReceiver = object : BroadcastReceiver() {
override fun onReceive(contxt: Context?, intent: Intent?) {
if (intent!!.getBooleanExtra(RECORDING_SUCCESS, false)) {
val byteArrayExtra = intent.getByteArrayExtra(RECORDING_AUDIO)
val audioResultByteString: ByteString = ByteString.copyFrom(byteArrayExtra)
if (audioResultByteString.size() > 0) {
val audio: RecognitionAudio = RecognitionAudio.newBuilder()
.setContent(audioResultByteString).build()
val resultsList = mSpeechClient.recognize(config, audio).resultsList
if (resultsList.size > 0) {
for (result in resultsList) {
val resultText = result.alternativesList[0].transcript
}
}
Log.d(TAG, "- Done recognition. Result Qty: ${resultsList.size}")
}
}
}
}
Here is the AudioRecorder class function, which does the recording:
fun record(listenTimeMillis: Long) {
val byteString: ByteString = ByteString.EMPTY
mAudioRecorder = initAudioRecorder()
val mBuffer = ByteArray(4 * AudioRecord.getMinBufferSize(SAMPLE_RATE_HZ, CHANNEL, ENCODING))
mAudioRecorder!!.startRecording()
Thread {
Process.setThreadPriority(Process.THREAD_PRIORITY_BACKGROUND)
Thread.sleep(listenTimeMillis)
val read = mAudioRecorder!!.read(mBuffer, 0, mBuffer.size, AudioRecord.READ_NON_BLOCKING)
val intent = Intent(RECORDING_COMPLETED_INTENT)
try {
if (read > 0) {
intent.putExtra(RECORDING_AUDIO, mBuffer)
intent.putExtra(RECORDING_SUCCESS, true)
}
LocalBroadcastManager.getInstance(context).sendBroadcast(intent)
} catch (e: Exception) {
Log.e(TAG, e.stackTrace.toString())
}
releaseAudioRecorder()
}.start()
}
I solved this. The thing to blame was a too small buffer size. So the recognition server was actually getting half a second of audio record which it obviously couldn't recognize.
val mBuffer = ByteArray(4 * AudioRecord.getMinBufferSize(SAMPLE_RATE_HZ, CHANNEL, ENCODING))
instead of 4 I put 200 and instead of AudioRecord.READ_NON_BLOCKING I have put AudioRecord.READ_BLOCKING and I read the buffer in a loop and increase the offset in each iteration. Then it started working.
val startTime = System.currentTimeMillis()
var deltaTime = 0L
var offset = 0
val intent = Intent(RECORDING_COMPLETED_INTENT)
val readChunk = 512
while (deltaTime < listenTimeMillis && offset < mBuffer.size) {
val read = mAudioRecord!!.read(mBuffer, offset, readChunk, AudioRecord.READ_BLOCKING)
if (read < 0) {
intent.putExtra(RECORDING_SUCCESS, false)
break; //if read with error, end here
}
deltaTime = System.currentTimeMillis() - startTime //startTime is a while loop breaking condition so it lestens only for specified amount of time
offset += readChunk
}

Enable Audio method issue in android agora rtc sdk

I am using interactive video broadcasting in my app.
I am attaching class in which I am using live streaming.
I am getting the audio issue when I go back from the live streaming screen to the previous screen. I still listen to the audio of the host.
previously I was using leave channel method and destroying rtc client object, but after implementing this when I go back from streaming class then it closes all users screen who are using this app because of leave channel method. after that, I removed this option from my on destroy method.
Now I am using disable audio method which disables the audio but when I open live streaming class it doesn't enable audio. Enable audio method is not working I also used the mute audio local stream method and rtc handler on user mute audio method.
I am getting error--
"LiveStreamingActivity has leaked IntentReceiver io.agora.rtc.internal.AudioRoutingController$HeadsetBroadcastReceiver#101a7a7
that was originally registered here. Are you missing a call to
unregisterReceiver()? android.app.IntentReceiverLeaked: Activity
com.allin.activities.home.homeActivities.LiveStreamingActivity has
leaked IntentReceiver
io.agora.rtc.internal.AudioRoutingController$HeadsetBroadcastReceiver#101a7a7
that was originally registered here. Are you missing a call to
unregisterReceiver()?"
Receiver is registering in SDK and exception is coming inside the SDK that is jar file I can't edit.
Please help this in resolving my issue as I have to live the app on
play store.
//firstly I have tried this but it automatically stops other
devices streaming.
override fun onDestroy() {
/* if (mRtcEngine != null) {
leaveChannel()
RtcEngine.destroy(mRtcEngine)
mRtcEngine = null
}*/
//second I have tried disabling the audio so that user will
not hear
the host voice
if (mRtcEngine != null) //
{
mRtcEngine!!.disableAudio()
}
super.onDestroy()
}
// then I when I came back from the previous screen to live streaming activity everything is initializing again but the audio is not able to audible.
override fun onResume() {
super.onResume()
Log.e("resume", "resume")
if (mRtcEngine != null) {
mRtcEngine!!.enableAudio()
// mRtcEngine!!.resumeAudio()
}
}
code I am using
//agora rtc engine and handler initialization-----------------
private var mRtcEngine: RtcEngine? = null
private var mRtcEventHandler = object : IRtcEngineEventHandler() {
#SuppressLint("LongLogTag")
override fun onFirstRemoteVideoDecoded(uid: Int, width: Int,
height: Int, elapsed: Int) {
}
override fun onUserOffline(uid: Int, reason: Int) {
runOnUiThread {
val a = reason //if login =0 user is offline
try {
if (mUid == uid) {
if (surfaceView?.parent != null)
(surfaceView?.parent as ViewGroup).removeAllViews()
if (mRtcEngine != null) {
leaveChannel()
RtcEngine.destroy(mRtcEngine)
mRtcEngine = null
}
setResult(IntentConstants.REQUEST_CODE_LIVE_STREAMING)
finish()
}
} catch (e: Exception) {
e.printStackTrace()
}
}
}
override fun onUserMuteVideo(uid: Int, muted: Boolean) {
runOnUiThread {
// onRemoteUserVideoMuted(uid, muted);
Log.e("video","muted")
}
}
override fun onAudioQuality(uid: Int, quality: Int, delay:
Short, lost: Short) {
super.onAudioQuality(uid, quality, delay, lost)
Log.e("", "")
}
override fun onUserJoined(uid: Int, elapsed: Int) {
// super.onUserJoined(uid, elapsed)
mUid = uid
runOnUiThread {
try {
setupRemoteVideo(mUid!!)
} catch (e: Exception) {
e.printStackTrace()
}
}
Log.e("differnt_uid----", mUid.toString())
}
}
private fun initAgoraEngineAndJoinChannel() {
if(mRtcEngine==null)
{
initializeAgoraEngine()
setupVideoProfile()
}
}
//initializing rtc engine class
#Throws(Exception::class)
private fun initializeAgoraEngine() {
try {
var s = RtcEngine.getSdkVersion()
mRtcEngine = RtcEngine.create(baseContext, AgoraConstants.APPLICATION_ID, mRtcEventHandler)
} catch (e: Exception) {
// Log.e(LOG_TAG, Log.getStackTraceString(e));
throw RuntimeException("NEED TO check rtc sdk init fatal error\n" + Log.getStackTraceString(e))
}
}
#Throws(Exception::class)
private fun setupVideoProfile() {
//mRtcEngine?.muteAllRemoteAudioStreams(true)
// mLogger.log("channelName account = " + channelName + ",uid = " + 0);
mRtcEngine?.enableVideo()
//mRtcEngine.clearVideoCompositingLayout();
mRtcEngine?.enableLocalVideo(false)
mRtcEngine?.setEnableSpeakerphone(false)
mRtcEngine?.muteLocalAudioStream(true)
joinChannel()
mRtcEngine?.setVideoProfile(Constants.CHANNEL_PROFILE_LIVE_BROADCASTING, true)
mRtcEngine?.setChannelProfile(Constants.CHANNEL_PROFILE_LIVE_BROADCASTING)
mRtcEngine?.setClientRole(Constants.CLIENT_ROLE_AUDIENCE,"")
val speaker = mRtcEngine?.isSpeakerphoneEnabled
val camerafocus = mRtcEngine?.isCameraAutoFocusFaceModeSupported
Log.e("", "")
}
#Throws(Exception::class)
private fun setupRemoteVideo(uid: Int) {
val container = findViewById<FrameLayout>(R.id.fl_video_container)
if (container.childCount >= 1) {
return
}
surfaceView = RtcEngine.CreateRendererView(baseContext)
container.addView(surfaceView)
mRtcEngine?.setupRemoteVideo(VideoCanvas(surfaceView, VideoCanvas.RENDER_MODE_HIDDEN, uid))
mRtcEngine?.setRemoteVideoStreamType(uid, 1)
mRtcEngine?.setCameraAutoFocusFaceModeEnabled(false)
mRtcEngine?.muteRemoteAudioStream(uid, false)
mRtcEngine?.adjustPlaybackSignalVolume(0)
// mRtcEngine.setVideoProfile(Constants.VIDEO_PROFILE_180P, false); // Earlier than 2.3.0
surfaceView?.tag = uid // for mark purpose
val audioManager: AudioManager =
this#LiveStreamingActivity.getSystemService(Context.AUDIO_SERVICE) as AudioManager
//audioManager.mode = AudioManager.MODE_IN_CALL
val isConnected: Boolean = audioManager.isWiredHeadsetOn
if (isConnected) {
/* audioManager.isSpeakerphoneOn = false
audioManager.isWiredHeadsetOn = true*/
mRtcEngine?.setEnableSpeakerphone(false)
mRtcEngine?.setDefaultAudioRoutetoSpeakerphone(false)
mRtcEngine?.setSpeakerphoneVolume(0)
mRtcEngine?.enableInEarMonitoring(true)
// Sets the in-ear monitoring volume to 50% of original volume.
mRtcEngine?.setInEarMonitoringVolume(200)
mRtcEngine?.adjustPlaybackSignalVolume(200)
} else {
/* audioManager.isSpeakerphoneOn = true
audioManager.isWiredHeadsetOn = false*/
mRtcEngine?.setEnableSpeakerphone(true)
mRtcEngine?.setDefaultAudioRoutetoSpeakerphone(true)
mRtcEngine?.setSpeakerphoneVolume(50)
mRtcEngine?.adjustPlaybackSignalVolume(50)
mRtcEngine?.enableInEarMonitoring(false)
// Sets the in-ear monitoring volume to 50% of original volume.
mRtcEngine?.setInEarMonitoringVolume(0)
}
Log.e("", "")
}
#Throws(Exception::class)
private fun joinChannel() {
mRtcEngine?.joinChannel(
null,
AgoraConstants.CHANNEL_NAME,
"Extra Optional Data",
0
) // if you do not specify the uid, we will generate the uid for you
}
#Throws(Exception::class)
private fun leaveChannel() {
mRtcEngine!!.leaveChannel()
}
I think first you want to put setupRemoteVideo in onFirstRemoteVideoDecoded callback instead of the onUserJoined callback. Also, in the onDestroy callback, you should call RtcEngine.destroy() instead of RtcEngine.destroy(mRtcEngine).

Android WebRTC doesn't work on Different network - No Video

I am trying to stream video from Raspberry Pi to android device via webrtc. I am using firebase (firestore) as signalling. I am able to run the setup while connected to same wifi but it fails when different networks are being used.
Device - RPI
Client
1) Web client (hosted on firebase)
2) Android App
On same network (wifi) between device and clients, both clients are able to play video and audio.
But when device and client are on different network, web client is able to show video but Android App is not able to show video.
Signalling is working correctly and on device, camera and microphone are started and ice candidates are exchanged successfully. I also get remote stream added (onAddStream called) on android. But no video and audio is playing.
Android PeerConnectionClient
class PeerConnectionClient(private val activity: MainActivity, private val fSignalling: FSignalling) {
internal var isVideoRunning = false
private val rootEglBase by lazy {
EglBase.create()
}
private val peerConnectionFactory: PeerConnectionFactory by lazy {
val initializationOptions = PeerConnectionFactory.InitializationOptions.builder(activity).createInitializationOptions()
PeerConnectionFactory.initialize(initializationOptions)
val options = PeerConnectionFactory.Options()
val defaultVideoEncoderFactory = DefaultVideoEncoderFactory(rootEglBase.eglBaseContext, true, true)
val defaultVideoDecoderFactory = DefaultVideoDecoderFactory(rootEglBase.eglBaseContext)
PeerConnectionFactory.builder()
.setOptions(options)
.setVideoEncoderFactory(defaultVideoEncoderFactory)
.setVideoDecoderFactory(defaultVideoDecoderFactory)
.createPeerConnectionFactory()
}
private val iceServersList = mutableListOf("stun:stun.l.google.com:19302")
private var sdpConstraints: MediaConstraints? = null
private var localAudioTrack: AudioTrack? = null
private var localPeer: PeerConnection? = null
private var gotUserMedia: Boolean = false
private var peerIceServers: MutableList<PeerConnection.IceServer> = ArrayList()
init {
peerIceServers.add(PeerConnection.IceServer.builder(iceServersList).createIceServer())
// activity.surface_view.release()
activity.surface_view.init(rootEglBase.eglBaseContext, null)
activity.surface_view.setZOrderMediaOverlay(true)
createPeer()
}
private fun createPeer() {
sdpConstraints = MediaConstraints()
val audioconstraints = MediaConstraints()
val audioSource = peerConnectionFactory.createAudioSource(audioconstraints)
localAudioTrack = peerConnectionFactory.createAudioTrack("101", audioSource)
gotUserMedia = true
activity.runOnUiThread {
if (localAudioTrack != null) {
createPeerConnection()
// doCall()
}
}
}
/**
* Creating the local peerconnection instance
*/
private fun createPeerConnection() {
val constraints = MediaConstraints()
constraints.mandatory.add(MediaConstraints.KeyValuePair("offerToReceiveAudio", "true"))
constraints.mandatory.add(MediaConstraints.KeyValuePair("offerToReceiveVideo", "true"))
constraints.optional.add(MediaConstraints.KeyValuePair("DtlsSrtpKeyAgreement", "true"))
val rtcConfig = PeerConnection.RTCConfiguration(peerIceServers)
// TCP candidates are only useful when connecting to a server that supports
// ICE-TCP.
rtcConfig.enableDtlsSrtp = true
rtcConfig.enableRtpDataChannel = true
// rtcConfig.tcpCandidatePolicy = PeerConnection.TcpCandidatePolicy.DISABLED
// rtcConfig.bundlePolicy = PeerConnection.BundlePolicy.MAXBUNDLE
// rtcConfig.rtcpMuxPolicy = PeerConnection.RtcpMuxPolicy.REQUIRE
// rtcConfig.continualGatheringPolicy = PeerConnection.ContinualGatheringPolicy.GATHER_CONTINUALLY
// Use ECDSA encryption.
// rtcConfig.keyType = PeerConnection.KeyType.ECDSA
localPeer = peerConnectionFactory.createPeerConnection(rtcConfig, constraints, object : PeerObserver {
override fun onIceCandidate(p0: IceCandidate) {
super.onIceCandidate(p0)
onIceCandidateReceived(p0)
}
override fun onAddStream(p0: MediaStream) {
activity.showToast("Received Remote stream")
super.onAddStream(p0)
gotRemoteStream(p0)
}
})
addStreamToLocalPeer()
}
/**
* Adding the stream to the localpeer
*/
private fun addStreamToLocalPeer() {
//creating local mediastream
val stream = peerConnectionFactory.createLocalMediaStream("102")
stream.addTrack(localAudioTrack)
localPeer!!.addStream(stream)
}
/**
* This method is called when the app is initiator - We generate the offer and send it over through socket
* to remote peer
*/
/*private fun doCall() {
localPeer!!.createOffer(object : mySdpObserver {
override fun onCreateSuccess(p0: SessionDescription) {
super.onCreateSuccess(p0)
localPeer!!.setLocalDescription(object: mySdpObserver {}, p0)
Log.d("onCreateSuccess", "SignallingClient emit ")
}
}, sdpConstraints)
}*/
private fun onIceCandidateReceived(iceCandidate: IceCandidate) {
//we have received ice candidate. We can set it to the other peer.
if (localPeer == null) {
return
}
val message = JSONObject()
message.put("type", "candidate")
message.put("label", iceCandidate.sdpMLineIndex)
message.put("id", iceCandidate.sdpMid)
message.put("candidate", iceCandidate.serverUrl)
fSignalling.doSignalingSend(message.toString())
}
private fun gotRemoteStream(stream: MediaStream) {
isVideoRunning = true
//we have remote video stream. add to the renderer.
val videoTrack = stream.videoTracks[0]
videoTrack.setEnabled(true)
activity.runOnUiThread {
try {
// val remoteRenderer = VideoRenderer(surface_view)
activity.surface_view.visibility = View.VISIBLE
// videoTrack.addRenderer(remoteRenderer)
videoTrack.addSink(activity.surface_view)
} catch (e: Exception) {
e.printStackTrace()
}
}
}
fun onReceivePeerMessage(data: JSONObject) {
if (data.getString("type") == "offer") {
// val sdpReturned = SdpUtils.forceChosenVideoCodec(data.getString("sdp"), "H264")
val sdpReturned = data.getString("sdp")
// data.remove("sdp")
// data.put("sdp", sdpReturned)
val sessionDescription = SessionDescription(SessionDescription.Type.OFFER, sdpReturned)
localPeer?.setRemoteDescription(object: mySdpObserver { }, sessionDescription)
localPeer?.createAnswer(object : mySdpObserver {
override fun onCreateSuccess(p0: SessionDescription) {
super.onCreateSuccess(p0)
localPeer!!.setLocalDescription( object : mySdpObserver {}, p0)
val description = JSONObject()
description.put("type", p0.type.canonicalForm())
description.put("sdp", p0.description)
this#PeerConnectionClient.fSignalling.doSignalingSend(description.toString())
}
override fun onCreateFailure(p0: String) {
super.onCreateFailure(p0)
activity.showToast("Failed to create answer")
}
}, MediaConstraints())
} else if (data.getString("type") == "candidate") {
val iceCandidates = IceCandidate(data.getString("id"), data.getInt("label"), data.getString("candidate"))
localPeer?.addIceCandidate(iceCandidates)
}
}
internal fun close() {
isVideoRunning = false
localPeer?.close()
localPeer = null
}
}
I am under the impression that if web client is able to display video on different network (mobile hotspot), android client on same internet used by web client should be able to display video as well. Is it wrong?
Why won't android display video (onAddStream is called)
Is it required to use Turn server? My assumption again is the if web client works, so should android. The service i am using on RPI do not have support for turn server.
Additional info:
Device is behind double natted ISP (i guess) (but since web client can connect, it won't be an issue i guess).
I have found a solution to the issue
I was using
private fun onIceCandidateReceived(iceCandidate: IceCandidate) {
//we have received ice candidate. We can set it to the other peer.
if (localPeer == null) {
return
}
val message = JSONObject()
message.put("type", "candidate")
message.put("label", iceCandidate.sdpMLineIndex)
message.put("id", iceCandidate.sdpMid)
message.put("candidate", iceCandidate.serverUrl)
fSignalling.doSignalingSend(message.toString())
}
Instead was required to use
message.put("candidate", iceCandidate.sdp) // iceCandidate.serverUrl)

ACRCloud integration to android app

I have the following code for music recognition. I am using intent service to do all the music recognition in the service. I have done all the basic steps like adding all the permissions required and adding the ACRCloud android SDK in the project.
class SongIdentifyService(discoverPresenter : DiscoverPresenter? = null) : IACRCloudListener , IntentService("SongIdentifyService") {
private val callback : SongIdentificationCallback? = discoverPresenter
private val mClient : ACRCloudClient by lazy { ACRCloudClient() }
private val mConfig : ACRCloudConfig by lazy { ACRCloudConfig() }
private var initState : Boolean = false
private var mProcessing : Boolean = false
override fun onHandleIntent(intent: Intent?) {
Log.d("SongIdentifyService", "onHandeIntent called" )
setUpConfig()
addConfigToClient()
if (callback != null) {
startIdentification(callback)
}
}
public fun setUpConfig(){
Log.d("SongIdentifyService", "setupConfig called")
this.mConfig.acrcloudListener = this#SongIdentifyService
this.mConfig.host = "some-host"
this.mConfig.accessKey = "some-accesskey"
this.mConfig.accessSecret = "some-secret"
this.mConfig.protocol = ACRCloudConfig.ACRCloudNetworkProtocol.PROTOCOL_HTTP // PROTOCOL_HTTPS
this.mConfig.reqMode = ACRCloudConfig.ACRCloudRecMode.REC_MODE_REMOTE
}
// Called to start identifying/discovering the song that is currently playing
fun startIdentification(callback: SongIdentificationCallback)
{
Log.d("SongIdentifyService", "startIdentification called")
if(!initState)
{
Log.d("AcrCloudImplementation", "init error")
}
if(!mProcessing) {
mProcessing = true
if (!mClient.startRecognize()) {
mProcessing = false
Log.d("AcrCloudImplementation" , "start error")
}
}
}
// Called to stop identifying/discovering song
fun stopIdentification()
{
Log.d("SongIdentifyService", "stopIdentification called")
if(mProcessing)
{
mClient.stopRecordToRecognize()
}
mProcessing = false
}
fun cancelListeningToIdentifySong()
{
if(mProcessing)
{
mProcessing = false
mClient.cancel()
}
}
fun addConfigToClient(){
Log.d("SongIdentifyService", "addConfigToClient called")
this.initState = this.mClient.initWithConfig(this.mConfig)
if(this.initState)
{
this.mClient.startPreRecord(3000)
}
}
override fun onResult(result: String?) {
Log.d("SongIdentifyService", "onResult called")
Log.d("SongIdentifyService",result)
mClient.cancel()
mProcessing = false
val result = Gson().fromJson(result, SongIdentificationResult :: class.java)
if(result.status.code == 3000)
{
callback!!.onOfflineError()
}
else if(result.status.code == 1001)
{
callback!!.onSongNotFound()
}
else if(result.status.code == 0 )
{
callback!!.onSongFound(MusicDataMapper().convertFromDataModel(result))
//callback!!.onSongFound(Song("", "", ""))
}
else
{
callback!!.onGenericError()
}
}
override fun onVolumeChanged(p0: Double) {
TODO("not implemented") //To change body of created functions use File | Settings | File Templates.
}
interface SongIdentificationCallback {
// Called when the user is offline and music identification failed
fun onOfflineError()
// Called when a generic error occurs and music identification failed
fun onGenericError()
// Called when music identification completed but couldn't identify the song
fun onSongNotFound()
// Called when identification completed and a matching song was found
fun onSongFound(song: Song)
}
}
Now when I am starting the service I am getting the following error:
I checked the implementation of the ACRCloudClient and its extends android Activity. Also ACRCloudClient uses shared preferences(that's why I am getting a null pointer exception).
Since keeping a reference to an activity in a service is not a good Idea its best to Implement the above code in the activity. All the implementation of recognizing is being done in a separate thread anyway in the ACRCloudClient class so there is no point of creating another service for that.

ExoPlayer not working with Ads

I have implemented the ExoPlayer in my application using the example from the Codelab : https://codelabs.developers.google.com/codelabs/exoplayer-intro/#3, algo with the example from https://medium.com/google-exoplayer/playing-ads-with-exoplayer-and-ima-868dfd767ea, the only difference is that I use AdsMediaSource instead of the deprecated ImaAdsMediaSource.
My Implementation is this:
class HostVideoFullFragment : Fragment(), AdsMediaSource.MediaSourceFactory {
override fun getSupportedTypes() = intArrayOf(C.TYPE_DASH, C.TYPE_HLS, C.TYPE_OTHER)
override fun createMediaSource(uri: Uri?, handler: Handler?, listener: MediaSourceEventListener?): MediaSource {
#C.ContentType val type = Util.inferContentType(uri)
return when (type) {
C.TYPE_DASH -> {
DashMediaSource.Factory(
DefaultDashChunkSource.Factory(mediaDataSourceFactory),
manifestDataSourceFactory)
.createMediaSource(uri, handler, listener)
}
C.TYPE_HLS -> {
HlsMediaSource.Factory(mediaDataSourceFactory)
.createMediaSource(uri, handler, listener)
}
C.TYPE_OTHER -> {
ExtractorMediaSource.Factory(mediaDataSourceFactory)
.createMediaSource(uri, handler, listener)
}
else -> throw IllegalStateException("Unsupported type for createMediaSource: $type")
}
}
private var player: SimpleExoPlayer? = null
private lateinit var playerView: SimpleExoPlayerView
private lateinit var binding: FragmentHostVideoFullBinding
private var playbackPosition: Long = 0
private var currentWindow: Int = 0
private var playWhenReady = true
private var inErrorState: Boolean = false
private lateinit var adsLoader: ImaAdsLoader
private lateinit var manifestDataSourceFactory: DataSource.Factory
private lateinit var mediaDataSourceFactory: DataSource.Factory
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
//Initialize the adsLoader
adsLoader = ImaAdsLoader(activity as Context, Uri.parse("https://pubads.g.doubleclick.net/gampad/ads?sz=640x480&iu=/124319096/external/ad_rule_samples&ciu_szs=300x250&ad_rule=1&impl=s&gdfp_req=1&env=vp&output=vmap&unviewed_position_start=1&cust_params=deployment%3Ddevsite%26sample_ar%3Dpremidpost&cmsid=496&vid=short_onecue&correlator="))
manifestDataSourceFactory = DefaultDataSourceFactory(
context, Util.getUserAgent(context, "BUO-APP"))//TODO change the applicationName with the right application name
//
mediaDataSourceFactory = DefaultDataSourceFactory(
context,
Util.getUserAgent(context, "BUO-APP"),//TODO change the applicationName with the right application name
DefaultBandwidthMeter())
}
private fun initializePlayer() {
/*
* Since the player can change from null (when we release resources) to not null we have to check if it's null.
* If it is then reset again
* */
if (player == null) {
//Initialize the Exo Player
player = ExoPlayerFactory.newSimpleInstance(DefaultRenderersFactory(activity as Context),
DefaultTrackSelector())
}
val uri = Uri.parse(videoURl)
val mediaSourceWithAds = buildMediaSourceWithAds(uri)
//Bind the view from the xml to the SimpleExoPlayer instance
playerView.player = player
//Add the listener that listens for errors
player?.addListener(PlayerEventListener())
player?.seekTo(currentWindow, playbackPosition)
player?.prepare(mediaSourceWithAds, true, false)
//In case we could not set the exo player
player?.playWhenReady = playWhenReady
//We got here without an error, therefore set the inErrorState as false
inErrorState = false
//Re update the retry button since, this method could have come from a retry click
updateRetryButton()
}
private inner class PlayerEventListener : Player.DefaultEventListener() {
fun updateResumePosition() {
player?.let {
currentWindow = player!!.currentWindowIndex
playbackPosition = Math.max(0, player!!.contentPosition)
}
}
override fun onPlayerStateChanged(playWhenReady: Boolean, playbackState: Int) {
//The player state has ended
//TODO check if there is going to be a UI change here
// if (playbackState == Player.STATE_ENDED) {
// showControls()
// }
// updateButtonVisibilities()
}
override fun onPositionDiscontinuity(#Player.DiscontinuityReason reason: Int) {
if (inErrorState) {
// This will only occur if the user has performed a seek whilst in the error state. Update
// the resume position so that if the user then retries, playback will resume from the
// position to which they seek.
updateResumePosition()
}
}
override fun onPlayerError(e: ExoPlaybackException?) {
var errorString: String? = null
//Check what was the error so that we can show the user what was the correspond problem
if (e?.type == ExoPlaybackException.TYPE_RENDERER) {
val cause = e.rendererException
if (cause is MediaCodecRenderer.DecoderInitializationException) {
// Special case for decoder initialization failures.
errorString = if (cause.decoderName == null) {
when {
cause.cause is MediaCodecUtil.DecoderQueryException -> getString(R.string.error_querying_decoders)
cause.secureDecoderRequired -> getString(R.string.error_no_secure_decoder,
cause.mimeType)
else -> getString(R.string.error_no_decoder,
cause.mimeType)
}
} else {
getString(R.string.error_instantiating_decoder,
cause.decoderName)
}
}
}
if (errorString != null) {
//Show the toast with the proper error
Toast.makeText(activity as Context, errorString, Toast.LENGTH_LONG).show()
}
inErrorState = true
if (isBehindLiveWindow(e)) {
clearResumePosition()
initializePlayer()
} else {
updateResumePosition()
updateRetryButton()
}
}
}
private fun clearResumePosition() {
//Clear the current resume position, since there was an error
currentWindow = C.INDEX_UNSET
playbackPosition = C.TIME_UNSET
}
/*
* This is for the multi window support
* */
private fun isBehindLiveWindow(e: ExoPlaybackException?): Boolean {
if (e?.type != ExoPlaybackException.TYPE_SOURCE) {
return false
}
var cause: Throwable? = e.sourceException
while (cause != null) {
if (cause is BehindLiveWindowException) {
return true
}
cause = cause.cause
}
return false
}
private fun buildMediaSourceWithAds(uri: Uri): MediaSource {
/*
* This content media source is the video itself without the ads
* */
val contentMediaSource = ExtractorMediaSource.Factory(
DefaultHttpDataSourceFactory("BUO-APP")).createMediaSource(uri) //TODO change the user agent
/*
* The method constructs and returns a ExtractorMediaSource for the given uri.
* We simply use a new DefaultHttpDataSourceFactory which only needs a user agent string.
* By default the factory will use a DefaultExtractorFactory for the media source.
* This supports almost all non-adaptive audio and video formats supported on Android. It will recognize our mp3 file and play it nicely.
* */
return AdsMediaSource(
contentMediaSource,
/* adMediaSourceFactory= */ this,
adsLoader,
playerView.overlayFrameLayout,
/* eventListener= */ null, null)
}
override fun onStart() {
super.onStart()
if (Util.SDK_INT > 23) {
initializePlayer()
}
}
override fun onResume() {
super.onResume()
hideSystemUi()
/*
* Starting with API level 24 Android supports multiple windows.
* As our app can be visible but not active in split window mode, we need to initialize the player in onStart.
* Before API level 24 we wait as long as possible until we grab resources, so we wait until onResume before initializing the player.
* */
if ((Util.SDK_INT <= 23 || player == null)) {
initializePlayer()
}
}
}
The ad never shows and if it shows it shows a rendering error E/ExoPlayerImplInternal: Renderer error. which never allows the video to show. I've run the examples from the IMA ads https://developers.google.com/interactive-media-ads/docs/sdks/android/ example code and it doesn't work neither. Does anyone have implemented the Exo Player succesfully with the latest ExoPlayer library version?
Please Help. Thanks!
When on an emulator, be sure to enable gpu rendering on the virtual device
The problem is that the emulator can not render videos. Therefore it wasn't showing the ads or the video. Run the app on a phone and it will work

Categories

Resources