I recently added the ability to (chrome) cast a radio stream (audio/aac) from my Android radio app. The initial casting works as expected; the stream begins playing on the selected chromecast device and the MediaMetadata I provided is displayed in the Cast Dialog and, if casting to a TV or monitor, on the screen via the receiver app.
When I initially cast the stream I provide the MediaMetadata of the currently playing track. Within my Android app I have a service that can notify me when the track has changed and sends out an updated metadata. The issue I'm having is that I'm struggling to find a way to notify the Chromecast device of this update so that the TV will update the track metadata, show the right title/artist & artwork.
The only partial, hacky way I've found of doing it so far is by calling RemoteMediaClient.load again with the same stream info and the updated MediaMetadata. This approach is not acceptable as it causes the stream to reload, temporarily halting the playing.
I do not see any methods in the RemoteMediaClient that allow for simply updating the MediaMetadata.
The only other post I can find on here that somewhat relates to this is this one, however this now 4 years old and refers to an older version of the SDK that has since been deprecated.
I'm pretty sure the functionality I desire is possible as I've seen this behaviour in other radio apps that include the Chromecast feature.
Related
I'm trying to use a media session in my app to make background playback and notifications and all that work. The problem is that the documentation isn't very good and there's missing features. The main feature which was causing me issues is not being able to set media items in the media controller. I found this other post asking about the same problem Android Media3 Session & Controller - Playback not starting, and the solution does work. However, that's only for a singular URI. I need to be able to play audio and video at the same time somehow. Can I make it work somehow or am I just doomed to only have a media app that cant play audio?
I would've asked in a comment on the solution, but there's a requirement for 50 rep... which is stupid.
This question might seem to be a repetition of the questions such as following:
How to play an audio file on a voice call in android
Background Audio for a Call in Progress - Possible?
The answers of these questions suggests that it is not possible to play a pre-recorded audio on a voice call in android. I want to know why it is not possible? What is the limitation (hardware/software)? Is it really a limitation or done purposely? Can we alter the source code of android to make it possible?
I think this is a limitation, imposed for security reasons and restricted at the OS level.
Let's analyze the security threat, first of all. If you were able to play custom audio files to the callee, a whole world of cons opens up: you could trick customer supports, you could pretend to be someone else, you could give unauthorized purchase confirmations, and so on. For this reason, neither Android nor iOS allows this functionality.
On Android, you won't be able to do so in a programmatic way, simply because the current APIs won't allow you to do so. It is stated in the official documentation as well, as pointed out here. If you dig into the source code, you can probably enable this feature by accessing the microphone output during a phone call, but that would require running your custom version of Android. A good starting point would be the AudioTrack source, available here.
EDIT: a good example of an audio mod involves enabling the Nexus 5 earpiece as a second loudspeaker (requires root). Can be found here.
After a thorough research, what I have come to know is that there are more than one limitations/hurdles to make it possible. These limitations/hurdles are at three different levels.
First limitation is at API level, because there is no high-level API to play sound files in the conversation audio during a call as mentioned in Android official documentation.
Second limitation is at Radio Interface Layer (RIL). RIL passes on complete control of the call to Radio Daemon (rild) of the Linux library which then further passes the control to the vendor RIL. That means we cannot manipulate voice call in android source code.
Even if we are able to remove these two limitations, we may still not be able to play audio file to an ongoing voice call. Because there is a third limitation. Every vendor has their own library of RIL that communicates with Radio Daemon (rild). This requires that vendor RIL to be open source which is not actually. Hardware vendors do not usually make their device drivers code available.
Detail discussion on this topic is present at this link.
This is software related due to the prioritization of audio routing in Android.
Take a look into the CallManager where you can dig into the method setAudioMode(). After the audio mode was set to MODE_IN_COMMUNICATION the following code is called
audioManager.requestAudioFocusForCall(AudioManager.STREAM_VOICE_CALL,
AudioManager.AUDIOFOCUS_GAIN_TRANSIENT);
From this point on the telephony service has the highest priority and won't let any other audio play in parallel.
Note: You can play back the audio data only to the standard output device. Currently, that is the mobile device speaker or a Bluetooth headset. You cannot play sound files in the conversation audio during a call.
See official link
http://developer.android.com/guide/topics/media/mediaplayer.html
By implementing the AudioManager.OnAudioFocusChangeListener you can get the state of the audiomanager. so by this if any music is playing in the background you can get the AudioManager states(playing and pausing is completely in developer hands) similarly......
Some of the native music players in android device where handling this, they restrict the music when call is in TelephonyManager.EXTRA_STATE_OFFHOOK.so this scenario is also completely in developer hand (whether to handle or not) if he is not handling both will play parallel y
I created a one sample app to validate filters for playing video on Chromecast.
When my device connects to Chromecast, it shows that it is connected and I'm able to play a video on Chromecast device. However, If I want to switch another video, I'm able to switch it and video plyaing also works fine.
But In this case, I want to update the UI, weather I'm playing a first video or second video? Is there any specific callback to know the mediaplayer status based on user selection on different videos.
Thanks in Advance!!
There is a lot of callbacks/listeners that report various changes that happen on the Cast side, such as OnMetadataUpdatedListener or OnStatusUpdatedListener (on Android sender side, similar ones exist on other platforms); I suggest you read our documentation and look through our sample or reference apps and then come back here with issues that you are running into, if any.
You can see the player live here:
www.stateofpsychosis.com/media/
The viewable player is a custom player. The default SC player has been shrunk down to a 1x1px iframe so it can't be seen without using the developer tools to make changes to the style.
This is only a problem on Chrome for Android
For some reason I can't get the API to actually play the music. The Previous/Next buttons work in the sense that they query and change the info, but they won't play the song either. It does however change the default SoundCloud widget play button to a pause button. It changes the track info. But it just won't actually make any noise. If I make the widget viewable and push anything in the default player (as opposed to the custom one), it seems to work though.
I too, am facing the same issue.
It seems that Chrome for Android restricts the ability to trigger HTML5 audio playing. This is to protect users from extra data usage on their phones. It seems that triggering the audio to play from a script is not allowed.
You can read more about the issue
here
Hopefully they lift this restriction in the near future.
My music application constantly plays music in the background, however I'd like to be able to detect when another application starts playing audio (such as the YouTube app) so I can pause/mute/stop the audio in my application.
This will allow a user to continue browsing the web whilst listening to music, but then if they wish to watch a video at any point, they can do so without audio conflict.
One solution might be to listen for a broadcast which states when an application begins using the AudioManager. Does such an Intent Action exist?
Edit: As in the answer provided below, there appears to be a method of detecting the loss of audio focus in 2.2 with AudioManager.OnAudioFocusChangeListener.
Great, but is there a solution for the more common versions of Android? Ideally 1.5+.
http://developer.android.com/reference/android/media/AudioManager.OnAudioFocusChangeListener.html
this thread also has additional information that might get you heading in the right direction.
http://groups.google.com/group/android-developers/browse_thread/thread/db6822d84feaac6/219d8cba07795c61?hl=en&lnk=gst&q=OnAudioFocusChangeListener#219d8cba07795c61