I wrote a DMR for android with the open source project CyberLink4Java. Test it with tools that DLNA group released (Golden DMC & Gloden DMS). I create a Device with proper description file. Now it can push play.
But it don't support control by the DMC.
On the DMC UI, the control buttons(play, pause and stop) are grey.
There must be some data exchange to tell the DMC that it support these controls during the DMR service start. But I can't find in the spec. Any idea on what's wrong?
The way to expose transport-controlling actions that are valid at a given time is the CurrentTransportActions state variable and the corresponding method GetCurrentTransportActions. These features are optional but conditionally required so if you implement one you need to implement both. CurrentTransportActions is like most variables in AVTransport: it's not evented on its own but value changes will be included in LastChange events. This is all documented in AVTransport service definition.
That said, mostly a DMC can figure out the valid actions based on DMR state even if the above features aren't supported: e.g. if AVTransportURI is empty, showing playback controls doesn't make sense.
Related
How can I specify a custom wakeword name (eg "stack overflow" or "party time") in the spokestack-android configuration? I'm looking for something like:
SpeechPipeline pipeline = new SpeechPipeline.Builder()
.setProperty("wakeword", "stack overflow")
//...
.build();
Update: You can train your own wakeword (without writing code, just providing audio samples) with a Maker subscription. When they're finished training, you can download and configure the custom wake word the same way you set up the default wake word.
Currently, Spokestack Android only supports wakeword detection via a binary classifier, so we only recognize "Spokestack". In theory, this could be done via Android's platform ASR, with the caveat that the user would constantly be interrupted by Google Assistant-style audible dings as the ASR request times out and gets restarted, so it'd only be useful for informal demos, not real apps.
That said, it's theoretically possible, so feel free to open an issue, and it might show up in a future version if we get enough demand for it.
I have a remote playback device that is not a cast-device (let's call him Johnny 5 for now).
From a client app, I want to be able to cast content to a Chromecast or to my Johnny 5 device. The app is based on the Cast SDK v3.
In order to integrate my non-cast device, I built a Media Router Provider and extended the Session & SessionProvider classes. The Session Provider is added in the getAdditionalSessionProviders method of my OptionsProvider class.
I managed to make things work: my device appears in the list of the detected devices with the Chromecast (thanks to the MediaRouterProvider), and when I select it, session is started, and then I can cast contents on it.
However, it seems like the RemoteMediaClient object is Google-cast specific (cannot be used with non-cast devices), like a lot of Cast SDK features (mini controller, expanded controller...).
Question here regards the Cast Dialog, the Remote Control Notification and the Lock Screen: is that possible to use these with my non-cast device ? Or do I have to code the whole bunch to 'mimic' the Cast SDK features ?
Regarding the Cast Dialog, I would like to be able to customize it to have the same behavior as for the Chromecast without coding my own device picker or overriding the default button behavior.
The majority of work would be on your side, coding most everything for non-cast devices. Cast APIs, as you've noticed, do not interact with Sessions that are not CastSessions. So your options is to write an interface that is already implemented by Cast SDK for cast devices and by you for non-cast devices and try to use that common interface as much as possible. Getting things like lock screen amounts to creating a MediaStyle notification; CastSdk handles that for when there is a CastSession involved and you need to create a similar notification when a different type of Session is involved. This is true for almost all UI elements that the Cast SDK provides out of the box.
I can't seem to find anything related to finding out what application got audio focus. I can correctly determine from my application what type of focus change it was, but not from any other application. Is there any way to determine what application received focus?
"What am I wanting to do?"
I have managed to record internal sound whether it be music or voice. If I am currently recording audio no matter the source, I want to determine what application took the focus over to determine what my application need's to do next.
Currently I am using the AudioManager.OnAudioFocusChangeListener for my application to stop recording internal sounds once the focus changes, but I want the application's name that gained the focus.
Short Answer: There's no good solution... and Android probably intended it this way.
Explanation:
Looking at the source code, AudioManager has no API's(even hidden APIs) for checking who has Audio Focus. AudioManager wraps calls to AudioService which holds onto the real audio state. The API that AudioService exposes through it's Stub when AudioManager binds to it also does not have an API for querying current Audio Focus. Thus, even through reflection / system level permissions you won't be able get the information you want.
If you're curious how the focus changes are kept track of, you can look at MediaFocusControl whose instance is a member variable of AudioService here.
Untested Hacky Heuristic:
You might be able to get some useful information by looking at UsageStats timestamps. Then once you have apps that were used within say ~500ms of you losing AudioFocus you can cross-check them against apps with Audio Permissions. You can follow this post to get permissions for any installed app.
This is clearly a heuristic and could require some tuning. It also requires the user to grant your app permissions to get access to the usage stats. Mileage may vary.
Looking at the MediaContorller class (new in lollipop, available in comparability library for older versions).
There are these two methods that look interesting:
https://developer.android.com/reference/android/media/session/MediaController.html#getPackageName()
https://developer.android.com/reference/android/media/session/MediaController.html#getSessionActivity()
getPackageName supposedly returns the current sessions package name:
http://androidxref.com/5.1.1_r6/xref/frameworks/base/media/java/android/media/session/MediaController.java#397
getSessionActivity gives you a PendingIntent with an activity to start (if one is supplied), where you could get the package as well.
Used together with your audio listener and a broadcast receiver for phone state to detect if the phone is currently ringing you might be able to use this in order to get a more fine grained detection than you currently have. As Trevor Carothers pointed out above, there is no way to get the general app with audio focus.
You can use dumpsys audio to find who are using audio focus. And, you can also look into the results of dumpsys media_session.
And, if you want to find who're playing music, you can choose dumpsys media.audio_flinger. For myself, I switch to this command.
According to the Release Notes (of July 8), the docs for the Sender and the updated answer of this question, the Styled Media Receiver of Google Cast does now support Closed Captioning or Subtitle tracks.
However, when I tell the Default or the Styled Media Receiver to show a text track, nothing happens. It does not even load the .vtt from the server, as I can see in the logs.
I can tell the receiver app got the text tracks just fine, but even using the Android example app, the subtitles never show up. According to all the logs, they are being sent and the receiver app is told to show them - but they never appear, they are never even loaded.
The MediaTrack is being created as follows:
new MediaTrack.Builder(2, MediaTrack.TYPE_TEXT)
.setName("Deutsch")
.setSubtype(MediaTrack.SUBTYPE_CAPTIONS)
.setContentId("https://example.com/video/caption_de.vtt")
.setContentType("text/vtt")
.setLanguage("de").build();
I have checked thrice that the file exists and is being loaded with the type text/vtt. But that does not matter, as the file is never even requested by the player. I have tried both MediaTrack.SUBTYPE_CAPTIONS and MediaTrack.SUBTYPE_SUBTITLES.
So I need to know, is this claimed support of CC in the Styled Media Receiver simply a lie? Or is there some undocumented trick required to make it possible?
If there is still a custom receiver required, I would like to know how to convert the example player to support subtitles, as it doesn't seem to support them either.
First, I suggest you change your wording in future posts (re: "..is simply a lie.."); that is not appropriate at all. Secondly, it works and you can test that with the CastVideos-android app (or ios variation of it for that matter); the first three videos have CC. Lastly, we have documentation on that subject on our documentation site (https://developers.google.com/cast/docs/android_sender, under "Using the Tracks API").
I'm using a sample app for the RemotePlaybackClient from #commonsware to play a video from a url to Google ChromeCast dongle, the app works like a charm but I would like to implement a playlist, any idea how to send a playlist to ChromeCast from an Android device?
As usual, I don't need code, just links, tutorials, etc... Tks.
Are you using a custom receiver?
If so, you can pass a json to such receiver with your playlist and manage that list with a playback state.
you might try looking at "mediaList" object here. Thats your playlist object.
This is a totally different project (not mediaRouter api but ccl instead) that i used because i wanted to implement a playlist and wanted to NOT take on my own receiver app. I wanted to see whether the default receiver could collaborate with an existing github sender sample - altered slightly to manipulate both a playList implemented in the "mediaList" AND to send appropriate and successive PLAY instructions to the default recieiver app when that app's state as relayed in normal "consumer" message traffic indicated state=ready.
D/ccl_VideoCastManager(31057): onApplicationStatusChanged() reached: Ready To Cast
So, when the default receiver fires the "ready" message, the senderApp can just call getNext to return an entry from "mediaList" and then send a "play(mediaInfo.entry)" to the default receiver.
onApplicationStatusChanged() is the interface used by the ccl to commmunicate/ sync player state between the local/remote players. When the default-remote-state changes to "ready to cast" you can use "VideoCastManager" and its base class to select the next MediaInfo entry and format a message for the remote to play it...
this.startCastControllerActivity(this.mContext, nextMediaInfo, 0, true);
code above from sender/ccl base tells the receiver to play the item that the sender has determine is next from list.
Note : i was advised to implement the playlist on a custom receiver app that i would write. Im not that ambitious and found a very simple hack on the sender/ccl classes that was reliable enough for me.