I've setup casting abilities with the notification controls. The issue I'm having is that I need to differentiate between when a User clicks on the notification (that spawns the activity) and any other time the activity was created.
I would think this can be done by adding an intent-filter to the receiver entity in the manifest:
<receiver android:name=".services.CastIntentReceiver">
<intent-filter>
something goes here?
</intent-filter>
</receiver>
This is basically needed so I can rebuild the View where I house the Cast Controller after the activity is re-launched from the notification. Without any differentiation, the implementation interferes with the functionality I built for the View rebuilding after orientation change (since they both use onResume())
Thanks in advance for any help.
Try to read Media Route Provider. Media Route allows to play media content from their Android devices, allowing Android users to instantly show a picture, play a song or share a video.
The Android media router framework allows manufacturers to enable playback on their devices through a standardized interface called a MediaRouteProvider. A route provider defines a common interface for playing media on a receiver device, making it possible to play media on your equipment from any Android application that supports media routes.
A media route provider is distributed as part of an Android app. Your route provider can be made available to other apps by extending MediaRouteProviderService or wrapping your implementation of MediaRouteProvider with your own service and declaring an intent filter for the media route provider. These steps allow other apps to discover and make use of your media route.
There are two main types of playback supported by the media router framework. A media route provider can support one or both types of playback, depending on the capabilities of your playback equipment and the functionality you want to support:
Remote Playback — This approach uses the receiver device to handle the content data retrieval, decoding, and playback, while an Android device in the user's hand is used as a remote control. This approach is used by Android apps that support Google Cast.
Secondary Output — With this approach, the Android media application retrieves, renders and streams video or music directly to the receiver device. This approach is used to support Wireless Display output on Android.
<service android:name=".provider.SampleMediaRouteProviderService"
android:label="#string/sample_media_route_provider_service"
android:process=":mrp">
<intent-filter>
<action android:name="android.media.MediaRouteProviderService" />
</intent-filter>
</service>
public class SampleMediaRouteProviderService extends MediaRouteProviderService {
#Override
public MediaRouteProvider onCreateMediaRouteProvider() {
return new SampleMediaRouteProvider(this);
}
}
Related
I'm trying to add Android Auto capability to my existing app. It's a messaging app, but some messages have an audio attachment. So far, I was successfully able to create a CarAppService to display the messages themselves, but I can't seem to be able to get the audio playback to connect to the Android Auto (so that it's tied to a a media session and showing playback controls in the Android Auto dashboard).
I followed the instructions here, and I also tried using the new Media3 library (using sample code here), but neither one seems to activate the MediaBrowserService (or MediaLibraryService in case of Media3).
I have the foreground service permission and the browser service declared in the manifest
<uses-permission android:name="android.permission.FOREGROUND_SERVICE" />
<meta-data android:name="com.google.android.gms.car.notification.SmallIcon"
android:resource="#drawable/ic_auto_icon" />
<service
android:name=".auto.MediaService"
android:foregroundServiceType="mediaPlayback"
android:exported="true">
<intent-filter>
<action android:name="androidx.media3.session.MediaSessionService"/>
<action android:name="android.media.browse.MediaBrowserService"/>
</intent-filter>
</service>
and I have the MediaLibraryService all set with creating a media session, using my existing ExoPlayer instance, and being initialized with the proper MediaLibrarySession.Callback. But the `onCreate()' method of the MediaLibraryService (or MediaBrowserService) never gets called.
What am I missing? Is there something that I need to do in the car app itself to make it bind to the media browser service?
I figured out what was happening, and the problem is basically due to almost complete lack of documentation for Android Auto. As I said in my OP, I have two different services set up: CarAppService and MediaBrowserService. Because of that, my app gets shown twice in AA (2 icons that launch 2 completely different versions of the app). So, each service gets treated like a completely separate app. And once I realized that, I created different icons and labels for each service in the Manifest, so at least now I'm able to differentiate them.
When launching the MediaBrowserService, it acts as expected - it automatically binds to the service, and allows the user to browse through the media files (as if it were just a simple media player app).
But even launching the other one (which is the one I was originally posting about), I had to manually create a media browser client, and have it bind to the media browser service using MediaBrowserCompat.
I also figured out a way to have the MediaBrowserService icon hidden, but still retain the functionality to use it for before payback in the background. That can be done by removing the <intent-filter> component of that service from the Manifest. The only problem with doing it that way is that we also lose Android Auto's payback controls, so the media plays without a problem, but it would have to be controlled from the CarAppService, which eliminates the option of controlling it in the background.
I want to change the text "Default Media Receiver" which is displayed on the audio playback page of the Chromecast Default Media Receiver, but not the video playback page, to anything but that. I'm developing an Android app.
I'm having difficulty getting my website hosting service to enable SSL. They won't do it unless I upgrade to dedicated IP address even for a self-signed cert, at considerable expense). So I'm stuck with a choice between styled receiver/no stylesheet, and the Default Media Receiver. (Yes, a change of hosting services is in the wind. Don't ask).
I rather prefer the appearance of the Default Media Receiver. It starts up faster, and the Cast icon on the startup page instead of my app name is nicer.
Except for one small irritating detail. When playing audio tracks on the Default Media Receiver, the title of the application ("Default Media Receiver") is displayed on the otherwise very beautiful page on the Chromecast device when playing audio tracks.
Is there any way to change this without resorting to a styled media receiver?
(eyeroll directed to response in comment...Here's the code. I already described what I tried.)
#Override
public CastOptions getCastOptions(Context context) {
return new CastOptions.Builder()
// Use this line for styled/no-style-sheet.
//.setReceiverApplicationId(context.getString(R.string.cast_app_id))
// use this line for default receiver.
.setReceiverApplicationId(
CastMediaControlIntent.DEFAULT_MEDIA_RECEIVER_APPLICATION_ID)
.build();
}
There is now an option to use a "Styled Media Receiver" without having to serve content from your own Web Servers.
Go to your Application registration on the Google Cast DSK Developer site. Edit your application details to be a "Styled Media Receiver", and leave the "Skin URL" blank. Edit the rest of the details accordingly. The name you supplied in the App registration details will show as the ChromeCast device connects to your app. And your app logo will appear in the place in the Chromecast UI that used to say "Default Media Receiver".
You must also make sure you have the following code:
public CastOptions getCastOptions(Context context) {
return new CastOptions.Builder()
.setReceiverApplicationId(context.getString("YOUR APPLICATION ID"))
.build();
}
And you must (of course) PUBLISH the changes you made to the Application Registration (or add your phone as a test device).
This is not currently possible. I have filed a feature request for this here: https://issuetracker.google.com/issues/156888250
I'm using a sample app for the RemotePlaybackClient from #commonsware to play a video from a url to Google ChromeCast dongle, the app works like a charm but I would like to implement a playlist, any idea how to send a playlist to ChromeCast from an Android device?
As usual, I don't need code, just links, tutorials, etc... Tks.
Are you using a custom receiver?
If so, you can pass a json to such receiver with your playlist and manage that list with a playback state.
you might try looking at "mediaList" object here. Thats your playlist object.
This is a totally different project (not mediaRouter api but ccl instead) that i used because i wanted to implement a playlist and wanted to NOT take on my own receiver app. I wanted to see whether the default receiver could collaborate with an existing github sender sample - altered slightly to manipulate both a playList implemented in the "mediaList" AND to send appropriate and successive PLAY instructions to the default recieiver app when that app's state as relayed in normal "consumer" message traffic indicated state=ready.
D/ccl_VideoCastManager(31057): onApplicationStatusChanged() reached: Ready To Cast
So, when the default receiver fires the "ready" message, the senderApp can just call getNext to return an entry from "mediaList" and then send a "play(mediaInfo.entry)" to the default receiver.
onApplicationStatusChanged() is the interface used by the ccl to commmunicate/ sync player state between the local/remote players. When the default-remote-state changes to "ready to cast" you can use "VideoCastManager" and its base class to select the next MediaInfo entry and format a message for the remote to play it...
this.startCastControllerActivity(this.mContext, nextMediaInfo, 0, true);
code above from sender/ccl base tells the receiver to play the item that the sender has determine is next from list.
Note : i was advised to implement the playlist on a custom receiver app that i would write. Im not that ambitious and found a very simple hack on the sender/ccl classes that was reliable enough for me.
I tried using a Android phone to cast to Chromecast device by Youtube app. I added some Videos to queue, then I used another phone to cast to Chromecast device. The second one automatically knows the videos added to queue on the first one.
I don't know how Youtube app can do this?
EDIT I guess Youtube app uses one custom data channel besides Media channel. When Video is added to queue, sender app will send somethings (eg: videoId) to receiver. Receiver will save it in array of video ID. When another phone connects to Chromecast device, It'll receiver array of video ID from the receiver. Can anyone give other solutions? Thanks
I guess what you are asking is how you can create a play list, potentially shared by multiple devices. If that is the case, you have a couple of choices:
keep the playlist in the receiver: this is the simplest option. This will be a simple array on the receiver, kept in memory, which will go away when application ends. A custom receiver is required and it can implement the methods such as "append, insert, get, clear, ... to provide what the senders need. When each sender connects, it can ask (calling 'get' for example) for the current "queue" and then can modify the queue by other methods such as 'clear', 'append', 'insert', .... Note that there is no long-term persistence on the receiver (local storage is available but will be cleared as son as the app is gone).
keep the playlist in the cloud: you need to do most of the things that you do in the previous option but you also persist the playlist to the cloud; the advantage is that playlist lasts beyond the life of a session (this may or may not be desired). In addition, sender apps can potentially get the playlist fro the cloud directly, if needed.
The important thing is that the main storage for your playlist is not your sender devices; they don't know (and shouldn't know) abut the presence of other senders in the eco-system.
On the receiver side, we recently published a simple sample that sows how the notion of (local) playlist can be implemented; that is a simplified example but is enough to show that with minimal work, you can take advantage of the Media Channel; for more sophisticated handling of a shared queue, you definitely need an out-of-bound channel/namespace to handle all the additional api's that I mentioned above.
I want to create an app that stores a timestamp into the database when I scan my work-batch which contains an NFC tag. This will be done via an IntentService without starting an activity. After a second scan another timestamp will be stored into the database via the IntentService. No activity has to be started. A notification will be enough. The activity can be started manually by the user to see the info.
I have read that there are a lot of different tag technologies. But I like to make my app a bit more universal. So I don't know which kind of NFC tags my clients are going to use. I could listen for all the different tags and let the user pair a tag with a certain task.
This is fine unless there is one NFC app on the phone. But I have another app which uses NFC. And when I scan a tag Android shows me a selection dialog which app may handle the tag. But I don't want this every time I scan a tag. I want to use both apps so I dont select a default for the tags.
So the question is, How can I scan a tag and route it to the right app. So tag A will be handled by app A and tag B by app B without getting the selection box every time.
I was thinking what the best option should be or maybe somebody has a great idea how to solve this.
I have taught of a couple of different solutions:
Use only writable NDEF tags and add a Android Application Record (AAR) to it. So it will launch the right application after scanning. (If there is no NFC app active in the foreground) this will mean that the user is restricted to a tag technology and needs to write it before using.
Let the application listen for all NFC tags and if a tag is not paired to a task forward it to the system again so that other apps can handle it. (Don't know if this is possible)
Write a app which listens for all NFC tags and let the user decide which tag will be send to which application. So when a new tag is received by the application it asks the user which app may handle the tag and stores the default for this specific tag [by ID or something] into a database. So the next time it will route the intent to the default application for this tag. (Or is there already something like this?)
Hopefully this question is a bit clear. Else I'll try to clarify it a bit more if you like ;-)
I really like to hear what you think about this. Or maybe you have some good suggestions? Please let me know.
Thanks in advance.
I've successfully use an application specific URL scheme for this. Let's assume your URL scheme is "kuiperstimestamp".
The tag then should contain a URL like:
kuiperstimestamp://ts/20130506T034536.293
You then create an intent filter for your service that includes a data element:
<intent-filter>
<action android:name="android.nfc.action.TAG_DISCOVERED" />
<category android:name="android.intent.category.DEFAULT" />
<data android:scheme="kuiperstimestamp" />
</intent-filter>
Since the intent filter is rather specific, you don't get the app selection dialog (unless another app or service has the same specific intent filter which is unlikely).
This approach is less Android specific than using an AAR.