I'm using a MediaRouteButton to connect to a ChromeCast device. Once a user is connected they can click on the MediaRouteButton to disconnect, but I would like to place another specific disconnect button in the UI. I've been searching for a way to programmatically disconnect from the selected route, but I can't seem to find anything.
If you are using the MediaRouteHelper, you don't have access to the piece that you are looking for. You can extend MediaRouter and do all the discovery related stuff yourself and then you'll have access to what you want. That said, I strongly recommend not to provide a second mechanism for deselection of a route; we strongly urge developers to use the standard way that we have built. People may not yet be fully familiar with this relatively new approach to select a device and cast content to them but as more and more apps use that, it will become very familiar. Since popular Google apps also use that (e.g. YouTube or Play Movie or Play Music), then a lot of people will learn about that quickly.
Related
I would like to display some information from a Firebase database on the screen of my Nest Hub.
Let's say I want to scroll through 10 items and display their details every 5s.
If I am not wrong, I am tied to only 2 possibilities
Assistant Actions
I read the doc and made some POC, but it seems overcompliacted, I have to create a project, that I cannot deploy publicly, create a Firebase function to create the webhook, and I don't really have complex Assistant commands to send, so it seems overcomplicated at first sight.
=> Maybe I missed a way to make that simple?
Cast SDK
As an Android dev familiar with Java and now Kotlin, this seems easier, but as far as I understand, I can only cast Media to the Nest. Should I then create a layout with all my info, turn them to image files, and finally create a slideshow for the Nest Hub?
Thanks for any advise
Part of this answer depends on your use-case. Creating an Action via Actions Builder would give you a fair amount of flexibility over the interaction model, as it will support voice commands and you'd be able to trigger it directly from the device ("Talk to X").
Creating something via Cast may be easier, as you're just projecting content (a webview). While easier, you'd need to be able to start it from another device and can't control it with voice. Control can be done through the casting device.
Based on the use-case you've given, as a passive display, I would suggest going with the Cast SDK. You can look at implementing a custom web receiver to manage your web app.
I'm just looking for a little guidance, or a point in the right direction here.
I'm creating a battery monitoring application. I want to be able to link multiple android devices to the same users account, so that they can see the battery level of all their devices.
What is the simplest way to create a user profile, and share that data?
I was wanting to use "Sign in with Google" and possible a database storage solution, but I'm not sure if that is the easiest way to go about it.
I haven't used the Google thing but from what little I've read about it it sounds like that would be a good way to go. Parse.com is also a pretty easy service to use for data storage, and it supports access control lists to restrict data to a particular user.
I have tried to code this with Android's included android.speech.SpeechRecognizer class with no success.
Basically, what I am trying to do is making my app constantly listen for one keyword that will fire an intent whenever the keyword is recognized. I know that this will use a lot of battery.
For example - you are talking with a person. Normal conversation. The phone is actively listening and recognizing every single said word and listening for the keyword.
Let's say the keyword is "cheese" in this instance.
Whenever you say "cheese," the application fires an intent that starts up another part of the app.
I have tried to use speech recognition as a service but things didn't really go as planned. Maybe I did a mistake, I don't know.
I've been trying to accomplish this for 2 days in a row now, for more than 24 hours work time combined. If I am being too broad or infringing any of SO's rules, I sincerely apologize and ask my question to be deleted.
My question is - how would this be possible? Of course the SpeechRecognition that is included with android itself would be preferable, but it definitely will be a hassle because it is not even designed to work for extended periods.
from my research, there is no way to do this using the standard google voice recognition server. They way it works is once sound/word is recognized, the recognizer returns a list of what it thinks it heard with an associated confidence score.
to do what you are asking, you would:
have to keep re-activating the recognition service every time it fired a recognition event, until it matches the word you want.
your app would have to 'keep-awake' the recognition service. you could do this by creating a service that periodically wakes up your handset and resuming the service/activity.
I would not recommend either of these options considering that the battery life is really reduces by the voice recognition service being constantly on.
Unfortunately, I do not think there are any native Android APIs that will fully suit your needs. I would recommend checking out pocketsphinx.
It is a pretty robust speaker-independent speech recognition API from CMU that is more intended for tasks such as this. You can also check out a tutorial for getting started here.
Google has not made API support for "OK GOOGLE" public and left it on vendors to change or pass the support to consumers.
I think best bet at this time would be build source code yourself and then call the API's. As an example below google library has low level details of implementing recognizer. I'm not sure why google does not made it public.
I don't see an easy way to implement and test it.
http://grepcode.com/file/repository.grepcode.com/java/ext/com.google.android/android/4.3_r2.1/android/speech/srec/Recognizer.java
Google provides a variety of 'cards' for Google Now (http://www.google.com/landing/now/). Is it possible to create your own cards? The system looks pretty modular, but I haven't found any documentation or instructions to do so. (I believe you need to supply the content of the card, and some way of signaling when it is supposed to be shown. There is probably just some interface that you have to implement.)
If there is no documented solution, a hackish/undocumented way would be ok, too. I'm mostly curious how it works.
Edit: Specifically, does somebody have knowledge about the internals of Google Now, e.g. by decompiling the .apk? What I've seen suggests it is pretty modular, and it should be fairly easy to drop another class into the .apk, or to maybe inject code using Cydia Substrate. I know that there is (as of Nov. 2013) no official way to add new cards.
There is currently no way to do that. Google makes its own cards and custom application cannot register any cards. But I hope it will be possible in future.
Actually Google announced last week that developers can now develop custom Google Now cards:
http://www.google.com/landing/now/integrations.html
However, a developer guide seems not available yet.
Edit:
On the end of the page they point out that:
We'll let you know when we are able to onboard more partners
There is a work-around that will soon allow you to place cards in Google Now's stream at a particular time or a particular location: Use Google Keep (https://drive.google.com/keep/)
You can create a new card at Google Keep with a time based or location based reminder, depending on which the relevant card will show up in Google Now.
Since Google Keep is now in Drive, the API is expected to be available soon (keep a lookout for it at http://discovery-check.appspot.com/ )
There is not way to do this by your own at the moment. If you really want to do it you can fill in this form: https://services.google.com/fb/forms/nowintegrations/. You can ask Google if they want to cooperate to create a Google Now card.
Note quite an answer, as it is still not possible to create Google Now cards, but you can now hook into the Google Now search function (basically Android's Siri) and provide custom search results. For example you can say "show me the lyrics to..." and it opens a lyrics app.
Here is a link to the project which is based on the Xposed framework.
Just guessing from my impression of the Google Search apk (which includes all the Google Now functionality and even the home screen on KitKat), it should be possible to use a similar technique to inject cards into the app - however since the app is huge and very complicated, it will be a lot of work. I'd keep my eyes open on the xda-developers forums, wouldn't be surprized if someone there solves this in the future.
It appears that there is developer documentation on how to push google now info via email, eg. flight details, restaurant reservations etc.
https://developers.google.com/schemas/now/cards
I have yet to dig into this, but may update this answer if I discover anything significant.
I am trying to develop an app/widget for which I need display the currently playing information (metadata) of an audio track.
This would be trivial if I was also writing the MediaPlayer myself, as I could simply access the MediaStore and bring up the info, however, I do not wish to compete with the plethora of existing apps on this front. I want to be able to pull this inforrmation from the builtin audio player or other app such as SongBird or PowerAMP.
I should be able to do this with PowerAMP using their [API][1], but have, but I really want a solution that works for the stock android player and others too.
I was hoping to be able to grab the information from the AudioManager, but that seems only to allow me to query the current state (Music is playing et) and I can set my intent to play music, etc... But no access to metadata from someone elses app.
So my thought is this cannot be done easily. My thoughts are that I could maybe access this info from the info bar at the top as the now playing info is printed up there. It might be an ugly hack though...
For a moment I got excited about the RemoteControlClient.MetadataEditor from 4.0, but then I figured out that it was for writing that information to a stream that can be sent to the physical remote, rather than allowing you to create a software remote. Damn!
Does anyone have any ideas?
[1]: http://forum.powerampapp.com/index.php?/topic/1034-updated-for-20-poweramp-api-lib-and-sample-applications/ Power AMP
I've written a guide for implementing this.
Basically, you need to have access to hidden classes of android.jar library. Then you have to extend IRemoteControlDisplay$Stub class, and implement it's methods.
After that you register your RemoteControlDisplay with hidden method - AudioManager#registerRemoteControlDisplay.
There is just way too much to explain in one answer, so read my guide on XDA-Developers.
Here is the link:
http://forum.xda-developers.com/showpost.php?p=44513199
Also, I'm currently working on a library which will simplify the process of implementing you remote media controls.
I should be able to do this with PowerAMP using their [API][1], but have, but I really want a solution that works for the stock android player and others too.
There is no documented and supported API for the AOSP Music app or the Google Play Music app, AFAIK. They certainly are not in the Android SDK.
I am not aware of an Android ecosystem standard for media players exposing this information, let alone a roster of apps that support such a standard. You are welcome to work with the developers of such apps and encourage them to create and adopt a standard.
My thoughts are that I could maybe access this info from the info bar at the top as the now playing info is printed up there.
It is not possible to spy on other applications' Notifications, for obvious privacy and security reasons.
For a moment I got excited about the RemoteControlClient.MetadataEditor from 4.0, but then I figured out that it was for writing that information to a stream that can be sent to the physical remote, rather than allowing you to create a software remote. Damn!
Surely there's a way to access the Remote Control Client metadata on Android 4.0, because the lock screen is able to access it when media is playing.
I'm not a developer at all, but I've tried to do a bit of poking around in the AOKP sources and this is my limited understanding of how it works. At least in AOKP (and presumably AOSP as well, then), it appears that the lockscreen uses core/java/com/android/internal/widget/TransportControlView.java to draw the music control widget on the lockscreen, which in turn uses media/java/android/media/IRemoteControlDisplay.aidl for data retrieval. At the very least, it may be useful to poke around in TransportControlView.java to see if you can figure out how the lockscreen widget works.