In my application, I need to use Google Assistant to ask custom questions to a user that has already logged in. Based on the user input, I need to make an API call and place an entry for the user. I can use any Google Assistant supported devices to call my API.
Google Assistant allows me to create custom actions and from the fulfillment I can see that there is an option to call the API.
My question is: How can I make an entry for the particular user in the database via API calls? What is the correct way to accomplish this? I am wondering if this can be accomplished with Google Assistant or if I will require Google Voice Actions.
Google Assistant seems to provide all the options to satisfy my requirement. However, I am unsure how to put an entry against the particular user logged into my android application from my device via API call.
Any insight to this is helpful, thank you!
Related
Is it possible to customize Google Assistant experience in Android Auto media app? I'm building an app that is playing news (radio auditions, podcasts etc.) and Google Assistant is optimized to provide responses that are focused on "Music". For example when I'm asking for "Guns and Roses" then this search query is forwarded to my app and I can provide a search results to the user. But my app is more "news" oriented and when I'm for example asking to search for "sports news" then Google Assistant is not letting me to respond to this request.
My question is: is it possible to customize Google Assistant behaviour here? Or, even better it would be to force it to forward all of the "not recognized" queries to the app.
No, you cannot customize Google Assistant but I don't think that is the question you want to be asking here.
I propose an alternative solution that Google already provides.
This is not exactly as you ask but you may be able to to say "Play Sports News on {your app name}".
If you register ACTION_PLAY_FROM_SEARCH and implement the onPlayFromSearch() callback, you may get the play command string routed to your app.
Please refer to the following Google document: https://developer.android.com/guide/topics/media-apps/interacting-with-assistant#playback_actions
I have an app that keeps track of payments, I want to be able to add payment by using google assistant. Which intent should my android.xml have and what should I say to google assistant? Would really appreciate any help. Also, note that I can't seem to use dialog flow to create my own custom intents since that would mean that it would only work on my device whereas I want it to work for anyone that has the app. I wouldn't mind also instead of having the google assistant call on the API that adds a payment to the database directly either.
Look into App Actions.
With App Actions, you declare support for built-in intents, which model common ways that users express tasks or find information, such as ordering a meal, booking a ride, or checking an account balance.
The Assistant can then launch your Android app or display a slice to your users. App Actions is supported on devices running Android 5 (API level 21) and higher.
With the Finance built-in intents, users can send money, pay a bill, and check their account, provided your Android app supports deep links and has an actions.xml file in the APK.
You can review a sample App Actions project on Github.
My use case is the following:
the user is driving or he/she's somehow unable to use his smartphone with his/her hands. All the actions he/she can do are the following:
"hey goole,"
play playlist $playlistName
play $radioStation
play $podcastName of $podcastDate
pause
next radio station
previous radio station
I've seen a Google I/O '18 video where they presented App Actions (here a readable version of the presentation). One should basically create an action.xml file where the mapping between the sematic intent and the android intent is created. So, when the user says the "magic words" (semantic intent), the right (android) intent is invoked to fulfill the request.
My questions are:
How do I create a semantic intent using the Action Console/Dialogflow console? All I have seen is how to create a conversational app which is not what I need
Since on the developer guide is stated "Note: Developer preview coming soon!", am I missing something? Is there a way to do what I need using the Actions on Google console?
Note:
To get the Radio contents I use a third-party library
Unfortunately, direct invocation of the Google Assistant ("Hey Google, play Africa by Toto") is not currently available to third party developers.
However, you can use explicit invocation to trigger a Google Assistant action that could then send an HTTP request to a REST API that could communicate with your Android app.
The user can include an invocation phrase at the end of their invocation that will take them directly to the Actions on Google > Dialogflow > Firebase function they're requesting, like so:
I am trying to build a simple app which can be triggered by google assistant.
Like if the users say, "Hey Google, Open TestApp" or "Hey Google, perform xyz from TestApp".
What would be the best approach? Dialogflow?
Saying "Okay, Google - do [something] with [your app]" in order to open your app to fulfill a query is called a Google Voice Action and you can add these to your app quite easily. See the documentation here for full details. Keep in mind - for these to work, your app does need to be in the Play Store, uploaded as a beta at the very least.
The Google Assistant is a little bit different - it's more conversational and doesn't take the user directly to your app. Instead, it takes the user's input, looks for the appropriate app to handle the query, feeds the query to your app, and then returns the response, within Google Assistant itself. This is all about having a conversation with an app, from the Google Assistant, without actually opening up your app.
I've seen the Voice Actions API and it describes how you can create your own voice interactions that can be started by Google Assistant using keywords. This enables the user to say Ok, Google and then your keywords. Google Assistant then forwards the interaction to your android app's activity.
I'd like to go the other direction. I want to use a built-in interaction that Google Assistant already handles, but I'd like to begin the interaction from inside my app (meaning I want to pass some data to Google Assistant, or at least launch it with it listening). Does anyone know if that is possible? Each time my app detects a certain trigger, I'd like to prompt Google Assistant to begin a specific interaction with the user without them first having to say "Ok, Google".
I could do something similar by beginning my own interaction using text to speech and the speech recognizer service, but the voice sounds and work flow are so much cleaner in Google Assistant so why re-invent the wheel.
Alternatively, if there was a way to use the same voice and voice recognition used by Google Assistant I could work with that.
Any ideas?
Since you asked your question, Google has launched a developer preview of the Google Assistant SDK that sounds like it does what you want (or can). Although intended for embedded devices, it will let your users open a voice channel to the Assistant without having to say "OK Google" and get a voice response back.
There are some issues still with triggering something inside your app itself, but it is possible now and expected to improve over time. Currently you'd have to issue the command to your app out of channel - so either by having your server-side Action send your app a GCM message or by taking other actions that your app could pick up.