The requirement is to launch my Android app if the user commands GA to find nearby restaurants of McDonald' where McDonald is the app name. Since the user is not mentioning that s/he wants to search in McDonald app it is not an explicit intent which the app can handle with a defined intent filter. I want to understand which component should I use to get this done.
There will be another use case that user wants to order some food item that has the name, McDonald. This suggests that ideally, I have to define action sets(set of templates) although the app doesn't converse with the user hence DialogFlow is not usable.
I contacted the Google action team after going through this Question, their response was that Invoking an Android application from an AOG app is not an available feature at this moment, this could be achieved using Google Assistant
What is the option left for this use-case?
My findings
Open app with Explicit intent
Perhaps the same use-case
Voice action
You will want to sign up for the app actions developer preview, which will expose intents in your app to be opened from the Assistant.
At Google I/O 2019, some updates were posted about App Actions. However, the current developer launch does not include the ability to implicitly call through to an app. The user will need to explicitly call out the app to use.
Official documentation
Google I/O 2019 session on App Actions (video)
Sample application on github
Related
I am trying to use the Google Assistant to launch an activity of my Android app with a specific parameter.
I have read this:
https://codelabs.developers.google.com/codelabs/actions-1/#0
https://developers.google.com/assistant/app/action-schema
https://developers.google.com/assistant/app/get-started
But I am confused, completely lost.
From my point of view, everything looks messy and confused.
In the Actions Console I have created one Action Project called "Actions Project for my app".
Then I went to the DialogFlow console and modified the Default Welcome Intent and created some other intent. Both were tested using the testing option "See how it works in Google Assistant."
It worked well doing those testings.
Then, in another site I read that I have to create an Action.xml file in my app project in Android studio. But how do I have to do that? Do I have to "export" my project "Actions Project for my app" to some xml file? I think that documentation is not enough for clumsy people like me. I don't really know the next step. I am really struggling with this.
Basically I want the assistant to let the user say something, then pass that information to my app, do somework and then execute an activity to do something.
For example, if we are talking about MyCookingApp I want this:
User: Ok Google, talk to MyCookingApp
Assistant: Welcome to MyCookingApp, what do you want to cook?
User: Pizza
Assistant should open my app with the parameter "Pizza". My app should display an activity with a pizza image and the recepie for it.
Is all this possible?
Connecting your serverside Action to your Android app is not directly feasible. Actions on Google is available on a variety of surfaces beyond Android, so you can't necessarily expect the Action to immediately launch your app. However, there are a few ways to get the behavior that you want:
Make your response include a BasicCard which has a link to deep link into your app
Look at using App Actions to achieve the behavior you want, which is designed for the Assistant to link into Android apps
From your Action's webhook, send a push notification to the user's phone (assuming you have account linking setup) to your app to open it up
I have an app that keeps track of payments, I want to be able to add payment by using google assistant. Which intent should my android.xml have and what should I say to google assistant? Would really appreciate any help. Also, note that I can't seem to use dialog flow to create my own custom intents since that would mean that it would only work on my device whereas I want it to work for anyone that has the app. I wouldn't mind also instead of having the google assistant call on the API that adds a payment to the database directly either.
Look into App Actions.
With App Actions, you declare support for built-in intents, which model common ways that users express tasks or find information, such as ordering a meal, booking a ride, or checking an account balance.
The Assistant can then launch your Android app or display a slice to your users. App Actions is supported on devices running Android 5 (API level 21) and higher.
With the Finance built-in intents, users can send money, pay a bill, and check their account, provided your Android app supports deep links and has an actions.xml file in the APK.
You can review a sample App Actions project on Github.
My use case is the following:
the user is driving or he/she's somehow unable to use his smartphone with his/her hands. All the actions he/she can do are the following:
"hey goole,"
play playlist $playlistName
play $radioStation
play $podcastName of $podcastDate
pause
next radio station
previous radio station
I've seen a Google I/O '18 video where they presented App Actions (here a readable version of the presentation). One should basically create an action.xml file where the mapping between the sematic intent and the android intent is created. So, when the user says the "magic words" (semantic intent), the right (android) intent is invoked to fulfill the request.
My questions are:
How do I create a semantic intent using the Action Console/Dialogflow console? All I have seen is how to create a conversational app which is not what I need
Since on the developer guide is stated "Note: Developer preview coming soon!", am I missing something? Is there a way to do what I need using the Actions on Google console?
Note:
To get the Radio contents I use a third-party library
Unfortunately, direct invocation of the Google Assistant ("Hey Google, play Africa by Toto") is not currently available to third party developers.
However, you can use explicit invocation to trigger a Google Assistant action that could then send an HTTP request to a REST API that could communicate with your Android app.
The user can include an invocation phrase at the end of their invocation that will take them directly to the Actions on Google > Dialogflow > Firebase function they're requesting, like so:
I am trying to build a simple app which can be triggered by google assistant.
Like if the users say, "Hey Google, Open TestApp" or "Hey Google, perform xyz from TestApp".
What would be the best approach? Dialogflow?
Saying "Okay, Google - do [something] with [your app]" in order to open your app to fulfill a query is called a Google Voice Action and you can add these to your app quite easily. See the documentation here for full details. Keep in mind - for these to work, your app does need to be in the Play Store, uploaded as a beta at the very least.
The Google Assistant is a little bit different - it's more conversational and doesn't take the user directly to your app. Instead, it takes the user's input, looks for the appropriate app to handle the query, feeds the query to your app, and then returns the response, within Google Assistant itself. This is all about having a conversation with an app, from the Google Assistant, without actually opening up your app.
I want to integrate Google Assistant in my app. I am not just for launching the the app, also to operate on it like below:
App: Hi! What are you looking for?
User: T-shirts
App: What is the size?
User: 40
App: What is the color?
User: Any / Red and blue
App: Here is the T-shirts for you
User: Show order by Popularity
App: Here it is.
Is it possible to do so by Google Assistance? Is it possible to start any activity of my app with proper intent extra?
With Actions on Google as you will need to build a server side components to handle the voice/chat interaction. For an example of the voice interaction see: https://developers.google.com/actions/extending-the-assistant
As to actually starting an app from Google Assistant. I don't believe there is a direct way.
Alternatively that seems to be possible for a select set of voice commands through Google Search Voice Actions to reach your app.
Note that Google isn't taking any more requests for custom Voice Actions:
https://developers.google.com/voice-actions/custom-actions
And that the pages doesn't seem to have been updated since 2015 so I'm unclear what is still supported or not.
From what I can tell the only part of Google Search Voice Actions that may be still around in 2017 is the ability to incorporate app data into Google Search via
Firebase App Indexing