I want to start Google Assistant with a query (or without any) when the user clicks a button, could not find any reference to doing this task.
I tried calling the following function, as I found in the documentation of Google Assistant but I do not understand how to use it for my Application:
#Override
public void onProvideAssistContent(AssistContent assistContent) {
super.onProvideAssistContent(assistContent);
String structuredJson = new JSONObject()
.put("#type", "MusicRecording")
.put("#id", "https://example.com/music/recording")
.put("name", "Album Title")
.toString();
assistContent.setStructuredData(structuredJson);
}
It does nothing, even when I long press home screen button to open Google Assistant, and select option what's on my screen it shows nothing.
If this is not possible, can I add the same action as Long Press Home does, i.e. open Google Assistant; to my button?
This method is for the on-screen reading feature "What's on my screen?" It can help to parse content by giving additional context.
As of now there is no programmable way to activate the Assistant in the normal manner. It doesn't have to be as the process is straightforward for users.
You can implement an Assistant-like intent receiver for your app, so it'll be called when users long press. However, it would override that for all apps, which would not be a great experience.
There is a Google Assistant SDK for embedding it into apps and devices. You can use the gRPC calls to send audio or text requests to the Google Assistant and get back a response. It is just a network API, so there's no UI included.
Related
I am trying to develop a feature to let user interact with my Android application using google assistant.
Due to my action is not available in build-in intents (starting/stopping vpn service) i'm struggling to implement it using custom DialogFlow intent. (btw i have created feature request to support it)
I have found Deeplink helper
function startVpnHandler(agent) {
let conv = agent.conv()
conv.ask(new SimpleResponse({
speech: `speech`,
text: `text`,
}))
conv.ask(new DeepLink({
destination: 'App',
url: 'deeplink.url',
package: 'app package',
reason: 'start vpn',
}))
agent.add(conv);
}
But unfortunately the assistant answer for this action is
"App isn’t responding right now. Please try again later”
Firebase console outputs:
DeepLink is DEPRECATED: Access will be by request only
I saw that the Deeplink is deprecated, but what is replacement for that?
I know that i'm able to create card response with the button which can contain deeplink, but
how can i invoke a deeplink directly for the DialogFlow custom action without additional user interaction?
I have researched a lot, someone suggests to use FCM and call it from webhook directly. But maybe you know another cleaner solution? And also i'm curios whether such action with implicit FCM calling will pass the google review.
Dialogflow isn't supported in App Actions.
It might not support all your use cases, but you could leverage OPEN_APP_FEATURE BII for some of your test queries.
Stay tuned for future updated to build customized intents.
I am working on a project to integrate the Google Assistant with an existing Android app. The requirements are simple. Assuming my app is named TestApp and I want it to just Create a meeting, I want to be able to say, "Hey Google, Create a meeting on TestApp".
The closest API I found that does what I need is the Voice Interactions. This is very similar to what I need done but it only allows preset voice triggers such as "call this person..." or "set alarm at...", but cannot do "create a task or something else..." like in my example. According to this link, custom voice actions aren't allowed.
So my question is, is it possible to directly interact with my app and have it do a very simple task?
Short Answer: No, not directly. But your Action can be made to prompt the user for permission to "deep-link" into an Android activity. To do that you need first to associate the Action and the Activity in the console. Sample code is here
https://actions-on-google.github.io/actions-on-google-nodejs/classes/conversation_helper.deeplink.html
Longer Answer: Before there were Actions on Google there were Google Now Actions. (Yeah I know it's confusing). With Google Now Actions the Google App on a phone can be made to launch an Android activity with an intent which carries search terms in the bundle of "string extras". Then your user would say something like
"OK, Google search for the meeting creator on TestApp". Then you would get the text "the meeting creator" as a query string extra in a specially crafted search intent which launches TestApp into its meeting-creating activity. Details are here:
https://www.seroundtable.com/google-now-actions-19436.html
Unlike Actions on Google, Google Now Actions forces your user to pose his request as a search.
I'm integrating AppsFlyer with Android Native Application. And I want to use Deferred Deep Linking, when user click landing page ads and download the app and upon first app open the user lands directly on the activity I want.
Link docs: https://support.appsflyer.com/hc/en-us/articles/207032096-Deferred-Deep-Linking-Getting-the-Conversion-Data
But I have not found a way to check that my code is running correctly.
Please help me with this problem
What was working for me is:
Add physical device as a test device in AppsFlyer (here's how to do it)
Enable Debug Mode in AppDelegate.swift in didFinishLaunchingWithOptions
AppsFlyerTracker.shared().isDebug = true
Add AppsFlyer methods in your AppDelegate.swift (as per article)
Remove app (or test build) from physical device
Open Deep Link from physical device, you will be redirected to App Store. Don't install app from the App Store!!! (just close it)
Install app via XCode
After it, on a first install it will call onConversionDataReceived method and the rest staff.
You're going to have to implement the onInstallConversionDataLoaded listener:
public interface AppsFlyerConversionListener {
void onInstallConversionDataLoaded(Map<String,String> conversionData);
void onInstallConversionFailure(String errorMessage);
}
This will return a map of all the parameters on the link that you clicked.
The parameter you need to pay attention to is the af_dp parameter.
This parameter should contain the URI scheme of the activity you want to route your users to. Make sure that you have set up this URI scheme properly in the manifest.
To create a tracking link you can use Link Management. It doesn't matter if it's a single platform link or a OneLink, as long as you have the af_dp parameter on the link, that parameter (along with all other parameters on the link) will be part of the response.
If you're still facing issues, feel free to reach out to support#appsflyer.com.
I am writing an Espresso test for an Activity which contains a button used for authenticating using a Google account and a button used for authenticating using a Facebook account. In my test I would like to simulate clicks on each of these buttons and ensure that it launches the correct component provided by these third parties.
It seems that there are two possible options:
Simulate a click on the appropriate button and ensure the UI contains the expected views.
Simulate a click on the appropriate button and ensure the correct Intent was fired.
I've initially opted for option #2, as I have found that it is difficult to test the UI with components outside of the Activity under test.
To start, I created a test for the Google button. The code is below:
#Test
public void testGoogleAuthButton() {
// Start the Activity under test.
activityRule.launchActivity(new Intent());
// Click the Google button.
onView(withId(R.id.googleSignUpButton)).perform(click());
// Ensure the `SignInHubActivity`, an `Activity` provided by the Google APIs, was launched.
intended(hasComponent(SignInHubActivity.class.getName()), times(1));
// Simulate a press on the back button to close the account chooser.
UiDevice.getInstance(getInstrumentation()).pressBack();
}
This test passes, however I am unsure as to whether or not this is the correct way to perform this test. Next, I tried a similar technique for the Facebook button:
#Test
public void testFacebookAuthButton() {
// Start the Activity under test.
activityRule.launchActivity(intent);
// Simulate a click on the Facebook button.
onView(withId(R.id.facebookSignUpButton)).perform(click());
// Ensure the `FacebookActivity`, an `Activity` provided by the Facebook APIs, was launched.
intended(hasComponent(FacebookActivity.class.getName()), times(1));
// Simulate a press on the back button to close the current Activity.
UiDevice.getInstance(getInstrumentation()).pressBack();
}
Strangely, this test fails, and notes that no Intents were matched and that none were recorded.
My questions:
Does the code shown above represent the correct way to test these flows?
Why does the Facebook version of the test shown above fail, while the Google version does not?
Is there any reliable way to check if my activity was started from the Google Assistant (with the Start/Open command) or from the launcher icon?
The documentation about the Open command states
(Works by default; no specific intent.)
so i'm not totaly optimistic.
I found following difference when debugging app if its start by Google Assistant Voice Interaction Open Command
when it is start by Google Assistant Voice Interaction Open Command
you will get "android.intent.extra.REFERRER_NAME" contain key in getIntent().getExtras()
getIntent().getExtras().containsKey("android.intent.extra.REFERRER_NAME")
if(getIntent().getExtras()!= null && getIntent().getExtras().containsKey("android.intent.extra.REFERRER_NAME")){
Log.e(TAG, "onCreate: From Voice assistance");
}else{
Log.e(TAG, "onCreate: Not From Voice assistance");
}
but if it start from menu, you will not get that key.
May be this can solve your problem.
The short answer: Actually NO
The default voice command: Open XYZ where XYZ is the name of an application, actually doesn't have any information attached to let you know that the request has been processed by the Google Assistant.
As I already told you, probably this is done to prevent developers to attach behaviours that users could not expect from the open command which should simply open the app and nothing else.
Note: If the user specifies additional commands, for example Open XYZ and play ABC song this results in a custom intent which you can get with getAction() in your onCreate() method as described here but this is not your specific case.
References: https://developers.google.com/voice-actions/system/#open_actions