Is there any reliable way to check if my activity was started from the Google Assistant (with the Start/Open command) or from the launcher icon?
The documentation about the Open command states
(Works by default; no specific intent.)
so i'm not totaly optimistic.
I found following difference when debugging app if its start by Google Assistant Voice Interaction Open Command
when it is start by Google Assistant Voice Interaction Open Command
you will get "android.intent.extra.REFERRER_NAME" contain key in getIntent().getExtras()
getIntent().getExtras().containsKey("android.intent.extra.REFERRER_NAME")
if(getIntent().getExtras()!= null && getIntent().getExtras().containsKey("android.intent.extra.REFERRER_NAME")){
Log.e(TAG, "onCreate: From Voice assistance");
}else{
Log.e(TAG, "onCreate: Not From Voice assistance");
}
but if it start from menu, you will not get that key.
May be this can solve your problem.
The short answer: Actually NO
The default voice command: Open XYZ where XYZ is the name of an application, actually doesn't have any information attached to let you know that the request has been processed by the Google Assistant.
As I already told you, probably this is done to prevent developers to attach behaviours that users could not expect from the open command which should simply open the app and nothing else.
Note: If the user specifies additional commands, for example Open XYZ and play ABC song this results in a custom intent which you can get with getAction() in your onCreate() method as described here but this is not your specific case.
References: https://developers.google.com/voice-actions/system/#open_actions
Related
I am working on a project to integrate the Google Assistant with an existing Android app. The requirements are simple. Assuming my app is named TestApp and I want it to just Create a meeting, I want to be able to say, "Hey Google, Create a meeting on TestApp".
The closest API I found that does what I need is the Voice Interactions. This is very similar to what I need done but it only allows preset voice triggers such as "call this person..." or "set alarm at...", but cannot do "create a task or something else..." like in my example. According to this link, custom voice actions aren't allowed.
So my question is, is it possible to directly interact with my app and have it do a very simple task?
Short Answer: No, not directly. But your Action can be made to prompt the user for permission to "deep-link" into an Android activity. To do that you need first to associate the Action and the Activity in the console. Sample code is here
https://actions-on-google.github.io/actions-on-google-nodejs/classes/conversation_helper.deeplink.html
Longer Answer: Before there were Actions on Google there were Google Now Actions. (Yeah I know it's confusing). With Google Now Actions the Google App on a phone can be made to launch an Android activity with an intent which carries search terms in the bundle of "string extras". Then your user would say something like
"OK, Google search for the meeting creator on TestApp". Then you would get the text "the meeting creator" as a query string extra in a specially crafted search intent which launches TestApp into its meeting-creating activity. Details are here:
https://www.seroundtable.com/google-now-actions-19436.html
Unlike Actions on Google, Google Now Actions forces your user to pose his request as a search.
I'm trying to develop an interactive "allspeech-based" application and I have found some troubles.
I would be able to do every task of my app without the touch input, only the voice one.
So, my idea was to use Google Assitant to open my app, and then execute tasks with Activities "allspeech-based". My only problem is open my app with a custom phrase like "Hello myApp".
I've noticed that Google only permit to define in Manifest in which developer can set predefined method like SEARCH_ACTION (I have to say "Serch cats in myAPP..."), ... and not custom action!
Then I had an idea: to resolve this I can define a new routine in my Google Assistant where Command is "Hello myApp" and action is "Open myApp", and it works!
But I don't want to force my users to add a new tourine in their Assistant, I want to do it from my app automatically at the first opening of my app (throught an Intent and a startActivity or something similar idk).
My question is: How can I do? Which Intent I have to invoke? With which extras? Is it possible?
Please answer, thank you.
There's no such intent which can directly create a new routine.
I want to start Google Assistant with a query (or without any) when the user clicks a button, could not find any reference to doing this task.
I tried calling the following function, as I found in the documentation of Google Assistant but I do not understand how to use it for my Application:
#Override
public void onProvideAssistContent(AssistContent assistContent) {
super.onProvideAssistContent(assistContent);
String structuredJson = new JSONObject()
.put("#type", "MusicRecording")
.put("#id", "https://example.com/music/recording")
.put("name", "Album Title")
.toString();
assistContent.setStructuredData(structuredJson);
}
It does nothing, even when I long press home screen button to open Google Assistant, and select option what's on my screen it shows nothing.
If this is not possible, can I add the same action as Long Press Home does, i.e. open Google Assistant; to my button?
This method is for the on-screen reading feature "What's on my screen?" It can help to parse content by giving additional context.
As of now there is no programmable way to activate the Assistant in the normal manner. It doesn't have to be as the process is straightforward for users.
You can implement an Assistant-like intent receiver for your app, so it'll be called when users long press. However, it would override that for all apps, which would not be a great experience.
There is a Google Assistant SDK for embedding it into apps and devices. You can use the gRPC calls to send audio or text requests to the Google Assistant and get back a response. It is just a network API, so there's no UI included.
friends I am new in mirrorlink common Api.I don't know to enable mirror link service in android application.Please anyone tell me steps and any tutorial link.
I already do following things :-
I get developer account from mirrorlink.com.
I attached certificate with my app(that get from software that available from https://causeway.carconnectivity.org) Documents.
I saved device EMI number in my ACMS account (https://acms.carconnectivity.org)
Now I know I missing mirror link code for launching and terminate all this code and permission in manifest file.
Actually, I don't know how to code in the application for mirrorlink. please help me for step no.4
The launch and terminate UPnP code should be placed in the MainActivity. For certification purposes, your app needs to handle the Terminate intent without coming to the foreground (if it is in the background). Even if you don't respond to the intents, the app should be usable in MirrorLink.
You don't need to do anything extra to enable framebuffer streaming, or audio streaming. (Though providing context information via IContextManager.setFramebufferContextInformation and IContextManager.setAudioContextInformation is needed to make sure that the head unit knows what is being provided to it.)
I have an application that contains voice commands. I would like to access these
voice commands without pressing any button, only with a voice command .As in Chrome
("Ok google").
In Google developers in the section "adding voice capabilities", to "declare
app-providedvoice actions "explains how to define a label to start the app
but does not work me.
The app is for a mobile
Thank You
You can use the api of google
http://developer.android.com/reference/android/speech/package-summary.html
http://www.androidhive.info/2014/07/android-speech-to-text-tutorial/
and you can convert the text returned into commands
Hope that will help you