My question is regarding custom voice commands for Android.
Is there a way to develop an app, not necessarily a UI app, that adds custom voice commands to any Android mobile?
For example, after installing AccuWeather on Android, every time I ask Google Now for the weather, it displays information fetched from the said app.
Did AccuWeather added new voice commands to my phone?
Can any app developed by anyone add custom voice commands to an
Android phone?
Can I write and install an app that, triggered by the voice command
"How is my portfolio?", fetches data from the stock market, returns it
and then be displayed in the same view, the Google Now view?
I Googled all of these concerns but found more questions than answers.
You can write an app that registers itself as a choice for the available system commands.
As of this writing, you can not register an app that registers a custom command. Google used to accept requests to register custom voice actions, but stopped. I don't know why they stopped, or if/when they'll allow customization again.
BTW there was a possibility with an XPOSED Module to add custom commands to google search ( and so you can use the voice recognition even with a smart watch connected ), but it stopped working since they updated the google search app.
if you are just doing this for personal use u can still use it with an old version of google search app and this xposed framework module and correct hooks ( they postet the correct hooks for some updated versions of google app later in the thread , just use the latest one...).
here you go:
https://forum.xda-developers.com/xposed/modules/mod-google-search-api-t2554173
i really hope google will bring out a better voice API than the actual is.
Related
I can not find any information anywhere on the net, not even in Google Documentation about controlling our self developed android app by voice without having to tap a GUI element previously.
We successfully implemented voice control into our app, but for the app to start listening, I have to tap a microphone icon on the GUI.
I want our app listening all the time and if I say a specific command like "Hey MyApp!" (Like Hey Google) then the app knows that it needs to listen for my command.
Is it possible?
I found this:
https://developer.android.com/training/wearables/user-input/voice
And this:
https://developer.android.com/guide/app-actions/overview
And this:
https://support.google.com/accessibility/android/answer/6151848?hl=en
But none of them explaining things in a way that I understand if it's possible or not. English is my third language and I am not a programmer, but a designer.
Our programmers are also not experienced in this field.
Thanks for answers in advance.
Is it possible?
That depends.
If you are building your own custom hardware with its own custom build of Android on it, then yes.
If not, and you literally want what you are asking ("I want our app listening all the time and if I say a specific command like "Hey MyApp!" (Like Hey Google) then the app knows that it needs to listen for my command"), then no. You cannot even have your app running all the time, let alone with the microphone active. You also would need to have your plans reviewed by qualified legal counsel, as continuously monitoring everything said on the microphone will have ramifications.
If you are planning on distributing your app via the Play Store, and you are willing to be more flexible in your requirements, you could integrate with Google Assistant, per one of the documents that you linked to. Google Assistant can then launch your app if/when needed based on an app action triggered by user speech.
I am planning to develop a Flutter App which will be installed on a dedicated device which I want to sell to customers (B2C). This device will be used only with the developed app and thus represents a kind of customized end user device, which hasn't to provide any other functions. For example, the user should see a customized boot screen and then my app should open. The user should not be able to access the Android system apps (Config, System Bars, Home Screen, Home Button/Navigation).
In doing so, I have the following problems:
What options do I have to update the app automatically?
I have read that there should be something like the Managed Google Play Store, but would this even be the right use case for this or should I rather write my own service which is responsible for updating my app?
How can I make sure that the user only uses my app and not other apps or the system menu? The following link (https://developer.android.com/work/dpc/dedicated-devices) describes how I put certain apps into "Lock Task Mode", but that is mentioned more in the context of Enterprise Mobility Management, not for customized consumer devices. Is this even the right way to go for my requirement, or should I rather look into the AOSP project (https://source.android.com/)? That would certainly add an additional hurdle. My preference would be to be able to use any Android tablet for my App.
In order to sell my dedicated device, I will need to customize/install a variety of devices, which begs the question of how do I effectively solve this? Do I need to flash the devices to do this? Is there possibly another effective method such as customized installation scripts via adb? This question is certainly closely related to the previous question.
I know this is broad question but I've spent counless hours searching for the right solution. Google provides at least 15 different types of Google Assistant connections all of them are different and I don't know which one to use for my project. The project is pretty simple - I would like to call dynamic url with parameters (webhook to my home server) from mobile Google Assistant.Example - I would say to my Android mobile phone
> Ok, Google set the TV volume to 50
and the assistant would call
GET "https://192.168.1.12/tv/volume/50
or
GET "https://192.168.1.12/?device=tv&action=volume&value=50
where 'tv', 'volume' and '50' are the dynamic parameters (not static) so I could also call
> Ok, Google set the TV channel to 132
I just want a link or a name of the Google dev console that I can use. I don't want to waste another several hours just to find out that another Google package is not suitable for my project. Have anyone done something similar?
PS
I know that I could achieve something similar with "Ok, Google let's talk to ...." but that is not my case.
also I CAN'T USE IFTTT
There's no one-click mechanism to do this directly in Assistant. The smart home platform allows you to configure a service that would capture commands such as Channel and Volume and let you handle those commands in the way you want, using a cloud backend and optionally the Local Home SDK.
This may be a bit more work than you want, and you may want to consider existing smart home platforms which may handle some of the backend work such as https://homeassistant.io which does have an Assistant integration already, though I'm less sure of whether it may work in your use case.
It's hard to answer your question without knowing your constraints. Are you a developer? Are you looking to make a commercial app? For non-commercial use you can use the Google Assistant Service to create custom commands that do whatever you want (including call your API). As a benefit, you don't have to say "hey Google" before your command.
Also, it would help to know why you can't use IFTTT, given that it seems to do exactly what you're asking for.
Can I build custom actions to leverage the Google Assistant SDK without needing to be online? I'd like to have the natural language parser to help me navigate my Android App, even when offline. That is, speak to it and say special words to navigate the screens and what not, using the voice instructions of the user.
This seems totally possible using the Assistant SDK on the internet, but I need the app to run on my android even outside of connectivity.
Google assistant uses google servers so I don't think it is possible, but there are some other options check out this link.
Android: Speech Recognition without using google server
I am researching ways to implement form filling via voice command given by user inside my application.I have searched two options but no one is seems useful and I am bit confused here.
First I tried with android voice to speech library integration.It gives me text but isn't smart enough to converse with as google assistance do.
Then second I tried to integrate google assistance with api.a. It provides the user conversation but it is like adding command to google assistance .It doesn't provide me voice to text data so that I can fill form and do further operation.
Please suggest me ways to implement.
You can use the SDK provided by Slang Labs which allows you to add a custom voice experience inside your app. You can create a "buddy" in their console and configure the kind-of intents/utterances you want to handle. Then integrate its SDK into your app, which takes care of all the voice-related functionality and you can register callbacks for the intents you have configured in the console to handle the app-specific actions.
(disclaimer: I am a co-founder of Slang Labs :-))
You wouldn't use Actions on Google through Dialogflow for your implementation but rather the Google Assistant SDK which is meant for devices.
However, in your case it may make sense to use Dialogflow's Android client. You would not need to pull all of the Google Assistant's capabilities and the voice interaction would be limited to your own application.