As I understood this is possible, from here
Detecting toast messages
But I am unable to catch any event with code snippet from the link.
MyAccessibilityService.java
package com.test.toasts2;
import android.accessibilityservice.AccessibilityService;
import android.accessibilityservice.AccessibilityServiceInfo;
import android.app.Notification;
import android.os.Parcelable;
import android.view.accessibility.AccessibilityEvent;
import android.widget.Toast;
public class MyAccessibilityService extends AccessibilityService {
#Override
public void onAccessibilityEvent(AccessibilityEvent event) {
System.out.println("event catched");
Toast.makeText(this, "catched " + "!", Toast.LENGTH_SHORT).show();
if(event.getEventType() != AccessibilityEvent.TYPE_NOTIFICATION_STATE_CHANGED)
return; // event is not a notification
String sourcePackageName = (String)event.getPackageName();
Parcelable parcelable = event.getParcelableData();
if(parcelable instanceof Notification){
// Statusbar Notification
}
else{
// something else, e.g. a Toast message
String log = "Message: "+event.getText().get(0)+" [Source: "+sourcePackageName+"]";
System.out.println(log);
// write `log` to file...
}
}
#Override
public void onInterrupt() {
// TODO Auto-generated method stub
}
#Override
protected void onServiceConnected() {
// TODO Auto-generated method stub
super.onServiceConnected();
AccessibilityServiceInfo info = new AccessibilityServiceInfo();
info.feedbackType = AccessibilityServiceInfo.DEFAULT;
setServiceInfo(info);
}
}
AndroidManifest.xml
<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="com.test.toasts2"
android:versionCode="1"
android:versionName="1.0" >
<uses-sdk android:minSdkVersion="15" />
<application>
<service android:name=".MyAccessibilityService"
android:label="label">
<intent-filter>
<action android:name="android.accessibilityservice.AccessibilityService" />
</intent-filter>
</service>
</application>
</manifest>
Seems like this service is simply not started. What I am doing wrong?
Why I am doing this:
I am installing many shortcuts on the stock launcher from my app. I am having the problem that theese shortcuts are placed one over another in one cell (even Sleep 500 did not help). So I am finding a way to install them one by another. But how to know when shortcut was successfully installed? I have found only a message that ics launcher shows to user.
TYPE_NOTIFICATION_STATE_CHANGED generally refers to NotificationManager and icons placed in the status bar. Nonetheless, the below code should help shed some light as to the origin of a Toast message. On Android 4.0.4 ICS a Toast has a class of android.widget.Toast so getClassName should do the trick.
For what its worth, the change seems to have been made in Android 4.0.3 to add and use the following method in Toast.TN
private void trySendAccessibilityEvent() {
AccessibilityManager accessibilityManager =
AccessibilityManager.getInstance(mView.getContext());
if (!accessibilityManager.isEnabled()) {
return;
}
// treat toasts as notifications since they are used to
// announce a transient piece of information to the user
AccessibilityEvent event = AccessibilityEvent.obtain(
AccessibilityEvent.TYPE_NOTIFICATION_STATE_CHANGED);
event.setClassName(getClass().getName());
event.setPackageName(mView.getContext().getPackageName());
mView.dispatchPopulateAccessibilityEvent(event);
accessibilityManager.sendAccessibilityEvent(event);
}
You can see the Toast class in all versions of Android here.
private final String getEventType(AccessibilityEvent event) {
switch (event.getEventType()) {
case AccessibilityEvent.TYPE_NOTIFICATION_STATE_CHANGED:
return "TYPE_NOTIFICATION_STATE_CHANGED";
case AccessibilityEvent.TYPE_VIEW_CLICKED:
return "TYPE_VIEW_CLICKED";
case AccessibilityEvent.TYPE_VIEW_FOCUSED:
return "TYPE_VIEW_FOCUSED";
case AccessibilityEvent.TYPE_VIEW_LONG_CLICKED:
return "TYPE_VIEW_LONG_CLICKED";
case AccessibilityEvent.TYPE_VIEW_SELECTED:
return "TYPE_VIEW_SELECTED";
case AccessibilityEvent.TYPE_VIEW_SCROLLED:
return "TYPE_VIEW_SCROLLED";
case AccessibilityEvent.TYPE_VIEW_HOVER_EXIT:
return "TYPE_VIEW_HOVER_EXIT";
case AccessibilityEvent.TYPE_VIEW_HOVER_ENTER:
return "TYPE_VIEW_HOVER_ENTER";
case AccessibilityEvent.TYPE_TOUCH_EXPLORATION_GESTURE_START:
return "TYPE_TOUCH_EXPLORATION_GESTURE_START";
case AccessibilityEvent.TYPE_TOUCH_EXPLORATION_GESTURE_END:
return "TYPE_TOUCH_EXPLORATION_GESTURE_END";
case AccessibilityEvent.TYPE_WINDOW_STATE_CHANGED:
return "TYPE_WINDOW_STATE_CHANGED";
case AccessibilityEvent.TYPE_WINDOW_CONTENT_CHANGED:
return "TYPE_WINDOW_CONTENT_CHANGED";
case AccessibilityEvent.TYPE_VIEW_TEXT_SELECTION_CHANGED:
return "TYPE_VIEW_TEXT_SELECTION_CHANGED";
case AccessibilityEvent.TYPE_VIEW_TEXT_CHANGED:
return "TYPE_VIEW_TEXT_CHANGED";
}
return "default";
}
private final String getEventText(AccessibilityEvent event) {
StringBuilder sb = new StringBuilder();
for (CharSequence s : event.getText()) {
sb.append(s);
sb.append('\n');
}
return sb.toString();
}
#Override
public void onAccessibilityEvent(AccessibilityEvent event)
{
Log.v(TAG, String.format(
"onAccessibilityEvent: [type] %s [class] %s [package] %s [time]
%s [fullscreen] %s [text] %s", getEventType(event), event.getClassName(),
event.getPackageName(), event.getEventTime(), Boolean.toString(
event.isFullScreen()), getEventText(event)));
if (android.os.Build.VERSION.SDK_INT >= 14)
Log.v(TAG, "Window ID: " + Integer.toString(event.getWindowId()) + ".");
}
private void setServiceInfo(int feedbackType)
{
final AccessibilityServiceInfo info = new AccessibilityServiceInfo();
// We are interested in all types of accessibility events.
info.eventTypes = AccessibilityEvent.TYPES_ALL_MASK;
// We want to provide specific type of feedback.
info.feedbackType = feedbackType;
// We want to receive events in a certain interval.
// info.notificationTimeout = EVENT_NOTIFICATION_TIMEOUT_MILLIS;
// We want to receive accessibility events only from certain packages.
// info.packageNames = PACKAGE_NAMES;
setServiceInfo(info);
}
private boolean isInfrastructureInitialized = false;
#Override
public void onServiceConnected()
{
if (isInfrastructureInitialized) return;
// Claim the events with which to listen to.
setServiceInfo(AccessibilityServiceInfo.FEEDBACK_ALL_MASK);
// We are in an initialized state now.
isInfrastructureInitialized = true;
}
Source: personal experience.
create an Activity that will build an intent and use it to start your service.
something like this will be the code inside the activity.
Intent i = new Intent(YourActivity.this, MyAccessibilityService.class);
startService(i);
In your manifest make the intent filter for the activity have the MAIN and LAUNCHER in it so that it will the Activity that is run when the user (or adb) starts your app. The activity will then start your service for you.
EDIT: I assume you saw this note from the post that you linked. And that you aren't trying this on 2.2?
Note: This didn't work for me on Android 2.2 (it doesn't seem to catch Toasts), but it worked on Android 4.0.
First of all you shouldn't be trying to catch the Toast as this is an Asynchronous call that bring up a Toast on the screen and it will stay on the screen depending on the Timing, it is possible to leave the application and still have the toast appear. You should NOT care when the toast is done as this is irrelevant. All a Toast should be used for is JUST for giving the User info about a particular process that is underway/done or just Information. Its not meant for what you're trying to do.
Why don't you just send an internal broadcast into your application and catch it via an Intent Filter and then upon receiving that broadcast, you should start the service.
Related
I'm trying to send messages via Whatsapp programmatically, the code works except the user needs to click the send button. I need the app to do everything (all user interactions). One way to do to it as follows .
Go to Menu Button > Settings > Chats. and check the "Enter is send option"
Here's the code I'm using:
protected void sendwts(){
String smsNumber = "2126123456789"; // E164 format without '+' sign
Intent sendIntent = new Intent(Intent.ACTION_SEND);
// Intent sendIntent = new Intent(Intent.ACTION_SENDTO);
sendIntent.setType("text/plain");
sendIntent.putExtra(Intent.EXTRA_TEXT, "test \n");
sendIntent.putExtra("jid", smsNumber + "#s.whatsapp.net"); //phone number without "+" prefix
sendIntent.setPackage("com.whatsapp");
startActivity(sendIntent);
}
Thank you
You can do that only using the Accessibility API of Android.
The idea is quite simple, you'll actually make Android perform the click on Whatsapp's send button.
So the flow will be:
Send a regular message (with the intent you're currently using) with a suffix at the end of your message content such as "Sent by MY_APP".
Once the text there, your accessibility service will be notified that the EditText of whatsapp is filled.
If the suffix is present on the EditText of whatsapp, Your accessibility service will click on the send button. (this is to avoid performing actions as the user types in naturally a regular message).
Here's an example (which you'll have tweak if you wanna make it more restrictive):
public class WhatsappAccessibilityService extends AccessibilityService {
#Override
public void onAccessibilityEvent (AccessibilityEvent event) {
if (getRootInActiveWindow () == null) {
return;
}
AccessibilityNodeInfoCompat rootInActiveWindow = AccessibilityNodeInfoCompat.wrap (getRootInActiveWindow ());
// Whatsapp Message EditText id
List<AccessibilityNodeInfoCompat> messageNodeList = rootInActiveWindow.findAccessibilityNodeInfosByViewId ("com.whatsapp:id/entry");
if (messageNodeList == null || messageNodeList.isEmpty ()) {
return;
}
// check if the whatsapp message EditText field is filled with text and ending with your suffix (explanation above)
AccessibilityNodeInfoCompat messageField = messageNodeList.get (0);
if (messageField.getText () == null || messageField.getText ().length () == 0
|| !messageField.getText ().toString ().endsWith (getApplicationContext ().getString (R.string.whatsapp_suffix))) { // So your service doesn't process any message, but the ones ending your apps suffix
return;
}
// Whatsapp send button id
List<AccessibilityNodeInfoCompat> sendMessageNodeInfoList = rootInActiveWindow.findAccessibilityNodeInfosByViewId ("com.whatsapp:id/send");
if (sendMessageNodeInfoList == null || sendMessageNodeInfoList.isEmpty ()) {
return;
}
AccessibilityNodeInfoCompat sendMessageButton = sendMessageNodeInfoList.get (0);
if (!sendMessageButton.isVisibleToUser ()) {
return;
}
// Now fire a click on the send button
sendMessageButton.performAction (AccessibilityNodeInfo.ACTION_CLICK);
// Now go back to your app by clicking on the Android back button twice:
// First one to leave the conversation screen
// Second one to leave whatsapp
try {
Thread.sleep (500); // hack for certain devices in which the immediate back click is too fast to handle
performGlobalAction (GLOBAL_ACTION_BACK);
Thread.sleep (500); // same hack as above
} catch (InterruptedException ignored) {}
performGlobalAction (GLOBAL_ACTION_BACK);
}
}
Then create its definition in res -> xml -> whatsapp_service.xml:
<?xml version="1.0" encoding="utf-8"?>
<accessibility-service
xmlns:android="http://schemas.android.com/apk/res/android"
android:accessibilityEventTypes="typeWindowContentChanged"
android:packageNames="com.whatsapp"
android:accessibilityFeedbackType="feedbackSpoken"
android:notificationTimeout="100"
android:canRetrieveWindowContent="true"/>
Then declare it in your manifest:
<service
android:name=".services.WhatsappAccessibilityService"
android:label="Accessibility Service"
android:permission="android.permission.BIND_ACCESSIBILITY_SERVICE">
<meta-data
android:name="android.accessibilityservice"
android:resource="#xml/whatsapp_service"/>
<intent-filter>
<action android:name="android.accessibilityservice.AccessibilityService"/>
</intent-filter>
</service>
And last thing, is to check if the accessibility services are enabled for your app or not, and redirect the user to the settings if not:
private boolean isAccessibilityOn (Context context, Class<? extends AccessibilityService> clazz) {
int accessibilityEnabled = 0;
final String service = context.getPackageName () + "/" + clazz.getCanonicalName ();
try {
accessibilityEnabled = Settings.Secure.getInt (context.getApplicationContext ().getContentResolver (), Settings.Secure.ACCESSIBILITY_ENABLED);
} catch (Settings.SettingNotFoundException ignored) { }
TextUtils.SimpleStringSplitter colonSplitter = new TextUtils.SimpleStringSplitter (":");
if (accessibilityEnabled == 1) {
String settingValue = Settings.Secure.getString (context.getApplicationContext ().getContentResolver (), Settings.Secure.ENABLED_ACCESSIBILITY_SERVICES);
if (settingValue != null) {
colonSplitter.setString (settingValue);
while (colonSplitter.hasNext ()) {
String accessibilityService = colonSplitter.next ();
if (accessibilityService.equalsIgnoreCase (service)) {
return true;
}
}
}
}
return false;
}
which you'll call with:
if (!isAccessibilityOn (context, WhatsappAccessibilityService.class)) {
Intent intent = new Intent (Settings.ACTION_ACCESSIBILITY_SETTINGS);
context.startActivity (intent);
}
This is purely on the technical aspect of the solution.
Now, the ethical question of "should you do that?", I believe the answer is quite clear:
Except if you are targeting people with disabilities (which is the very purpose of the Accessibility API), you should probably NOT do that.
I have written a bound service and I would like this service to be only called from particular app. I do not want other apps to be able to make calls to this service.
The options I know so far are:
Use a permission. There seems to be 3 secured permission, dangerous, signature and signatureOrSystem. Unfortunately, none of these permissions will work for me as I don't want users to accept this permission also both app does not have same signature and these are not system app.
Get app name on service bind or when making a call to service. I looked up a way to do this on stackoverflow here. This unfortunately does not works for me as it always returns the app ID in which the service resides.
Is there any other option for me or I can use the above mentioned options with some change to achieve the desired requirement.
Bound Service Code
public class SampleCommsService extends Service {
private static Messenger messanger;
#Override
public IBinder onBind(Intent intent) {
Log.e("TEST", "package intent: " + intent.getPackage());
String callingApp = MyApplication.getAppContext().getPackageManager().getNameForUid(Binder.getCallingUid());
Log.e("TEST", "onBind - package name: " + callingApp);
return getMyBinder();
}
private synchronized IBinder getMyBinder() {
if (messanger == null) {
messanger = new Messenger(new SettingsProcessor());
}
return messanger.getBinder();
}
class SettingsProcessor extends Handler {
private static final int GET_SETTINGS_REQUEST = 1;
private static final int UPDATE_SETTINGS_RESPONSE = 2;
private static final String SETTINGS = "settings";
#Override
public void handleMessage(Message msg) {
String callingApp = MyApplication.getAppContext().getPackageManager().getNameForUid(Binder.getCallingUid());
Log.e("TEST", "handle message - package name: " + callingApp);
switch (msg.what) {
case GET_SETTINGS_REQUEST:
sendSettingsValue(msg);
break;
default:
super.handleMessage(msg);
}
}
private void sendSettingsValue(Message msg) {
try {
Message resp = Message.obtain(null, UPDATE_SETTINGS_RESPONSE);
Bundle bundle = new Bundle();
bundle.putBoolean(SETTINGS, MyApplication.isSettingsEnabled());
resp.setData(bundle);
msg.replyTo.send(resp);
} catch (RemoteException e) {
// ignore
}
}
}
}
Output on calling api:
02-01 15:21:03.138 7704-7704/my.service.package E/TEST: package intent: null
02-01 15:21:03.139 7704-7704/my.service.package E/TEST: onBind - package name: my.service.package
02-01 15:21:12.429 7704-7704/my.service.package E/TEST: handle message - package name: my.service.package
OK, I was able to solve this problem based on a given answer here. The answer given in the link obviously does not works, but you can get the app ID from the Handler used for the bound service.
class SettingsProcessor extends Handler {
#Override
public void handleMessage(Message msg) {
String callingApp = MyApplication.getAppContext().getPackageManager().getNameForUid(msg.sendingUid);
Log.e("TEST", "handle message - package name: " + callingApp);
}
}
Instead of Binder.getCallingUid(), I am using msg.sendingUid and it works fine for me.
I'm making an auto-fill type accessibility service for android. I read the documentation and did some googling and it's still confusing to me. I got only one edittext to fill up the text. Whenever I try to move to the next edittext, it doesn't work. Is there a way to access all the edit text a once? Anyone has a solution for this?
#Override
public void onAccessibilityEvent(AccessibilityEvent event) {
final int eventType = event.getEventType();
String eventText = null;
switch(eventType) {
case AccessibilityEvent.TYPE_VIEW_CLICKED:
eventText = "Clicked: ";
break;
case AccessibilityEvent.TYPE_VIEW_FOCUSED:
eventText = "Focused " + event.getItemCount() +":";
break;
}
eventText = eventText + event.getContentDescription();
System.out.println("Accessibility On Accessibility Event");
Log.i("Test", "Custom Accessibility On Accessibility Event");
// Do something nifty with this text, like speak the composed string
// back to the user.
Toast.makeText(getApplicationContext(), eventText, Toast.LENGTH_LONG).show();
//let's try this
AccessibilityNodeInfo nodeInfo = event.getSource();
if(event.getClassName().equals("android.widget.EditText")) {
Bundle arguments = new Bundle();
arguments.putCharSequence(AccessibilityNodeInfo.ACTION_ARGUMENT_SET_TEXT_CHARSEQUENCE, "Howdy");
nodeInfo.performAction(AccessibilityNodeInfo.ACTION_SET_TEXT, arguments);
nodeInfo.performAction(AccessibilityNodeInfo.ACTION_NEXT_AT_MOVEMENT_GRANULARITY);
// event.setAction(AccessibilityEvent.TYPE_TOUCH_INTERACTION_START);
}
}
#Override
public void onInterrupt() {
System.out.println("CustomAccessibility On Interrupt");
Log.i("Custom", "CustomAccessibility On Interrupt");
}
#Override
protected void onServiceConnected() {
//super.onServiceConnected();
Toast.makeText(getApplication(), "onServiceConnected", Toast.LENGTH_SHORT).show();
System.out.println("CustomAccessibility On Service Connected");
Log.i("Custom", "CustomAccessibility On Service Connected");
AccessibilityServiceInfo info = new AccessibilityServiceInfo();
// Set the type of events that this service wants to listen to. Others
// won't be passed to this service.
info.eventTypes = AccessibilityEvent.TYPE_VIEW_CLICKED |
AccessibilityEvent.TYPE_VIEW_FOCUSED;
// If you only want this service to work with specific applications, set their
// package names here. Otherwise, when the service is activated, it will listen
// to events from all applications.
// Set the type of feedback your service will provide.
info.feedbackType = AccessibilityServiceInfo.FEEDBACK_VISUAL;
// Default services are invoked only if no package-specific ones are present
// for the type of AccessibilityEvent generated. This service *is*
// application-specific, so the flag isn't necessary. If this was a
// general-purpose service, it would be worth considering setting the
// DEFAULT flag.
// info.flags = AccessibilityServiceInfo.DEFAULT;
info.notificationTimeout = 100;
this.setServiceInfo(info);
}
Thanks for the help!
I am making an application to call multiple numbers.
In that app
When I call to 1 person and if the call is answered by the user then
the loop should be stopped.
But If the call is rejected then the call should be on next number and
loop should be couninue.
My problem is I cant detect whether the call is rejected or answered. when I had search on net some people says it is not possible to detect the call is answered or rejected.
Is it really not possible to detect the call in android If it is possible then how can I do that?
I think you can check outgoing call time of last call in PhoneStateListener class' onCallStateChanged method. Fetch the data if state is idle that is TelephonyManager.CALL_STATE_IDLE.
Something like this:
Cursor mCallCursor = context.getContentResolver().query(android.provider.CallLog.Calls.CONTENT_URI,null,null,null,null);
int duration = mCallCursor.getColumnIndex( CallLog.Calls.DURATION);
while(mCallCursor.moveToFirst())
{
Toast.makeText(context, mCallCursor.getString(duration), Toast.LENGTH_LONG).show();
}
You can find more about that here. I haven't tested the above code. But something like that should work.
You can check if time's 00:00, then call next number of loop. Else you can stop calling.
Hope this helps you.
below is a code of detecting outgoing call by accessibility events -
Add a class which extends AccessibilityService in your projects -
public class CallDetection extends AccessibilityService {
#Override
public void onAccessibilityEvent(AccessibilityEvent event) {
acquireLock(this);
Log.d("myaccess","after lock");
if (event.getEventType() == AccessibilityEvent.TYPE_WINDOW_CONTENT_CHANGED) {
Log.d("myaccess","in window changed");
AccessibilityNodeInfo info = event.getSource();
if (info != null && info.getText() != null) {
String duration = info.getText().toString();
String zeroSeconds = String.format("%02d:%02d", new Object[]{Integer.valueOf(0), Integer.valueOf(0)});
String firstSecond = String.format("%02d:%02d", new Object[]{Integer.valueOf(0), Integer.valueOf(1)});
Log.d("myaccess","after calculation - "+ zeroSeconds + " --- "+ firstSecond + " --- " + duration);
if (zeroSeconds.equals(duration) || firstSecond.equals(duration)) {
Toast.makeText(getApplicationContext(),"Call answered",Toast.LENGTH_SHORT).show();
// Your Code goes here
}
info.recycle();
}
}
}
#Override
protected void onServiceConnected() {
super.onServiceConnected();
Toast.makeText(this,"Service connected",Toast.LENGTH_SHORT).show();
AccessibilityServiceInfo info = new AccessibilityServiceInfo();
info.eventTypes = AccessibilityEvent.TYPE_WINDOW_CONTENT_CHANGED;
info.feedbackType = AccessibilityServiceInfo.FEEDBACK_GENERIC;
info.notificationTimeout = 0;
info.packageNames = null;
setServiceInfo(info);
}
#Override
public void onInterrupt() {
}
}
But to get the function event.getSource() working you have to specify some of your service configuration through xml, so create a xml folder in your project and add a xml file called serviceconfig.xml (you can give any name you want.
The content of serviceconfig is below -
<accessibility-service xmlns:android="http://schemas.android.com/apk/res/android"
android:description="#string/callDetection"
android:accessibilityEventTypes="typeWindowContentChanged"
android:notificationTimeout="100"
android:canRetrieveWindowContent="true"
/>
You can find more about serviceconfig in Here
Now add your service in you Manifest file like this -
<service android:name=".CallDetection"
android:permission="android.permission.BIND_ACCESSIBILITY_SERVICE"
android:label="#string/callDetection">
<intent-filter>
<action android:name="android.accessibilityservice.AccessibilityService" />
</intent-filter>
<meta-data
android:name="android.accessibilityservice"
android:resource="#xml/serviceconfig" />
</service>
And youre done, just run the app and go to Accessibility settings in your phone, you will find an option named as detection (or whatever name you have given as your service description), switch that on to give accesibility permissions for you app.
Now you will see a toast when call is answered.
you can Code any code you want in there, also you can call a callback function in your activity
Most important - Dont call your call window(android dialer window) untill the call is answered, otherwise this will not work.
Note - As android doesn't provide any solution to detect if the call is answered or not, this is the best alternative i have made, hope it works for you.
How would I go about coding a voice trigger to navigate Google Glass Cards?
This is how I see it happening:
1) "Ok Glass, Start My Program"
2) Application begins and shows the first card
3) User can say "Next Card" to move to the next card
(somewhat the equivalent of swiping forward when in the timeline)
4) User can say "Previous Card" to go back
The cards that I need to display are simple text and images, I'm wondering if I can setup a listener of some type to listen for voice commands while the card is being shown.
I've researched Glass voice command nearest match from given list but wasn't able to run the code, although I do have all the libraries.
side note: It's important that the user still see the card when using the voice command. Also his hands are busy so tap/swipe isn't an option.
Any ideas on how to control the timeline within my Immersion app using only voice control? would be greatly appreciated!
I am tracking https://code.google.com/p/google-glass-api/issues/detail?id=273 as well.
My ongoing research made me look back at Google Glass Developer to use Google's suggested way of listening to gestures: https://developers.google.com/glass/develop/gdk/input/touch#detecting_gestures_with_a_gesture_detector
How can we activate these gestures with voice commands?
Android just beta-released wearable devices upgrade for android http://developer.android.com/wear/notifications/remote-input.html, Is there a way we can use this to answer my question? it still feels like we are still 1-step away since we can call on the service but not have it "sleep" and "wake up" as a background service when we talk.
this thing define in onCreate method
mAudioManager = (AudioManager) context.getSystemService(Context.AUDIO_SERVICE);
// mAudioManager.setStreamSolo(AudioManager.STREAM_VOICE_CALL, true);
sr = SpeechRecognizer.createSpeechRecognizer(context);
sr.setRecognitionListener(new listener(context));
// intent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL, "en-US");
intent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH);
intent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL,RecognizerIntent.LANGUAGE_MODEL_FREE_FORM);
intent.putExtra(RecognizerIntent.EXTRA_CALLING_PACKAGE,context.getPackageName());
sr.startListening(intent);
Log.i("111111","11111111"+"in");
This listener class simply add in your class
class listener implements RecognitionListener
{
Context context1;
public listener(Context context)
{
//Log.i("onError startListening","enter"+"nam");
context1=context;
}
public void onReadyForSpeech(Bundle params)
{
//Log.d(TAG, "onReadyForSpeech");
}
public void onBeginningOfSpeech()
{
//Log.d(TAG, "onBeginningOfSpeech");
}
public void onRmsChanged(float rmsdB)
{
//Log.d(TAG, "onRmsChanged");
}
public void onBufferReceived(byte[] buffer)
{
//Log.d(TAG, "onBufferReceived");
}
public void onEndOfSpeech()
{
//Log.d(TAG, "onEndofSpeech");
sr.startListening(intent);
}
public void onError(int error)
{
//Log.d(TAG, "error " + error);
//7 -No recognition result matched.
//9 - vInsufficient permissions
//6 - No speech input
//8 RecognitionService busy.
//5 Other client side errors.
//3 Audio recording error.
// mText.setText("error " + error);
if(error==6 || error==7 || error==4 || error==1 || error==2 || error==5 || error==3 || error==8 || error==9 )
{
sr.startListening(intent);
//Log.i("onError startListening","onError startListening"+error);
}
}
public void onResults(Bundle results)
{
//Log.v(TAG,"onResults" + results);
ArrayList data = results.getStringArrayList(SpeechRecognizer.RESULTS_RECOGNITION);
for (int i = 0; i < data.size(); i++)
{
//Log.d(TAG, "result " + data.get(i));
//str += data.get(i);
//Toast.makeText(context1, "results: "+data.get(0).toString(), Toast.LENGTH_LONG).show();
//Log.v("my", "output"+"results: "+data.get(0).toString());
//sr.startListening(intent);
}
}
public void onPartialResults(Bundle partialResults)
{
//Log.d(TAG, "onPartialResults");
}
public void onEvent(int eventType, Bundle params)
{
//Log.d(TAG, "onEvent " + eventType);
}
}
I'm writing out the entire code in detail since it took me such a long time to get this working.. perhaps it'll save someone else valuable time.
This code is the implementation of Google Contextual Voice Commands as described on Google Developers here: Contextual voice commands
ContextualMenuActivity.java
package com.drace.contextualvoicecommands;
import android.app.Activity;
import android.os.Bundle;
import android.view.Menu;
import android.view.MenuItem;
import com.drace.contextualvoicecommands.R;
import com.google.android.glass.view.WindowUtils;
public class ContextualMenuActivity extends Activity {
#Override
protected void onCreate(Bundle bundle) {
super.onCreate(bundle);
// Requests a voice menu on this activity. As for any other
// window feature, be sure to request this before
// setContentView() is called
getWindow().requestFeature(WindowUtils.FEATURE_VOICE_COMMANDS);
setContentView(R.layout.activity_main);
}
#Override
public boolean onCreatePanelMenu(int featureId, Menu menu) {
if (featureId == WindowUtils.FEATURE_VOICE_COMMANDS) {
getMenuInflater().inflate(R.menu.main, menu);
return true;
}
// Pass through to super to setup touch menu.
return super.onCreatePanelMenu(featureId, menu);
}
#Override
public boolean onCreateOptionsMenu(Menu menu) {
getMenuInflater().inflate(R.menu.main, menu);
return true;
}
#Override
public boolean onMenuItemSelected(int featureId, MenuItem item) {
if (featureId == WindowUtils.FEATURE_VOICE_COMMANDS) {
switch (item.getItemId()) {
case R.id.dogs_menu_item:
// handle top-level dogs menu item
break;
case R.id.cats_menu_item:
// handle top-level cats menu item
break;
case R.id.lab_menu_item:
// handle second-level labrador menu item
break;
case R.id.golden_menu_item:
// handle second-level golden menu item
break;
case R.id.calico_menu_item:
// handle second-level calico menu item
break;
case R.id.cheshire_menu_item:
// handle second-level cheshire menu item
break;
default:
return true;
}
return true;
}
// Good practice to pass through to super if not handled
return super.onMenuItemSelected(featureId, item);
}
}
activity_main.xml (layout)
<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent" >
<TextView
android:id="#+id/coming_soon"
android:layout_alignParentTop="true"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="#string/voice_command_test"
android:textSize="22sp"
android:layout_marginRight="40px"
android:layout_marginTop="30px"
android:layout_marginLeft="210px" />
</RelativeLayout>
strings.xml
<resources>
<string name="app_name">Contextual voice commands</string>
<string name="voice_start_command">Voice commands</string>
<string name="voice_command_test">Say "Okay, Glass"</string>
<string name="show_me_dogs">Dogs</string>
<string name="labrador">labrador</string>
<string name="golden">golden</string>
<string name="show_me_cats">Cats</string>
<string name="cheshire">cheshire</string>
<string name="calico">calico</string>
</resources>
AndroidManifest.xml
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="com.drace.contextualvoicecommands"
android:versionCode="1"
android:versionName="1.0" >
<uses-sdk
android:minSdkVersion="19"
android:targetSdkVersion="19" />
<uses-permission android:name="com.google.android.glass.permission.DEVELOPMENT"/>
<application
android:allowBackup="true"
android:icon="#drawable/ic_launcher"
android:label="#string/app_name" >
<activity
android:name="com.drace.contextualvoicecommands.ContextualMenuActivity"
android:label="#string/app_name" >
<intent-filter>
<action android:name="com.google.android.glass.action.VOICE_TRIGGER" />
</intent-filter>
<meta-data
android:name="com.google.android.glass.VoiceTrigger"
android:resource="#xml/voice_trigger_start" />
</activity>
</application>
</manifest>
It's been Tested and works great under Google Glass XE22 !
You can try the snippet here: https://github.com/pscholl/glass_snippets/tree/master/hotword_detection.
You may want to try the contextual voice commands available in the GDK. While it does temporarily cover the screen with a menu, it allows voice-only input.
https://developers.google.com/glass/develop/gdk/voice
I did something very similar for one of my applications. It doesn't require the ok glass screen at all, but the user does need to know the commands ahead of time. I explained a bit of it and provided links on this question: Check out my answer here: Glass GDk : Contextual voice commands without the "Ok Glass"
I hope this helps!