Glass voice command nearest match from given list - android

With Glass you can launch an app via the 'OK, Glass' menu and it seems to pick the nearest match unless a command is miles off, and you can obviously see the list of commands.
Is there anyway from within the app, or from the voice prompt (after the initial app trigger) to have a similar list given and return the nearest match.
Random (non-real world) example, an app that shows you a colour, "OK Glass, show the colour red"
'show the colour' could be your voice trigger and seems to be matched by glass on a 'nearest neighbor' method, however 'red' is just read in as free text and could be easily misheard as 'dread' or 'head', or even 'read' as there is no way of differentiating 'read' from 'red'.
Is there a way to pass a list of pre-approved option (red, green, blue, orange*, etc.) to this stage, or to another voice prompt within the app so the user can see the list and get more accurate results when there is a finite set of expected responses (like the main ok glass screen)?
*ok well nothing rhymes with orange, we're probably safe there

The Google GDK doesn't support this feature yet. However, the necessary features are already available in some libraries and you can use them as long as the GDK doesn't support this natively.
What you have to do:
Pull the GlassVoice.apk from your Glass: adb pull /system/app/GlassVoice.apk
Use dex2jar to convert this apk into a jar file.
Add the jar file to your build path
Now you can use this library like this:
public class VoiceActivity extends Activity {
private VoiceInputHelper mVoiceInputHelper;
private VoiceConfig mVoiceConfig;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.voice_activity);
String[] items = {"red", "green", "blue", "orange"};
mVoiceConfig = new VoiceConfig("MyVoiceConfig", items);
mVoiceInputHelper = new VoiceInputHelper(this, new MyVoiceListener(mVoiceConfig),
VoiceInputHelper.newUserActivityObserver(this));
}
#Override
protected void onResume() {
super.onResume();
mVoiceInputHelper.addVoiceServiceListener();
}
#Override
protected void onPause() {
super.onPause();
mVoiceInputHelper.removeVoiceServiceListener();
}
public class MyVoiceListener implements VoiceListener {
protected final VoiceConfig voiceConfig;
public MyVoiceListener(VoiceConfig voiceConfig) {
this.voiceConfig = voiceConfig;
}
#Override
public void onVoiceServiceConnected() {
mVoiceInputHelper.setVoiceConfig(mVoiceConfig, false);
}
#Override
public void onVoiceServiceDisconnected() {
}
#Override
public VoiceConfig onVoiceCommand(VoiceCommand vc) {
String recognizedStr = vc.getLiteral();
Log.i("VoiceActivity", "Recognized text: "+recognizedStr);
return voiceConfig;
}
#Override
public FormattingLogger getLogger() {
return FormattingLoggers.getContextLogger();
}
#Override
public boolean isRunning() {
return true;
}
#Override
public boolean onResampledAudioData(byte[] arg0, int arg1, int arg2) {
return false;
}
#Override
public boolean onVoiceAmplitudeChanged(double arg0) {
return false;
}
#Override
public void onVoiceConfigChanged(VoiceConfig arg0, boolean arg1) {
}
}
}

You can take advantage of the disambiguation step that occurs when multiple Activities or Services support the same Voice Trigger: simply have multiple Activities or Services in your application support "show me the color" as the voice trigger and label them with the color options.
Your manifest would look something like:
<application
android:allowBackup="true"
android:label="#string/app_name"
android:icon="#drawable/icon_50"
>
<activity
android:name="com.mycompany.RedActivity"
android:label="#string/red"
android:icon="#drawable/icon_red"
>
<intent-filter>
<action android:name="com.google.android.glass.action.VOICE_TRIGGER"/>
</intent-filter>
<meta-data
android:name="com.google.android.glass.VoiceTrigger"
android:resource="#xml/activity_start"
/>
</activity>
<activity
android:name="com.mycompany.BlueActivity"
android:label="#string/blue"
android:icon="#drawable/icon_blue"
>
<intent-filter>
<action android:name="com.google.android.glass.action.VOICE_TRIGGER"/>
</intent-filter>
<meta-data
android:name="com.google.android.glass.VoiceTrigger"
android:resource="#xml/activity_start"
/>
</activity>
<!-- ... -->
</application>
Those Activities or Services would only be used as a "trampoline" to launch the main logic of your app with the color selection.

If you haven't already, you should take a look at contextual voice menus that were added just a few weeks ago to the GDK. I had your exact same problem just the day before it was released, looking at it the next day and finding this helped me a lot! :)

Related

How to use Accessibility Services for "Taking Action for Users"?

Background
Back a few years ago, I asked how TeamViewer allows the user to control the device without normal interaction with the device. I was told it's a special "backdoor" that manufacturers allow specifically for this app, and only possible using root priviledge for other apps.
Seeing that an app like "Airplane Mode Shortcut" allows to toggle airplane mode, by automatic navigation to its screen and toggling the switch, it made me realize this situation has changed.
The problem
It is said in the docs:
Starting with Android 4.0 (API Level 14), accessibility services can
act on behalf of users, including changing the input focus and
selecting (activating) user interface elements. In Android 4.1 (API
Level 16) the range of actions has been expanded to include scrolling
lists and interacting with text fields. Accessibility services can
also take global actions, such as navigating to the Home screen,
pressing the Back button, opening the notifications screen and recent
applications list. Android 4.1 also includes a new type of focus,
Accessibilty Focus, which makes all visible elements selectable by an
accessibility service.
These new capabilities make it possible for developers of
accessibility services to create alternative navigation modes such as
gesture navigation, and give users with disabilities improved control
of their Android devices.
But there is no more information about how to use it.
Only samples I've found are at the bottom, but those are very old and a part of the apiDemos bundle.
The question
How do I make a service that can query, focus, click, enter text, and perform other UI related operations?
By implementing AccessibilityService (https://developer.android.com/training/accessibility/service.html) you get access to that features.
You can either inspect or perform action on the element lastly interacted by user or inspect whole application which currently active.
Intercept user events by implementing onAccessibilityEvent(AccessibilityEvent event), here you can retrieve virtual view (representing original view) with event.getSource() and then inspect it with getClassName() or getText() or anything you find in the documentation.
Inspect whole application by calling getRootInActiveWindow() and iterate throught tree of virtaul views with getRootInActiveWindow().getChild(index).
Both getRootInActiveWindow() and event.getSource() return AccessibilityNodeInfo, on which you can invoke performAction(action) and do something like Click, Set Text, etc..
Example: Play Store
Search for 'facebook' app and open it's page on play store, once you opened the play store app.
#Override
public void onAccessibilityEvent(final AccessibilityEvent event) {
AccessibilityNodeInfo rootInActiveWindow = getRootInActiveWindow();
//Inspect app elements if ready
if (rootInActiveWindow != null) {
//Search bar is covered with textview which need to be clicked
List<AccessibilityNodeInfo> searchBarIdle = rootInActiveWindow.findAccessibilityNodeInfosByViewId("com.android.vending:id/search_box_idle_text");
if (searchBarIdle.size() > 0) {
AccessibilityNodeInfo searchBar = searchBarIdle.get(0);
searchBar.performAction(AccessibilityNodeInfo.ACTION_CLICK);
}
//Check is search bar is visible
List<AccessibilityNodeInfo> searchBars = rootInActiveWindow.findAccessibilityNodeInfosByViewId("com.android.vending:id/search_box_text_input");
if (searchBars.size() > 0) {
AccessibilityNodeInfo searchBar = searchBars.get(0);
//Check is searchbar have the required text, if not set the text
if (searchBar.getText() == null || !searchBar.getText().toString().equalsIgnoreCase("facebook")) {
Bundle args = new Bundle();
args.putString(AccessibilityNodeInfo.ACTION_ARGUMENT_SET_TEXT_CHARSEQUENCE, "facebook");
searchBar.performAction(AccessibilityNodeInfo.ACTION_SET_TEXT, args);
} else {
//There is no way to press Enter to perform search, so find corresponding suggestion and click
List<AccessibilityNodeInfo> searchSuggestions = rootInActiveWindow.findAccessibilityNodeInfosByViewId("com.android.vending:id/suggest_text");
for (AccessibilityNodeInfo suggestion : searchSuggestions) {
if(suggestion.getText().toString().equals("Facebook")) {
//We found textview, but its not clickable, so we should perform the click on the parent
AccessibilityNodeInfo clickableParent = suggestion.getParent();
clickableParent.performAction(AccessibilityNodeInfo.ACTION_CLICK);
}
}
}
}
}
}
EDIT: full code below:
MyAccessibilityService
public class MyAccessibilityService extends AccessibilityService {
#Override
public void onCreate() {
super.onCreate();
Log.d("MyAccessibilityService", "onCreate");
}
#Override
public void onAccessibilityEvent(final AccessibilityEvent event) {
Log.d("MyAccessibilityService", "onAccessibilityEvent");
AccessibilityNodeInfo rootInActiveWindow = getRootInActiveWindow();
//Inspect app elements if ready
if (rootInActiveWindow != null) {
//Search bar is covered with textview which need to be clicked
List<AccessibilityNodeInfo> searchBarIdle = rootInActiveWindow.findAccessibilityNodeInfosByViewId("com.android.vending:id/search_box_idle_text");
if (searchBarIdle.size() > 0) {
AccessibilityNodeInfo searchBar = searchBarIdle.get(0);
searchBar.performAction(AccessibilityNodeInfo.ACTION_CLICK);
}
//Check is search bar is visible
List<AccessibilityNodeInfo> searchBars = rootInActiveWindow.findAccessibilityNodeInfosByViewId("com.android.vending:id/search_box_text_input");
if (searchBars.size() > 0) {
AccessibilityNodeInfo searchBar = searchBars.get(0);
//Check is searchbar have the required text, if not set the text
if (searchBar.getText() == null || !searchBar.getText().toString().equalsIgnoreCase("facebook")) {
Bundle args = new Bundle();
args.putString(AccessibilityNodeInfo.ACTION_ARGUMENT_SET_TEXT_CHARSEQUENCE, "facebook");
searchBar.performAction(AccessibilityNodeInfo.ACTION_SET_TEXT, args);
} else {
//There is no way to press Enter to perform search, so find corresponding suggestion and click
List<AccessibilityNodeInfo> searchSuggestions = rootInActiveWindow.findAccessibilityNodeInfosByViewId("com.android.vending:id/suggest_text");
for (AccessibilityNodeInfo suggestion : searchSuggestions) {
if (suggestion.getText().toString().equals("Facebook")) {
//We found textview, but its not clickable, so we should perform the click on the parent
AccessibilityNodeInfo clickableParent = suggestion.getParent();
clickableParent.performAction(AccessibilityNodeInfo.ACTION_CLICK);
}
}
}
}
}
}
#Override
public void onInterrupt() {
}
}
AndroidManifest.xml
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="com.example.findfacebookapp">
<application
android:allowBackup="true"
android:icon="#mipmap/ic_launcher"
android:label="#string/app_name"
android:supportsRtl="true"
android:theme="#style/AppTheme">
<activity android:name=".MainActivity">
<intent-filter>
<action android:name="android.intent.action.MAIN"/>
<category android:name="android.intent.category.LAUNCHER"/>
</intent-filter>
</activity>
<service
android:name=".MyAccessibilityService"
android:label="#string/accessibility_service_label"
android:permission="android.permission.BIND_ACCESSIBILITY_SERVICE">
<intent-filter>
<action android:name="android.accessibilityservice.AccessibilityService"/>
</intent-filter>
<meta-data
android:name="android.accessibilityservice"
android:resource="#xml/accessibility_service_config"/>
</service>
</application>
</manifest>
res/xml/accessibility_service_config.xml
<?xml version="1.0" encoding="utf-8"?>
<accessibility-service
xmlns:android="http://schemas.android.com/apk/res/android"
android:accessibilityEventTypes="typeAllMask"
android:accessibilityFeedbackType="feedbackAllMask"
android:accessibilityFlags="flagDefault"
android:canRequestEnhancedWebAccessibility="true"
android:canRetrieveWindowContent="true"
android:description="#string/app_name"
android:notificationTimeout="100"/>
MainActivity
public class MainActivity extends AppCompatActivity {
public void onEnableAccClick(View view) {
startActivityForResult(new Intent(Settings.ACTION_ACCESSIBILITY_SETTINGS), 1);
}
}

Android N call blocking numbers not getting

I am trying to get the call block numbers in android N, i want to know the given is block number or not (ex:- 5554 emulator number)
Contacts, sms, phone state permissions has been given to allow to access the block numbers and i followed the "Android Developer" site https://developer.android.com/reference/android/provider/BlockedNumberContract.html
But i am unable to get the block numbers, i am using latest android studio 2.2.2 and checked the functionality in android N emulator i don't have device.
Here is my code.
public class MainActivity extends AppCompatActivity {
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
}
// Button onclick method to show the logs
public void displayBlockCursorCount(View view) {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.N) {
String number = "5552";
if (BlockedNumberContract.canCurrentUserBlockNumbers(MainActivity.this)) {
if (BlockedNumberContract.isBlocked(MainActivity.this, number)) {
Log.e(TAG, "given number is blocked >>>>>> " + number);
}
}
}
}
}
I am getting the
java.lang.SecurityException: Caller must be system, default dialer or default SMS app.
Please post the comment if down comment and thanks for advance.
To access blocked contacts,Your app should be default calling app or Messaging app else it throws security exception.
Add additional check
private boolean isAppAsDefaultDialer() {
TelecomManager telecom = mContext.getSystemService(TelecomManager.class);
if (getApplicationContext().getPackageName().equals(telecom.getDefaultDialerPackage())) {
return true;
}
return false;
}
or check sources https://android.googlesource.com/platform/packages/providers/BlockedNumberProvider/+/android-7.0.0_r1/src/com/android/providers/blockednumber/BlockedNumberProvider.java
And make you app as defaul dialer
<intent-filter>
<action android:name="android.intent.action.DIAL"/>
<category android:name="android.intent.category.DEFAULT"/>
<data android:scheme="tel"/>
</intent-filter>

How to interact with USSD dialog programmatically in android

I want to use USSD dialog which comes after dialing any USSD code say *123# which asks user to enter option number to perform specific task(s) depending upon sim card vendors. I need to interact with that dialog to provide input in the text box given into it programmatically.
However, I am able to read the USSD response that comes in Alert Dialog after dialing any USSD code, using AccessibilityService and I'm showing the response in a Toast as shown in the code below. I haven't found any solution to interact with USSD dialog yet.
public class UssdService extends AccessibilityService{
public static String TAG = "USSD";
#Override
public void onAccessibilityEvent(AccessibilityEvent event) {
Log.d(TAG, "onAccessibilityEvent");
String text = event.getText().toString();
if (event.getClassName().equals("android.app.AlertDialog")) {
Log.d(TAG, text);
Toast.makeText(this, text, Toast.LENGTH_LONG).show();
}
}
#Override
public void onInterrupt() {
}
#Override
protected void onServiceConnected() {
super.onServiceConnected();
Log.d(TAG, "onServiceConnected");
AccessibilityServiceInfo info = new AccessibilityServiceInfo();
info.flags = AccessibilityServiceInfo.DEFAULT;
info.packageNames = new String[]{"com.android.phone"};
info.eventTypes = AccessibilityEvent.TYPE_WINDOW_STATE_CHANGED;
info.feedbackType = AccessibilityServiceInfo.FEEDBACK_GENERIC;
setServiceInfo(info);
}
}
Here is the service declaration in Manifest:
<service android:name=".UssdService"
android:permission="android.permission.BIND_ACCESSIBILITY_SERVICE">
<intent-filter>
<action android:name="android.accessibilityservice.AccessibilityService" />
</intent-filter>
<meta-data android:name="android.accessibilityservice"
android:resource="#xml/config_service" />
</service>
For interacting with USSD dialog, I used below code.
I used the below code for click event:
List<AccessibilityNodeInfo> list = nodeInfo.findAccessibilityNodeInfosByText("Send");
for (AccessibilityNodeInfo node : list) {
node.performAction(AccessibilityNodeInfo.ACTION_CLICK);
}
I used the below code for setText in EditText. This is setText where the current focus is.
AccessibilityNodeInfo nodeInput = nodeInfo.findFocus(AccessibilityNodeInfo.FOCUS_INPUT);
Bundle bundle = new Bundle();
bundle.putCharSequence(AccessibilityNodeInfo.ACTION_ARGUMENT_SET_TEXT_CHARSEQUENCE,pMPIN);
nodeInput.performAction(AccessibilityNodeInfo.ACTION_SET_TEXT,bundle);
nodeInput.refresh();
My company Hover has developed an Android SDK which uses accessibility services to run multi-step USSD sessions and have it appear to happen inside your app. The underlying method is similar to what is outlined in the accepted answer, but there is more of an abstraction layer, support for lots of devices that behave differently, and it hides the session from the user.
You create configurations for USSD services, trigger the session to run from your app and pass in any runtime variables you need. When the response is returned your app is notified and you can parse it as you need. It works on Android 4.3 and above.
The SDK is free to integrate and use until you hit large scale. Please see our docs to get started.
(Disclosure: I am the CTO of Hover)

How to Navigate a Google Glass GDK Immersion Application using Voice Command only?

How would I go about coding a voice trigger to navigate Google Glass Cards?
This is how I see it happening:
1) "Ok Glass, Start My Program"
2) Application begins and shows the first card
3) User can say "Next Card" to move to the next card
(somewhat the equivalent of swiping forward when in the timeline)
4) User can say "Previous Card" to go back
The cards that I need to display are simple text and images, I'm wondering if I can setup a listener of some type to listen for voice commands while the card is being shown.
I've researched Glass voice command nearest match from given list but wasn't able to run the code, although I do have all the libraries.
side note: It's important that the user still see the card when using the voice command. Also his hands are busy so tap/swipe isn't an option.
Any ideas on how to control the timeline within my Immersion app using only voice control? would be greatly appreciated!
I am tracking https://code.google.com/p/google-glass-api/issues/detail?id=273 as well.
My ongoing research made me look back at Google Glass Developer to use Google's suggested way of listening to gestures: https://developers.google.com/glass/develop/gdk/input/touch#detecting_gestures_with_a_gesture_detector
How can we activate these gestures with voice commands?
Android just beta-released wearable devices upgrade for android http://developer.android.com/wear/notifications/remote-input.html, Is there a way we can use this to answer my question? it still feels like we are still 1-step away since we can call on the service but not have it "sleep" and "wake up" as a background service when we talk.
this thing define in onCreate method
mAudioManager = (AudioManager) context.getSystemService(Context.AUDIO_SERVICE);
// mAudioManager.setStreamSolo(AudioManager.STREAM_VOICE_CALL, true);
sr = SpeechRecognizer.createSpeechRecognizer(context);
sr.setRecognitionListener(new listener(context));
// intent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL, "en-US");
intent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH);
intent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL,RecognizerIntent.LANGUAGE_MODEL_FREE_FORM);
intent.putExtra(RecognizerIntent.EXTRA_CALLING_PACKAGE,context.getPackageName());
sr.startListening(intent);
Log.i("111111","11111111"+"in");
This listener class simply add in your class
class listener implements RecognitionListener
{
Context context1;
public listener(Context context)
{
//Log.i("onError startListening","enter"+"nam");
context1=context;
}
public void onReadyForSpeech(Bundle params)
{
//Log.d(TAG, "onReadyForSpeech");
}
public void onBeginningOfSpeech()
{
//Log.d(TAG, "onBeginningOfSpeech");
}
public void onRmsChanged(float rmsdB)
{
//Log.d(TAG, "onRmsChanged");
}
public void onBufferReceived(byte[] buffer)
{
//Log.d(TAG, "onBufferReceived");
}
public void onEndOfSpeech()
{
//Log.d(TAG, "onEndofSpeech");
sr.startListening(intent);
}
public void onError(int error)
{
//Log.d(TAG, "error " + error);
//7 -No recognition result matched.
//9 - vInsufficient permissions
//6 - No speech input
//8 RecognitionService busy.
//5 Other client side errors.
//3 Audio recording error.
// mText.setText("error " + error);
if(error==6 || error==7 || error==4 || error==1 || error==2 || error==5 || error==3 || error==8 || error==9 )
{
sr.startListening(intent);
//Log.i("onError startListening","onError startListening"+error);
}
}
public void onResults(Bundle results)
{
//Log.v(TAG,"onResults" + results);
ArrayList data = results.getStringArrayList(SpeechRecognizer.RESULTS_RECOGNITION);
for (int i = 0; i < data.size(); i++)
{
//Log.d(TAG, "result " + data.get(i));
//str += data.get(i);
//Toast.makeText(context1, "results: "+data.get(0).toString(), Toast.LENGTH_LONG).show();
//Log.v("my", "output"+"results: "+data.get(0).toString());
//sr.startListening(intent);
}
}
public void onPartialResults(Bundle partialResults)
{
//Log.d(TAG, "onPartialResults");
}
public void onEvent(int eventType, Bundle params)
{
//Log.d(TAG, "onEvent " + eventType);
}
}
I'm writing out the entire code in detail since it took me such a long time to get this working.. perhaps it'll save someone else valuable time.
This code is the implementation of Google Contextual Voice Commands as described on Google Developers here: Contextual voice commands
ContextualMenuActivity.java
package com.drace.contextualvoicecommands;
import android.app.Activity;
import android.os.Bundle;
import android.view.Menu;
import android.view.MenuItem;
import com.drace.contextualvoicecommands.R;
import com.google.android.glass.view.WindowUtils;
public class ContextualMenuActivity extends Activity {
#Override
protected void onCreate(Bundle bundle) {
super.onCreate(bundle);
// Requests a voice menu on this activity. As for any other
// window feature, be sure to request this before
// setContentView() is called
getWindow().requestFeature(WindowUtils.FEATURE_VOICE_COMMANDS);
setContentView(R.layout.activity_main);
}
#Override
public boolean onCreatePanelMenu(int featureId, Menu menu) {
if (featureId == WindowUtils.FEATURE_VOICE_COMMANDS) {
getMenuInflater().inflate(R.menu.main, menu);
return true;
}
// Pass through to super to setup touch menu.
return super.onCreatePanelMenu(featureId, menu);
}
#Override
public boolean onCreateOptionsMenu(Menu menu) {
getMenuInflater().inflate(R.menu.main, menu);
return true;
}
#Override
public boolean onMenuItemSelected(int featureId, MenuItem item) {
if (featureId == WindowUtils.FEATURE_VOICE_COMMANDS) {
switch (item.getItemId()) {
case R.id.dogs_menu_item:
// handle top-level dogs menu item
break;
case R.id.cats_menu_item:
// handle top-level cats menu item
break;
case R.id.lab_menu_item:
// handle second-level labrador menu item
break;
case R.id.golden_menu_item:
// handle second-level golden menu item
break;
case R.id.calico_menu_item:
// handle second-level calico menu item
break;
case R.id.cheshire_menu_item:
// handle second-level cheshire menu item
break;
default:
return true;
}
return true;
}
// Good practice to pass through to super if not handled
return super.onMenuItemSelected(featureId, item);
}
}
activity_main.xml (layout)
<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent" >
<TextView
android:id="#+id/coming_soon"
android:layout_alignParentTop="true"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="#string/voice_command_test"
android:textSize="22sp"
android:layout_marginRight="40px"
android:layout_marginTop="30px"
android:layout_marginLeft="210px" />
</RelativeLayout>
strings.xml
<resources>
<string name="app_name">Contextual voice commands</string>
<string name="voice_start_command">Voice commands</string>
<string name="voice_command_test">Say "Okay, Glass"</string>
<string name="show_me_dogs">Dogs</string>
<string name="labrador">labrador</string>
<string name="golden">golden</string>
<string name="show_me_cats">Cats</string>
<string name="cheshire">cheshire</string>
<string name="calico">calico</string>
</resources>
AndroidManifest.xml
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="com.drace.contextualvoicecommands"
android:versionCode="1"
android:versionName="1.0" >
<uses-sdk
android:minSdkVersion="19"
android:targetSdkVersion="19" />
<uses-permission android:name="com.google.android.glass.permission.DEVELOPMENT"/>
<application
android:allowBackup="true"
android:icon="#drawable/ic_launcher"
android:label="#string/app_name" >
<activity
android:name="com.drace.contextualvoicecommands.ContextualMenuActivity"
android:label="#string/app_name" >
<intent-filter>
<action android:name="com.google.android.glass.action.VOICE_TRIGGER" />
</intent-filter>
<meta-data
android:name="com.google.android.glass.VoiceTrigger"
android:resource="#xml/voice_trigger_start" />
</activity>
</application>
</manifest>
It's been Tested and works great under Google Glass XE22 !
You can try the snippet here: https://github.com/pscholl/glass_snippets/tree/master/hotword_detection.
You may want to try the contextual voice commands available in the GDK. While it does temporarily cover the screen with a menu, it allows voice-only input.
https://developers.google.com/glass/develop/gdk/voice
I did something very similar for one of my applications. It doesn't require the ok glass screen at all, but the user does need to know the commands ahead of time. I explained a bit of it and provided links on this question: Check out my answer here: Glass GDk : Contextual voice commands without the "Ok Glass"
I hope this helps!

How do I get the Sony Small App SDK Sample Project to work?

I have tried several times to get the Sony Small apps SDK Sample project but I can't get it to work:
Each time I use it I get the error:
06-02 10:40:13.358: E/AndroidRuntime(5903):
java.lang.IllegalAccessError: Class ref in pre-verified class resolved
to unexpected implementation
I have tried redownloading it from their website as well as from the sdk manager and it seems to have the same problem. Anyone able to get it working?
Sample Project is here: https://dl.dropboxusercontent.com/u/2690965/SmallAppSample.zip
Here is the main application class:
public class MainApplication extends SmallApplication {
#Override
public void onCreate() {
super.onCreate();
/* Set the content of the application */
setContentView(R.layout.main);
/*
* Set the content displayed when the application is minimized.
* Calling this method is optional. If not called, application icon is displayed.
*/
setMinimizedView(R.layout.minimized);
/* Set the title of the application to be displayed in the titlebar */
setTitle(R.string.app_name);
SmallAppWindow.Attributes attr = getWindow().getAttributes();
/* Set the requested width of the application */
attr.width = getResources().getDimensionPixelSize(R.dimen.width);
/* Set the requested height of the application */
attr.height = getResources().getDimensionPixelSize(R.dimen.height);
/*
* Set the minimum width of the application, if it's resizable.
*
* If you don't have strong intention to specify minimum window size,
* it is preferable not to set minimum window size.
* If you still want to specify the minimum size, set as small value as possible
* to make your application work properly on the devices with small screens.
*/
// attr.minWidth = getResources().getDimensionPixelSize(R.dimen.min_width);
/* Set the minimum height of the application, if it's resizable */
// attr.minHeight = getResources().getDimensionPixelSize(R.dimen.min_height);
/* Use this flag to make the application window resizable */
attr.flags |= SmallAppWindow.Attributes.FLAG_RESIZABLE;
/* Use this flag to remove the titlebar from the window */
// attr.flags |= SmallAppWindow.Attributes.FLAG_NO_TITLEBAR;
/* Use this flag to enable hardware accelerated rendering */
// attr.flags |= SmallAppWindow.Attributes.FLAG_HARDWARE_ACCELERATED;
/* Set the window attributes to apply the changes above */
getWindow().setAttributes(attr);
setupOptionMenu();
}
#Override
public void onStart() {
super.onStart();
}
#Override
public void onStop() {
super.onStop();
}
#Override
public void onDestroy() {
super.onDestroy();
}
private void setupOptionMenu() {
View header = LayoutInflater.from(this).inflate(R.layout.header, null);
final View optionMenu = header.findViewById(R.id.option_menu);
optionMenu.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
PopupMenu popup = new PopupMenu(MainApplication.this, optionMenu);
popup.getMenuInflater().inflate(R.menu.menus, popup.getMenu());
popup.setOnMenuItemClickListener(new PopupMenu.OnMenuItemClickListener() {
#Override
public boolean onMenuItemClick(MenuItem item) {
Toast.makeText(MainApplication.this,
R.string.menu_clicked, Toast.LENGTH_SHORT).show();
return true;
}
});
popup.show();
}
});
/* Deploy the option menu in the header area of the titlebar */
getWindow().setHeaderView(header);
}
}
Here is the Android Manifest
<?xml version="1.0" encoding="utf-8"?>
<uses-sdk android:minSdkVersion="15" />
<uses-permission android:name="com.sony.smallapp.permission.SMALLAPP" />
<application android:label="#string/app_name" >
<uses-library android:name="com.sony.smallapp.framework" />
<service
android:name="MainApplication"
android:exported="true" >
<intent-filter>
<action android:name="com.sony.smallapp.intent.action.MAIN" />
<category android:name="com.sony.smallapp.intent.category.LAUNCHER" />
</intent-filter>
</service>
</application>
Any help is really well received
I tried your code and it works fine. I'd guess there's something wrong with the references. Hope this link helps. http://juristr.com/blog/2010/06/android-instrumentation-test/

Categories

Resources