I am using the activity, voicerecognition.java, provided by Google to allow the user to choose a language, talk into the handset, and view what they have just said. It works when I take out all the language constants, but when I put the language options in, I get errors.
For example:
The method getVoiceDetailsIntent(VoiceRecognition) is undefined for the type
EXTRA_SUPPORTED_LANGUAGES cannot be resolved or is not a field
EXTRA_LANGUAGE_PREFERENCE cannot be resolved or is not a field
The method run() of type new Runnable(){} must override a superclass method
I thought I might just need to update the SDK or something but everything is up to date. I have no idea what is wrong since I copied the activity exactly as is on the link provided.
package com.sample.voicerecog;
import android.app.Activity;
import android.content.BroadcastReceiver;
import android.content.Context;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.content.pm.ResolveInfo;
import android.os.Bundle;
import android.os.Handler;
import android.speech.RecognizerIntent;
import android.util.Log;
import android.view.View;
import android.view.View.OnClickListener;
import android.widget.ArrayAdapter;
import android.widget.Button;
import android.widget.ListView;
import android.widget.Spinner;
import android.widget.SpinnerAdapter;
import android.widget.TextView;
import android.widget.Toast;
import java.util.ArrayList;
import java.util.List;
public class VoiceRecognition extends Activity implements OnClickListener {
private static final String TAG = "VoiceRecognition";
private static final int VOICE_RECOGNITION_REQUEST_CODE = 1234;
private ListView mList;
private Handler mHandler;
private Spinner mSupportedLanguageView;
/**
* Called with the activity is first created.
*/
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
mHandler = new Handler();
// Inflate our UI from its XML layout description.
setContentView(R.layout.voice_recognition);
// Get display items for later interaction
Button speakButton = (Button) findViewById(R.id.btn_speak);
mList = (ListView) findViewById(R.id.list);
mSupportedLanguageView = (Spinner) findViewById(R.id.supported_languages);
// Check to see if a recognition activity is present
PackageManager pm = getPackageManager();
List<ResolveInfo> activities = pm.queryIntentActivities(
new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH), 0);
if (activities.size() != 0) {
speakButton.setOnClickListener(this);
} else {
speakButton.setEnabled(false);
speakButton.setText("Recognizer not present");
}
// Most of the applications do not have to handle the voice settings. If the application
// does not require a recognition in a specific language (i.e., different from the system
// locale), the application does not need to read the voice settings.
refreshVoiceSettings();
}
/**
* Handle the click on the start recognition button.
*/
public void onClick(View v) {
if (v.getId() == R.id.btn_speak) {
startVoiceRecognitionActivity();
}
}
/**
* Fire an intent to start the speech recognition activity.
*/
private void startVoiceRecognitionActivity() {
Intent intent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH);
// Display an hint to the user about what he should say.
intent.putExtra(RecognizerIntent.EXTRA_PROMPT, "Speech recognition demo");
// Given an hint to the recognizer about what the user is going to say
intent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL,
RecognizerIntent.LANGUAGE_MODEL_FREE_FORM);
// Specify how many results you want to receive. The results will be sorted
// where the first result is the one with higher confidence.
intent.putExtra(RecognizerIntent.EXTRA_MAX_RESULTS, 5);
// Specify the recognition language. This parameter has to be specified only if the
// recognition has to be done in a specific language and not the default one (i.e., the
// system locale). Most of the applications do not have to set this parameter.
if (!mSupportedLanguageView.getSelectedItem().toString().equals("Default")) {
intent.putExtra(RecognizerIntent.EXTRA_LANGUAGE,
mSupportedLanguageView.getSelectedItem().toString());
}
startActivityForResult(intent, VOICE_RECOGNITION_REQUEST_CODE);
}
/**
* Handle the results from the recognition activity.
*/
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
if (requestCode == VOICE_RECOGNITION_REQUEST_CODE && resultCode == RESULT_OK) {
// Fill the list view with the strings the recognizer thought it could have heard
ArrayList<String> matches = data.getStringArrayListExtra(
RecognizerIntent.EXTRA_RESULTS);
mList.setAdapter(new ArrayAdapter<String>(this, android.R.layout.simple_list_item_1,
matches));
}
super.onActivityResult(requestCode, resultCode, data);
}
private void refreshVoiceSettings() {
Log.i(TAG, "Sending broadcast");
sendOrderedBroadcast(RecognizerIntent.getVoiceDetailsIntent(this), null,
new SupportedLanguageBroadcastReceiver(), null, Activity.RESULT_OK, null, null);
}
private void updateSupportedLanguages(List<String> languages) {
// We add "Default" at the beginning of the list to simulate default language.
languages.add(0, "Default");
SpinnerAdapter adapter = new ArrayAdapter<CharSequence>(this,
android.R.layout.simple_spinner_item, languages.toArray(
new String[languages.size()]));
mSupportedLanguageView.setAdapter(adapter);
}
private void updateLanguagePreference(String language) {
TextView textView = (TextView) findViewById(R.id.language_preference);
textView.setText(language);
}
/**
* Handles the response of the broadcast request about the recognizer supported languages.
*
* The receiver is required only if the application wants to do recognition in a specific language.
*/
private class SupportedLanguageBroadcastReceiver extends BroadcastReceiver {
#Override
public void onReceive(Context context, final Intent intent) {
Log.i(TAG, "Receiving broadcast " + intent);
final Bundle extra = getResultExtras(false);
if (getResultCode() != Activity.RESULT_OK) {
mHandler.post(new Runnable() {
#Override
public void run() {
showToast("Error code:" + getResultCode());
}
});
}
if (extra == null) {
mHandler.post(new Runnable() {
#Override
public void run() {
showToast("No extra");
}
});
}
if (extra.containsKey(RecognizerIntent.EXTRA_SUPPORTED_LANGUAGES)) {
mHandler.post(new Runnable() {
#Override
public void run() {
updateSupportedLanguages(extra.getStringArrayList(
RecognizerIntent.EXTRA_SUPPORTED_LANGUAGES));
}
});
}
if (extra.containsKey(RecognizerIntent.EXTRA_LANGUAGE_PREFERENCE)) {
mHandler.post(new Runnable() {
#Override
public void run() {
updateLanguagePreference(
extra.getString(RecognizerIntent.EXTRA_LANGUAGE_PREFERENCE));
}
});
}
}
private void showToast(String text) {
Toast.makeText(VoiceRecognition.this, text, 1000).show();
}
}
}
Any help is appreciated
It appears that your project is at a lower API level than level 8. To fix this, if you are using Eclipse, right click on your project and click properties. Choose Android from the menu on the left, and select a different project build target. If you don't find one at level 8 or higher, click on Window -> Android SDK Manager and install one. You can see what features of RecognizerIntent are available at the various API levels here.
Related
I have the following piece of code:
package audiovisuales.aventuresengulpiyuri;
import android.content.Intent;
import android.speech.tts.TextToSpeech;
import android.speech.tts.TextToSpeech.OnInitListener;
import android.support.v7.app.AppCompatActivity;
import android.os.Bundle;
import android.view.View;
import util.Utilidades;
public class PaginaPrimera extends AppCompatActivity implements OnInitListener{
private TextToSpeech tts;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_pagina_primera);
if (!Utilidades.verificaConexion(this)){
Utilidades.mostrarVentanaErrorDeConexion(this);
}
else if(Portada.getLecturaAutomatica()){
tts= new TextToSpeech(PaginaPrimera.this, this);
tts.speak(getResources().getString(R.string.primeraPagina), TextToSpeech.QUEUE_FLUSH, null);
}
}
The minimum API I have set is API 19. I have checked that it enters in the else block and it detects the text to tell, but when I execute it doesn't say anything.
You need to check if TTS is available on the device.
First, in your onCreate method have this:
#Override
protected void onCreate(Bundle savedInstanceState) {
Intent checkIntent = new Intent();
checkIntent.setAction(TextToSpeech.Engine.ACTION_CHECK_TTS_DATA);
startActivityForResult(checkIntent, REQUEST_DATA_CHECK_CODE);
}
Then in you onActivityResult, check if it is ok to use TTS.
protected void onActivityResult(
int requestCode, int resultCode, Intent data) {
if (requestCode == REQUEST_DATA_CHECK_CODE) {
if (resultCode == TextToSpeech.Engine.CHECK_VOICE_DATA_PASS) {
// success, create the TTS instance
mTts = new TextToSpeech(this, this);
} else {
// missing data, install it
Intent installIntent = new Intent();
installIntent.setAction(
TextToSpeech.Engine.ACTION_INSTALL_TTS_DATA);
startActivity(installIntent);
}
}
}
Then check if the language of the device is available.
if (mTts.isLanguageAvailable(Locale.getDefault()) >= 0) {
mTts.setLanguage(Locale.getDefault());
mTts.speak(textToBeSpoken, TextToSpeech.QUEUE_FLUSH, null);
}
If the language is not available, you may try to install the data for that language or you may change the language.
I am new to android development. Ive to make an application which detects sound and plays music. I have already taken care of music playing part but the problem is that I don't understand how to put a condition to check that whether there is even a sound or not
Im using eclipse and android API 20 and voice recognition intent. Kindly look at the following code.
import android.app.Activity;
import android.os.Bundle;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.content.pm.ResolveInfo;
import android.speech.RecognizerIntent;
import android.view.View;
import android.widget.ArrayAdapter;
import android.widget.Button;
import android.widget.ListView;
import java.util.ArrayList;
import java.util.List;
/**
* A very simple application to handle Voice Recognition intents
* and display the results
*/
public class MainActivity extends Activity
{
private static final int REQUEST_CODE = 1234;
private ListView wordsList;
/**
* Called with the activity is first created.
*/
#Override
public void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
setContentView(R.layout.voice_recog);
Button speakButton = (Button) findViewById(R.id.speakButton);
wordsList = (ListView) findViewById(R.id.list);
// Disable button if no recognition service is present
PackageManager pm = getPackageManager();
List<ResolveInfo> activities = pm.queryIntentActivities(
new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH), 0);
if (activities.size() == 0)
{
speakButton.setEnabled(false);
speakButton.setText("Recognizer not present");
}
else
{startVoiceRecognitionActivity();}
}
/**
* Handle the action of the button being clicked
*/
/**
* Fire an intent to start the voice recognition activity.
*/
private void startVoiceRecognitionActivity()
{
Intent intent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH);
intent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL,
RecognizerIntent.LANGUAGE_MODEL_FREE_FORM);
intent.putExtra(RecognizerIntent.EXTRA_PROMPT, "Voice recognition Demo...");
startActivityForResult(intent, REQUEST_CODE);
}
/**
* Handle the results from the voice recognition activity.
*/
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data)
{
if (requestCode == REQUEST_CODE && resultCode == RESULT_OK)
{
// Populate the wordsList with the String values the recognition engine thought it heard
ArrayList<String> matches = data.getStringArrayListExtra(
RecognizerIntent.EXTRA_RESULTS);
wordsList.setAdapter(new ArrayAdapter<String>(this, android.R.layout.simple_list_item_1,
matches));
}
super.onActivityResult(requestCode, resultCode, data);
}
}
I'm working with the voice recognition API with the code that is already provided within the API examples the feature that I want in the same activity example is:
A) that it works even when my phone is not used that is when the screen is locked.
B) If the googleAPI doesn't find the word it shows a dialog saying cancel/speak again then code selects speak again by itself ,how to go about.
Here's the code:
package com.wwwww.and;
import android.app.Activity;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.content.pm.ResolveInfo;
import android.os.Bundle;
import android.speech.RecognizerIntent;
import android.view.View;
import android.view.View.OnClickListener;
import android.widget.ArrayAdapter;
import android.widget.Button;
import android.widget.ListView;
import java.util.ArrayList;
import java.util.List;
/**
* Sample code that invokes the speech recognition intent API.
*/
public class VoiceRecognition extends Activity implements OnClickListener {
private static final int VOICE_RECOGNITION_REQUEST_CODE = 1234;
private ListView mList;
/**
* Called with the activity is first created.
*/
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
// Inflate our UI from its XML layout description.
setContentView(R.layout.voice_recognition);
// Get display items for later interaction
Button speakButton = (Button) findViewById(R.id.btn_speak);
mList = (ListView) findViewById(R.id.list);
// Check to see if a recognition activity is present
PackageManager pm = getPackageManager();
List<ResolveInfo> activities = pm.queryIntentActivities(
new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH), 0);
if (activities.size() != 0) {
speakButton.setOnClickListener(this);
} else {
speakButton.setEnabled(false);
speakButton.setText("Recognizer not present");
}
startVoiceRecognitionActivity();
}
/**
* Handle the click on the start recognition button.
*/
public void onClick(View v) {
if (v.getId() == R.id.btn_speak) {
//startVoiceRecognitionActivity();
}
}
/**
* Fire an intent to start the speech recognition activity.
*/
private void startVoiceRecognitionActivity() {
Intent intent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH);
intent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL,
RecognizerIntent.LANGUAGE_MODEL_FREE_FORM);
intent.putExtra(RecognizerIntent.EXTRA_PROMPT, " VoiceRecognition Service");
startActivityForResult(intent, VOICE_RECOGNITION_REQUEST_CODE);
}
/**
* Handle the results from the recognition activity.
*/
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
if (requestCode == VOICE_RECOGNITION_REQUEST_CODE && resultCode == RESULT_OK) {
// Fill the list view with the strings the recognizer thought it could have heard
ArrayList<String> matches = data.getStringArrayListExtra(
RecognizerIntent.EXTRA_RESULTS);
mList.setAdapter(new ArrayAdapter<String>(this, android.R.layout.simple_list_item_1,
matches));
}
super.onActivityResult(requestCode, resultCode, data);
}
}
A) that it works even when my phone is not used that is when the screen is locked.
You need to implement the speech recognition in a service.
B) If the googleAPI doesn't find the word it shows a dialog saying cancel/speak again then code selects speak again by itself ,how to go about.
Change your onActivityResult to the one below
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data)
{
if (requestCode == VOICE_RECOGNITION_REQUEST_CODE)
{
if (resultCode == RESULT_OK)
{
// Fill the list view with the strings the recognizer thought it could have heard
ArrayList<String> matches;
if (data != null)
{
matches = data.getStringArrayListExtra(
RecognizerIntent.EXTRA_RESULTS);
}
if (data == null || matches.size() == 0)
{
startVoiceRecognitionActivity();
}
else
{
mList.setAdapter(new ArrayAdapter<String>(this, android.R.layout.simple_list_item_1,
matches));
}
}
else
{
startVoiceRecognitionActivity();
}
}
super.onActivityResult(requestCode, resultCode, data);
}
I am working on voice input in android. I used the sample from
http://developer.android.com/resources/samples/ApiDemos/src/com/example/android/apis/app/VoiceRecognition.html
And while testing on Xperia X10, I got the "Speak now" dialog but before I input some voice it gets closed. I am trying to implement voice search e.g. If voice input is James Bond then I want to populate the James in first name Edit Text and Bond in Last name Edit Text. Which will search in database for the name. But while trying to use the API Demo sample, its not working. May be I am missing something. Will anybody post any sample for voice input rather than ApiDemos sample.
Thanks in advance.
You can use the following code for voice recognition. For a complete tutorial for voice recognition, Click Here.
import android.app.Activity;
import android.os.Bundle;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.content.pm.ResolveInfo;
import android.speech.RecognizerIntent;
import android.view.View;
import android.widget.ArrayAdapter;
import android.widget.Button;
import android.widget.ListView;
import java.util.ArrayList;
import java.util.List;
/**
* A very simple application to handle Voice Recognition intents
* and display the results
*/
public class VoiceRecognitionDemo extends Activity
{
private static final int REQUEST_CODE = 1234;
private ListView wordsList;
/**
* Called with the activity is first created.
*/
#Override
public void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
setContentView(R.layout.voice_recog);
Button speakButton = (Button) findViewById(R.id.speakButton);
wordsList = (ListView) findViewById(R.id.list);
// Disable button if no recognition service is present
PackageManager pm = getPackageManager();
List<ResolveInfo> activities = pm.queryIntentActivities(
new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH), 0);
if (activities.size() == 0)
{
speakButton.setEnabled(false);
speakButton.setText("Recognizer not present");
}
}
/**
* Handle the action of the button being clicked
*/
public void speakButtonClicked(View v)
{
startVoiceRecognitionActivity();
}
/**
* Fire an intent to start the voice recognition activity.
*/
private void startVoiceRecognitionActivity()
{
Intent intent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH);
intent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL,
RecognizerIntent.LANGUAGE_MODEL_FREE_FORM);
intent.putExtra(RecognizerIntent.EXTRA_PROMPT, "Voice recognition Demo...");
startActivityForResult(intent, REQUEST_CODE);
}
/**
* Handle the results from the voice recognition activity.
*/
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data)
{
if (requestCode == REQUEST_CODE && resultCode == RESULT_OK)
{
// Populate the wordsList with the String values the recognition engine thought it heard
ArrayList<String> matches = data.getStringArrayListExtra(
RecognizerIntent.EXTRA_RESULTS);
wordsList.setAdapter(new ArrayAdapter<String>(this, android.R.layout.simple_list_item_1,
matches));
}
super.onActivityResult(requestCode, resultCode, data);
}
}
I want to develop an application which implements voice recognition,and after that implements text to speech using text to speech Engine.I posted the code bellow. I use two buttons and a list view.One button is used for voice recognition, another one is used for text to speech, and the list view is used for both (first in the list view is posted the result of voice recognition, and then the application will read back the words from the list view). When I touch the button for the voice recognition the words are posted in my list view, but the problem is that when I press the button for text to speech the application doesn't read back the words from the list view and in my logcat when I press this button I don't receive any information about this.
Here is my program:
package rtv.rtv.rtv;
import android.app.Activity;
import android.os.Bundle;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.content.pm.ResolveInfo;
import android.speech.RecognizerIntent;
import android.view.View;
import android.view.View.OnClickListener;
import android.widget.ArrayAdapter;
import android.widget.Button;
import android.widget.ListView;
import java.util.ArrayList;
import java.util.List;
import android.speech.tts.TextToSpeech;
import android.speech.tts.TextToSpeech.OnInitListener;
import android.util.Log;
public class VoiceRecTextSpeech extends Activity implements OnClickListener,OnInitListener {
private static final int VOICE_RECOGNITION_REQUEST_CODE = 1234;
private ListView mList;
private Button speakBtn = null;
private static final int REQ_TTS_STATUS_CHECK = 0;
private static final String TAG = "TTS Demo";
private TextToSpeech mTts;
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
// Button and ListView for Voice Recognition
Button speakButton = (Button) findViewById(R.id.btn_speak);
mList = (ListView) findViewById(R.id.list);
//Button for Text to Speech
speakBtn = (Button)findViewById(R.id.speak);
// Check to see if a recognition activity is present
PackageManager pm = getPackageManager();
List<ResolveInfo> activities = pm.queryIntentActivities(
new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH), 0);
if (activities.size() != 0) {
speakButton.setOnClickListener(this);
} else {
speakButton.setEnabled(false);
speakButton.setText("Recognizer not present");
}
// Check to be sure that TTS exists and is okay to use
Intent checkIntent = new Intent();
checkIntent.setAction(TextToSpeech.Engine.ACTION_CHECK_TTS_DATA);
startActivityForResult(checkIntent, REQ_TTS_STATUS_CHECK);
}
//Handle the click on the start recognition button and on text to speech button
public void onClick(View v) {
switch (v.getId()) {
case R.id.btn_speak:
startVoiceRecognitionActivity();
break;
case R.id.speak:
mTts.speak(mList.toString(), TextToSpeech.QUEUE_ADD, null);
break;
}
}
//Fire an intent to start the speech recognition activity
private void startVoiceRecognitionActivity() {
Intent intent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH);
intent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL,
RecognizerIntent.LANGUAGE_MODEL_FREE_FORM);
intent.putExtra(RecognizerIntent.EXTRA_PROMPT, "Speech recognition demo");
startActivityForResult(intent, VOICE_RECOGNITION_REQUEST_CODE);
}
// Handle the results
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
if (requestCode == VOICE_RECOGNITION_REQUEST_CODE && resultCode == RESULT_OK) {
// Fill the list view with the strings the recognizer thought it could have heard
ArrayList<String> matches = data.getStringArrayListExtra(
RecognizerIntent.EXTRA_RESULTS);
mList.setAdapter(new ArrayAdapter<String>(this, android.R.layout.simple_list_item_1,
matches));
}
super.onActivityResult(requestCode, resultCode, data);
if (requestCode == REQ_TTS_STATUS_CHECK) {
switch (resultCode) {
case TextToSpeech.Engine.CHECK_VOICE_DATA_PASS:
// TTS is up and running
mTts = new TextToSpeech(this, this);
Log.v(TAG, "Pico is installed okay");
break;
case TextToSpeech.Engine.CHECK_VOICE_DATA_BAD_DATA:
case TextToSpeech.Engine.CHECK_VOICE_DATA_MISSING_DATA:
case TextToSpeech.Engine.CHECK_VOICE_DATA_MISSING_VOLUME:
// missing data, install it
Log.v(TAG, "Need language stuff: " + resultCode);
Intent installIntent = new Intent();
installIntent.setAction(
TextToSpeech.Engine.ACTION_INSTALL_TTS_DATA);
startActivity(installIntent);
break;
case TextToSpeech.Engine.CHECK_VOICE_DATA_FAIL:
default:
Log.e(TAG, "Got a failure. TTS apparently not available");
}
}
else {
// Got something else
}
}
#Override
public void onInit(int status) {
// Now that the TTS engine is ready, we enable the button
if( status == TextToSpeech.SUCCESS) {
speakBtn.setEnabled(true);
}
}
#Override
public void onPause()
{
super.onPause();
// if we're losing focus, stop talking
if( mTts != null)
mTts.stop();
}
#Override
public void onDestroy()
{
super.onDestroy();
mTts.shutdown();
}
}
Here is the main.xml :
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:orientation="vertical"
android:layout_width="fill_parent"
android:layout_height="fill_parent">
<Button android:id="#+id/speak"
android:layout_height="wrap_content"
android:enabled="false" android:text="Text To Speech" android:layout_width="match_parent"/>
<Button android:id="#+id/btn_speak"
android:layout_width="match_parent"
android:layout_height="wrap_content" android:text="Speak for Voice Recognition"/>
<ListView android:id="#+id/list"
android:layout_width="match_parent"
android:layout_height="0dip"
android:layout_weight="1" />
</LinearLayout>
Thanks !
Your TTS Button has no onClickListener() registered. Therefore your code starting TTS is never called. Are you sure you want to convert the ListView to a String and pass it to the TTS engine? More likely you want to convert the data in the adapter of the ListView to a String.
Even if status == TextToSpeech.SUCCESS passes, it could be that the language data is not supported.
Sometime after onInit() you need to execute some code that looks like this:
public void onInit(Context context, TextToSpeech tts, Locale forLocale)
{
Locale defaultOrPassedIn = forLocale;
if (forLocale == null)
{
defaultOrPassedIn = Locale.getDefault();
}
// check if language is available
switch (tts.isLanguageAvailable(defaultOrPassedIn))
{
case TextToSpeech.LANG_AVAILABLE:
case TextToSpeech.LANG_COUNTRY_AVAILABLE:
case TextToSpeech.LANG_COUNTRY_VAR_AVAILABLE:
tts.setLanguage(defaultOrPassedIn);
break;
case TextToSpeech.LANG_MISSING_DATA:
Log.d(TAG, "MISSING_DATA");
// check if waiting....
Intent installIntent = new Intent();
installIntent
.setAction(TextToSpeech.Engine.ACTION_INSTALL_TTS_DATA);
context.startActivity(installIntent);
break;
case TextToSpeech.LANG_NOT_SUPPORTED:
Log.d(TAG, "NOT SUPPORTED");
// report failure to the user
break;
}
}
Did you insure that its actually receiving results?
As in your adapter has some elements in it? Debug from the starting points ALWAYS.