I want to set lock screen password programatically and then remove when the loop is executed. I have added Device Administration successfully, can someone help me to SET and UNSET lock screen password from my application itself. Below is my working code for Device Administration
public class DevicePolicyDemoActivity extends Activity {
static final String TAG = "DevicePolicyDemoActivity";
static final int ACTIVATION_REQUEST = 47; // identifies our request id
DevicePolicyManager devicePolicyManager;
ComponentName demoDeviceAdmin;
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
// Initialize Device Policy Manager service and our receiver class
devicePolicyManager = (DevicePolicyManager) getSystemService(Context.DEVICE_POLICY_SERVICE);
demoDeviceAdmin = new ComponentName(this, DemoDeviceAdminReceiver.class);
Intent intent = new Intent(
DevicePolicyManager.ACTION_ADD_DEVICE_ADMIN);
intent.putExtra(DevicePolicyManager.EXTRA_DEVICE_ADMIN,
demoDeviceAdmin);
intent.putExtra(DevicePolicyManager.EXTRA_ADD_EXPLANATION,
"Your boss told you to do this");
startActivityForResult(intent, ACTIVATION_REQUEST);
}
}
For Setting Password you can use the code below:
devicePolicyManager.setPasswordQuality(
demoDeviceAdmin
,DevicePolicyManager.PASSWORD_QUALITY_UNSPECIFIED);
devicePolicyManager.setPasswordMinimumLength(
demoDeviceAdmin,
5);
boolean result = devicePolicyManager.resetPassword("123456",
DevicePolicyManager.RESET_PASSWORD_REQUIRE_ENTRY);
try this example:
public class Main extends Activity implements TextToSpeech.OnInitListener
{
private TextToSpeech mTts;
// This code can be any value you want, its just a checksum.
private static final int MY_DATA_CHECK_CODE = 1234;
/**
* Called when the activity is first created.
*/
#Override
public void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
// Fire off an intent to check if a TTS engine is installed
Intent checkIntent = new Intent();
checkIntent.setAction(TextToSpeech.Engine.ACTION_CHECK_TTS_DATA);
startActivityForResult(checkIntent, MY_DATA_CHECK_CODE);
}
/**
* Executed when a new TTS is instantiated. Some static text is spoken via TTS here.
* #param i
*/
public void onInit(int i)
{
mTts.speak("Hello folks, welcome to my little demo on Text To Speech.",
TextToSpeech.QUEUE_FLUSH, // Drop all pending entries in the playback queue.
null);
}
/**
* This is the callback from the TTS engine check, if a TTS is installed we
* create a new TTS instance (which in turn calls onInit), if not then we will
* create an intent to go off and install a TTS engine
* #param requestCode int Request code returned from the check for TTS engine.
* #param resultCode int Result code returned from the check for TTS engine.
* #param data Intent Intent returned from the TTS check.
*/
public void onActivityResult(int requestCode, int resultCode, Intent data)
{
if (requestCode == MY_DATA_CHECK_CODE)
{
if (resultCode == TextToSpeech.Engine.CHECK_VOICE_DATA_PASS)
{
// success, create the TTS instance
mTts = new TextToSpeech(this, this);
}
else
{
// missing data, install it
Intent installIntent = new Intent();
installIntent.setAction(
TextToSpeech.Engine.ACTION_INSTALL_TTS_DATA);
startActivity(installIntent);
}
}
}
/**
* Be kind, once you've finished with the TTS engine, shut it down so other
* applications can use it without us interfering with it :)
*/
#Override
public void onDestroy()
{
// Don't forget to shutdown!
if (mTts != null)
{
mTts.stop();
mTts.shutdown();
}
super.onDestroy();
}
}
Related
I'm trying to figure out how to configure my VPN application to toggle the Always-On flag from within the application with a toggle.
I'm aware of
DevicePolicyManager#setAlwaysOnVpnPackage
However, it's not very clear how to use this function. I have tried the following:
Admin.java
public class Admin extends DeviceAdminReceiver {
#Override
public void onEnabled(#NonNull Context context, #NonNull Intent intent) {
super.onEnabled(context, intent);
}
}
AdvancedSettings.java
public class AdvancedSettings extends AppCompatActivity
implements View.OnClickListener {
private ComponentName componentName;
private DevicePolicyManager devicePolicyManager;
private boolean alwaysOnConfiguredValue;
private static final int ALWAYS_ON_REQUEST_CODE = 11;
#Override
public void onCreate(Bundle savedInstanceState){
super.onCreate(savedInstanceState);
setContentView(R.layout.settings_advanced);
Button button = findViewById(R.id.toggleAlwaysOnButton);
button.setOnClickListener(this);
devicePolicyManager = (DevicePolicyManager) this
.getSystemService(Context.DEVICE_POLICY_SERVICE);
componentName = new ComponentName(
this.getApplicationContext(), Admin.class);
}
#Override
public void onClick(View v) {
if (v.getId() == R.id.toggleAlwaysOnButton) {
this.setAlwaysOn(true);
}
}
/**
* Handle the Activity Result.
*/
#Override
protected void onActivityResult(
int requestCode, int resultCode, #Nullable Intent data
) {
if (requestCode == ALWAYS_ON_REQUEST_CODE) {
if (resultCode == Activity.RESULT_OK) {
finalizeAlwaysOnToggle();
} else {
Log.w(
"Invalid result code " + resultCode
);
}
}
super.onActivityResult(requestCode, resultCode, data);
}
/**
* Start the process of enabling "Always On" for the VPN.
*
* #param boolean value
*/
private void setAlwaysOn(boolean value) {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.N) {
alwaysOnConfiguredValue = value;
if (devicePolicyManager.isAdminActive(componentName)) {
finalizeAlwaysOnToggle();
return;
}
requestAdminAccess();
} else {
Toas.makeText(this, "Not supported", Toast.LENGTH_LONG).show();
}
}
/**
* Request Admin Access for this application
* if it has not already been done.
*/
private void requestAdminAccess() {
Intent intent = new Intent(DevicePolicyManager.ACTION_ADD_DEVICE_ADMIN);
intent.putExtra(DevicePolicyManager.EXTRA_DEVICE_ADMIN, componentName);
intent.putExtra(
DevicePolicyManager.EXTRA_ADD_EXPLANATION,
"This is required to modify the Always-On Feature from within the Test Application."
);
this.startActivityForResult(intent, ALWAYS_ON_REQUEST_CODE);
}
/**
* Finalize setting the always on toggle after the Admin Access
* has been granted for this application.
*/
private void finalizeAlwaysOnToggle() {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.N) {
try {
if (devicePolicyManager.isAdminActive(componentName)) {
devicePolicyManager.setAlwaysOnVpnPackage(
componentName, (alwaysOnConfiguredValue) ? "com.myapp" : null, true
);
} else {
Log.e(
"Device Policy Manager Admin is not yet active while " +
"trying to finalize changes to AlwaysOnToggle."
);
}
} catch (PackageManager.NameNotFoundException e) {
Log.e("Unable to set always on vpn due to NameNotFound Exception.", e);
}
}
}
}
It processes the request for adding the Device Admin just fine, however after that has completed, when it runs finalizeAlwaysOnToggle(), during the call to devicePolicyManager.setAlwaysOnVpnPackage I receive the following error:
E/AndroidRuntime: FATAL EXCEPTION: main
Process: com.myapp, PID: 30778
java.lang.RuntimeException: Failure delivering result ResultInfo{who=null, request=11, result=-1, data=null} to activity {com.myapp/com.myapp.ui.settings.AdvancedSettings}: java.lang.SecurityException: Admin ComponentInfo{com.myapp/com.myapp.provider.Admin} does not own the profile
You have to differentiate between "Device Admin", "Device Owner" and "Profile Owner".
As it is stated in the docs you need to be one of the latter twos to be able to call setAlwaysOnVpnPackage:
Called by a device or profile owner to configure an always-on VPN
connection through a specific application for the current user. This
connection is automatically granted and persisted after a reboot.
(https://developer.android.com/reference/android/app/admin/DevicePolicyManager.html#setAlwaysOnVpnPackage(android.content.ComponentName,%2520java.lang.String,%2520boolean))
I want to use TTS in an Android application. I followed introduction-to-text-to-speech-in. And this is the code of the Activity which creates TTS instance:
public class MainActivity extends Activity implements TextToSpeech.OnInitListener {
private int MY_DATA_CHECK_CODE = 0;
private TextToSpeech myTTS;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
Intent checkTTSIntent = new Intent();
checkTTSIntent.setAction(TextToSpeech.Engine.ACTION_CHECK_TTS_DATA);
startActivityForResult(checkTTSIntent, MY_DATA_CHECK_CODE);
}
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
if (requestCode == MY_DATA_CHECK_CODE) {
if (resultCode == TextToSpeech.Engine.CHECK_VOICE_DATA_PASS) {
myTTS = new TextToSpeech(this, this);
}
else {
Intent installTTSIntent = new Intent();
installTTSIntent.setAction(TextToSpeech.Engine.ACTION_INSTALL_TTS_DATA);
startActivity(installTTSIntent);
}
}
}
#Override
public void onInit(int status) {
}
}
As you can see it is straightforward and simple and also works up to Android 8.1-API 27; but in Android 9.0 I get ActivityNotFoundException:
in onCreate method: No Activity found to handle Intent { act=android.speech.tts.engine.CHECK_TTS_DATA }
in onActivityResult method: No Activity found to handle Intent { act=android.speech.tts.engine.INSTALL_TTS_DATA }
Although with attention to documentation about ACTION_CHECK_TTS_DATA and ACTION_INSTALL_TTS_DATA, no one of them is deprecated. How I can solve above errors?
Sounds like it's probably a beta version of Android 9.0 emulator that is janky?
I don't think it's necessary these days to use the CHECK_TTS_DATA intent... as
1) most all devices (commercial phones, at least) have at least one TTS installed,
2) the myTTS object won't initialize unless that is true ( or, at least it will return an onError callback if you attempt a speak() ), and
3) devices can have multiple engines installed and thus will force the user to choose which engine to send the intent (from the example you used) to.
Instead, I would decide what exact engine/s you want to support, check for it/them specifically, and prompt to install it/them.
For example, if you decided to use/support the Google engine:
private boolean isGoogleTTSInstalled() {
Intent ttsIntent = new Intent();
ttsIntent.setAction(TextToSpeech.Engine.ACTION_CHECK_TTS_DATA);
PackageManager pm = this.getPackageManager();
List<ResolveInfo> listOfInstalledTTSInfo = pm.queryIntentActivities(ttsIntent, PackageManager.GET_META_DATA);
for (ResolveInfo r : listOfInstalledTTSInfo) {
String engineName = r.activityInfo.applicationInfo.packageName;
if (engineName.equals("com.google.android.tts")) {
return true;
}
}
return false;
}
private void installGoogleTTS() {
Intent goToMarket = new Intent(Intent.ACTION_VIEW)
.setData(Uri.parse("market://details?id=com.google.android.tts"));
startActivity(goToMarket);
}
And, if you intend to support a specific language, check for it using myTTS.isLanguageAvailable(Locale loc), and if not:
private void openTTSSettingsToInstallUnsupportedLanguage() {
Intent intent = new Intent();
intent.setAction("com.android.settings.TTS_SETTINGS");
intent.setFlags(Intent.FLAG_ACTIVITY_NEW_TASK);
startActivity(intent);
}
i want to develop a application to navigate one page to another page using voice command
here is my code
public class mainActivity extends Activity implements OnClickListener {
/** Called when the activity is first created. */
ArrayList<String> StoredCommand = new ArrayList<String>();
private static final String TAG = "VoiceRecognition";
private static final int VOICE_RECOGNITION_REQUEST_CODE = 1234;
private static final Context View = null;
private ListView mList;
private Handler mHandler;
private Spinner mSupportedLanguageView;
/**
* Called with the activity is first created.
*/
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
mHandler = new Handler();
StoredCommand.add("Path Recoder");
StoredCommand.add("Path Selector");
StoredCommand.add("Stop");
StoredCommand.add("Pause");
// Inflate our UI from its XML layout description.
setContentView(R.layout.main);
// Get display items for later interaction
Button speakButton = (Button) findViewById(R.id.btn_speak);
mList = (ListView) findViewById(R.id.list);
mSupportedLanguageView = (Spinner) findViewById(R.id.supported_languages);
// Check to see if a recognition activity is present
PackageManager pm = getPackageManager();
List<ResolveInfo> activities = pm.queryIntentActivities(
new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH), 0);
if (activities.size() != 0) {
speakButton.setOnClickListener(this);
} else {
speakButton.setEnabled(false);
speakButton.setText("Recognizer not present");
}
// Most of the applications do not have to handle the voice settings. If the application
// does not require a recognition in a specific language (i.e., different from the system
// locale), the application does not need to read the voice settings.
refreshVoiceSettings();
}
/**
* Handle the click on the start recognition button.
*/
public void onClick(View v) {
if (v.getId() == R.id.btn_speak) {
startVoiceRecognitionActivity();
}
}
/**
* Fire an intent to start the speech recognition activity.
*/
private void startVoiceRecognitionActivity() {
Intent intent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH);
// Specify the calling package to identify your application
intent.putExtra(RecognizerIntent.EXTRA_CALLING_PACKAGE, getClass().getPackage().getName());
// Display an hint to the user about what he should say.
intent.putExtra(RecognizerIntent.EXTRA_PROMPT, "Speech recognition demo");
// Given an hint to the recognizer about what the user is going to say
intent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL,
RecognizerIntent.LANGUAGE_MODEL_FREE_FORM);
// Specify how many results you want to receive. The results will be sorted
// where the first result is the one with higher confidence.
intent.putExtra(RecognizerIntent.EXTRA_MAX_RESULTS, 5);
// Specify the recognition language. This parameter has to be specified only if the
// recognition has to be done in a specific language and not the default one (i.e., the
// system locale). Most of the applications do not have to set this parameter.
if (!mSupportedLanguageView.getSelectedItem().toString().equals("Default")) {
intent.putExtra(RecognizerIntent.EXTRA_LANGUAGE,
mSupportedLanguageView.getSelectedItem().toString());
}
startActivityForResult(intent, VOICE_RECOGNITION_REQUEST_CODE);
}
/**
* Handle the results from the recognition activity.
*/
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
if (requestCode == VOICE_RECOGNITION_REQUEST_CODE && resultCode == RESULT_OK) {
// Fill the list view with the strings the recognizer thought it could have heard
ArrayList<String> matches = data.getStringArrayListExtra(
RecognizerIntent.EXTRA_RESULTS);
mList.setAdapter(new ArrayAdapter<String>(this, android.R.layout.simple_list_item_1,
matches));
StringBuilder sb=new StringBuilder() ;
for (String match:matches){
switch(resultCode) {
case RESULT_OK:
Log.i(TAG, "RESULT_OK");
if(StoredCommand==matches)
{
//Button next=(Button)findViewById(R.id.btn_speak);
if (matches.contains("Path Recoder"))
{
Intent myIntent = new Intent(View, PahtRecoder.class);
startActivityForResult(myIntent, 0);
}
else if(matches.contains("path selector"))
{
Intent myIntent = new Intent(View, Pahtselector.class);
startActivityForResult(myIntent, 0);
}
else if(matches.contains("stop"))
{
Intent myIntent = new Intent(View, Pahtselector.class);
startActivityForResult(myIntent, 0);
}
else if(matches.contains("start"))
{
Intent myIntent = new Intent(View, Pahtselector.class);
startActivityForResult(myIntent, 0);
}
}
else
{
Log.i(TAG, "COMMAND_NOT_MATCHING");
}
break;
case RESULT_CANCELED:
Log.i(TAG, "RESULT_CANCELED");
break;
case RecognizerIntent.RESULT_AUDIO_ERROR:
Log.i(TAG, "RESULT_AUDIO_ERROR");
break;
case RecognizerIntent.RESULT_CLIENT_ERROR:
Log.i(TAG, "RESULT_CLIENT_ERROR");
break;
case RecognizerIntent.RESULT_NETWORK_ERROR:
Log.i(TAG, "RESULT_NETWORK_ERROR");
break;
case RecognizerIntent.RESULT_NO_MATCH:
Log.i(TAG, "RESULT_NO_MATCH");
break;
case RecognizerIntent.RESULT_SERVER_ERROR:
Log.i(TAG, "RESULT_SERVER_ERROR");
break;
default:
Log.i(TAG, "RESULT_UNKNOWN");
break;
}
}
}
else{
Log.e("TAG", "Recognition is Failed");
}
super.onActivityResult(requestCode, resultCode, data);
}
private void refreshVoiceSettings() {
Log.i(TAG, "Sending broadcast");
sendOrderedBroadcast(RecognizerIntent.getVoiceDetailsIntent(this), null,
new SupportedLanguageBroadcastReceiver(), null, Activity.RESULT_OK, null, null);
}
private void updateSupportedLanguages(List<String> languages) {
// We add "Default" at the beginning of the list to simulate default language.
languages.add(0, "Default");
SpinnerAdapter adapter = new ArrayAdapter<CharSequence>(this,
android.R.layout.simple_spinner_item, languages.toArray(
new String[languages.size()]));
mSupportedLanguageView.setAdapter(adapter);
}
private void updateLanguagePreference(String language) {
TextView textView = (TextView) findViewById(R.id.language_preference);
textView.setText(language);
}
/**
* Handles the response of the broadcast request about the recognizer supported languages.
*
* The receiver is required only if the application wants to do recognition in a specific
* language.
*/
private class SupportedLanguageBroadcastReceiver extends BroadcastReceiver {
#Override
public void onReceive(Context context, final Intent intent) {
Log.i(TAG, "Receiving broadcast " + intent);
final Bundle extra = getResultExtras(false);
if (getResultCode() != Activity.RESULT_OK) {
mHandler.post(new Runnable() {
#Override
public void run() {
showToast("Error code:" + getResultCode());
}
});
}
if (extra == null) {
mHandler.post(new Runnable() {
#Override
public void run() {
showToast("No extra");
}
});
}
if (extra.containsKey(RecognizerIntent.EXTRA_SUPPORTED_LANGUAGES)) {
mHandler.post(new Runnable() {
#Override
public void run() {
updateSupportedLanguages(extra.getStringArrayList(
RecognizerIntent.EXTRA_SUPPORTED_LANGUAGES));
}
});
}
if (extra.containsKey(RecognizerIntent.EXTRA_LANGUAGE_PREFERENCE)) {
mHandler.post(new Runnable() {
#Override
public void run() {
updateLanguagePreference(
extra.getString(RecognizerIntent.EXTRA_LANGUAGE_PREFERENCE));
}
});
}
}
private void showToast(String text) {
Toast.makeText(mainActivity.this, text, 1000).show();
}
}
according to the stored command i need to navigate those pages.but result is Google voice option work but commands are not working.is there any wrong of my pattern matching...please give me solution for that.
thank you
Try this out as a minimal implementation of onActivityResult():
if (requestCode == VOICE_RECOGNITION_REQUEST_CODE && resultCode == RESULT_OK)
{
ArrayList<String> matches = data.getStringArrayListExtra(
RecognizerIntent.EXTRA_RESULTS);
mList.setAdapter(new ArrayAdapter<String>(this, android.R.layout.simple_list_item_1,
matches));
for (String bestMatch:matches)
{
if (bestMatch.equalsIgnoreCase("Path Recoder"))
{
Intent myIntent = new Intent(View, PahtRecoder.class);
startActivityForResult(myIntent, 0);
}
else if(bestMatch.equalsIgnoreCase("Path Selector"))
{
Intent myIntent = new Intent(View, Pahtselector.class);
startActivityForResult(myIntent, 0);
}
else if(bestMatch.equalsIgnoreCase("Stop"))
{
Intent myIntent = new Intent(View, Pahtselector.class);
startActivityForResult(myIntent, 0);
}
else if(bestMatch.equalsIgnoreCase("Pause"))
{
Intent myIntent = new Intent(View, Pahtselector.class);
startActivityForResult(myIntent, 0);
}
else
{
Log.i(TAG, "COMMAND_NOT_MATCHING");
}
}
}
Further Update: My initial post was wrong - StoredCommand should not be used; on older speech recognition platforms they would be provided with a list of possible utterances and the engine would try to match what you say against the possibilities. However, the default engine on Android does not need this. By the way, what does your mList display?
Note also that I have not tested any of the code above...
Hi all this is the code for navigate from one page to another page using voice command.It will useful any one
public class mainActivity extends Activity implements OnClickListener {
/** Called when the activity is first created. */
private static final String TAG = "VoiceRecognition";
private static final int VOICE_RECOGNITION_REQUEST_CODE = 1234;
private static final Context View = null;
private ListView mList;
private Handler mHandler;
private Spinner mSupportedLanguageView;
/**
* Called with the activity is first created.
*/
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
mHandler = new Handler();
// Inflate our UI from its XML layout description.
setContentView(R.layout.main);
// Get display items for later interaction
Button speakButton = (Button) findViewById(R.id.btn_speak);
mList = (ListView) findViewById(R.id.list);
mSupportedLanguageView = (Spinner) findViewById(R.id.supported_languages);
// Check to see if a recognition activity is present
PackageManager pm = getPackageManager();
List<ResolveInfo> activities = pm.queryIntentActivities(
new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH), 0);
if (activities.size() != 0) {
speakButton.setOnClickListener(this);
} else {
speakButton.setEnabled(false);
speakButton.setText("Recognizer not present");
}
// Most of the applications do not have to handle the voice settings. If the application
// does not require a recognition in a specific language (i.e., different from the system
// locale), the application does not need to read the voice settings.
refreshVoiceSettings();
}
/**
* Handle the click on the start recognition button.
*/
public void onClick(View v) {
if (v.getId() == R.id.btn_speak) {
startVoiceRecognitionActivity();
}
}
/**
* Fire an intent to start the speech recognition activity.
*/
private void startVoiceRecognitionActivity() {
Intent intent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH);
// Specify the calling package to identify your application
intent.putExtra(RecognizerIntent.EXTRA_CALLING_PACKAGE, getClass().getPackage().getName());
// Display an hint to the user about what he should say.
intent.putExtra(RecognizerIntent.EXTRA_PROMPT, "Speech recognition demo");
// Given an hint to the recognizer about what the user is going to say
intent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL,
RecognizerIntent.LANGUAGE_MODEL_FREE_FORM);
// Specify how many results you want to receive. The results will be sorted
// where the first result is the one with higher confidence.
intent.putExtra(RecognizerIntent.EXTRA_MAX_RESULTS, 5);
// Specify the recognition language. This parameter has to be specified only if the
// recognition has to be done in a specific language and not the default one (i.e., the
// system locale). Most of the applications do not have to set this parameter.
if (!mSupportedLanguageView.getSelectedItem().toString().equals("Default")) {
intent.putExtra(RecognizerIntent.EXTRA_LANGUAGE,
mSupportedLanguageView.getSelectedItem().toString());
}
startActivityForResult(intent, VOICE_RECOGNITION_REQUEST_CODE);
}
/**
* Handle the results from the recognition activity.
*/
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
if (requestCode == VOICE_RECOGNITION_REQUEST_CODE && resultCode == RESULT_OK) {
// Fill the list view with the strings the recognizer thought it could have heard
ArrayList<String> matches = data.getStringArrayListExtra(
RecognizerIntent.EXTRA_RESULTS);
mList.setAdapter(new ArrayAdapter<String>(this, android.R.layout.simple_list_item_1,
matches));
for (String bestMatch : matches) {
if (bestMatch.contains("record") || bestMatch.contains("cod") || bestMatch.contains("ed")) {
// Intent myIntent = new Intent(View, PahtRecoder.class);
// startActivityForResult(myIntent, 0);
Intent my = new Intent(getApplicationContext(),
PathRecorderStart.class);
startActivityForResult(my, 0);
}
else if (bestMatch.contains("select") || bestMatch.contains("elect") || bestMatch.contains("ct")) {
// Intent myIntent = new Intent(View, PahtRecoder.class);
// startActivityForResult(myIntent, 0);
Intent my = new Intent(getApplicationContext(),
PathSelectorOptions.class);
startActivityForResult(my, 0);
}
else {
Log.i(TAG, "COMMAND_NOT_MATCHING");
}
}
}
super.onActivityResult(requestCode, resultCode, data);
}
private void refreshVoiceSettings() {
Log.i(TAG, "Sending broadcast");
sendOrderedBroadcast(RecognizerIntent.getVoiceDetailsIntent(this), null,
new SupportedLanguageBroadcastReceiver(), null, Activity.RESULT_OK, null, null);
}
private void updateSupportedLanguages(List<String> languages) {
// We add "Default" at the beginning of the list to simulate default language.
languages.add(0, "Default");
SpinnerAdapter adapter = new ArrayAdapter<CharSequence>(this,
android.R.layout.simple_spinner_item, languages.toArray(
new String[languages.size()]));
mSupportedLanguageView.setAdapter(adapter);
}
private void updateLanguagePreference(String language) {
TextView textView = (TextView) findViewById(R.id.language_preference);
textView.setText(language);
}
/**
* Handles the response of the broadcast request about the recognizer supported languages.
*
* The receiver is required only if the application wants to do recognition in a specific
* language.
*/
private class SupportedLanguageBroadcastReceiver extends BroadcastReceiver {
#Override
public void onReceive(Context context, final Intent intent) {
Log.i(TAG, "Receiving broadcast " + intent);
final Bundle extra = getResultExtras(false);
if (getResultCode() != Activity.RESULT_OK) {
mHandler.post(new Runnable() {
#Override
public void run() {
showToast("Error code:" + getResultCode());
}
});
}
if (extra == null) {
mHandler.post(new Runnable() {
#Override
public void run() {
showToast("No extra");
}
});
}
if (extra.containsKey(RecognizerIntent.EXTRA_SUPPORTED_LANGUAGES)) {
mHandler.post(new Runnable() {
#Override
public void run() {
updateSupportedLanguages(extra.getStringArrayList(
RecognizerIntent.EXTRA_SUPPORTED_LANGUAGES));
}
});
}
if (extra.containsKey(RecognizerIntent.EXTRA_LANGUAGE_PREFERENCE)) {
mHandler.post(new Runnable() {
#Override
public void run() {
updateLanguagePreference(
extra.getString(RecognizerIntent.EXTRA_LANGUAGE_PREFERENCE));
}
});
}
}
private void showToast(String text) {
Toast.makeText(mainActivity.this, text, 1000).show();
}
}
}
Hey i a creating app which is TextToSpeech functionality. I write code and run but no any
speech is generate. some error display in logcat. here is logcat
04-11 20:21:30.099: VERBOSE/TtsService(481): TtsService.setLanguage(eng, USA, )
04-11 20:21:30.109: INFO/TextToSpeech.java - speak(849): speak text of length 41
04-11 20:21:30.109: ERROR/TextToSpeech.java - speak(849): service isn't started
I don't understand how to solve this...here is my full code.
public class ExamAppearingActivity extends Activity implements OnInitListener
{
private int MY_DATA_CHECK_CODE = 0;
private TextToSpeech tts;
#Override
public void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
setContentView(R.layout.examquestionscreen);
if (isVoiceEnabled==1)
{
tts = new TextToSpeech(this, this);
final List<ObjectiveWiseQuestion> QuestionWiseProfile1= db.getOneQuestion(examId);
for (final ObjectiveWiseQuestion cn : QuestionWiseProfile1)
{
Intent checkIntent = new Intent();
checkIntent.setAction(TextToSpeech.Engine.ACTION_CHECK_TTS_DATA);
startActivityForResult(checkIntent, MY_DATA_CHECK_CODE);
db=new MySQLiteHelper(getBaseContext());
db.getWritableDatabase();
counter=cn.getCounter();
String question="Question is "+cn.getQuestion();
String option1="Option A is "+cn.getOptionA();
String option2="Option B is "+cn.getOptionB();
String option3="Option C is "+cn.getOptionC();
String option4="Option D is "+cn.getOptionD();
tts.speak(question, TextToSpeech.QUEUE_ADD, null);
tts.speak(option1, TextToSpeech.QUEUE_ADD, null);
tts.speak(option2, TextToSpeech.QUEUE_ADD, null);
tts.speak(option3, TextToSpeech.QUEUE_ADD, null);
tts.speak(option4, TextToSpeech.QUEUE_ADD, null);
}
}
}
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
if (requestCode == MY_DATA_CHECK_CODE) {
if (resultCode == TextToSpeech.Engine.CHECK_VOICE_DATA_PASS) {
// success, create the TTS instance
tts = new TextToSpeech(this, this);
}
else {
// missing data, install it
Intent installIntent = new Intent();
installIntent.setAction(TextToSpeech.Engine.ACTION_INSTALL_TTS_DATA);
//tts.isLanguageAvailable(Locale.INDIA_HINDI);
startActivity(installIntent);
}
}
}
#Override
public void onInit(int status)
{
if (status == TextToSpeech.SUCCESS)
{
// tts.setLanguage(Locale.US);
Locale loc = new Locale ("hi_IN");
tts.setLanguage(loc);
Toast.makeText(ExamAppearingActivity.this,"Text-To-Speech engine is initialized", Toast.LENGTH_LONG).show();
}
else if (status == TextToSpeech.ERROR)
{
Toast.makeText(ExamAppearingActivity.this, "Error occurred while initializing Text-To-Speech engine", Toast.LENGTH_LONG).show();
}
}
This code is run only when i add it on button click but i need to start it from
onCreate() method.
Any help is appreciated.
You can't use tts until onInit has been called.
At the moment, you create it and try to use it within the onCreate method, but it won't have finished being initialised by then.
You're also creating tts twice. The one in onActivityResult makes most sense because you're checking it exists first. I'd get rid of the creation in onCreate, and put all of the actual speaking into onInit.
It seems to me that the method "speak" of class TextToSpeech only works in method onInit or onUtteranceCompleted. However, onInit and onUtteranceCompleted don't have any parameter for passing strings.
In the following code, I tried to define a global string arraylist outside the methods and used the arraylist for string input.For some reason , it didn't work out.But the engine did speak "did you sleep well". Any help is appreciated.
public class TTS extends Activity implements OnInitListener,OnUtteranceCompletedListener,Runnable {
ArrayList<String> content=new ArrayList<String>();
int MY_DATA_CHECK_CODE=50;
private TextToSpeech mTts;
public void onCreate(Bundle savedInstanceState) {
content.add("test");
content.add("another test");
super.onCreate(savedInstanceState);
setContentView(R.layout.splash);
Intent checkIntent = new Intent();
checkIntent.setAction(TextToSpeech.Engine.ACTION_CHECK_TTS_DATA);
startActivityForResult(checkIntent, MY_DATA_CHECK_CODE);
}
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
if (requestCode == MY_DATA_CHECK_CODE) {
if (resultCode == TextToSpeech.Engine.CHECK_VOICE_DATA_PASS) {
// success, create the TTS instance
mTts = new TextToSpeech(this,this);
} else {
// missing data, install it
Intent installIntent = new Intent();
installIntent.setAction(
TextToSpeech.Engine.ACTION_INSTALL_TTS_DATA);
startActivity(installIntent);
}
}
}
public void onInit(int status){
if(status==TextToSpeech.SUCCESS){
mTts.setLanguage(Locale.US);
mTts.setOnUtteranceCompletedListener(this);
String myText1 = "Did you sleep well?";
mTts.speak(myText1, TextToSpeech.QUEUE_FLUSH, null);
for(int i=0;i<content.size();i++){
mTts.speak(content.get(i),TextToSpeech.QUEUE_ADD,null);
}
if(status==TextToSpeech.ERROR){
mTts.shutdown();
}
}
}
I believe some of your code is missing, but FYI it is possible to assign an ID to an utterance via the parameters map, e.g.:
HashMap<String, String> myHashAlarm = new HashMap();
myHashAlarm.put(TextToSpeech.Engine.KEY_PARAM_UTTERANCE_ID, "ID of First Utterance");
mTts.speak("It was a clear black night", TextToSpeech.QUEUE_ADD, myHashAlarm);
"ID of First Utterance" will be passed to onUtteranceCompleted(String utteranceId)
Please see Using Text-to-Speech.