I am new in xamarin, and I was wondering if it is possible to implement this kind of speech recognition:
First the user inputs "Hello" but the text output will be "Hi"?
I have found this link: Android speech recognition pass data back to Xamarin Forms
but it only outputs the speech "Hello" as a text "Hello".
Speech recognition usually involves translating what you say into text.If you need to change its content, perhaps you can try to make some judgments directly after the transformation and make changes according to your requirements like change the result in the link above :
in MainActivity OnActivityResult:
const int VOICE = 10;
protected override void OnActivityResult(int requestCode, Result resultCode, Intent data)
{
base.OnActivityResult(requestCode, resultCode, data);
if (requestCode == VOICE)
{
if (resultCode == Result.Ok)
{
var matches = data.GetStringArrayListExtra(RecognizerIntent.ExtraResults);
if (matches.Count != 0)
{
var textInput = matches[0];
if (textInput.Length > 500)
textInput = textInput.Substring(0, 500);
//make a judgment and change the value
if(textInput.Eques("hello")){
textInput = "Hi";
}
SpeechToText_Android.SpeechText = textInput;
}
}
SpeechToText_Android.autoEvent.Set();
}
}
Related
Google Speech-to-Text API supports Marathi as per their documentation here. However I have not been able to get it working on my Android phone. I have already added 'Marathi' in languages for my Android device (Moto G6, running android 7.1.1). However, I am not yet able to get a simple SMS converted from speech-to-text with this. Marathi typing works fine though.
Do I need to modify any other setting? What else is required? Any pointers for this would be highly appreciated.
Have you configured your intent to detect marathi languages as below?
intent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL,"mr-IN");
You can checkout other languages codes here.
You can use following code for your reference:
#Override
public void onClick(View v) {
switch (v.getId()) {
case R.id.RecordBtn:
Intent intent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH);
// Use Off line Recognition Engine only...
intent.putExtra(RecognizerIntent.EXTRA_PREFER_OFFLINE, false);
// Use Marathi Speech Recognition Model...
intent.putExtra(RecognizerIntent.EXTRA_LANGUAGE, "mr-IN");
try {
startActivityForResult(intent, SST_REQUEST_CODE);
} catch (ActivityNotFoundException a) {
Toast.makeText(getApplicationContext(),
getString(R.string.error),
Toast.LENGTH_SHORT).show();
}
break;
}
}
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
switch (requestCode) {
case SST_REQUEST_CODE:
if (resultCode == RESULT_OK && null != data) {
ArrayList<String> getResult = data.getStringArrayListExtra(RecognizerIntent.EXTRA_RESULTS);
Original.setText(getResult.get(0));
conversionTable = new ConversionTable();
String Transformed_String = conversionTable.transform(getResult.get(0));
result.setText(Transformed_String);
}
break;
}
}
I have two separate applications written using Xamarin.Android; for the sake of discussion, let's call them "Tristan" and "Isolde". Tristan has some state information that Isolde sometimes needs to know. Complication: Tristan may or may not be running at the moment Isolde develops the need to know his state.
I've got kludge working now where Isolde sends a special launch intent to Tristan, who then uses a broadcast intent to send information back to Isolde. (See my earlier question for details.)
"But wait!" I hear you cry, "Is this not a perfect use case for StartActivityForResult()?" Indeed it is! The code is a whole lot simpler, and everything I've read implies that this is how Android wants you to do stuff like this.
Unfortunately, I can't get it to work (despite trying many variations and reading the dozen-or-so related questions on this very site). My specific problem is that in Isolde's OnActivityResult() callback, the resultCode is always Result.Canceled and the data is always null.
Here is the code for Tristan (where commented-out bits represent variations I've tried):
using Android.App;
using Android.Content;
namespace com.example.Tristan.Android
{
[Activity(Name ="com.example.Tristan.Android.IsoldeQueryActivity")]
public class IsoldeQueryActivity : Activity
{
protected override void OnStart()
{
// base.OnStart();
var rtn = new Intent();
rtn.PutExtra("Test", "test");
//rtn.SetAction("TestAction");
SetResult(Result.Ok, rtn);
Finish();
//FinishActivity(1234);
}
}
}
And here is the relevant code from the Activity where Isolde needs to ask for Tristan's state:
private TaskCompletionSource<bool> TristanStateCompletion;
public async Task GetTristanState()
{
TristanStateCompletion = new TaskCompletionSource<bool>();
var req = new Intent("com.example.Tristan.Android.IsoldeQueryActivity");
//req.PutExtra(Intent.ExtraReturnResult, true);
StartActivityForResult(req, 1234);
var rtn = await TristanStateCompletion.Task;
if (!rtn) bomb("can't get state");
TristanStateCompletion = null;
}
protected override void OnActivityResult(int requestCode, [GeneratedEnum] Result resultCode, Intent data)
{
base.OnActivityResult(requestCode, resultCode, data);
if(requestCode == 1234) {
DoStuffWith(data);
TristanStateCompletion?.TrySetResult(true);
}
}
Diagnostics -- or rather, a specific lack of them -- leads me to believe that Tristan's IsoldeQueryActivity.OnStart() is never actually being called.
Ideas, requests for additional information and/or useful experiments to try are all welcome. (If your idea is "Put <thing> in the manifest", remember this is Xamarin.Android and I have to do that by putting <relatedThing> in the attribute decorating the Activity.)
Edited to add: In Isolde's code, DoStuffWith(data) was crashing because data was null. When I changed that method to avoid that, I found that I got a (slightly later) exception thrown in StartActivityForResult():
Android.Content.ActivityNotFoundException No Activity found to handle Intent { act=com.example.Tristan.Android.IsoldeQueryActivity }
This leads me to believe I'm not creating the Intent properly in Isolde. Do I need to be using one of the other Intent constructors? If so, how specifically?
Okay, I think I have this figured out. The code in my original question had three major problems:
I was building the Intent incorrectly in Isolde.
I didn't export the IsoldeQueryActivity in Tristan.
The call to base.OnStart() in Tristan's OnStart override is mandatory.
Here is the working version of Tristan:
using Android.App;
using Android.Content;
namespace com.example.Tristan.Android
{
[Activity(Name ="com.example.Tristan.Android.IsoldeQueryActivity", Exported=true)]
public class IsoldeQueryActivity : Activity
{
protected override void OnStart()
{
base.OnStart();
var rtn = new Intent();
rtn.PutExtra("Test", "test");
SetResult(Result.Ok, rtn);
Finish();
}
}
}
And here is the fixed code from Isolde:
private TaskCompletionSource<bool> TristanStateCompletion;
public async Task GetTristanState()
{
TristanStateCompletion = new TaskCompletionSource<bool>();
var req = new Intent();
req.SetComponent(new ComponentName("com.example.Tristan.Android", "com.example.Tristan.Android.IsoldeQueryActivity"));
StartActivityForResult(req, 1234);
var rtn = await TristanStateCompletion.Task;
if (!rtn) bomb("can't get state");
TristanStateCompletion = null;
}
protected override void OnActivityResult(int requestCode, [GeneratedEnum] Result resultCode, Intent data)
{
base.OnActivityResult(requestCode, resultCode, data);
if(requestCode == 1234) {
if(resultCode != Result.Ok) bomb("bad resultCode {0}", resultCode);
if(data == null) bomb("null data from Tristan");
DoStuffWith(data);
TristanStateCompletion?.TrySetResult(true);
}
}
So I am trying to make a simple to do list app where it has only a mic button and the list. I am very new to android app dev, i have managed to figure out a text input into the list and how to get the text to speech up and put the spoken text in a text field. All this was achieved through a mix of tutorials. I can't seem to figure out how to bring the 2 together.
Any tips?
Here I leave you a great tutorial considering that you didnt post any code.
This tutorial, shows you how to make speech recognition with a button and then it makes a list with the possible spoken text. It works perfectly, I tried once.
Try this code to open the mic on button --> OnClickListener.
Intent intent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH);
intent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL, "en-US");
startActivityForResult(intent, nRESULT_SPEECH);
}
nRESULT_SPEECH is your code which you can give anything as 0, 1, 2,etc;
You will get the word spoken in this callback method onActivityResult
#Override public void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
switch (requestCode) {
case nRESULT_SPEECH:
if (null != data) {
ArrayList<String> text = data.getStringArrayListExtra(RecognizerIntent.EXTRA_RESULTS);
String textCapturedFromVoice=text.get(0);
}
break;
}
}
Once u will get the text in textCapturedFromVoice, you can add this to ur list.
I am having difficulty in framing this question but anyways here it goes, I have made two applications basically the TextToSpeech app and the SpeechToText application. They are working fine, now I am trying to merge them. Basically I want the the text to speech part to say the text entered in a text field , only if the user says the word "speak".That would be done using the speech to text part. Now the problem I am having is that android displays a list of outputs when the user says something, but I only one output that is "Speak" . How can I get only one output instead of a list.
Initially the app becomes active on button click.
The code I am trying is as follows:
#Override
public void onClick(View v) {
// TODO Auto-generated method stub
Intent i = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH);
i.putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL,
RecognizerIntent.LANGUAGE_MODEL_FREE_FORM);
i.putExtra(RecognizerIntent.EXTRA_PROMPT, "system Activated");
startActivityForResult(i, check);
}
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
// TODO Auto-generated method stub
if (requestCode == check && resultCode == RESULT_OK) {
String command = data
.getStringExtra(RecognizerIntent.EXTRA_RESULTS).toString();
if (command.equals(command_verify)) {
speak();
} else
finish();
}
super.onActivityResult(requestCode, resultCode, data);
}
public void speak() {
String text_to_speak = message_field.getText().toString();
talk.speak(text_to_speak, TextToSpeech.QUEUE_FLUSH, null);
}
The Voice Recognizer returns you a list of guesses, where the first guess is the most accurate. You can use it calling list.get(0). Hope this helps.
I have this app which consists of three tabs. The first tab has a button to scan QRcode. I didn't use an intent to call the barcode scanner here. I integrated all com.google... into my src. It works smoothly. The issue here, is when I scan for a Qrcode that has a website. The result I get back is the URL itself because of the textView. How do I get this URL to be clickable and redirected to the browser? Or simply show the content of the website in my app. Here is the Result Activity:
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
if (resultCode != RESULT_OK) {
return;
}
if (requestCode == ACTIVITY_REQUEST_CODE_QRCODE) {
if (txtQRcodeResult == null) {
txtQRcodeResult = (TextView) findViewById(R.id.textView1);
}
txtQRcodeResult.setText(data.getStringExtra("SCAN_RESULT"));
}
}
If you want the user to have to click you can use Linkify:
Linkify.addLinks(txtQRcodeResult, Linkify.WEB_URLS);
if you want to just jump into the browser and load the page, you can fire an intent:
Intent browserIntent = new Intent(Intent.VIEW_ACTION,ContentURI.create(data.getStringExtra("SCAN_RESULT")));
startActivity(browserIntent );