Where to use 11-digit generated hash key in SMS Retriever API? - android

I've all things setup in both server and android side. My flow is working fine. I generated hashkey also, but now don't know where to use it.
Note: I removed RECEIVE_SMS permission and now listener is not calling.
Code:
private void startSMSListener() {
try {
smsReceiver = new SMSReceiver();
smsReceiver.setOTPListener(this);
IntentFilter intentFilter = new IntentFilter();
intentFilter.addAction(SmsRetriever.SMS_RETRIEVED_ACTION);
this.registerReceiver(smsReceiver, intentFilter);
SmsRetrieverClient client = SmsRetriever.getClient(this);
Task<Void> task = client.startSmsRetriever();
task.addOnSuccessListener(new OnSuccessListener<Void>() {
#Override
public void onSuccess(Void aVoid) {
// API successfully started
}
});
task.addOnFailureListener(new OnFailureListener() {
#Override
public void onFailure(#NonNull Exception e) {
// Fail to start API
}
});
} catch (Exception e) {
e.printStackTrace();
}
}
Update: I'm not getting OTP:
Server side code:
//Prepare Url
OkHttpClient client=new OkHttpClient();
Request request = new Request.Builder()
.url("http://msg.msgclub.net/rest/services/sendSMS/sendGroupSms?AUTH_KEY="+authKey+"&message=<#>You OTP is:"+randomPIN+"TthI79n9NvR&senderId=REMIND&routeId=1&mobileNos="+mobNo+"&smsContentType=english")
.get()
.addHeader("Cache-Control", "no-cache")
.build();
which logic should I try inside onReeive method? can anyone please guide me?
Code:
import android.content.BroadcastReceiver;
import android.content.Context;
import android.content.Intent;
import android.os.Bundle;
import android.util.Log;
import com.google.android.gms.auth.api.phone.SmsRetriever;
import com.google.android.gms.common.api.CommonStatusCodes;
import com.google.android.gms.common.api.Status;
import com.tekitsolution.remindly.Dialog.PaymentListDialog;
import com.tekitsolution.remindly.Listener.SMSListener;
public class SMSReceiver extends BroadcastReceiver {
private static final String TAG = SMSReceiver.class.getSimpleName();
private SMSListener mListener;
/**
* #param mListener
*/
public void setOTPListener(SMSListener mListener) {
this.mListener = mListener;
}
/**
* #param context
* #param intent
*/
#Override
public void onReceive(Context context, Intent intent) {
showLog("Inside SMS Receiver");
if (SmsRetriever.SMS_RETRIEVED_ACTION.equals(intent.getAction())) {
Bundle extras = intent.getExtras();
Status status = (Status) extras.get(SmsRetriever.EXTRA_STATUS);
switch (status.getStatusCode()) {
case CommonStatusCodes.SUCCESS:
//This is the full message
String message = (String) extras.get(SmsRetriever.EXTRA_SMS_MESSAGE);
showLog("message: "+ message);
/*<#> Your ExampleApp code is: 123ABC78
FA+9qCX9VSu*/
//Extract the OTP code and send to the listener
if (mListener != null) {
mListener.onOTPReceived(message);
}
break;
case CommonStatusCodes.API_NOT_CONNECTED:
if (mListener != null) {
mListener.onOTPReceivedError("API NOT CONNECTED");
}
break;
case CommonStatusCodes.NETWORK_ERROR:
if (mListener != null) {
mListener.onOTPReceivedError("NETWORK ERROR");
}
break;
case CommonStatusCodes.ERROR:
if (mListener != null) {
mListener.onOTPReceivedError("SOME THING WENT WRONG");
}
break;
}
}
}
private void showLog(String msg) {
Log.d(TAG, msg);
}
}
OTP activity: called this method inside onCreate:
private void startSMSListener() {
try {
smsReceiver = new SMSReceiver();
smsReceiver.setOTPListener(this);
IntentFilter intentFilter = new IntentFilter();
intentFilter.addAction(SmsRetriever.SMS_RETRIEVED_ACTION);
this.registerReceiver(smsReceiver, intentFilter);
SmsRetrieverClient client = SmsRetriever.getClient(this);
Task<Void> task = client.startSmsRetriever();
task.addOnSuccessListener(new OnSuccessListener<Void>() {
#Override
public void onSuccess(Void aVoid) {
showLog("success");
}
});
task.addOnFailureListener(new OnFailureListener() {
#Override
public void onFailure(#NonNull Exception e) {
showLog("failure");
}
});
} catch (Exception e) {
e.printStackTrace();
}
}

Your sms format should be
<#> SampleApp: Your verification code is 143567
QbwSot12oP
1) The message should start with <#>, that will indicate this is an OTP message to the system.
2) The message should end with Hashcode e.g QbwSot12oP at the end of the message.
So you have to pass your generated hash to server and append it at last of this sms.
Following is the complete example of retrive api. I used it and it is working in my case
https://medium.com/android-dev-hacks/autofill-otp-verification-with-latest-sms-retriever-api-73c788636783
Generate hash key pragmatically via AppSignatureHelper in given example. And pass it to your server and make message format like I show you before.

Related

TTS onUtteranceCompleted not working in Android

I am working on speechRecognizer module in android integrated with dialogflow. I have almost completed the integration and it's working fine but facing one issue. The issue is when TTS is speaking the response, before completing the speech mic gets enabled and it capture the response also instead of user utterance only.
So I want to know how can I get to know that TTS finished speaking the response? I have tried Google solution by using onUtteranceCompleted() method and also I have tried on setOnUtteranceProgressListener() method but doesn't seems to be working. I want to implement these methods in asynctask So that it can be done in background. How can I do that?
Here is the code that I have tried:
import android.Manifest;
import android.app.ProgressDialog;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.content.pm.ResolveInfo;
import android.os.Bundle;
import android.os.Handler;
import android.os.Looper;
import android.speech.RecognitionListener;
import android.speech.RecognizerIntent;
import android.speech.SpeechRecognizer;
import android.speech.tts.TextToSpeech;
import android.support.annotation.NonNull;
import android.support.v4.app.ActivityCompat;
import android.support.v4.content.ContextCompat;
import android.support.v7.app.AppCompatActivity;
import android.text.TextUtils;
import android.util.Log;
import android.view.View;
import android.widget.ArrayAdapter;
import android.widget.EditText;
import android.widget.ListView;
import android.widget.TextView;
import android.widget.Toast;
import java.util.ArrayList;
import java.util.List;
import java.util.Locale;
import ai.api.PartialResultsListener;
import ai.api.android.AIConfiguration;
import ai.api.android.GsonFactory;
import ai.api.model.AIError;
import ai.api.model.AIResponse;
import ai.api.ui.AIDialog;
import android.os.AsyncTask;
import com.google.gson.Gson;
public class VoiceActivity extends AppCompatActivity implements View.OnClickListener, TextToSpeech.OnInitListener, RecognitionListener {
private AIDialog.AIDialogListener resultsListener;
private static final int REQUEST_AUDIO_PERMISSIONS_ID = 33;
private Gson gson = GsonFactory.getGson();
private ListView wordList;
TextView txtmain;
private static final int VR_REQUEST = 999;
//Log tag for output information
private final String LOG_TAG = "SpeechRepeatActivity";//***enter your own tag here***
//variable for checking TTS engine data on user device
private int MY_DATA_CHECK_CODE = 0;
//Text To Speech instance
private TextToSpeech tts;
AIButton aiButton;
EditText time;
TextView matchcall;
// private SpeechRecognizer speech = null;
private Intent recognizerIntent;
TextView tvOutput ;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_voice);
aiButton = (AIButton) findViewById(R.id.speech_btn);
txtmain =findViewById(R.id.txtmain);
wordList = findViewById(R.id.word_list);
tvOutput = findViewById(R.id.tvOutput);
time = (EditText) findViewById(R.id.in_time);
matchcall = (TextView) findViewById(R.id.matchcall);
final AsyncTaskRunner runner = new AsyncTaskRunner();
initService();
PackageManager packManager = getPackageManager();
List<ResolveInfo> intActivities = packManager.queryIntentActivities(new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH), 0);
if (intActivities.size() != 0) {
//speech recognition is supported - detect user button clicks
aiButton.setOnClickListener(this);
//prepare the TTS to repeat chosen words
Intent checkTTSIntent = new Intent();
//check TTS data
checkTTSIntent.setAction(TextToSpeech.Engine.ACTION_CHECK_TTS_DATA);
//start the checking Intent - will retrieve result in onActivityResult
startActivityForResult(checkTTSIntent, MY_DATA_CHECK_CODE);
} else {
//speech recognition not supported, disable button and output message
aiButton.setEnabled(false);
Toast.makeText(this, "Oops - Speech recognition not supported!", Toast.LENGTH_LONG).show();
}
}
private void setAIButtonCallback(final AIButton aiButton) {
aiButton.setResultsListener(new AIButton.AIButtonListener() {
#Override
public void onResult(final AIResponse result) {
if (resultsListener != null) {
resultsListener.onResult(result);
Log.e(LOG_TAG,"onResult=="+result.getResult().getResolvedQuery());
final String speech = result.getResult().getFulfillment().getSpeech();
tvOutput.setText(speech);
}
}
#Override
public void onError(final AIError error) {
if (resultsListener != null) {
resultsListener.onError(error);
Log.e(LOG_TAG,"onError=="+error);
}
}
#Override
public void onCancelled() {
if (resultsListener != null) {
resultsListener.onCancelled();
Log.e(LOG_TAG,"onCancelled==");
}
}
});
aiButton.setPartialResultsListener(new PartialResultsListener() {
#Override
public void onPartialResults(final List<String> partialResults) {
final String result = partialResults.get(0);
if (!TextUtils.isEmpty(result)) {
new Handler(Looper.getMainLooper()).post(new Runnable() {
#Override
public void run() {
txtmain.setText(result);
}
});
}
}
});
}
#Override
public void onClick(View v) {
if (v.getId() == R.id.speech_btn) {
//listen for results
listenToSpeech();
}
}
#Override
public void onInit(int status) {
//if successful, set locale
if (status == TextToSpeech.SUCCESS)
tts.setLanguage(Locale.UK);//***choose your own locale here***
}
private void listenToSpeech() {
SpeechRecognizer speech = SpeechRecognizer.createSpeechRecognizer(this);
speech.setRecognitionListener(this);
recognizerIntent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH);
recognizerIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_PREFERENCE,"en");
recognizerIntent.putExtra(RecognizerIntent.EXTRA_CALLING_PACKAGE,this.getPackageName());
recognizerIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL,RecognizerIntent.LANGUAGE_MODEL_WEB_SEARCH);
recognizerIntent.putExtra(RecognizerIntent.EXTRA_CALLING_PACKAGE,this.getPackageName());
recognizerIntent.putExtra(RecognizerIntent.EXTRA_MAX_RESULTS, 3);
//start listening
speech.startListening(recognizerIntent);
}
#Override
public void onResults(Bundle results) {
Log.i(LOG_TAG, "onResults");
ArrayList<String> matches = results.getStringArrayList(SpeechRecognizer.RESULTS_RECOGNITION);
/* wordList.setAdapter(new ArrayAdapter<String>(this, R.layout.word, matches));
new Handler().post(new Runnable() {
#Override
public void run() {
wordList.performItemClick(
wordList.getChildAt(0),
0,
wordList.getAdapter().getItemId(0));
wordList.getChildAt(0).setBackgroundColor(0xFFD3D3D3);
}
});*/
String wordChosen = matches.get(0);
tvOutput.setText(wordChosen);
try {
AsyncTaskRunner runner = new AsyncTaskRunner();
runner.execute(wordChosen);
// aiButton.textRequest(text);
//runner.doInBackground(text);
} catch (Exception e) {
e.printStackTrace();
}
}
private class AsyncTaskRunner extends AsyncTask<String, AIResponse, AIResponse> {
private AIResponse resp;
ProgressDialog progressDialog;
#Override
protected AIResponse doInBackground(String... params) {
Log.e(LOG_TAG,"doInBackground=="+params[0]);
try {
resp = aiButton.textRequest(String.valueOf(params[0]));
}
catch (Exception e) {
e.printStackTrace();
}
return resp;
}
protected void onPostExecute(AIResponse result) {
Log.e(LOG_TAG,"onPostExecute== result=="+result.getResult().getFulfillment().getSpeech());
// execution of result of Long time consuming operation
findViewById(R.id.progBar).setVisibility(View.GONE);
txtmain.setText(result.getResult().getFulfillment().getSpeech());
String speech = result.getResult().getFulfillment().getSpeech();
tts.speak( speech, TextToSpeech.QUEUE_FLUSH, null);
Toast.makeText(VoiceActivity.this, speech, Toast.LENGTH_SHORT).show();
listenToSpeech();
}
#Override
protected void onPreExecute() {
Log.e(LOG_TAG,"onPreExecute==");
findViewById(R.id.progBar).setVisibility(View.VISIBLE);
}
}
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
//check speech recognition result
if (requestCode == VR_REQUEST && resultCode == RESULT_OK)
{
//store the returned word list as an ArrayList
ArrayList<String> suggestedWords = data.getStringArrayListExtra(RecognizerIntent.EXTRA_RESULTS);
//set the retrieved list to display in the ListView using an ArrayAdapter
wordList.setAdapter(new ArrayAdapter<String>(this, R.layout.word, suggestedWords));
/*new Handler().post(new Runnable() {
#Override
public void run() {
wordList.performItemClick(
wordList.getChildAt(0),
0,
wordList.getAdapter().getItemId(0));
wordList.getChildAt(0).setBackgroundColor(0xFFD3D3D3);
}
});*/
tvOutput.setText(suggestedWords.get(0));
}
if (requestCode == MY_DATA_CHECK_CODE)
{
//we have the data - create a TTS instance
if (resultCode == TextToSpeech.Engine.CHECK_VOICE_DATA_PASS)
tts = new TextToSpeech(this, this);
//data not installed, prompt the user to install it
else
{
//intent will take user to TTS download page in Google Play
Intent installTTSIntent = new Intent();
installTTSIntent.setAction(TextToSpeech.Engine.ACTION_INSTALL_TTS_DATA);
startActivity(installTTSIntent);
}
}
//call superclass method
super.onActivityResult(requestCode, resultCode, data);
}
private void initService() {
final AIConfiguration config = new AIConfiguration("a73d5e88477e4926ae84af46f24e0aaa",
AIConfiguration.SupportedLanguages.English,
AIConfiguration.RecognitionEngine.Google);
aiButton.initialize(config);
setAIButtonCallback(aiButton);
Log.i(LOG_TAG, "initService:::::: ");
}
#Override
protected void onPause() {
super.onPause();
// speech.stopListening();
// use this method to disconnect from speech recognition service
new java.util.Timer().schedule(
new java.util.TimerTask() {
#Override
public void run() {
// your code here
}
},
5000
);
// Not destroying the SpeechRecognition object in onPause method would block other apps from using SpeechRecognition service
}
#Override
protected void onResume() {
super.onResume();
}
#Override
protected void onStop() {
Log.i(LOG_TAG, "stop");
super.onStop();
/* if (speech != null) {
speech.destroy();*/
try {
Thread.sleep(5000);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
//}
#Override
protected void onStart() {
super.onStart();
checkAudioRecordPermission();
}
protected void checkAudioRecordPermission() {
if (ContextCompat.checkSelfPermission(this,
Manifest.permission.RECORD_AUDIO)
!= PackageManager.PERMISSION_GRANTED) {
// Should we show an explanation?
if (ActivityCompat.shouldShowRequestPermissionRationale(this,
Manifest.permission.RECORD_AUDIO)) {
} else {
// No explanation needed, we can request the permission.
ActivityCompat.requestPermissions(this,
new String[]{Manifest.permission.RECORD_AUDIO},
REQUEST_AUDIO_PERMISSIONS_ID);
}
}
}
#Override
public void onRequestPermissionsResult(int requestCode, #NonNull String[] permissions, #NonNull int[] grantResults) {
switch (requestCode) {
case REQUEST_AUDIO_PERMISSIONS_ID: {
// If request is cancelled, the result arrays are empty.
if (grantResults.length > 0
&& grantResults[0] == PackageManager.PERMISSION_GRANTED) {
} else {
}
return;
}
}
}
#Override
public void onReadyForSpeech(Bundle bundle) {
}
#Override
public void onBeginningOfSpeech() {
}
#Override
public void onRmsChanged(float v) {
}
#Override
public void onBufferReceived(byte[] bytes) {
}
#Override
public void onEndOfSpeech() {
//speech.stopListening();
}
#Override
public void onError(int errorcode) {
String errorMessage = getErrorText(errorcode);
Log.i(LOG_TAG, "FAILED " + errorMessage);
// speech.stopListening();
// listenToSpeech();
}
public String getErrorText(int errorCode) {
String message;
switch (errorCode) {
case SpeechRecognizer.ERROR_AUDIO:
message = "Audio recording error";
break;
case SpeechRecognizer.ERROR_CLIENT:
message = "Client side error";
break;
case SpeechRecognizer.ERROR_INSUFFICIENT_PERMISSIONS:
message = "Insufficient permissions";
break;
case SpeechRecognizer.ERROR_NETWORK:
message = "Network error";
break;
case SpeechRecognizer.ERROR_NETWORK_TIMEOUT:
message = "Network timeout";
break;
case SpeechRecognizer.ERROR_NO_MATCH:
message = "No match";
break;
case SpeechRecognizer.ERROR_RECOGNIZER_BUSY:
message = "RecognitionService busy";
break;
case SpeechRecognizer.ERROR_SERVER:
message = "error from server";
break;
case SpeechRecognizer.ERROR_SPEECH_TIMEOUT:
message = "No speech input";
//speech.stopListening();
break;
default:
message = "Didn't understand, please try again.";
break;
}
return message;
}
#Override
public void onPartialResults(Bundle bundle) {
}
#Override
public void onEvent(int i, Bundle bundle) {
}
}

Android bluetooth printer app works fine in debug mode but doesn't work in release mode

I am writing an Android app to print text on a bluetooth thermal printer.
Here is the complete code
The app works fine in the debug mode but when I generate a signed APK and install it on the device, it does not respond at all.
I have tried different solution suggested on stackoverflow but non of them worked.
This is my main activity
import android.app.Activity;
import android.bluetooth.BluetoothAdapter;
import android.content.BroadcastReceiver;
import android.content.Context;
import android.content.Intent;
import android.os.AsyncTask;
import android.os.Handler;
import android.os.Message;
import android.support.v7.app.AppCompatActivity;
import android.os.Bundle;
import android.util.Log;
import android.view.View;
import android.widget.Button;
import android.widget.Toast;
import com.lvrenyang.io.IOCallBack;
import java.lang.ref.WeakReference;
public class MainActivity extends AppCompatActivity {
private Handler mHandler; // Our main handler that will receive callback notifications
// #defines for identifying shared types between calling functions
private final static int REQUEST_ENABLE_BT = 1; // used to identify adding bluetooth names
private static String TAG = "MAIN_ACTIVITY";
private Activity activity;
private Button btnConnect;
private String name = "MTP-II";
private String mac_address = "02:15:44:31:49:05";
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
//Get the activity
this.activity = this;
//Button from the XML view
btnConnect = findViewById(R.id.btnConnect);
//Start the Init Work Service Async task
new InitWorkService().execute();
//Set onClickListener for test print button
btnConnect.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
try {
//Check if name and address are set
if (name != "null" && mac_address != "null" && mac_address.contains(":")) {
if (!WorkService.workThread.isConnected()) {
WorkService.workThread.connectBt(mac_address);
//Sleep for 3 seconds
try {
Thread.sleep(3000);
} catch (Exception e) {
}
}
//Check if connected
if (WorkService.workThread.isConnected()) {
//Collect data in background Thread
new PrintData().execute();
} else {
Toast.makeText(activity, Global.toast_notconnect, Toast.LENGTH_SHORT).show();
}
} else {
Toast.makeText(activity, "Please setup printer first!", Toast.LENGTH_LONG).show();
}
}
catch(Exception e){
Log.e(TAG, e.getMessage(), e.fillInStackTrace());}
}
});
}
/**
* Background Async Task
* */
private class InitWorkService extends AsyncTask<String, String, String> {
#Override
protected void onPreExecute() {
super.onPreExecute();
}
protected String doInBackground(String... args){
try{
WorkService.cb = new IOCallBack() {
public void OnOpen() {
if (null != mHandler) {
Message msg = mHandler.obtainMessage(Global.MSG_IO_ONOPEN);
mHandler.sendMessage(msg);
}
}
public void OnClose() {
if (null != mHandler) {
Message msg = mHandler.obtainMessage(Global.MSG_IO_ONCLOSE);
mHandler.sendMessage(msg);
}
}
};
}
catch(Exception e){
Log.e(TAG, e.getMessage(), e.fillInStackTrace());
}
return null;
}
protected void onPostExecute(String file_url){
try {
mHandler = new MHandler(MainActivity.this);
WorkService.addHandler(mHandler);
if (null == WorkService.workThread) {
Intent intent = new Intent(activity, WorkService.class);
startService(intent);
}
}
catch (Exception e) {
Log.e(TAG, e.getMessage(), e.fillInStackTrace());
Toast.makeText(activity, "Unable to initiate the WorkService!", Toast.LENGTH_LONG).show();
}
}
}
/**
* Background Async Task
* */
class PrintData extends AsyncTask<String, String, String> {
#Override
protected void onPreExecute() {
super.onPreExecute();
}
protected String doInBackground(String... args){
try{
int nTextAlign=1;
String text = "Test message!\r\n\r\n\r\n";
String encoding = "UTF-8";
byte[] hdrBytes = {0x1c, 0x26, 0x1b, 0x39, 0x01};
Bundle dataAlign = new Bundle();
Bundle dataTextOut = new Bundle();
Bundle dataHdr = new Bundle();
dataHdr.putByteArray(Global.BYTESPARA1, hdrBytes);
dataHdr.putInt(Global.INTPARA1, 0);
dataHdr.putInt(Global.INTPARA2, hdrBytes.length);
dataAlign.putInt(Global.INTPARA1, nTextAlign);
dataTextOut.putString(Global.STRPARA1, text);
dataTextOut.putString(Global.STRPARA2, encoding);
WorkService.workThread.handleCmd(Global.CMD_POS_WRITE,dataHdr);
WorkService.workThread.handleCmd(Global.CMD_POS_SALIGN,dataAlign);
WorkService.workThread.handleCmd(Global.CMD_POS_STEXTOUT,dataTextOut);
}catch(Exception e){
Log.e(TAG, e.getMessage(), e.fillInStackTrace());
}
return null;
}
protected void onPostExecute(String file_url){}
}
#Override
protected void onUserLeaveHint()
{
Log.d("onUserLeaveHint","Home button pressed");
super.onUserLeaveHint();
//Unregister bluetooth receiver
try{unregisterReceiver(bluetoothReceiver);}catch(Exception e){}
//Disconnect bt connection
try{WorkService.workThread.disconnectBt();}catch(Exception e){}
// remove the handler
try{WorkService.delHandler(mHandler);}catch(Exception e){}
mHandler = null;
}
/**
* Broadcast receiver for bluetooth state changes
*/
private final BroadcastReceiver bluetoothReceiver = new BroadcastReceiver() {
#Override
public void onReceive(Context context, Intent intent)
{
final String action = intent.getAction();
if (action.equals(BluetoothAdapter.ACTION_STATE_CHANGED))
{
final int state = intent.getIntExtra(BluetoothAdapter.EXTRA_STATE,BluetoothAdapter.ERROR);
switch (state)
{
case BluetoothAdapter.STATE_OFF:
// closeConnection();//Close on going connection and disable button
Intent enableBtIntent = new Intent(BluetoothAdapter.ACTION_REQUEST_ENABLE);
startActivityForResult(enableBtIntent, REQUEST_ENABLE_BT);
break;
case BluetoothAdapter.STATE_ON:
break;
}
}
}
};
private static class MHandler extends Handler {
WeakReference<MainActivity> mActivity;
MHandler(MainActivity activity) {
mActivity = new WeakReference<>(activity);
}
#Override
public void handleMessage(Message msg) {
MainActivity theActivity = mActivity.get();
switch (msg.what) {
case Global.CMD_POS_STEXTOUTRESULT:
case Global.CMD_POS_WRITERESULT: {
int result = msg.arg1;
Toast.makeText(
theActivity,
(result == 1) ? Global.toast_success
: Global.toast_fail, Toast.LENGTH_SHORT).show();
Log.v(TAG, "Result: " + result);
break;
}
}
}
}
}
Does your app manifest declare permissions for bluetooth to be used?
https://developer.android.com/guide/topics/connectivity/bluetooth#Permissions
In order to use Bluetooth features in your application, you must declare two permissions. The first of these is BLUETOOTH. You need this permission to perform any Bluetooth communication, such as requesting a connection, accepting a connection, and transferring data.
The other permission that you must declare is either
ACCESS_COARSE_LOCATION or ACCESS_FINE_LOCATION. A location permission
is required because Bluetooth scans can be used to gather information
about the location of the user. This information may come from the
user's own devices, as well as Bluetooth beacons in use at locations
such as shops and transit facilities.
If it only happens when u sign the apk, it looks like u have to update ur proguard rules, to exclude the printer lib clases or similar
One reason for the application not responding is that you stop the main thread for 3 seconds on line 60 in the click listener of the button.
Replace the onCreate() method with the code below
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
//Get the activity
this.activity = this;
//Button from the XML view
btnConnect = findViewById(R.id.btnConnect);
//Start the Init Work Service Async task
new InitWorkService().execute();
final ExecutorService es = Executors.newFixedThreadPool(1);
//Set onClickListener for test print button
btnConnect.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
btnConnect.setEnabled(false);
es.submit(new Runnable() {
#Override
public void run() {
connect();
}
});
}
});
}
private void connect() {
try {
//Check if name and address are set
if (name != null && mac_address != null && mac_address.contains(":")) {
if (!WorkService.workThread.isConnected()) {
WorkService.workThread.connectBt(mac_address);
//Sleep for 3 seconds
try {
Thread.sleep(3000);
} catch (Exception e) {
}
}
//Check if connected
if (WorkService.workThread.isConnected()) {
//Collect data in background Thread
new PrintData().execute();
} else {
new Handler(Looper.getMainLooper()).post(new Runnable() {
#Override
public void run() {
Toast.makeText(activity, Global.toast_notconnect, Toast.LENGTH_SHORT).show();
}
});
}
} else {
new Handler(Looper.getMainLooper()).post(new Runnable() {
#Override
public void run() {
Toast.makeText(activity, "Please setup printer first!", Toast.LENGTH_LONG).show();
}
});
}
} catch (Exception e) {
Log.e(TAG, e.getMessage(), e.fillInStackTrace());
}
new Handler(Looper.getMainLooper()).post(new Runnable() {
#Override
public void run() {
btnConnect.setEnabled(true);
}
});
}
Now the connection part executes in a new thread and only the UI operations go on the main one.
Please note that this code is not the optimal solution because it does not take into account the activity lifecycle. If the activity is re-created while the thread sleeps, there is still a reference kept to the old activity. But it should be a starting point for you.

Publish an share intent on MQTT server

I am trying to create an Android application that receive some text from other application (using share intent) and send it on a MQTT server.
I'm using Eclipse Paho library and the Android example that publish text from button action and log text from subscribe topic. The example is working fine.
But, when I am trying to add some code to handle share intent, the application crash when I try to publish text.
This is the Java code:
package net.example.testingmqtt;
import android.content.Intent;
import android.os.Bundle;
import android.support.design.widget.FloatingActionButton;
import android.support.v7.app.AppCompatActivity;
import android.support.v7.widget.Toolbar;
import android.util.Log;
import android.view.View;
import android.view.Menu;
import android.view.MenuItem;
import android.widget.Toast;
import org.eclipse.paho.android.service.MqttAndroidClient;
import org.eclipse.paho.client.mqttv3.IMqttActionListener;
import org.eclipse.paho.client.mqttv3.IMqttDeliveryToken;
import org.eclipse.paho.client.mqttv3.IMqttToken;
import org.eclipse.paho.client.mqttv3.MqttCallbackExtended;
import org.eclipse.paho.client.mqttv3.MqttConnectOptions;
import org.eclipse.paho.client.mqttv3.MqttException;
import org.eclipse.paho.client.mqttv3.MqttMessage;
public class MainActivity extends AppCompatActivity {
MqttAndroidClient mqttAndroidClient;
final String serverUri = "tcp://iot.eclipse.org:1883";
String clientId = "AndroidClient";
final String subscriptionTopic = "topic";
final String publishTopic = "pubtopic";
final String logTag = "MqttLog";
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
Toolbar toolbar = findViewById(R.id.toolbar);
setSupportActionBar(toolbar);
FloatingActionButton fab = findViewById(R.id.fab);
fab.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
publishMessage("Hello from Android");
}
});
// Setup MQTT
setupMqtt();
// Intent
Intent intent = getIntent();
String action = intent.getAction();
String type = intent.getType();
if(Intent.ACTION_SEND.equals(action) && type != null) {
if("text/plain".equals(type)) {
String sharedText = intent.getStringExtra(Intent.EXTRA_TEXT);
Log.d(logTag, "Get text/plain intent: " + sharedText);
publishMessage("Oh"); // -- This is the problem!
}
}
}
public void setupMqtt() {
clientId = clientId + System.currentTimeMillis();
mqttAndroidClient = new MqttAndroidClient(getApplicationContext(), serverUri, clientId);
mqttAndroidClient.setCallback(new MqttCallbackExtended() {
#Override
public void connectComplete(boolean reconnect, String serverURI) {
if (reconnect) {
Log.d(logTag, "Reconnected to " + serverURI);
subscribeToTopic();
}
else {
Log.d(logTag, "Connected to " + serverURI);
}
}
#Override
public void connectionLost(Throwable cause) {
Log.d(logTag, "The connection was lost");
}
#Override
public void messageArrived(String topic, MqttMessage message) throws Exception {
String msg = new String(message.getPayload());
Log.d(logTag, "Incoming message: " + msg);
Toast.makeText(MainActivity.this, msg, Toast.LENGTH_SHORT).show();
}
#Override
public void deliveryComplete(IMqttDeliveryToken token) {
}
});
MqttConnectOptions mqttConnectOptions = new MqttConnectOptions();
mqttConnectOptions.setAutomaticReconnect(true);
mqttConnectOptions.setCleanSession(false);
Log.d(logTag, "Trying to connect to MQTT server...");
try {
mqttAndroidClient.connect(mqttConnectOptions, null, new IMqttActionListener() {
#Override
public void onSuccess(IMqttToken asyncActionToken) {
subscribeToTopic();
}
#Override
public void onFailure(IMqttToken asyncActionToken, Throwable exception) {
Log.w(logTag, "Failed to connect to " + serverUri);
}
});
} catch (MqttException ex) {
ex.printStackTrace();
}
}
#Override
public boolean onCreateOptionsMenu(Menu menu) {
// Inflate the menu; this adds items to the action bar if it is present.
getMenuInflater().inflate(R.menu.menu_main, menu);
return true;
}
#Override
public boolean onOptionsItemSelected(MenuItem item) {
// Handle action bar item clicks here. The action bar will
// automatically handle clicks on the Home/Up button, so long
// as you specify a parent activity in AndroidManifest.xml.
int id = item.getItemId();
//noinspection SimplifiableIfStatement
if (id == R.id.action_settings) {
return true;
}
return super.onOptionsItemSelected(item);
}
public void subscribeToTopic() {
try {
mqttAndroidClient.subscribe(subscriptionTopic, 0, null, new IMqttActionListener() {
#Override
public void onSuccess(IMqttToken asyncActionToken) {
Log.d(logTag, "Subscribed");
}
#Override
public void onFailure(IMqttToken asyncActionToken, Throwable exception) {
Log.w(logTag, "Failed to subscribe");
}
});
} catch (MqttException ex) {
ex.printStackTrace();
}
}
public void publishMessage(String text) {
try {
MqttMessage message = new MqttMessage();
message.setPayload(text.getBytes());
mqttAndroidClient.publish(publishTopic, message);
Log.d(logTag, "Message published");
} catch (MqttException ex) {
ex.printStackTrace();
}
}
}
The publishMessage("Oh") return an exception:
2019-03-14 22:15:44.579 19507-19507/net.e.testingmqtt E/AndroidRuntime: FATAL EXCEPTION: main
Process: net.example.testingmqtt, PID: 19507
java.lang.NullPointerException: Attempt to invoke virtual method 'org.eclipse.paho.client.mqttv3.IMqttDeliveryToken org.eclipse.paho.android.service.MqttService.publish(java.lang.String, java.lang.String, org.eclipse.paho.client.mqttv3.MqttMessage, java.lang.String, java.lang.String)' on a null object reference
at org.eclipse.paho.android.service.MqttAndroidClient.publish(MqttAndroidClient.java:812)
at org.eclipse.paho.android.service.MqttAndroidClient.publish(MqttAndroidClient.java:668)
at net.example.testingmqtt.MainActivity.publishMessage(MainActivity.java:177)
at net.example.testingmqtt.MainActivity$1.onClick(MainActivity.java:48)
at android.view.View.performClick(View.java:6891)
at android.view.View$PerformClick.run(View.java:26083)
at android.os.Handler.handleCallback(Handler.java:789)
at android.os.Handler.dispatchMessage(Handler.java:98)
at android.os.Looper.loop(Looper.java:164)
at android.app.ActivityThread.main(ActivityThread.java:6938)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.Zygote$MethodAndArgsCaller.run(Zygote.java:327)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1374)
I understand that the MQTT connection is not establish when the application is launch (after using share intent). But I don't understand why in that case the connection is not set up by the previous statement (setupMqtt).
Do you have any advice to fix the code?
That is a basic issue - NPE aka NullPointerException :-)
You are trying to call mqttAndroidClient.publish(publishTopic, message) but the mqttAndroidClient is not initialized yet (by you, in your setupMqtt method) and thus null.
You could add an if (mqttAndroidClient != null) condition or just catch and ignore the NPE:
public void publishMessage(String text) {
try {
MqttMessage message = new MqttMessage();
message.setPayload(text.getBytes());
mqttAndroidClient.publish(publishTopic, message);
Log.d(logTag, "Message published");
} catch (NullPointerException | MqttException ex) {
ex.printStackTrace();
}
}
Well, I found a solution (maybe not the best) by using the singleTask mode (new android:launchMode="singleTask" attribute of activity tag in AndroidManifest.xml) and the onNewIntent method:
#Override
protected void onNewIntent(Intent intent) {
super.onNewIntent(intent);
Log.d(logTag, "Get new intent...");
String action = intent.getAction();
String type = intent.getType();
if(Intent.ACTION_SEND.equals(action) && type != null) {
if("text/plain".equals(type)) {
String sharedText = intent.getStringExtra(Intent.EXTRA_TEXT);
Log.d(logTag, "Get text/plain intent: " + sharedText);
publishMessage("Oh, a new intent!");
}
}
}
No more error when I publish the message when I received intent.
Thanks for the The Cheese Factory Blog that help me understood Android activity launch mode.

SMS retriever API SMS Broadcaster problem

I am using SMS Retriever API to get OTP but the problem I am facing is that it is not receiving SMS every time. Sometime SMS content is retrieved and some time nothing happens.
I have used the Toast (Broadcaster started) to show if it is started every time but Toast is also not displayed every time. I am unable to diagnose the problem.
Broadcast Receiver code:
public class OTPBroadcastReceiver extends BroadcastReceiver {
private String otp;
private static OTPSMSReceiveListner otpsmsReceiveListner = null;
private final Pattern p = Pattern.compile("(|^)\\d{4}");
public static void injectListner(OTPSMSReceiveListner listner){
otpsmsReceiveListner = listner;
}
#Override
public void onReceive(Context context, Intent intent) {
try {
Toast.makeText(context,"Broadcaster started",Toast.LENGTH_LONG).show();
if (SmsRetriever.SMS_RETRIEVED_ACTION.equals(intent.getAction())) {
Bundle extras = intent.getExtras();
Status status = (Status) extras.get(SmsRetriever.EXTRA_STATUS);
switch (status.getStatusCode()) {
case CommonStatusCodes.SUCCESS:
//Toast.makeText(context,"success",Toast.LENGTH_LONG).show();
// Get SMS message contents
String message = (String) extras.get(SmsRetriever.EXTRA_SMS_MESSAGE);
if (message != null) {
Matcher m = p.matcher(message);
if (m.find()) {
otp = m.group(0);
}
String token;
try {
token = CommonMethods.getSecurePref("OTP", context);
} catch (Exception ex) {
token = null;
}
if (token == null) {
//Pass on the text to our listener.
otpsmsReceiveListner.onOTPReceived(otp);
}
}
break;
case CommonStatusCodes.TIMEOUT:
Log.d("onReceive", "timed out (5 minutes)");
//Toast.makeText(context,"Timeout",Toast.LENGTH_LONG).show();
otpsmsReceiveListner.onOTPTimeout();
break;
}
}
}
catch (Exception ex){
Toast.makeText(context,ex.getLocalizedMessage(),Toast.LENGTH_LONG).show();
}
}
public interface OTPSMSReceiveListner{
void onOTPReceived(String otp);
void onOTPTimeout();
}
}
OTP class:
SmsRetrieverClient client = SmsRetriever.getClient(mContext);
Task<Void> task = client.startSmsRetriever();
task.addOnSuccessListener(new OnSuccessListener<Void>() {
#Override
public void onSuccess(Void aVoid) {
try
{
Log.e("onSuccess","Successfully started retriever");
}
catch (Exception ex)
{
Log.e("onSuccess",ex.getMessage());
}
}
});
task.addOnFailureListener(new OnFailureListener() {
#Override
public void onFailure(#NonNull Exception e) {
Log.e("onFailure", "Failed to start retriever");
}
});
OTPBroadcastReceiver.injectListner(new OTPBroadcastReceiver.OTPSMSReceiveListner() {
#Override
public void onOTPReceived(String otp) {
if(otp.length() == 4) {
otpField.setText(otp);
btnVerify.performClick();
}
}
#Override
public void onOTPTimeout() {
Log.e("onOTPTimeout","onOTPTimeout");
}
});
Manifest:
<receiver
android:name=".helpers.OTPBroadcastReceiver"
android:exported="true">
<intent-filter>
<action android:name="com.google.android.gms.auth.api.phone.SMS_RETRIEVED" />
</intent-filter>
</receiver>
SMS:
<#> your App OTP is:8149 585dyDy8cbh
See this answer https://stackoverflow.com/a/55374780/10449332. Please register the BroadcastReceiver inside SmsRetriever addOnSuccessListener callback.

Examples using CastCompanionLibrary to simply display an image

I am looking for and example of casting an image to chromecast in android. Oddly enough it doesn't seem like this is covered in the googlecast sample repositories. Does anyone have a simple implementation of this? I basically would like to click on an image in my app's photo gallery on my android device and have it cast to the screen.
One side question is, does the image need to be at a url? or is it possible to stream the image to the device? I appreciate the help in advance.
I've solved this without the CastCompanionLibrary, but based on google's CastHelloText-android sample. Basically what I did was:
encode an image into a base64 string and send it as a message to a custom receiver
modify the sample's receiver to receive a base64 string and set it as the image source.
upload and register my receiver and have the application use the generated application id
This is the code for the receiver:
<!DOCTYPE html>
<html>
<head>
<style>
img#androidImage {
height:auto;
width:100%;
}
</style>
<title>Cast Hello Text</title>
</head>
<body>
<img id="androidImage" src="" />
<script type="text/javascript" src="//www.gstatic.com/cast/sdk/libs/receiver/2.0.0/cast_receiver.js"></script>
<script type="text/javascript">
window.onload = function() {
cast.receiver.logger.setLevelValue(0);
window.castReceiverManager = cast.receiver.CastReceiverManager.getInstance();
console.log('Starting Receiver Manager');
// handler for the 'ready' event
castReceiverManager.onReady = function(event) {
console.log('Received Ready event: ' + JSON.stringify(event.data));
window.castReceiverManager.setApplicationState("Application status is ready...");
};
// handler for 'senderconnected' event
castReceiverManager.onSenderConnected = function(event) {
console.log('Received Sender Connected event: ' + event.data);
console.log(window.castReceiverManager.getSender(event.data).userAgent);
};
// handler for 'senderdisconnected' event
castReceiverManager.onSenderDisconnected = function(event) {
console.log('Received Sender Disconnected event: ' + event.data);
if (window.castReceiverManager.getSenders().length == 0) {
window.close();
}
};
// handler for 'systemvolumechanged' event
castReceiverManager.onSystemVolumeChanged = function(event) {
console.log('Received System Volume Changed event: ' + event.data['level'] + ' ' +
event.data['muted']);
};
// create a CastMessageBus to handle messages for a custom namespace
window.messageBus =
window.castReceiverManager.getCastMessageBus(
'urn:x-cast:com.google.cast.sample.helloworld');
// handler for the CastMessageBus message event
window.messageBus.onMessage = function(event) {
console.log('Message recieved');
var obj = JSON.parse(event.data)
console.log('Message type: ' + obj.type);
if (obj.type == "text") {
console.log('Skipping message: ' + obj.data);
}
if (obj.type == "image") {
var source = 'data:image/png;base64,'.concat(obj.data)
displayImage(source);
}
// inform all senders on the CastMessageBus of the incoming message event
// sender message listener will be invoked
window.messageBus.send(event.senderId, event.data);
}
// initialize the CastReceiverManager with an application status message
window.castReceiverManager.start({statusText: "Application is starting"});
console.log('Receiver Manager started');
};
function displayImage(source) {
console.log('received image');
document.getElementById("androidImage").src=source;
window.castReceiverManager.setApplicationState('image source changed');
};
</script>
</body>
</html>
Below is the modified MainActivity.java code. Don't forget to modify the app_id in string.xml once your receiver application is registered.
2 notes:
The sent messages are wrapped in a JSON object so I can filter out
the text messages.
The ENCODED_IMAGE_STRING variable isn't defined in this
example, you'll have to find an image and convert it to a base64 string yourself.
MainActivity.java:
package com.example.casthelloworld;
import java.io.IOException;
import java.util.ArrayList;
import android.content.Intent;
import android.graphics.drawable.ColorDrawable;
import android.os.Bundle;
import android.speech.RecognizerIntent;
import android.support.v4.view.MenuItemCompat;
import android.support.v7.app.ActionBar;
import android.support.v7.app.ActionBarActivity;
import android.support.v7.app.MediaRouteActionProvider;
import android.support.v7.media.MediaRouteSelector;
import android.support.v7.media.MediaRouter;
import android.support.v7.media.MediaRouter.RouteInfo;
import android.util.Log;
import android.view.Menu;
import android.view.MenuItem;
import android.view.View;
import android.view.View.OnClickListener;
import android.widget.Button;
import android.widget.Toast;
import com.google.android.gms.cast.ApplicationMetadata;
import com.google.android.gms.cast.Cast;
import com.google.android.gms.cast.Cast.ApplicationConnectionResult;
import com.google.android.gms.cast.Cast.MessageReceivedCallback;
import com.google.android.gms.cast.CastDevice;
import com.google.android.gms.cast.CastMediaControlIntent;
import com.google.android.gms.common.ConnectionResult;
import com.google.android.gms.common.api.GoogleApiClient;
import com.google.android.gms.common.api.ResultCallback;
import com.google.android.gms.common.api.Status;
/**
* Main activity to send messages to the receiver.
*/
public class MainActivity extends ActionBarActivity {
private static final String TAG = MainActivity.class.getSimpleName();
private static final int REQUEST_CODE = 1;
private MediaRouter mMediaRouter;
private MediaRouteSelector mMediaRouteSelector;
private MediaRouter.Callback mMediaRouterCallback;
private CastDevice mSelectedDevice;
private GoogleApiClient mApiClient;
private Cast.Listener mCastListener;
private ConnectionCallbacks mConnectionCallbacks;
private ConnectionFailedListener mConnectionFailedListener;
private HelloWorldChannel mHelloWorldChannel;
private boolean mApplicationStarted;
private boolean mWaitingForReconnect;
private String mSessionId;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
ActionBar actionBar = getSupportActionBar();
actionBar.setBackgroundDrawable(new ColorDrawable(
android.R.color.transparent));
// When the user clicks on the button, use Android voice recognition to
// get text
Button voiceButton = (Button) findViewById(R.id.voiceButton);
voiceButton.setOnClickListener(new OnClickListener() {
#Override
public void onClick(View v) {
startVoiceRecognitionActivity();
}
});
// When the user clicks on the button, use Android voice recognition to
// get text
Button yarrButton = (Button) findViewById(R.id.tmpButton);
yarrButton.setOnClickListener(new OnClickListener() {
#Override
public void onClick(View v) {
castImage();
}
});
// Configure Cast device discovery
mMediaRouter = MediaRouter.getInstance(getApplicationContext());
mMediaRouteSelector = new MediaRouteSelector.Builder()
.addControlCategory(
CastMediaControlIntent.categoryForCast(getResources()
.getString(R.string.app_id))).build();
mMediaRouterCallback = new MyMediaRouterCallback();
}
private void castImage()
{
Log.d(TAG, "castImage()");
String image_string = createJsonMessage(MessageType.image, ENCODED_IMAGE_STRING);
sendMessage(image_string);
}
/**
* Android voice recognition
*/
private void startVoiceRecognitionActivity() {
Intent intent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH);
intent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL,
RecognizerIntent.LANGUAGE_MODEL_FREE_FORM);
intent.putExtra(RecognizerIntent.EXTRA_PROMPT,
getString(R.string.message_to_cast));
startActivityForResult(intent, REQUEST_CODE);
}
/*
* Handle the voice recognition response
*
* #see android.support.v4.app.FragmentActivity#onActivityResult(int, int,
* android.content.Intent)
*/
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
if (requestCode == REQUEST_CODE && resultCode == RESULT_OK) {
ArrayList<String> matches = data
.getStringArrayListExtra(RecognizerIntent.EXTRA_RESULTS);
if (matches.size() > 0) {
Log.d(TAG, matches.get(0));
String message = createJsonMessage(MessageType.text, matches.get(0));
sendMessage(message);
}
}
super.onActivityResult(requestCode, resultCode, data);
}
#Override
protected void onResume() {
super.onResume();
// Start media router discovery
mMediaRouter.addCallback(mMediaRouteSelector, mMediaRouterCallback,
MediaRouter.CALLBACK_FLAG_REQUEST_DISCOVERY);
}
#Override
protected void onPause() {
if (isFinishing()) {
// End media router discovery
mMediaRouter.removeCallback(mMediaRouterCallback);
}
super.onPause();
}
#Override
public void onDestroy() {
teardown();
super.onDestroy();
}
#Override
public boolean onCreateOptionsMenu(Menu menu) {
super.onCreateOptionsMenu(menu);
getMenuInflater().inflate(R.menu.main, menu);
MenuItem mediaRouteMenuItem = menu.findItem(R.id.media_route_menu_item);
MediaRouteActionProvider mediaRouteActionProvider = (MediaRouteActionProvider) MenuItemCompat
.getActionProvider(mediaRouteMenuItem);
// Set the MediaRouteActionProvider selector for device discovery.
mediaRouteActionProvider.setRouteSelector(mMediaRouteSelector);
return true;
}
/**
* Callback for MediaRouter events
*/
private class MyMediaRouterCallback extends MediaRouter.Callback {
#Override
public void onRouteSelected(MediaRouter router, RouteInfo info) {
Log.d(TAG, "onRouteSelected");
// Handle the user route selection.
mSelectedDevice = CastDevice.getFromBundle(info.getExtras());
launchReceiver();
}
#Override
public void onRouteUnselected(MediaRouter router, RouteInfo info) {
Log.d(TAG, "onRouteUnselected: info=" + info);
teardown();
mSelectedDevice = null;
}
}
/**
* Start the receiver app
*/
private void launchReceiver() {
try {
mCastListener = new Cast.Listener() {
#Override
public void onApplicationDisconnected(int errorCode) {
Log.d(TAG, "application has stopped");
teardown();
}
};
// Connect to Google Play services
mConnectionCallbacks = new ConnectionCallbacks();
mConnectionFailedListener = new ConnectionFailedListener();
Cast.CastOptions.Builder apiOptionsBuilder = Cast.CastOptions
.builder(mSelectedDevice, mCastListener);
mApiClient = new GoogleApiClient.Builder(this)
.addApi(Cast.API, apiOptionsBuilder.build())
.addConnectionCallbacks(mConnectionCallbacks)
.addOnConnectionFailedListener(mConnectionFailedListener)
.build();
mApiClient.connect();
} catch (Exception e) {
Log.e(TAG, "Failed launchReceiver", e);
}
}
/**
* Google Play services callbacks
*/
private class ConnectionCallbacks implements
GoogleApiClient.ConnectionCallbacks {
#Override
public void onConnected(Bundle connectionHint) {
Log.d(TAG, "onConnected");
if (mApiClient == null) {
// We got disconnected while this runnable was pending
// execution.
return;
}
try {
if (mWaitingForReconnect) {
mWaitingForReconnect = false;
// Check if the receiver app is still running
if ((connectionHint != null)
&& connectionHint
.getBoolean(Cast.EXTRA_APP_NO_LONGER_RUNNING)) {
Log.d(TAG, "App is no longer running");
teardown();
} else {
// Re-create the custom message channel
try {
Cast.CastApi.setMessageReceivedCallbacks(
mApiClient,
mHelloWorldChannel.getNamespace(),
mHelloWorldChannel);
} catch (IOException e) {
Log.e(TAG, "Exception while creating channel", e);
}
}
} else {
// Launch the receiver app
Cast.CastApi
.launchApplication(mApiClient,
getString(R.string.app_id), false)
.setResultCallback(
new ResultCallback<Cast.ApplicationConnectionResult>() {
#Override
public void onResult(
ApplicationConnectionResult result) {
Status status = result.getStatus();
Log.d(TAG,
"ApplicationConnectionResultCallback.onResult: statusCode "
+ status.getStatusCode());
if (status.isSuccess()) {
ApplicationMetadata applicationMetadata = result
.getApplicationMetadata();
mSessionId = result
.getSessionId();
String applicationStatus = result
.getApplicationStatus();
boolean wasLaunched = result
.getWasLaunched();
Log.d(TAG,
"application name: "
+ applicationMetadata
.getName()
+ ", status: "
+ applicationStatus
+ ", sessionId: "
+ mSessionId
+ ", wasLaunched: "
+ wasLaunched);
mApplicationStarted = true;
// Create the custom message
// channel
mHelloWorldChannel = new HelloWorldChannel();
try {
Cast.CastApi
.setMessageReceivedCallbacks(
mApiClient,
mHelloWorldChannel
.getNamespace(),
mHelloWorldChannel);
} catch (IOException e) {
Log.e(TAG,
"Exception while creating channel",
e);
}
// set the initial instructions
// on the receiver
String message = createJsonMessage(MessageType.text, getString(R.string.instructions));
sendMessage(message);
} else {
Log.e(TAG,
"application could not launch");
teardown();
}
}
});
}
} catch (Exception e) {
Log.e(TAG, "Failed to launch application", e);
}
}
#Override
public void onConnectionSuspended(int cause) {
Log.d(TAG, "onConnectionSuspended");
mWaitingForReconnect = true;
}
}
/**
* Google Play services callbacks
*/
private class ConnectionFailedListener implements
GoogleApiClient.OnConnectionFailedListener {
#Override
public void onConnectionFailed(ConnectionResult result) {
Log.e(TAG, "onConnectionFailed ");
teardown();
}
}
/**
* Tear down the connection to the receiver
*/
private void teardown() {
Log.d(TAG, "teardown");
if (mApiClient != null) {
if (mApplicationStarted) {
if (mApiClient.isConnected() || mApiClient.isConnecting()) {
try {
Cast.CastApi.stopApplication(mApiClient, mSessionId);
if (mHelloWorldChannel != null) {
Cast.CastApi.removeMessageReceivedCallbacks(
mApiClient,
mHelloWorldChannel.getNamespace());
mHelloWorldChannel = null;
}
} catch (IOException e) {
Log.e(TAG, "Exception while removing channel", e);
}
mApiClient.disconnect();
}
mApplicationStarted = false;
}
mApiClient = null;
}
mSelectedDevice = null;
mWaitingForReconnect = false;
mSessionId = null;
}
/**
* Send a text message to the receiver
*
* #param message
*/
private void sendMessage(String message) {
if (mApiClient != null && mHelloWorldChannel != null) {
try {
Cast.CastApi.sendMessage(mApiClient,
mHelloWorldChannel.getNamespace(), message)
.setResultCallback(new ResultCallback<Status>() {
#Override
public void onResult(Status result) {
if (!result.isSuccess()) {
Log.e(TAG, "Sending message failed");
}
}
});
} catch (Exception e) {
Log.e(TAG, "Exception while sending message", e);
}
} else {
Toast.makeText(MainActivity.this, message, Toast.LENGTH_SHORT)
.show();
}
}
/**
* Custom message channel
*/
class HelloWorldChannel implements MessageReceivedCallback {
/**
* #return custom namespace
*/
public String getNamespace() {
return getString(R.string.namespace);
}
/*
* Receive message from the receiver app
*/
#Override
public void onMessageReceived(CastDevice castDevice, String namespace,
String message) {
Log.d(TAG, "onMessageReceived: " + message);
}
}
enum MessageType {
text,
image,
}
public static Bitmap getBitmapFromView(View view) {
//Define a bitmap with the same size as the view
Bitmap returnedBitmap = Bitmap.createBitmap(view.getWidth(), view.getHeight(),Bitmap.Config.ARGB_8888);
//Bind a canvas to it
Canvas canvas = new Canvas(returnedBitmap);
//Get the view's background
Drawable bgDrawable =view.getBackground();
if (bgDrawable!=null)
//has background drawable, then draw it on the canvas
bgDrawable.draw(canvas);
else
//does not have background drawable, then draw white background on the canvas
canvas.drawColor(Color.WHITE);
// draw the view on the canvas
view.draw(canvas);
//return the bitmap
return returnedBitmap;
}
private static String createJsonMessage(MessageType type, String message)
{
return String.format("{\"type\":\"%s\", \"data\":\"%s\"}", type.toString(), message);
}
}
Since on Chromecast your application is running inside a web browser, you need to have an <img/> tag show the image. The src attribute of that tag should point to the image that you want to see and it has to be a url, so if your image is residing on your phone's local storage, you need to start a small web server in your mobile application to serve that image and communicate with the receiver what url it should point at (which would be the url at which your server is serving that image). these are all doable and you can use the CastCompanionLibrary, if you want, to communicate with your custom receiver; simply use the DataCastManager class instead of VideoCastManager.
May be my answer will be helpful for other developers, because I also did'nt found good solution and done it by myself.
For showing image via Google Cast on your device screen from your app you can create and start simply web server from your app which will process http requests with selected image name or id in URL.
Example:
public class MyWebServer {
private Activity activity;
private static ServerSocket httpServerSocket;
private static boolean isWebServerSunning;
public static final String drawableDelimiter = "pic-"
public MyWebServer(Activity activity) {
this.activity = activity;
}
public void stopWebServer() {
isWebServerSunning = false;
try {
if (httpServerSocket != null) {
httpServerSocket.close();
}
} catch (IOException e) {
e.printStackTrace();
}
}
public void startWebServer() {
isWebServerSunning = true;
Thread webServerThread = new Thread(() -> {
Socket socket;
HttpResponseThread httpResponseThread;
try {
httpServerSocket = new ServerSocket(5050);
while (isWebServerSunning) {
socket = httpServerSocket.accept();
httpResponseThread = new HttpResponseThread(socket);
httpResponseThread.start();
}
} catch (Exception e) {
e.printStackTrace();
}
});
webServerThread.start();
}
private class HttpResponseThread extends Thread {
Socket clientSocket;
HttpResponseThread(Socket socket) {
this.clientSocket = socket;
}
#Override
public void run() {
try (BufferedReader bufferedReader = new BufferedReader(new InputStreamReader(clientSocket.getInputStream()));
OutputStream outputStream = clientSocket.getOutputStream();
) {
String input = bufferedReader.readLine();
if (input != null && !input.isEmpty() && input.contains("/") && input.contains(" ")) {
if (input.contains(drawableDelimiter)) {
String imageId = input.substring(input.indexOf("/") + 1, input.lastIndexOf(" ")).trim().split(drawableDelimiter)[1];
Bitmap bitmap = BitmapFactory.decodeResource(activity.getResources(), Integer.parseInt(imageId));
if (bitmap != null) {
ByteArrayOutputStream bitmapBytes = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.PNG, 100, bitmapBytes);
outputStream.write("HTTP/1.0 200 OK\r\n".getBytes());
outputStream.write("Server: Apache/0.8.4\r\n".getBytes());
outputStream.write(("Content-Length: " + bitmapBytes.toByteArray().length + "\r\n").getBytes());
outputStream.write("\r\n".getBytes());
outputStream.write(bitmapBytes.toByteArray());
}
}
}
} catch (Exception e) {
e.printStackTrace();
}
}
}
}
And just start or stop your web server at when Google Cast will be useable or stopped.
MyWebServer myWebServer = new MyWebServer(this); // pass your activity here
myWebServer.startWebServer();
myWebServer.stopWebServer();

Categories

Resources