I want to check whether a headset has a microphone or not. Currently I'm using this broadcast receiver code, but I'm not sure whether it is correct or not.
public class HeadSetMicrophoneStateReceiver extends BroadcastReceiver {
private String pluggedState = null;
#Override
public void onReceive(Context context, Intent intent) {
if (intent.getAction().equals(Intent.ACTION_HEADSET_PLUG)) {
int state = intent.getIntExtra("microphone", -1);
switch (state) {
case 0:
//Headset does not have a Microphone
pluggedState = "0";
break;
case 1:
//Headset have a Microphone
pluggedState = "1";
break;
default:
pluggedState = "I have no idea what the headset state is";
}
EventBus.getDefault().post(new HeadSetMicrophoneEvent(pluggedState));
}
}
}
Please help me out.
This method returns whether the microphone is available.
If it is not available then an exception will be caught.
public static boolean getMicrophoneAvailable(Context context) {
MediaRecorder recorder = new MediaRecorder();
recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
recorder.setOutputFormat(MediaRecorder.OutputFormat.DEFAULT);
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.DEFAULT);
recorder.setOutputFile(new File(context.getCacheDir(), "MediaUtil#micAvailTestFile").getAbsolutePath());
boolean available = true;
try {
recorder.prepare();
}
catch (IOException exception) {
available = false;
}
recorder.release();
return available;
}
try Following code
ackage com.example.headsetplugtest;
import android.app.Activity;
import android.content.BroadcastReceiver;
import android.content.Context;
import android.content.Intent;
import android.content.IntentFilter;
import android.os.Bundle;
import android.util.Log;
public class MyActivity extends Activity {
private final BroadcastReceiver mReceiver = new BroadcastReceiver() {
#Override
public void onReceive(Context context, Intent intent) {
final String action = intent.getAction();
if (Intent.ACTION_HEADSET_PLUG.equals(action)) {
Log.d("MyActivity ", "state: " + intent.getIntExtra("state", -1));
Log.d("MyActivity ", "microphone: " + intent.getIntExtra("microphone", -1));
}
}
};
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
}
#Override
protected void onResume() {
super.onResume();
IntentFilter filter = new IntentFilter(Intent.ACTION_HEADSET_PLUG);
getApplicationContext().registerReceiver(mReceiver, filter);
}
#Override
protected void onStop() {
super.onStop();
getApplicationContext().unregisterReceiver(mReceiver);
}
}
Related
I am working on speechRecognizer module in android integrated with dialogflow. I have almost completed the integration and it's working fine but facing one issue. The issue is when TTS is speaking the response, before completing the speech mic gets enabled and it capture the response also instead of user utterance only.
So I want to know how can I get to know that TTS finished speaking the response? I have tried Google solution by using onUtteranceCompleted() method and also I have tried on setOnUtteranceProgressListener() method but doesn't seems to be working. I want to implement these methods in asynctask So that it can be done in background. How can I do that?
Here is the code that I have tried:
import android.Manifest;
import android.app.ProgressDialog;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.content.pm.ResolveInfo;
import android.os.Bundle;
import android.os.Handler;
import android.os.Looper;
import android.speech.RecognitionListener;
import android.speech.RecognizerIntent;
import android.speech.SpeechRecognizer;
import android.speech.tts.TextToSpeech;
import android.support.annotation.NonNull;
import android.support.v4.app.ActivityCompat;
import android.support.v4.content.ContextCompat;
import android.support.v7.app.AppCompatActivity;
import android.text.TextUtils;
import android.util.Log;
import android.view.View;
import android.widget.ArrayAdapter;
import android.widget.EditText;
import android.widget.ListView;
import android.widget.TextView;
import android.widget.Toast;
import java.util.ArrayList;
import java.util.List;
import java.util.Locale;
import ai.api.PartialResultsListener;
import ai.api.android.AIConfiguration;
import ai.api.android.GsonFactory;
import ai.api.model.AIError;
import ai.api.model.AIResponse;
import ai.api.ui.AIDialog;
import android.os.AsyncTask;
import com.google.gson.Gson;
public class VoiceActivity extends AppCompatActivity implements View.OnClickListener, TextToSpeech.OnInitListener, RecognitionListener {
private AIDialog.AIDialogListener resultsListener;
private static final int REQUEST_AUDIO_PERMISSIONS_ID = 33;
private Gson gson = GsonFactory.getGson();
private ListView wordList;
TextView txtmain;
private static final int VR_REQUEST = 999;
//Log tag for output information
private final String LOG_TAG = "SpeechRepeatActivity";//***enter your own tag here***
//variable for checking TTS engine data on user device
private int MY_DATA_CHECK_CODE = 0;
//Text To Speech instance
private TextToSpeech tts;
AIButton aiButton;
EditText time;
TextView matchcall;
// private SpeechRecognizer speech = null;
private Intent recognizerIntent;
TextView tvOutput ;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_voice);
aiButton = (AIButton) findViewById(R.id.speech_btn);
txtmain =findViewById(R.id.txtmain);
wordList = findViewById(R.id.word_list);
tvOutput = findViewById(R.id.tvOutput);
time = (EditText) findViewById(R.id.in_time);
matchcall = (TextView) findViewById(R.id.matchcall);
final AsyncTaskRunner runner = new AsyncTaskRunner();
initService();
PackageManager packManager = getPackageManager();
List<ResolveInfo> intActivities = packManager.queryIntentActivities(new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH), 0);
if (intActivities.size() != 0) {
//speech recognition is supported - detect user button clicks
aiButton.setOnClickListener(this);
//prepare the TTS to repeat chosen words
Intent checkTTSIntent = new Intent();
//check TTS data
checkTTSIntent.setAction(TextToSpeech.Engine.ACTION_CHECK_TTS_DATA);
//start the checking Intent - will retrieve result in onActivityResult
startActivityForResult(checkTTSIntent, MY_DATA_CHECK_CODE);
} else {
//speech recognition not supported, disable button and output message
aiButton.setEnabled(false);
Toast.makeText(this, "Oops - Speech recognition not supported!", Toast.LENGTH_LONG).show();
}
}
private void setAIButtonCallback(final AIButton aiButton) {
aiButton.setResultsListener(new AIButton.AIButtonListener() {
#Override
public void onResult(final AIResponse result) {
if (resultsListener != null) {
resultsListener.onResult(result);
Log.e(LOG_TAG,"onResult=="+result.getResult().getResolvedQuery());
final String speech = result.getResult().getFulfillment().getSpeech();
tvOutput.setText(speech);
}
}
#Override
public void onError(final AIError error) {
if (resultsListener != null) {
resultsListener.onError(error);
Log.e(LOG_TAG,"onError=="+error);
}
}
#Override
public void onCancelled() {
if (resultsListener != null) {
resultsListener.onCancelled();
Log.e(LOG_TAG,"onCancelled==");
}
}
});
aiButton.setPartialResultsListener(new PartialResultsListener() {
#Override
public void onPartialResults(final List<String> partialResults) {
final String result = partialResults.get(0);
if (!TextUtils.isEmpty(result)) {
new Handler(Looper.getMainLooper()).post(new Runnable() {
#Override
public void run() {
txtmain.setText(result);
}
});
}
}
});
}
#Override
public void onClick(View v) {
if (v.getId() == R.id.speech_btn) {
//listen for results
listenToSpeech();
}
}
#Override
public void onInit(int status) {
//if successful, set locale
if (status == TextToSpeech.SUCCESS)
tts.setLanguage(Locale.UK);//***choose your own locale here***
}
private void listenToSpeech() {
SpeechRecognizer speech = SpeechRecognizer.createSpeechRecognizer(this);
speech.setRecognitionListener(this);
recognizerIntent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH);
recognizerIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_PREFERENCE,"en");
recognizerIntent.putExtra(RecognizerIntent.EXTRA_CALLING_PACKAGE,this.getPackageName());
recognizerIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL,RecognizerIntent.LANGUAGE_MODEL_WEB_SEARCH);
recognizerIntent.putExtra(RecognizerIntent.EXTRA_CALLING_PACKAGE,this.getPackageName());
recognizerIntent.putExtra(RecognizerIntent.EXTRA_MAX_RESULTS, 3);
//start listening
speech.startListening(recognizerIntent);
}
#Override
public void onResults(Bundle results) {
Log.i(LOG_TAG, "onResults");
ArrayList<String> matches = results.getStringArrayList(SpeechRecognizer.RESULTS_RECOGNITION);
/* wordList.setAdapter(new ArrayAdapter<String>(this, R.layout.word, matches));
new Handler().post(new Runnable() {
#Override
public void run() {
wordList.performItemClick(
wordList.getChildAt(0),
0,
wordList.getAdapter().getItemId(0));
wordList.getChildAt(0).setBackgroundColor(0xFFD3D3D3);
}
});*/
String wordChosen = matches.get(0);
tvOutput.setText(wordChosen);
try {
AsyncTaskRunner runner = new AsyncTaskRunner();
runner.execute(wordChosen);
// aiButton.textRequest(text);
//runner.doInBackground(text);
} catch (Exception e) {
e.printStackTrace();
}
}
private class AsyncTaskRunner extends AsyncTask<String, AIResponse, AIResponse> {
private AIResponse resp;
ProgressDialog progressDialog;
#Override
protected AIResponse doInBackground(String... params) {
Log.e(LOG_TAG,"doInBackground=="+params[0]);
try {
resp = aiButton.textRequest(String.valueOf(params[0]));
}
catch (Exception e) {
e.printStackTrace();
}
return resp;
}
protected void onPostExecute(AIResponse result) {
Log.e(LOG_TAG,"onPostExecute== result=="+result.getResult().getFulfillment().getSpeech());
// execution of result of Long time consuming operation
findViewById(R.id.progBar).setVisibility(View.GONE);
txtmain.setText(result.getResult().getFulfillment().getSpeech());
String speech = result.getResult().getFulfillment().getSpeech();
tts.speak( speech, TextToSpeech.QUEUE_FLUSH, null);
Toast.makeText(VoiceActivity.this, speech, Toast.LENGTH_SHORT).show();
listenToSpeech();
}
#Override
protected void onPreExecute() {
Log.e(LOG_TAG,"onPreExecute==");
findViewById(R.id.progBar).setVisibility(View.VISIBLE);
}
}
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
//check speech recognition result
if (requestCode == VR_REQUEST && resultCode == RESULT_OK)
{
//store the returned word list as an ArrayList
ArrayList<String> suggestedWords = data.getStringArrayListExtra(RecognizerIntent.EXTRA_RESULTS);
//set the retrieved list to display in the ListView using an ArrayAdapter
wordList.setAdapter(new ArrayAdapter<String>(this, R.layout.word, suggestedWords));
/*new Handler().post(new Runnable() {
#Override
public void run() {
wordList.performItemClick(
wordList.getChildAt(0),
0,
wordList.getAdapter().getItemId(0));
wordList.getChildAt(0).setBackgroundColor(0xFFD3D3D3);
}
});*/
tvOutput.setText(suggestedWords.get(0));
}
if (requestCode == MY_DATA_CHECK_CODE)
{
//we have the data - create a TTS instance
if (resultCode == TextToSpeech.Engine.CHECK_VOICE_DATA_PASS)
tts = new TextToSpeech(this, this);
//data not installed, prompt the user to install it
else
{
//intent will take user to TTS download page in Google Play
Intent installTTSIntent = new Intent();
installTTSIntent.setAction(TextToSpeech.Engine.ACTION_INSTALL_TTS_DATA);
startActivity(installTTSIntent);
}
}
//call superclass method
super.onActivityResult(requestCode, resultCode, data);
}
private void initService() {
final AIConfiguration config = new AIConfiguration("a73d5e88477e4926ae84af46f24e0aaa",
AIConfiguration.SupportedLanguages.English,
AIConfiguration.RecognitionEngine.Google);
aiButton.initialize(config);
setAIButtonCallback(aiButton);
Log.i(LOG_TAG, "initService:::::: ");
}
#Override
protected void onPause() {
super.onPause();
// speech.stopListening();
// use this method to disconnect from speech recognition service
new java.util.Timer().schedule(
new java.util.TimerTask() {
#Override
public void run() {
// your code here
}
},
5000
);
// Not destroying the SpeechRecognition object in onPause method would block other apps from using SpeechRecognition service
}
#Override
protected void onResume() {
super.onResume();
}
#Override
protected void onStop() {
Log.i(LOG_TAG, "stop");
super.onStop();
/* if (speech != null) {
speech.destroy();*/
try {
Thread.sleep(5000);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
//}
#Override
protected void onStart() {
super.onStart();
checkAudioRecordPermission();
}
protected void checkAudioRecordPermission() {
if (ContextCompat.checkSelfPermission(this,
Manifest.permission.RECORD_AUDIO)
!= PackageManager.PERMISSION_GRANTED) {
// Should we show an explanation?
if (ActivityCompat.shouldShowRequestPermissionRationale(this,
Manifest.permission.RECORD_AUDIO)) {
} else {
// No explanation needed, we can request the permission.
ActivityCompat.requestPermissions(this,
new String[]{Manifest.permission.RECORD_AUDIO},
REQUEST_AUDIO_PERMISSIONS_ID);
}
}
}
#Override
public void onRequestPermissionsResult(int requestCode, #NonNull String[] permissions, #NonNull int[] grantResults) {
switch (requestCode) {
case REQUEST_AUDIO_PERMISSIONS_ID: {
// If request is cancelled, the result arrays are empty.
if (grantResults.length > 0
&& grantResults[0] == PackageManager.PERMISSION_GRANTED) {
} else {
}
return;
}
}
}
#Override
public void onReadyForSpeech(Bundle bundle) {
}
#Override
public void onBeginningOfSpeech() {
}
#Override
public void onRmsChanged(float v) {
}
#Override
public void onBufferReceived(byte[] bytes) {
}
#Override
public void onEndOfSpeech() {
//speech.stopListening();
}
#Override
public void onError(int errorcode) {
String errorMessage = getErrorText(errorcode);
Log.i(LOG_TAG, "FAILED " + errorMessage);
// speech.stopListening();
// listenToSpeech();
}
public String getErrorText(int errorCode) {
String message;
switch (errorCode) {
case SpeechRecognizer.ERROR_AUDIO:
message = "Audio recording error";
break;
case SpeechRecognizer.ERROR_CLIENT:
message = "Client side error";
break;
case SpeechRecognizer.ERROR_INSUFFICIENT_PERMISSIONS:
message = "Insufficient permissions";
break;
case SpeechRecognizer.ERROR_NETWORK:
message = "Network error";
break;
case SpeechRecognizer.ERROR_NETWORK_TIMEOUT:
message = "Network timeout";
break;
case SpeechRecognizer.ERROR_NO_MATCH:
message = "No match";
break;
case SpeechRecognizer.ERROR_RECOGNIZER_BUSY:
message = "RecognitionService busy";
break;
case SpeechRecognizer.ERROR_SERVER:
message = "error from server";
break;
case SpeechRecognizer.ERROR_SPEECH_TIMEOUT:
message = "No speech input";
//speech.stopListening();
break;
default:
message = "Didn't understand, please try again.";
break;
}
return message;
}
#Override
public void onPartialResults(Bundle bundle) {
}
#Override
public void onEvent(int i, Bundle bundle) {
}
}
I am writing an Android app to print text on a bluetooth thermal printer.
Here is the complete code
The app works fine in the debug mode but when I generate a signed APK and install it on the device, it does not respond at all.
I have tried different solution suggested on stackoverflow but non of them worked.
This is my main activity
import android.app.Activity;
import android.bluetooth.BluetoothAdapter;
import android.content.BroadcastReceiver;
import android.content.Context;
import android.content.Intent;
import android.os.AsyncTask;
import android.os.Handler;
import android.os.Message;
import android.support.v7.app.AppCompatActivity;
import android.os.Bundle;
import android.util.Log;
import android.view.View;
import android.widget.Button;
import android.widget.Toast;
import com.lvrenyang.io.IOCallBack;
import java.lang.ref.WeakReference;
public class MainActivity extends AppCompatActivity {
private Handler mHandler; // Our main handler that will receive callback notifications
// #defines for identifying shared types between calling functions
private final static int REQUEST_ENABLE_BT = 1; // used to identify adding bluetooth names
private static String TAG = "MAIN_ACTIVITY";
private Activity activity;
private Button btnConnect;
private String name = "MTP-II";
private String mac_address = "02:15:44:31:49:05";
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
//Get the activity
this.activity = this;
//Button from the XML view
btnConnect = findViewById(R.id.btnConnect);
//Start the Init Work Service Async task
new InitWorkService().execute();
//Set onClickListener for test print button
btnConnect.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
try {
//Check if name and address are set
if (name != "null" && mac_address != "null" && mac_address.contains(":")) {
if (!WorkService.workThread.isConnected()) {
WorkService.workThread.connectBt(mac_address);
//Sleep for 3 seconds
try {
Thread.sleep(3000);
} catch (Exception e) {
}
}
//Check if connected
if (WorkService.workThread.isConnected()) {
//Collect data in background Thread
new PrintData().execute();
} else {
Toast.makeText(activity, Global.toast_notconnect, Toast.LENGTH_SHORT).show();
}
} else {
Toast.makeText(activity, "Please setup printer first!", Toast.LENGTH_LONG).show();
}
}
catch(Exception e){
Log.e(TAG, e.getMessage(), e.fillInStackTrace());}
}
});
}
/**
* Background Async Task
* */
private class InitWorkService extends AsyncTask<String, String, String> {
#Override
protected void onPreExecute() {
super.onPreExecute();
}
protected String doInBackground(String... args){
try{
WorkService.cb = new IOCallBack() {
public void OnOpen() {
if (null != mHandler) {
Message msg = mHandler.obtainMessage(Global.MSG_IO_ONOPEN);
mHandler.sendMessage(msg);
}
}
public void OnClose() {
if (null != mHandler) {
Message msg = mHandler.obtainMessage(Global.MSG_IO_ONCLOSE);
mHandler.sendMessage(msg);
}
}
};
}
catch(Exception e){
Log.e(TAG, e.getMessage(), e.fillInStackTrace());
}
return null;
}
protected void onPostExecute(String file_url){
try {
mHandler = new MHandler(MainActivity.this);
WorkService.addHandler(mHandler);
if (null == WorkService.workThread) {
Intent intent = new Intent(activity, WorkService.class);
startService(intent);
}
}
catch (Exception e) {
Log.e(TAG, e.getMessage(), e.fillInStackTrace());
Toast.makeText(activity, "Unable to initiate the WorkService!", Toast.LENGTH_LONG).show();
}
}
}
/**
* Background Async Task
* */
class PrintData extends AsyncTask<String, String, String> {
#Override
protected void onPreExecute() {
super.onPreExecute();
}
protected String doInBackground(String... args){
try{
int nTextAlign=1;
String text = "Test message!\r\n\r\n\r\n";
String encoding = "UTF-8";
byte[] hdrBytes = {0x1c, 0x26, 0x1b, 0x39, 0x01};
Bundle dataAlign = new Bundle();
Bundle dataTextOut = new Bundle();
Bundle dataHdr = new Bundle();
dataHdr.putByteArray(Global.BYTESPARA1, hdrBytes);
dataHdr.putInt(Global.INTPARA1, 0);
dataHdr.putInt(Global.INTPARA2, hdrBytes.length);
dataAlign.putInt(Global.INTPARA1, nTextAlign);
dataTextOut.putString(Global.STRPARA1, text);
dataTextOut.putString(Global.STRPARA2, encoding);
WorkService.workThread.handleCmd(Global.CMD_POS_WRITE,dataHdr);
WorkService.workThread.handleCmd(Global.CMD_POS_SALIGN,dataAlign);
WorkService.workThread.handleCmd(Global.CMD_POS_STEXTOUT,dataTextOut);
}catch(Exception e){
Log.e(TAG, e.getMessage(), e.fillInStackTrace());
}
return null;
}
protected void onPostExecute(String file_url){}
}
#Override
protected void onUserLeaveHint()
{
Log.d("onUserLeaveHint","Home button pressed");
super.onUserLeaveHint();
//Unregister bluetooth receiver
try{unregisterReceiver(bluetoothReceiver);}catch(Exception e){}
//Disconnect bt connection
try{WorkService.workThread.disconnectBt();}catch(Exception e){}
// remove the handler
try{WorkService.delHandler(mHandler);}catch(Exception e){}
mHandler = null;
}
/**
* Broadcast receiver for bluetooth state changes
*/
private final BroadcastReceiver bluetoothReceiver = new BroadcastReceiver() {
#Override
public void onReceive(Context context, Intent intent)
{
final String action = intent.getAction();
if (action.equals(BluetoothAdapter.ACTION_STATE_CHANGED))
{
final int state = intent.getIntExtra(BluetoothAdapter.EXTRA_STATE,BluetoothAdapter.ERROR);
switch (state)
{
case BluetoothAdapter.STATE_OFF:
// closeConnection();//Close on going connection and disable button
Intent enableBtIntent = new Intent(BluetoothAdapter.ACTION_REQUEST_ENABLE);
startActivityForResult(enableBtIntent, REQUEST_ENABLE_BT);
break;
case BluetoothAdapter.STATE_ON:
break;
}
}
}
};
private static class MHandler extends Handler {
WeakReference<MainActivity> mActivity;
MHandler(MainActivity activity) {
mActivity = new WeakReference<>(activity);
}
#Override
public void handleMessage(Message msg) {
MainActivity theActivity = mActivity.get();
switch (msg.what) {
case Global.CMD_POS_STEXTOUTRESULT:
case Global.CMD_POS_WRITERESULT: {
int result = msg.arg1;
Toast.makeText(
theActivity,
(result == 1) ? Global.toast_success
: Global.toast_fail, Toast.LENGTH_SHORT).show();
Log.v(TAG, "Result: " + result);
break;
}
}
}
}
}
Does your app manifest declare permissions for bluetooth to be used?
https://developer.android.com/guide/topics/connectivity/bluetooth#Permissions
In order to use Bluetooth features in your application, you must declare two permissions. The first of these is BLUETOOTH. You need this permission to perform any Bluetooth communication, such as requesting a connection, accepting a connection, and transferring data.
The other permission that you must declare is either
ACCESS_COARSE_LOCATION or ACCESS_FINE_LOCATION. A location permission
is required because Bluetooth scans can be used to gather information
about the location of the user. This information may come from the
user's own devices, as well as Bluetooth beacons in use at locations
such as shops and transit facilities.
If it only happens when u sign the apk, it looks like u have to update ur proguard rules, to exclude the printer lib clases or similar
One reason for the application not responding is that you stop the main thread for 3 seconds on line 60 in the click listener of the button.
Replace the onCreate() method with the code below
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
//Get the activity
this.activity = this;
//Button from the XML view
btnConnect = findViewById(R.id.btnConnect);
//Start the Init Work Service Async task
new InitWorkService().execute();
final ExecutorService es = Executors.newFixedThreadPool(1);
//Set onClickListener for test print button
btnConnect.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
btnConnect.setEnabled(false);
es.submit(new Runnable() {
#Override
public void run() {
connect();
}
});
}
});
}
private void connect() {
try {
//Check if name and address are set
if (name != null && mac_address != null && mac_address.contains(":")) {
if (!WorkService.workThread.isConnected()) {
WorkService.workThread.connectBt(mac_address);
//Sleep for 3 seconds
try {
Thread.sleep(3000);
} catch (Exception e) {
}
}
//Check if connected
if (WorkService.workThread.isConnected()) {
//Collect data in background Thread
new PrintData().execute();
} else {
new Handler(Looper.getMainLooper()).post(new Runnable() {
#Override
public void run() {
Toast.makeText(activity, Global.toast_notconnect, Toast.LENGTH_SHORT).show();
}
});
}
} else {
new Handler(Looper.getMainLooper()).post(new Runnable() {
#Override
public void run() {
Toast.makeText(activity, "Please setup printer first!", Toast.LENGTH_LONG).show();
}
});
}
} catch (Exception e) {
Log.e(TAG, e.getMessage(), e.fillInStackTrace());
}
new Handler(Looper.getMainLooper()).post(new Runnable() {
#Override
public void run() {
btnConnect.setEnabled(true);
}
});
}
Now the connection part executes in a new thread and only the UI operations go on the main one.
Please note that this code is not the optimal solution because it does not take into account the activity lifecycle. If the activity is re-created while the thread sleeps, there is still a reference kept to the old activity. But it should be a starting point for you.
I want to get the audio from Bluetooth headset and play it on Bluetooth headset itself. I am able to do that on lollipop(5.1.1)(Samsung note 3 neo) but it is not working on android(7.0)(Redmi Note 4).
I am first creating an audio track and then start a new thread for reading audio from mic. First, it starts reading audio from phonemic. After clicking the Bluetooth button it starts Bluetooth SCO.
Can anyone help?
package surya.com.audiorecord;
import android.Manifest;
import android.content.BroadcastReceiver;
import android.content.Context;
import android.content.Intent;
import android.content.IntentFilter;
import android.content.pm.PackageManager;
import android.media.AudioFormat;
import android.media.AudioManager;
import android.media.AudioRecord;
import android.media.AudioTrack;
import android.media.MediaRecorder;
import android.os.Build;
import android.os.Bundle;
import android.support.v4.app.ActivityCompat;
import android.support.v4.content.ContextCompat;
import android.support.v7.app.AppCompatActivity;
import android.util.Log;
import android.view.View;
import android.widget.Button;
import android.widget.Toast;
import java.nio.ByteBuffer;
import java.util.concurrent.atomic.AtomicBoolean;
/**
* Sample that demonstrates how to record from a Bluetooth HFP microphone using {#link AudioRecord}.
*/
public class BluetoothRecordActivity extends AppCompatActivity {
private static final String TAG = BluetoothRecordActivity.class.getCanonicalName();
private static final int SAMPLING_RATE_IN_HZ = 16000;
private static final int CHANNEL_CONFIG = AudioFormat.CHANNEL_IN_MONO;
private static final int AUDIO_FORMAT = AudioFormat.ENCODING_PCM_16BIT;
/**
* Factor by that the minimum buffer size is multiplied. The bigger the factor is the less
* likely it is that samples will be dropped, but more memory will be used. The minimum buffer
* size is determined by {#link AudioRecord#getMinBufferSize(int, int, int)} and depends on the
* recording settings.
*/
private static final int BUFFER_SIZE_FACTOR = 2;
/**
* Size of the buffer where the audio data is stored by Android
*/
private static final int BUFFER_SIZE = AudioRecord.getMinBufferSize(SAMPLING_RATE_IN_HZ,
CHANNEL_CONFIG, AUDIO_FORMAT) * BUFFER_SIZE_FACTOR;
/**
* Signals whether a recording is in progress (true) or not (false).
*/
private final AtomicBoolean recordingInProgress = new AtomicBoolean(false);
private AudioRecord recorder = null;
private AudioManager audioManager;
private Thread recordingThread = null;
private Button startButton;
private Button stopButton;
private Button bluetoothButton;
AudioTrack mAudioTrack;
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.bluetooth);
audioManager = (AudioManager) getSystemService(Context.AUDIO_SERVICE);
try {
outputBufferSize = AudioTrack.getMinBufferSize(16000,
AudioFormat.CHANNEL_IN_STEREO,
AudioFormat.ENCODING_PCM_16BIT);
mAudioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, 16000, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT, outputBufferSize, AudioTrack.MODE_STREAM);
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP) {
mAudioTrack.setVolume(100);
}
mAudioTrack.play();
} catch (Exception e) {
e.printStackTrace();
}
startButton = (Button) findViewById(R.id.btnStart);
startButton.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
startRecording();
}
});
stopButton = (Button) findViewById(R.id.btnStop);
stopButton.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
stopRecording();
}
});
bluetoothButton = (Button) findViewById(R.id.btnBluetooth);
bluetoothButton.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
activateBluetoothSco();
}
});
requestAudioPermissions();
}
int outputBufferSize;
#Override
protected void onResume() {
super.onResume();
ButtonEnableSetters();
registerReceiver(bluetoothStateReceiver, new IntentFilter(
AudioManager.ACTION_SCO_AUDIO_STATE_UPDATED));
}
private void ButtonEnableSetters() {
runOnUiThread(new Runnable() {
#Override
public void run() {
bluetoothButton.setEnabled(calculateBluetoothButtonState());
startButton.setEnabled(calculateStartRecordButtonState());
stopButton.setEnabled(calculateStopRecordButtonState());
}
});
}
#Override
protected void onPause() {
super.onPause();
stopRecording();
unregisterReceiver(bluetoothStateReceiver);
}
private void startRecording() {
// Depending on the device one might has to change the AudioSource, e.g. to DEFAULT
// or VOICE_COMMUNICATION
recorder = new AudioRecord(MediaRecorder.AudioSource.MIC,
SAMPLING_RATE_IN_HZ, CHANNEL_CONFIG, AUDIO_FORMAT, BUFFER_SIZE);
recorder.startRecording();
recordingInProgress.set(true);
try {
recordingThread = new Thread(new RecordingRunnable(), "Recording Thread");
recordingThread.start();
} catch (Exception e) {
e.printStackTrace();
}
ButtonEnableSetters();
}
private void stopRecording() {
if (null == recorder) {
return;
}
recordingInProgress.set(false);
recorder.stop();
recorder.release();
recorder = null;
recordingThread = null;
ButtonEnableSetters();
}
private void activateBluetoothSco() {
if (!audioManager.isBluetoothScoAvailableOffCall()) {
Log.e(TAG, "SCO ist not available, recording is not possible");
return;
}
if (!audioManager.isBluetoothScoOn()) {
audioManager.startBluetoothSco();
audioManager.setBluetoothScoOn(true);
}
}
private void bluetoothStateChanged(BluetoothState state) {
Log.i(TAG, "Bluetooth state changed to:" + state);
if (BluetoothState.UNAVAILABLE == state && recordingInProgress.get()) {
stopRecording();
}
ButtonEnableSetters();
}
private boolean calculateBluetoothButtonState() {
return !audioManager.isBluetoothScoOn();
}
private boolean calculateStartRecordButtonState() {
return audioManager.isBluetoothScoOn() && !recordingInProgress.get();
}
private boolean calculateStopRecordButtonState() {
return audioManager.isBluetoothScoOn() && recordingInProgress.get();
}
private class RecordingRunnable implements Runnable {
#Override
public void run() {
if (mAudioTrack != null) {
if (mAudioTrack.getPlayState() != AudioTrack.PLAYSTATE_PLAYING) {
mAudioTrack.play();
} else {
mAudioTrack.stop();
mAudioTrack.flush();
mAudioTrack.play();
}
}
// final File file = new File(Environment.getExternalStorageDirectory(), "recording.pcm");
final ByteBuffer buffer = ByteBuffer.allocateDirect(BUFFER_SIZE);
while (recordingInProgress.get()) {
int result = recorder.read(buffer, BUFFER_SIZE);
if (result 0
&& grantResults[0] == PackageManager.PERMISSION_GRANTED) {
// permission was granted, yay!
// recordAudio();
activateBluetoothSco();
startRecording();
} else {
// permission denied, boo! Disable the
// functionality that depends on this permission.
Toast.makeText(this, "Permissions Denied to record audio", Toast.LENGTH_LONG).show();
}
return;
}
}
}
private final BroadcastReceiver bluetoothStateReceiver = new BroadcastReceiver() {
private BluetoothState bluetoothState = BluetoothState.UNAVAILABLE;
#Override
public void onReceive(Context context, Intent intent) {
int state = intent.getIntExtra(AudioManager.EXTRA_SCO_AUDIO_STATE, -1);
switch (state) {
case AudioManager.SCO_AUDIO_STATE_CONNECTED:
Log.i(TAG, "Bluetooth HFP Headset is connected");
handleBluetoothStateChange(BluetoothState.AVAILABLE);
break;
case AudioManager.SCO_AUDIO_STATE_CONNECTING:
Log.i(TAG, "Bluetooth HFP Headset is connecting");
handleBluetoothStateChange(BluetoothState.UNAVAILABLE);
case AudioManager.SCO_AUDIO_STATE_DISCONNECTED:
Log.i(TAG, "Bluetooth HFP Headset is disconnected");
handleBluetoothStateChange(BluetoothState.UNAVAILABLE);
break;
case AudioManager.SCO_AUDIO_STATE_ERROR:
Log.i(TAG, "Bluetooth HFP Headset is in error state");
handleBluetoothStateChange(BluetoothState.UNAVAILABLE);
break;
}
}
private void handleBluetoothStateChange(BluetoothState state) {
if (bluetoothState == state) {
return;
}
bluetoothState = state;
bluetoothStateChanged(state);
}
};
}
This is the project source code
https://bitbucket.org/surya945/audiorecord
welcome to stackoverflow
I think your issue related to
TargetSdkVersion in build.gardle(module:app)
check this
I am trying to save an array list into a xml file in the internal storage using Xmlserializer. The app scans for BLE and when a Stop button is pressed it should save some parameters from the list of BLE that it has detected. It runs perfect and when I pressed the stop button it stops the scan and displays in the mobile screen a toast message that the xml file has been created successfully. However, when I seach for the xml file, it doesn´t appear anywhere as if it hasn´t been created. I don´t know the reason of this problem. Here is the code I have implemented:
package com.example.newblescan;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.ObjectOutputStream;
import java.io.Serializable;
import org.xmlpull.v1.XmlSerializer;
import com.example.newblescan.R;
import android.app.Activity;
import android.app.ListActivity;
import android.bluetooth.BluetoothAdapter;
import android.bluetooth.BluetoothDevice;
import android.bluetooth.BluetoothManager;
import android.content.Context;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.os.Bundle;
import android.text.format.DateFormat;
import android.util.Xml;
import android.view.Menu;
import android.view.MenuItem;
import android.view.View;
import android.widget.ListView;
import android.widget.Toast;
import com.example.newblescan.adapter.BleDevicesAdapter;
/**
* Activity for scanning and displaying available Bluetooth LE devices.
*/
public class DeviceScanActivity extends ListActivity {
private static final int REQUEST_ENABLE_BT = 1;
private static final long SCAN_PERIOD = 100;
private BleDevicesAdapter leDeviceListAdapter;
private BluetoothAdapter bluetoothAdapter;
private Scanner scanner;
private Save save;
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
getActionBar().setTitle(R.string.title_devices);
// Use this check to determine whether BLE is supported on the device. Then you can
// selectively disable BLE-related features.
if (!getPackageManager().hasSystemFeature(PackageManager.FEATURE_BLUETOOTH_LE)) {
Toast.makeText(this, R.string.ble_not_supported, Toast.LENGTH_SHORT).show();
finish();
return;
}
// Initializes a Bluetooth adapter. For API level 18 and above, get a reference to
// BluetoothAdapter through BluetoothManager.
final BluetoothManager bluetoothManager =
(BluetoothManager) getSystemService(Context.BLUETOOTH_SERVICE);
bluetoothAdapter = bluetoothManager.getAdapter();
// Checks if Bluetooth is supported on the device.
if (bluetoothAdapter == null) {
Toast.makeText(this, R.string.error_bluetooth_not_supported, Toast.LENGTH_SHORT).show();
finish();
return;
}
}
#Override
public boolean onCreateOptionsMenu(Menu menu) {
getMenuInflater().inflate(R.menu.gatt_scan, menu);
if (scanner == null || !scanner.isScanning()) {
menu.findItem(R.id.menu_stop).setVisible(false);
menu.findItem(R.id.menu_scan).setVisible(true);
menu.findItem(R.id.menu_refresh).setActionView(null);
} else {
menu.findItem(R.id.menu_stop).setVisible(true);
menu.findItem(R.id.menu_scan).setVisible(false);
menu.findItem(R.id.menu_refresh).setActionView(
R.layout.actionbar_indeterminate_progress);
}
return true;
}
#Override
public boolean onOptionsItemSelected(MenuItem item) {
switch (item.getItemId()) {
case R.id.menu_scan:
leDeviceListAdapter.clear();
if (scanner == null) {
scanner = new Scanner(bluetoothAdapter, mLeScanCallback);
scanner.startScanning();
invalidateOptionsMenu();
}
break;
case R.id.menu_stop:
if (scanner != null) {
save = new Save(leDeviceListAdapter);
scanner.stopScanning();
try {
save.savedata();
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
scanner = null;
invalidateOptionsMenu();
}
break;
}
return true;
}
#Override
protected void onResume() {
super.onResume();
// Ensures Bluetooth is enabled on the device. If Bluetooth is not currently enabled,
// fire an intent to display a dialog asking the user to grant permission to enable it.
if (!bluetoothAdapter.isEnabled()) {
final Intent enableBtIntent = new Intent(BluetoothAdapter.ACTION_REQUEST_ENABLE);
startActivityForResult(enableBtIntent, REQUEST_ENABLE_BT);
return;
}
init();
}
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
// User chose not to enable Bluetooth.
if (requestCode == REQUEST_ENABLE_BT) {
if (resultCode == Activity.RESULT_CANCELED) {
finish();
} else {
init();
}
}
super.onActivityResult(requestCode, resultCode, data);
}
#Override
protected void onPause() {
super.onPause();
if (scanner != null) {
scanner.stopScanning();
scanner = null;
}
}
#Override
protected void onListItemClick(ListView l, View v, int position, long id) {
final BluetoothDevice device = leDeviceListAdapter.getDevice(position);
if (device == null)
return;
//final Intent intent = new Intent(this, DeviceServicesActivity.class);
//intent.putExtra(DeviceServicesActivity.EXTRAS_DEVICE_NAME, device.getName());
//intent.putExtra(DeviceServicesActivity.EXTRAS_DEVICE_ADDRESS, device.getAddress());
//startActivity(intent);
}
private void init() {
if (leDeviceListAdapter == null) {
leDeviceListAdapter = new BleDevicesAdapter(getBaseContext());
setListAdapter(leDeviceListAdapter);
}
if (scanner == null) {
scanner = new Scanner(bluetoothAdapter, mLeScanCallback);
scanner.startScanning();
}
invalidateOptionsMenu();
}
// Device scan callback.
private BluetoothAdapter.LeScanCallback mLeScanCallback =
new BluetoothAdapter.LeScanCallback() {
#Override
public void onLeScan(final BluetoothDevice device, final int rssi, byte[] scanRecord) {
runOnUiThread(new Runnable() {
#Override
public void run() {
leDeviceListAdapter.addDevice(device, rssi);
leDeviceListAdapter.notifyDataSetChanged();
}
});
}
};
private static class Scanner extends Thread {
private final BluetoothAdapter bluetoothAdapter;
private final BluetoothAdapter.LeScanCallback mLeScanCallback;
private volatile boolean isScanning = false;
Scanner(BluetoothAdapter adapter, BluetoothAdapter.LeScanCallback callback) {
bluetoothAdapter = adapter;
mLeScanCallback = callback;
}
public boolean isScanning() {
return isScanning;
}
public void startScanning() {
synchronized (this) {
isScanning = true;
start();
}
}
public void stopScanning() {
synchronized (this) {
isScanning = false;
bluetoothAdapter.stopLeScan(mLeScanCallback);
}
}
#Override
public void run() {
try {
while (true) {
synchronized (this) {
if (!isScanning)
break;
bluetoothAdapter.startLeScan(mLeScanCallback);
}
sleep(SCAN_PERIOD);
synchronized (this) {
bluetoothAdapter.stopLeScan(mLeScanCallback);
}
}
} catch (InterruptedException ignore) {
} finally {
bluetoothAdapter.stopLeScan(mLeScanCallback);
}
}
}
public class Save {
/**
*
*/
private BleDevicesAdapter leDeviceListAdapter;
Save(BleDevicesAdapter BLEList) {
leDeviceListAdapter = BLEList;
}
public void savedata() throws FileNotFoundException{
String filename = "Dragon.txt";
//String date = (DateFormat.format("dd-MM-yyyy hh:mm:ss", new java.util.Date()).toString());
FileOutputStream fos = null;
int size = leDeviceListAdapter.getCount();
//Bundle extras = getIntent().getExtras();
//long timestamp = extras.getLong("currentTime");
try {
fos= openFileOutput(filename, Context.MODE_PRIVATE);
XmlSerializer serializer = Xml.newSerializer();
serializer.setOutput(fos, "UTF-8");
serializer.startDocument(null, Boolean.valueOf(true));
serializer.setFeature("http://xmlpull.org/v1/doc/features.html#indent-output", true);
serializer.startTag("", "root");
//serializer.startTag("", "timestamp");
//serializer.text(date);
//serializer.endTag("", "timestamp");
for(int j = 0 ; j < size ; j++)
{
BluetoothDevice devices = leDeviceListAdapter.getDevice(j);
//Integer signal = leDeviceListAdapter.getRSSI(j);
serializer.startTag("", "name");
serializer.text(devices.getName());
serializer.endTag("", "name");
serializer.startTag("", "address");
serializer.text(devices.getAddress());
serializer.endTag("", "address");
//serializer.startTag(null, "rssi");
//serializer.setProperty(null, signal);
//serializer.endTag(null, "rssi");
}
//ObjectOutputStream out = new ObjectOutputStream(fos);
//out.write((int) timestamp);
//out.writeObject(leDeviceListAdapter);
//out.close();
serializer.endDocument();
serializer.flush();
fos.close();
Toast.makeText(DeviceScanActivity.this, R.string.list_saved, Toast.LENGTH_SHORT).show();
} catch (FileNotFoundException e){
e.printStackTrace();
} catch (IOException e){
e.printStackTrace();
}
}
}
}
In the class Save, I implement the method savedata to save the list into a xml file.
Does anyone know what is the problem? Pleasee help!!!!
openFileOutput puts it in the /data/data/your_packagename/files directory. This directory can only be read by root and by your app- you will not be able to find the file in any other program without rooting your device. You will not be able to find it in a file explorer unless you have root access. If you want to be able to see the file, you need to write it somewhere world accessible, such as the sd card.
I am trying out the record outgoing call using mic written this code but not working, I tested the code for simple audio record it works fine, I am not sure when to start media record I putted start in broadcast receiver may be problem is there.
Here Audiorecoder is another class is created where I have implemented MediaRecoder
public void onReceive(Context context, Intent intent) {
// TODO Auto-generated method stub
audrec = new AudioRecorder("newcall");
this.context = context;
if (intent.getAction().equalsIgnoreCase(Intent.ACTION_DIAL))
{
try {
audrec.start();
recordstarted = 1;
telManager= (TelephonyManager)context.getSystemService(Context.TELEPHONY_SERVICE);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
final PhoneStateListener phoneListener = new PhoneStateListener()
{
#Override
public void onCallStateChanged(final int state, final String incomingNumber)
{
getTelephonyOverview(telManager);
}
};
telManager.listen(phoneListener, PhoneStateListener.LISTEN_CALL_STATE);
}
public void getTelephonyOverview(final TelephonyManager telManager)
{
int callState = telManager.getCallState();
switch (callState)
{
case TelephonyManager.CALL_STATE_IDLE:
{
if (recordstarted==1)
{
try {
audrec.stop();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
recordstarted =0;
}
break;
}
case TelephonyManager.CALL_STATE_OFFHOOK:
{
try {
audrec.start();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
recordstarted =1;
break;
}
case TelephonyManager.CALL_STATE_RINGING:
{
try {
audrec.start();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
recordstarted =1;
break;
}
}
}
Another Code that i am trying out that create 3Gp file but not playing
import android.app.Activity;
import android.os.Bundle;
import android.view.View;
import android.widget.Button;
public class androidrec extends Activity
{
Button btn_start;
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState); setContentView(R.layout.main);
btn_start = (Button) findViewById(R.id.btn_start);
UpdateRecorderState();
btn_start.setOnClickListener(new View.OnClickListener()
{
public void onClick(View v)
{
// Toast.makeText(getBaseContext(),"Please enter both phone number and message.",
// Toast.LENGTH_SHORT).show();
if(!SharedData._Started) { StartServicesAtStartUp.Start_CallRec(getBaseContext()); }
else { StartServicesAtStartUp.Stop_CallRec(getBaseContext()); }
UpdateRecorderState();
}
});
}
private void UpdateRecorderState()
{
if(SharedData._Started)
{btn_start.setText("Stop Recording");}
else
{btn_start.setText("Start Recording");}
}
}[/code]
import java.io.File;
import java.io.IOException;
import java.text.SimpleDateFormat;
import java.util.Date;
import android.media.MediaRecorder;
import android.telephony.PhoneStateListener;
import android.telephony.TelephonyManager;
import android.util.Log;
//import com.lumitrend.netlogger.Logger;
public class CallStateListener extends PhoneStateListener {
public void onCallStateChanged(int state, String incomingNumber)
{
super.onCallStateChanged(state, incomingNumber);
switch(state)
{
case TelephonyManager.CALL_STATE_IDLE:
if(SharedData._Recording)
{ Recorders_Stop(); }
break;
case TelephonyManager.CALL_STATE_RINGING:
break;
case TelephonyManager.CALL_STATE_OFFHOOK:
String CallDate = SanityDate();
String CallNum = SanityNum(incomingNumber);
String RootDir = SharedData._Path ;
String CallDir = SharedData._Path + CallNum + "/" ;
String CallFile = SharedData._Path + CallNum + "/" + CallNum + CallDate ;
if(!SharedData._Recording)
{
SharedData._Recording = true;
String med_state = android.os.Environment.getExternalStorageState();
if(!med_state.equals(android.os.Environment.MEDIA_MOUNTED))
{ break; }
File directory = null;
directory = new File(RootDir + "text.txt" ).getParentFile();
if (!directory.exists() && !directory.mkdirs())
{ break; }
directory = new File(CallDir + "text.txt" ).getParentFile();
if (!directory.exists() && !directory.mkdirs())
{ break; }
Recoders_Init(CallFile);
Recorder_Prepare();
}
Log.v("DEBUG", TelephonyManager.CALL_STATE_OFFHOOK + " ITS.CallRecorder - Recording Started " + state);
break;
}
}
private String SanityDate() {
SimpleDateFormat formatter = new SimpleDateFormat("yyMMdd-HHmmss");
Date currentTime_1 = new Date();
return formatter.format(currentTime_1);
}
private void Recorders_Stop() {
try {
SharedData._recorder.stop(); SharedData._recorder.reset();
//SharedData._recorder_down.stop(); SharedData._recorder_down.reset();
//SharedData._recorder_up.stop(); SharedData._recorder_up.reset();
}
catch (IllegalStateException e) {}
SharedData._Recording = false;
}
private void Recorder_Prepare() {
try {
SharedData._recorder.prepare(); SharedData._recorder.start();
//SharedData._recorder_down.prepare(); SharedData._recorder_down.start();
//SharedData._recorder_up.prepare(); SharedData._recorder_up.start();
}
catch (IllegalStateException e) { e.printStackTrace(); }
catch (IOException e) { e.printStackTrace(); }
}
private void Recoders_Init(String path) {
String _ext = ".3gp";
int out_format = MediaRecorder.OutputFormat.THREE_GPP;
SharedData._recorder.setAudioSource(SharedData._Rec_Type);
SharedData._recorder.setOutputFormat(out_format);
SharedData._recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
SharedData._recorder.setOutputFile(path + "both" + _ext);
/*
SharedData._recorder_down.setAudioSource(MediaRecorder.AudioSource.VOICE_DOWNLINK);
SharedData._recorder_down.setOutputFormat(out_format);
SharedData._recorder_down.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
SharedData._recorder_down.setOutputFile(path + "-down" + _ext);
SharedData._recorder_up.setAudioSource(MediaRecorder.AudioSource.VOICE_UPLINK);
SharedData._recorder_up.setOutputFormat(out_format);
SharedData._recorder_up.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
SharedData._recorder_up.setOutputFile(path + "-up" + _ext);
*/
}
private String SanityNum(String numstr)
{
String out = "";
for(char ch : numstr.toCharArray())
{
switch(ch)
{
case ' ':
break;
case '~':
break;
case '!':
break;
case '#':
break;
case '#':
break;
case '$':
break;
case '%':
break;
case '^':
break;
case '&':
break;
case '*':
break;
case '(':
break;
case ')':
break;
case '-':
break;
case '_':
break;
case '=':
break;
case '|':
break;
default:
out = out + ch;
}
}
return out;
}
}
import android.media.MediaRecorder;
final public class SharedData
{
static int _Rec_Type = android.media.MediaRecorder.AudioSource.VOICE_CALL;
static String _Path = android.os.Environment.getExternalStorageDirectory().getAbsolutePath() + "/ITS-CallRecorder/";
static boolean _Started = false;
static boolean _AutoStart = true;
static boolean _Recording = false;
static MediaRecorder _recorder = new MediaRecorder();
static MediaRecorder _recorder_down = new MediaRecorder();
static MediaRecorder _recorder_up = new MediaRecorder();
SharedData() { }
}
import android.content.BroadcastReceiver;
import android.content.Context;
import android.content.Intent;
import android.telephony.PhoneStateListener;
import android.telephony.TelephonyManager;
import android.util.Log;
import android.widget.Toast;
public class StartServicesAtStartUp extends BroadcastReceiver
{
public static Intent phoneStateListener;
public void onReceive(Context context, Intent intent)
{
Log.d("DEBUG", "com.its.CallRecorder Initiated ...");
Start_CallRec(context);
}
public static void Start_CallRec(Context context)
{
if(!SharedData._Started )
{
if(SharedData._AutoStart)
{
phoneStateListener = new Intent(context, CallStateListener.class);
phoneStateListener.setFlags(Intent.FLAG_ACTIVITY_NEW_TASK);
context.startService(phoneStateListener);
Log.d("DEBUG", "com.its.CallRecorder Call Recorder Started ...");
TelephonyManager tManager = (TelephonyManager) context.getSystemService(Context.TELEPHONY_SERVICE);
CallStateListener callStateListener = new CallStateListener();
tManager.listen(callStateListener,PhoneStateListener.LISTEN_CALL_STATE);
SharedData._Started = true;
Toast.makeText(context," Call Recording Started ... ", Toast.LENGTH_SHORT).show();
}
}
else
{
Toast.makeText(context," Call Recording Already Active.. ", Toast.LENGTH_SHORT).show();
}
}
public static void Stop_CallRec(Context context)
{
if(SharedData._Started )
{
context.stopService(phoneStateListener);
Toast.makeText(context," Call Recording Stopped ... ", Toast.LENGTH_SHORT).show();
SharedData._Started = false;
}
else
{
Toast.makeText(context," Call Recording Already Stopped ... ", Toast.LENGTH_SHORT).show();
}
}
}
You cannot record calls because the firmware doesn't support it . There is a better answer at xda-devs which I got from android's open issues list :
The voice streams were handled by the baseband processor
baseband processor, it's that the
baseband firmware aren't setup to
offer the streams to the application
processor that's limiting the ability
to truly record a call. The Android
system long has the API implemented,
but it can do nothing in this case.
Since the baseband firmware is close
sourced and available in binary only,
I doubt if the brilliant hackers here
can do anything about this.
someone found a "cure" for HD2 , just
by editing the registry -
xda-developers.com/windows-mobile/two-way-in-call-recording-on-hd2-fixed/wo-way-in-call-recording-on-hd2-fixed/
Call recording is not yet possible on Android. See this feature request.
You can record your voice from microphone, but you can not record the sound of the other party. If you only want to record your voice use android.media.MediaRecorder.AudioSource.MIC