Is it possible to control playback of the Spotify app from within another Android app? I'm only looking for track skipping functionality (forward and backward).
I'm aware of the Spotify Android SDK, but it seems to only allow skipping of tracks played by the SDK:
com.spotify.sdk.android.playback.NativeSpotifyException: Failed SpPlaybackSkipToPrev with error code 14 (The operation is not supported if the device is not the active playback device)
To clarify, both the actual Spotify app, and my own app are running on the same device
Here's how to do it:
This will try to play/pause Spotify. If it's not running it will start it and make it start playing.
public void nextSong() {
int keyCode = KeyEvent.KEYCODE_MEDIA_NEXT;
if (!isSpotifyRunning()) {
startMusicPlayer();
}
Intent intent = new Intent(Intent.ACTION_MEDIA_BUTTON);
intent.setPackage("com.spotify.music");
synchronized (this) {
intent.putExtra(Intent.EXTRA_KEY_EVENT, new KeyEvent(KeyEvent.ACTION_DOWN, keyCode));
getContext().sendOrderedBroadcast(intent, null);
intent.putExtra(Intent.EXTRA_KEY_EVENT, new KeyEvent(KeyEvent.ACTION_UP, keyCode));
getContext().sendOrderedBroadcast(intent, null);
}
}
public void playPauseMusic() {
int keyCode = KeyEvent.KEYCODE_MEDIA_PLAY_PAUSE;
if (!mAudioManager.isMusicActive() && !isSpotifyRunning()) {
startMusicPlayer();
}
Intent i = new Intent(Intent.ACTION_MEDIA_BUTTON);
i.setPackage("com.spotify.music");
synchronized (this) {
i.putExtra(Intent.EXTRA_KEY_EVENT, new KeyEvent(KeyEvent.ACTION_DOWN, keyCode));
getContext().sendOrderedBroadcast(i, null);
i.putExtra(Intent.EXTRA_KEY_EVENT, new KeyEvent(KeyEvent.ACTION_UP, keyCode));
getContext().sendOrderedBroadcast(i, null);
}
}
private void startMusicPlayer() {
Intent startPlayer = new Intent(Intent.ACTION_MAIN);
startPlayer.setPackage("com.spotify.music");
startPlayer.setFlags(Intent.FLAG_ACTIVITY_NEW_TASK);
getContext().startActivity(startPlayer);
if (mMusicPlayerStartTimer != null) {
mMusicPlayerStartTimer.cancel();
}
mMusicPlayerStartTimer = new Timer("MusicPlayerStartTimer", true);
mMusicPlayerStartTimer.schedule(new MusicPlayerStartTimerTask(), DateUtils.SECOND_IN_MILLIS, DateUtils.SECOND_IN_MILLIS);
}
private boolean isSpotifyRunning() {
Process ps = null;
try {
String[] cmd = {
"sh",
"-c",
"ps | grep com.spotify.music"
};
ps = Runtime.getRuntime().exec(cmd);
ps.waitFor();
return ps.exitValue() == 0;
} catch (IOException e) {
Log.e(DEBUG_TAG, "Could not execute ps", e);
} catch (InterruptedException e) {
Log.e(DEBUG_TAG, "Could not execute ps", e);
} finally {
if (ps != null) {
ps.destroy();
}
}
return false;
}
private class MusicPlayerStartTimerTask extends TimerTask {
#Override
public void run() {
if (isSpotifyRunning()) {
playPauseMusic(null);
cancel();
}
}
}
EDIT: Added full example code
Yes, you can control playback using the RemoteController classes, or if using Lollipop, the MediaController classes, or if supporting L and earlier, then the MediaControllerCompat classes.
Then perform dispatchMediaButtonEvent() with KEYCODE_MEDIA_NEXT.
Quick answer - No, this isn't possible.
Related
I have one midi file and I have played that midi file using MediaPlayer in android using the following code:
val mMediaPlayer = MediaPlayer.create(context, R.raw.test_ring_1)
mMediaPlayer?.start()
It default play with one instrument like piano, now I want to add soundfont (sf2/sf3) file to play the midi notes with different instrument and with reverberation effects.
Please guide a way to achieve expected result.
There are two libraries that will be used to play a midi file using SoundFont.
Midi Driver
Just a synthesizer for playing MIDI note on Android. You can use it with USB/Bluetooth-MIDI library together to create your MIDI application.
SoundFont2 file is supported.
Android MIDI Library
This library provides an interface to read, manipulate, and write MIDI files. "Playback" is supported as a real-time event dispatch system. This library does NOT include actual audio playback or device interfacing.
To initialize SF2-SoundBank
SF2Soundbank sf = new SF2Soundbank(getAssets().open("test.sf2"));
synth = new SoftSynthesizer();
synth.open();
synth.loadAllInstruments(sf);
synth.getChannels()[0].programChange(0);
synth.getChannels()[1].programChange(1);
recv = synth.getReceiver();
To Play the Midi notes from midi file
MidiFile midiFile = new MidiFile(getAssets().open("test.mid"));
// Create a new MidiProcessor:
MidiProcessor processor = new MidiProcessor(midiFile);
// listen for all midi events:
processor.registerEventListener(new MidiEventListener() {
#Override
public void onStart(boolean fromBeginning) {
}
#Override
public void onEvent(MidiEvent event, long ms) {
if (event.getClass() == NoteOn.class) {
NoteOn noteOn = ((NoteOn) event);
try {
ShortMessage msg = new ShortMessage();
msg.setMessage(ShortMessage.NOTE_ON, channel, noteOn.getNoteValue(), noteOn.getVelocity());
recv.send(msg, ms);
} catch (InvalidMidiDataException e) {
e.printStackTrace();
}
} else if (event.getClass() == NoteOff.class) {
NoteOff noteOff = ((NoteOff) event);
try {
ShortMessage msg = new ShortMessage();
msg.setMessage(ShortMessage.NOTE_ON, channel, noteOff.getNoteValue(), noteOff.getVelocity());
recv.send(msg, ms);
} catch (InvalidMidiDataException e) {
e.printStackTrace();
}
}
}
#Override
public void onStop(boolean finished) {
}
}, MidiEvent.class);
// Start the processor:
processor.start();
Variable to define SF channel
private int channel = 0;
I have tested this it is working
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
try {
SF2Soundbank sf = new SF2Soundbank(getAssets().open("SmallTimGM6mb.sf2"));
synth = new SoftSynthesizer();
synth.open();
synth.loadAllInstruments(sf);
synth.getChannels()[0].programChange(0);
synth.getChannels()[1].programChange(1);
recv = synth.getReceiver();
} catch (IOException e) {
e.printStackTrace();
} catch (MidiUnavailableException e) {
e.printStackTrace();
}
this.findViewById(R.id.piano).setOnTouchListener(new View.OnTouchListener() {
#Override
public boolean onTouch(View v, MotionEvent event) {
int action = MotionEventCompat.getActionMasked(event);
if (action == MotionEvent.ACTION_DOWN) {
try {
ShortMessage msg = new ShortMessage();
msg.setMessage(ShortMessage.NOTE_ON, 0, 60, 127);
recv.send(msg, -1);
} catch (InvalidMidiDataException e) {
e.printStackTrace();
}
} else if (action == MotionEvent.ACTION_UP || action == MotionEvent.ACTION_CANCEL) {
try {
ShortMessage msg = new ShortMessage();
msg.setMessage(ShortMessage.NOTE_OFF, 0, 60, 127);
recv.send(msg, -1);
} catch (InvalidMidiDataException e) {
e.printStackTrace();
}
}
return true;
}
});
this.findViewById(R.id.woodblock).setOnTouchListener(new View.OnTouchListener() {
#Override
public boolean onTouch(View v, MotionEvent event) {
int action = MotionEventCompat.getActionMasked(event);
if (action == MotionEvent.ACTION_DOWN) {
try {
ShortMessage msg = new ShortMessage();
msg.setMessage(ShortMessage.NOTE_ON, 1, 60, 127);
recv.send(msg, -1);
} catch (InvalidMidiDataException e) {
e.printStackTrace();
}
} else if (action == MotionEvent.ACTION_UP || action == MotionEvent.ACTION_CANCEL) {
try {
ShortMessage msg = new ShortMessage();
msg.setMessage(ShortMessage.NOTE_OFF, 1, 60, 127);
recv.send(msg, -1);
} catch (InvalidMidiDataException e) {
e.printStackTrace();
}
}
return true;
}
});
}
Dont forget to include sherlockmidi library from below repository, sample is also available in below repository.
https://github.com/agangzz/SherlockMidi
I am using the example code from IBM's github for Speech To Text but this line is giving me problems. android studio throws an error saying that i don't need the "capture" argument but when i remove it, i get an error when i run it that the audio cannot be null.
speechService.recognizeUsingWebSocket(capture, getRecognizeOptions(), new MicrophoneRecognizeDelegate());
it is used in this part
private void recordMessage() {
//mic.setEnabled(false);
speechService = new SpeechToText();
speechService.setUsernameAndPassword(STT_username, STT_password);
speechService.setEndPoint("https://stream.watsonplatform.net/speech-to-text/api");
if(listening != true) {
capture = microphoneHelper.getInputStream(true);
InputStream myInputStream = new MicrophoneInputStream(true);
new Thread(new Runnable() {
#Override public void run() {
try {
speechService.recognizeUsingWebSocket(capture, getRecognizeOptions(), new MicrophoneRecognizeDelegate());
} catch (Exception e) {
showError(e);
}
}
}).start();
listening = true;
Toast.makeText(MainActivity.this,"Listening....Click to Stop", Toast.LENGTH_LONG).show();
} else {
try {
microphoneHelper.closeInputStream();
listening = false;
Toast.makeText(MainActivity.this,"Stopped Listening....Click to Start", Toast.LENGTH_LONG).show();
} catch (Exception e) {
e.printStackTrace();
}
}
}
This is a very late answer but just in case anyone needs this ..
update your call to :
speechService.recognizeUsingWebSocket(getRecognizeOptions(capture),new MicrophoneRecognizeDelegate());
I want to somehow immitate the recent apps hard button click which opens up the native android recent apps screen.
I do following currently:
boolean success = showRecents1(c);
if (!success) {
showRecents2(c);
}
This works on a lot of devices, but on the android oreo emulator it does not work. Does anyone know a solution that works on android oreo as well?
private boolean showRecents1(Context c) {
try {
Intent intent = new Intent("com.android.systemui.recent.action.TOGGLE_RECENTS");
intent.setComponent(new ComponentName("com.android.systemui", "com.android.systemui.recent.RecentsActivity"));
c.startActivity(intent);
return true;
} catch (Exception e) {
L.e(e);
}
return false;
}
private boolean showRecents2(Context c) {
try {
Class serviceManagerClass = Class.forName("android.os.ServiceManager");
Method getService = serviceManagerClass.getMethod("getService", String.class);
IBinder retbinder = (IBinder) getService.invoke(serviceManagerClass, "statusbar");
Class statusBarClass = Class.forName(retbinder.getInterfaceDescriptor());
Object statusBarObject = statusBarClass.getClasses()[0].getMethod("asInterface", IBinder.class).invoke(null, new Object[]{retbinder});
Method clearAll = statusBarClass.getMethod("toggleRecentApps");
clearAll.setAccessible(true);
clearAll.invoke(statusBarObject);
return true;
} catch (Exception e) {
L.e(e);
}
return false;
}
I am developing an app in which we need to use the headphone jack as a button only.
Requirement : Play the default audio (calling) via earpiece when headsets are connected (no need of audio through headphones)
There are many example of routing audio through speaker and headphones and also bluetooth headsets but nothing about routing the audio through ear speakers of devices if headsets are connected.
I have tried a lot and some links are
Android : Force audio routing (not working in my scenario)
I have checked SoundAbout(https://play.google.com/store/apps/details?id=com.woodslink.android.wiredheadphoneroutingfix&hl=en)
app and it is routing the audio to various port like headset, speakers and earpieces.
I have got audio to speakers if headsets are connected:
Here is my code
if (Build.VERSION.SDK_INT >= 21) {
ForegroundService.audioManager.setMode(AudioManager.MODE_IN_COMMUNICATION);
ForegroundService.audioManager.setSpeakerphoneOn(true);
SplashScreen.preferences.edit().putBoolean("isKey", true).commit();
} else {
Class audioSystemClass = null;
try {
audioSystemClass = Class.forName("android.media.AudioSystem");
Method setForceUse = audioSystemClass.getMethod("setForceUse", int.class, int.class);
setForceUse.invoke(null, FOR_MEDIA, FORCE_SPEAKER);
} catch (ClassNotFoundException e) {
e.printStackTrace();
} catch (NoSuchMethodException e) {
e.printStackTrace();
} catch (IllegalAccessException e) {
e.printStackTrace();
} catch (InvocationTargetException e) {
e.printStackTrace();
}
SplashScreen.preferences.edit().putBoolean("isKey", true).commit();
ForegroundService.audioManager.setSpeakerphoneOn(true);
}
The earpiece is never used for media in Android, and it can only be used if the phone is in "call" or "communication" (VoIP) state.
I guess you have noticed that there is no "FORCE_EARPIECE" constant, so it can't be specified in a call to setForceUse.
Also, the earpiece has the lowest priority in output device selection for calls, so if the phone has anything connected to it (and in your case there is a fake headset), that device will be selected (see https://android.googlesource.com/platform/frameworks/av/+/322b4d2/services/audiopolicy/enginedefault/src/Engine.cpp#381).
Sorry, it doesn't seem to be possible to achieve what you intend.
UPDATE
After examining media.audio_policy state while SoundAbout is enforcing the use of earpiece for media, I have discovered the following tricks that this app uses:
It calls AudioSystem.setPhoneState(MODE_IN_COMMUNICATION) for enforcing "communication" phone state (usually used for VoIP calls).
If a headset (or headphones) is connected, in order to prevent the sound to be routed to it due to higher priority, the app calls AudioSystem.setDeviceConnectionState(DEVICE_OUT_WIRED_HEADSET, DEVICE_STATE_UNAVAILABLE, ...) to trick Audio Manager to believe that there is no headset.
These are all hacks and require the app to monitor the phone state closely. It also doesn't work all the time.
Another drawback is that using earpiece disable on-chip audio decompression and thus has higher battery use.
In general, I wouldn't recommend using these techniques.
After researching a lot, I found it out that there is not any way to achieve this funtionality without using reflection.
First you need to put headset jack in and then call the method setWiredDeviceConnectionState() with suitable parameters then it behave like the headphone are disconnected but click works still.
So it is a hack but as per my requirement, it's not a foolproof solution but working for now.
Here is my code to do this,
private void sendIntent(Intent i) {
Method m;
Log.i(TAG, "Device sdk = " + Build.VERSION.SDK_INT);
try {
if (Build.VERSION.SDK_INT < 16) {
Class<?> clazz = Class.forName("android.app.ActivityManagerNative");
m = clazz.getMethod("broadcastStickyIntent", Intent.class, String.class);
m.setAccessible(true);
m.invoke(clazz, i, null);
return;
} else if (Build.VERSION.SDK_INT < 23) {
//int type, int state, String address, String name
m = am.getClass().getMethod("setWiredDeviceConnectionState", Integer.TYPE, Integer.TYPE, String.class);
m.setAccessible(true);
Object[] objArr = new Object[3];
objArr[0] = (i.getIntExtra("microphone", 0) == 0) ? 8 : 4;
objArr[1] = i.getIntExtra("state", 0);
objArr[2] = i.getStringExtra("name");
m.invoke(am, objArr);
} else {
//int type, int state, String address, String name
m = am.getClass().getMethod("setWiredDeviceConnectionState", Integer.TYPE, Integer.TYPE, String.class, String.class);
m.setAccessible(true);
Object[] objArr = new Object[4];
objArr[0] = (i.getIntExtra("microphone", 0) == 0) ? 8 : 4;
objArr[1] = i.getIntExtra("state", 0);
objArr[2] = i.getStringExtra("address");
objArr[3] = i.getStringExtra("name");
m.invoke(am, objArr);
}
} catch (ClassNotFoundException e) {
e.printStackTrace();
} catch (NoSuchMethodException e) {
e.printStackTrace();
} catch (InvocationTargetException e) {
e.printStackTrace();
} catch (IllegalAccessException e) {
e.printStackTrace();
}
}
the intent to send :
#TargetApi(Build.VERSION_CODES.M)
public class HeadSetJackReciever extends AudioDeviceCallback {
public static boolean isAudioChecked;
public void onAudioDevicesAdded(AudioDeviceInfo[] addedDevices) {
if (addedDevices.length != 0) {
for (int i = 0; i < addedDevices.length; i++) {
if (addedDevices[i].getType() == AudioDeviceInfo.TYPE_WIRED_HEADSET) {
AudioDeviceInfo audioDeviceInfo = addedDevices[i];
int microphone = audioDeviceInfo.getType();
String headsetName = "DCS";
String headsetAddress = "";
try {
Method method = audioDeviceInfo.getClass().getMethod("getAddress");
method.setAccessible(true);
headsetAddress = (String) method.invoke(audioDeviceInfo);
} catch (NoSuchMethodException e) {
e.printStackTrace();
} catch (InvocationTargetException e) {
e.printStackTrace();
} catch (IllegalAccessException e) {
e.printStackTrace();
}
Log.e("TEST", "microphone:"+microphone);
Log.e("TEST", "headsetName:"+headsetName);
Log.e("TEST", "headsetAddress:"+headsetAddress );
Intent intent = new Intent(ForegroundService.context, SelectAudioOutput.class);
intent.addFlags(Intent.FLAG_ACTIVITY_NEW_TASK);
intent.putExtra("microphone",microphone);
intent.putExtra("headsetName",headsetName);
intent.putExtra("headsetAddress",headsetAddress);
ForegroundService.context.startActivity(intent);
}
}
}
}
public void onAudioDevicesRemoved(AudioDeviceInfo[] removedDevices) {
if (removedDevices.length != 0) {
Log.e("TEST", "Audio deinserted");
if (SplashScreen.preferences.getBoolean("isKey", false)) {
Intent startIntent = new Intent(ForegroundService.context, ForegroundService.class);
startIntent.setAction(Constants.ACTION.STARTNOTIFICATION_ACTION);
ForegroundService.context.startService(startIntent);
} else {
Intent startIntent = new Intent(ForegroundService.context, ForegroundService.class);
startIntent.setAction(Constants.ACTION.STOPNOTIFICATION_ACTION);
ForegroundService.context.startService(startIntent);
}
ForegroundService.audioManager.setMode(AudioManager.MODE_IN_CALL);
ForegroundService.audioManager.setSpeakerphoneOn(false);
}
}
}
for Lollipop and lower versions :
if (intent.getAction().equals(Intent.ACTION_HEADSET_PLUG)) {
headsetName = intent.getStringExtra("name");
microphone = intent.getIntExtra("microphone", 0);
int state = intent.getIntExtra("state", -1);
switch (state) {
case 0:
Log.d("onReceive", "Headset unplugged");
Log.e("TEST", "Audio deinserted");
if (SplashScreen.preferences.getBoolean("isKey", false)) {
Intent startIntent = new Intent(ForegroundService.context, ForegroundService.class);
startIntent.setAction(Constants.ACTION.STARTNOTIFICATION_ACTION);
context.startService(startIntent);
} else {
Intent startIntent = new Intent(ForegroundService.context, ForegroundService.class);
startIntent.setAction(Constants.ACTION.STOPNOTIFICATION_ACTION);
context.startService(startIntent);
}
ForegroundService.audioManager.setMode(AudioManager.MODE_IN_CALL);
ForegroundService.audioManager.setSpeakerphoneOn(false);
break;
case 1:
Log.d("onReceive", "Headset plugged");
Log.e("TEST", "microphone:"+microphone);
Log.e("TEST", "headsetName:"+headsetName);
Log.e("TEST", "headsetAddress:"+headsetAddress );
Intent intentone = new Intent(ForegroundService.context, SelectAudioOutput.class);
intentone.addFlags(Intent.FLAG_ACTIVITY_NEW_TASK);
intentone.putExtra("microphone",microphone);
intentone.putExtra("headsetName",headsetName);
intentone.putExtra("headsetAddress",headsetAddress);
context.startActivity(intentone);
break;
}
}
Let me know if I miss something.
Thanks.
How to implement SIP protocol in Android ?
there is any SDK or library available to implement it easily into Android?
Here is a third party Library with sample code. You can use this, I have used it and it works fine.
Android 2.3 or higher provides API for SIP.
Refer this link for SIP in Android
also you can see DEMO project for SIP from Sample
update:
Android SDK Samples on github.
SipDemo1, SipDemo2
Search for SipDemo project in samples for android 4.0.3 SDK version(API level -15)
I have been investigated this sort of problem for a long time and found out that SipManager and SipProfile are unfortunatelly poor and extremelly buggy.
So I found a Linphone library. There is a link for their wiki. I implemented it in my project using maven:
repositories {
...
maven { "https://linphone.org/maven_repository/"}
}
Also there is a sample of using it on gitlab: link here, it's pretty fresh, for now :)
If the link would crash, I just copy/paste the most important part of how to use linphone's core:
public class LinphoneService extends Service {
private static final String START_LINPHONE_LOGS = " ==== Device information dump ====";
// Keep a static reference to the Service so we can access it from anywhere in the app
private static LinphoneService sInstance;
private Handler mHandler;
private Timer mTimer;
private Core mCore;
private CoreListenerStub mCoreListener;
public static boolean isReady() {
return sInstance != null;
}
public static LinphoneService getInstance() {
return sInstance;
}
public static Core getCore() {
return sInstance.mCore;
}
#Nullable
#Override
public IBinder onBind(Intent intent) {
return null;
}
#Override
public void onCreate() {
super.onCreate();
// The first call to liblinphone SDK MUST BE to a Factory method
// So let's enable the library debug logs & log collection
String basePath = getFilesDir().getAbsolutePath();
Factory.instance().setLogCollectionPath(basePath);
Factory.instance().enableLogCollection(LogCollectionState.Enabled);
Factory.instance().setDebugMode(true, getString(R.string.app_name));
// Dump some useful information about the device we're running on
Log.i(START_LINPHONE_LOGS);
dumpDeviceInformation();
dumpInstalledLinphoneInformation();
mHandler = new Handler();
// This will be our main Core listener, it will change activities depending on events
mCoreListener = new CoreListenerStub() {
#Override
public void onCallStateChanged(Core core, Call call, Call.State state, String message) {
Toast.makeText(LinphoneService.this, message, Toast.LENGTH_SHORT).show();
if (state == Call.State.IncomingReceived) {
Toast.makeText(LinphoneService.this, "Incoming call received, answering it automatically", Toast.LENGTH_LONG).show();
// For this sample we will automatically answer incoming calls
CallParams params = getCore().createCallParams(call);
params.enableVideo(true);
call.acceptWithParams(params);
} else if (state == Call.State.Connected) {
// This stats means the call has been established, let's start the call activity
Intent intent = new Intent(LinphoneService.this, CallActivity.class);
// As it is the Service that is starting the activity, we have to give this flag
intent.addFlags(Intent.FLAG_ACTIVITY_NEW_TASK);
startActivity(intent);
}
}
};
try {
// Let's copy some RAW resources to the device
// The default config file must only be installed once (the first time)
copyIfNotExist(R.raw.linphonerc_default, basePath + "/.linphonerc");
// The factory config is used to override any other setting, let's copy it each time
copyFromPackage(R.raw.linphonerc_factory, "linphonerc");
} catch (IOException ioe) {
Log.e(ioe);
}
// Create the Core and add our listener
mCore = Factory.instance()
.createCore(basePath + "/.linphonerc", basePath + "/linphonerc", this);
mCore.addListener(mCoreListener);
// Core is ready to be configured
configureCore();
}
#Override
public int onStartCommand(Intent intent, int flags, int startId) {
super.onStartCommand(intent, flags, startId);
// If our Service is already running, no need to continue
if (sInstance != null) {
return START_STICKY;
}
// Our Service has been started, we can keep our reference on it
// From now one the Launcher will be able to call onServiceReady()
sInstance = this;
// Core must be started after being created and configured
mCore.start();
// We also MUST call the iterate() method of the Core on a regular basis
TimerTask lTask =
new TimerTask() {
#Override
public void run() {
mHandler.post(
new Runnable() {
#Override
public void run() {
if (mCore != null) {
mCore.iterate();
}
}
});
}
};
mTimer = new Timer("Linphone scheduler");
mTimer.schedule(lTask, 0, 20);
return START_STICKY;
}
#Override
public void onDestroy() {
mCore.removeListener(mCoreListener);
mTimer.cancel();
mCore.stop();
// A stopped Core can be started again
// To ensure resources are freed, we must ensure it will be garbage collected
mCore = null;
// Don't forget to free the singleton as well
sInstance = null;
super.onDestroy();
}
#Override
public void onTaskRemoved(Intent rootIntent) {
// For this sample we will kill the Service at the same time we kill the app
stopSelf();
super.onTaskRemoved(rootIntent);
}
private void configureCore() {
// We will create a directory for user signed certificates if needed
String basePath = getFilesDir().getAbsolutePath();
String userCerts = basePath + "/user-certs";
File f = new File(userCerts);
if (!f.exists()) {
if (!f.mkdir()) {
Log.e(userCerts + " can't be created.");
}
}
mCore.setUserCertificatesPath(userCerts);
}
private void dumpDeviceInformation() {
StringBuilder sb = new StringBuilder();
sb.append("DEVICE=").append(Build.DEVICE).append("\n");
sb.append("MODEL=").append(Build.MODEL).append("\n");
sb.append("MANUFACTURER=").append(Build.MANUFACTURER).append("\n");
sb.append("SDK=").append(Build.VERSION.SDK_INT).append("\n");
sb.append("Supported ABIs=");
for (String abi : Version.getCpuAbis()) {
sb.append(abi).append(", ");
}
sb.append("\n");
Log.i(sb.toString());
}
private void dumpInstalledLinphoneInformation() {
PackageInfo info = null;
try {
info = getPackageManager().getPackageInfo(getPackageName(), 0);
} catch (PackageManager.NameNotFoundException nnfe) {
Log.e(nnfe);
}
if (info != null) {
Log.i(
"[Service] Linphone version is ",
info.versionName + " (" + info.versionCode + ")");
} else {
Log.i("[Service] Linphone version is unknown");
}
}
private void copyIfNotExist(int ressourceId, String target) throws IOException {
File lFileToCopy = new File(target);
if (!lFileToCopy.exists()) {
copyFromPackage(ressourceId, lFileToCopy.getName());
}
}
private void copyFromPackage(int ressourceId, String target) throws IOException {
FileOutputStream lOutputStream = openFileOutput(target, 0);
InputStream lInputStream = getResources().openRawResource(ressourceId);
int readByte;
byte[] buff = new byte[8048];
while ((readByte = lInputStream.read(buff)) != -1) {
lOutputStream.write(buff, 0, readByte);
}
lOutputStream.flush();
lOutputStream.close();
lInputStream.close();
}
}
I hope, that will help somebody, because I spend a lot of time trying to find it!
I used by this library:
https://www.mizu-voip.com/Software/SIPSDK/AndroidSIPSDK.aspx
it is very easy.
also i add button for answer the call:
mysipclient.Accept(mysipclient.GetLine());