Hej, im currently trying to get AudioRecord to work. Because I need it in a bigger project. But it seems to mess up a lot.
I have been trying alot of things, so I went back to basic when I traced this bug.
I am using my Samsung Galaxy S as my debugdevice.
My problem is, first time after a reboot of my device I can initialize the AudioRecord object I create without problems.
But the second time I run it, it won't initialize the AudioRecord object.
I have tried several frequencies, fyi.
Here is my code:
package android.audiorecordtest;
import android.app.Activity;
import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.MediaRecorder;
import android.os.Bundle;
import android.view.View;
import android.view.View.OnClickListener;
import android.widget.Button;
import android.widget.TextView;
public class AudioRecordTest extends Activity {
int frequency;
AudioRecord audRec;
TextView txtVw;
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
txtVw = (TextView) findViewById(R.id.txtVw);
frequency=8000;
int bufferSize=(AudioRecord.getMinBufferSize(frequency, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT))*2;
if (bufferSize>0) {
audRec = new AudioRecord(MediaRecorder.AudioSource.MIC, frequency, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);
int status = audRec.getState();
if (status == AudioRecord.STATE_INITIALIZED) {
txtVw.setText("Initialized" + frequency);
} else {
txtVw.setText("Not Initialized i=" + frequency);
}
}
After a few hours of looking through logcat information i found this event
02-28 10:46:37.048: DEBUG/dalvikvm(4477): GC_EXPLICIT freed 1801 objects / 98944 bytes in 97ms
02-28 10:46:37.048: VERBOSE/AudioRecord(4477): stop
Which seems to "release the native hold on the AudioRecord.
So i tried doing an override of finalize with my Audiorecord object.release(). This didnt work though.. Anyone have any idea?
I was able to reproduce your problem (on a Samsung phone). I added an onDestroy() method releasing the record:
#Override
public void onDestroy() {
super.onDestroy();
System.out.println("OnDestroy");
audRec.release();
}
After adding this, the audioRecord seems to initialize correctly every time the activity is started.
I had the same problem, usually the audRec.release() helps indeed, but if you need to stop and start several times the following code is more robust. Plus, I had an issue that the recording took place in a separate thread and Android sometimes kills threads when running for a long time. So take a look at this code, it makes sure the recording is held even when the other thread is dead and upon the following audRec.start() it stops and releases:
import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.MediaRecorder;
public class RecorderSingleton {
private static final int FREQUENCY = 16000;
public static RecorderSingleton instance = new RecorderSingleton();
private AudioRecord recordInstance = null;
private int bufferSize;
private RecorderSingleton() {
bufferSize = AudioRecord.getMinBufferSize(FREQUENCY, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
}
public boolean init() {
recordInstance = new AudioRecord(MediaRecorder.AudioSource.MIC, FREQUENCY, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);
if (recordInstance.getState() == AudioRecord.STATE_UNINITIALIZED) {
return false;
}
return true;
}
public int getBufferSize() {
return bufferSize;
}
public boolean start() {
if (recordInstance != null && recordInstance.getState() != AudioRecord.STATE_UNINITIALIZED) {
if (recordInstance.getRecordingState() != AudioRecord.RECORDSTATE_STOPPED) {
recordInstance.stop();
}
recordInstance.release();
}
if (!init()) {
return false;
}
recordInstance.startRecording();
return true;
}
public int read(short[] tempBuffer) {
if (recordInstance == null) {
return AudioRecord.ERROR_INVALID_OPERATION;
}
int ret = recordInstance.read(tempBuffer, 0, bufferSize);
return ret;
}
public void stop() {
if (recordInstance == null) {
return;
}
recordInstance.stop();
recordInstance.release();
}
}
Then if you have a recorder thread you can use it as follows:
import android.media.AudioRecord;
public class Recorder implements Runnable {
private int requiredSamples;
private int takenSamples = 0;
private boolean cancelled = false;
public void run() {
// We're important...
android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
int bufferRead = 0;
int bufferSize = RecorderSingleton.instance.getBufferSize();
short[] tempBuffer = new short[bufferSize];
if (!RecorderSingleton.instance.start()) {
return;
}
try {
Log.d(RoomieConstants.LOG_TAG, "Recorder Started");
while (takenSamples < requiredSamples && !cancelled) {
bufferRead = RecorderSingleton.instance.read(tempBuffer);
if (bufferRead == AudioRecord.ERROR_INVALID_OPERATION) {
throw new IllegalStateException("read() returned AudioRecord.ERROR_INVALID_OPERATION");
} else if (bufferRead == AudioRecord.ERROR_BAD_VALUE) {
throw new IllegalStateException("read() returned AudioRecord.ERROR_BAD_VALUE");
}
takenSamples += bufferRead;
// do something with the samples ...
// ...
// ...
}
} finally {
// Close resources...
stop();
}
}
public void stop() {
RecorderSingleton.instance.stop();
}
public void cancel() {
cancelled = true;
}
}
To Answer my own question, the only way i found it doable to use AudioRecord, is to never have it as an global variable, dont know why, but it seems it won't let you release the resources of the instance correctly if you do so.
You should try to call audRec.stop() to release the resource.
My AudioRecord didn't initialize because it was static
Related
I'm working on a very small Android Project that uses this exact code from github.
However, when I (or you) intermittently button mash the start/stop button... the app eventually crashes. Unfortunately this can take a little while to reproduce... but it will happen!
Oh, I forgot the desired result!!
The desired result is that this crash does not occur. :)
Does anyone know why this crash occurs? The author of this code has had an open bug/issue for this on Github since March of 2013... so I'm pretty sure it's not a particularly stupid question... and if you do know the answer to this, you would no doubt be a hailed as a bowss.
I have been dissecting the code, print debugging, and researching ASyncTask, Handlers, and AudioTrack for a couple of days now but I can't figure it out... I will though if nobody else beats me to it.
This is the stack trace:
E/AndroidRuntime: FATAL EXCEPTION: AsyncTask #4
Process: com.example.boober.beatkeeper, PID: 15664
java.lang.RuntimeException: An error occurred while executing doInBackground()
at android.os.AsyncTask$3.done(AsyncTask.java:309)
at java.util.concurrent.FutureTask.finishCompletion(FutureTask.java:354)
at java.util.concurrent.FutureTask.setException(FutureTask.java:223)
at java.util.concurrent.FutureTask.run(FutureTask.java:242)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1113)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:588)
at java.lang.Thread.run(Thread.java:818)
Caused by: java.lang.IllegalStateException: Unable to retrieve AudioTrack pointer for write()
at android.media.AudioTrack.native_write_byte(Native Method)
at android.media.AudioTrack.write(AudioTrack.java:1761)
at android.media.AudioTrack.write(AudioTrack.java:1704)
at com.example.boober.beatkeeper.AudioGenerator.writeSound(AudioGenerator.java:55)
at com.example.boober.beatkeeper.Metronome.play(Metronome.java:60)
at com.example.boober.beatkeeper.MainActivity$MetronomeAsyncTask.doInBackground(MainActivity.java:298)
at com.example.boober.beatkeeper.MainActivity$MetronomeAsyncTask.doInBackground(MainActivity.java:283)
at android.os.AsyncTask$2.call(AsyncTask.java:295)
at java.util.concurrent.FutureTask.run(FutureTask.java:237)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1113)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:588)
at java.lang.Thread.run(Thread.java:818)
You could just go to github and download the original code, but in order to satisfy stackoverflow requirements, I have also provided the even-more-concise "minimal working example" which you can individually cut and paste into your Android Studio if you prefer.
MainActivity:
import android.graphics.Color;
import android.os.AsyncTask;
import android.os.Handler;
import android.os.Message;
import android.support.v7.app.AppCompatActivity;
import android.os.Bundle;
import android.util.Log;
import android.view.View;
import android.widget.Button;
import android.widget.TextView;
public class MainActivity extends AppCompatActivity {
String TAG = "AAA";
Button playStopButton;
TextView currentBeat;
// important objects
MetronomeAsyncTask aSync;
Handler mHandler;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
currentBeat = findViewById(R.id.currentBeatTextView);
playStopButton = findViewById(R.id.playStopButton);
// important objcts
aSync = new MetronomeAsyncTask();
}
// only called from within playStopPressed()
private void stopPressed() {
aSync.stop();
aSync = new MetronomeAsyncTask();
}
// only called from within playStopPressed()
private void playPressed() {
//aSync.execute();
aSync.executeOnExecutor(AsyncTask.THREAD_POOL_EXECUTOR, (Void[])null);
}
public synchronized void playStopButtonPressed(View v) {
boolean wasPlayingWhenPressed = playStopButton.isSelected();
playStopButton.setSelected(!playStopButton.isSelected());
if (wasPlayingWhenPressed) {
stopPressed();
} else {
playPressed();
}
}
// METRONOME BRAIN STUFF ------------------------------------------
private Handler getHandler() {
return new Handler() {
#Override
public void handleMessage(Message msg) {
String message = (String) msg.obj;
if (message.equals("1")) {
currentBeat.setTextColor(Color.GREEN);
}
else {
currentBeat.setTextColor(Color.BLUE);
}
currentBeat.setText(message);
}
};
}
private class MetronomeAsyncTask extends AsyncTask<Void, Void, String> {
MetronomeBrain metronome;
MetronomeAsyncTask() {
mHandler = getHandler();
metronome = new MetronomeBrain(mHandler);
Runtime.getRuntime().gc(); // <---- don't know if this line is necessary or not.
}
protected String doInBackground(Void... params) {
metronome.setBeat(4);
metronome.setNoteValue(4);
metronome.setBpm(100);
metronome.setBeatSound(2440);
metronome.setSound(6440);
metronome.play();
return null;
}
public void stop() {
metronome.stop();
metronome = null;
}
public void setBpm(short bpm) {
metronome.setBpm(bpm);
metronome.calcSilence();
}
public void setBeat(short beat) {
if (metronome != null)
metronome.setBeat(beat);
}
}
}
MetronomeBrain:
import android.os.Handler;
import android.os.Message;
public class MetronomeBrain {
private double bpm;
private int beat;
private int noteValue;
private int silence;
private double beatSound;
private double sound;
private final int tick = 1000; // samples of tick
private boolean play = true;
private AudioGenerator audioGenerator = new AudioGenerator(8000);
private Handler mHandler;
private double[] soundTickArray;
private double[] soundTockArray;
private double[] silenceSoundArray;
private Message msg;
private int currentBeat = 1;
public MetronomeBrain(Handler handler) {
audioGenerator.createPlayer();
this.mHandler = handler;
}
public void calcSilence() {
silence = (int) (((60 / bpm) * 8000) - tick);
soundTickArray = new double[this.tick];
soundTockArray = new double[this.tick];
silenceSoundArray = new double[this.silence];
msg = new Message();
msg.obj = "" + currentBeat;
double[] tick = audioGenerator.getSineWave(this.tick, 8000, beatSound);
double[] tock = audioGenerator.getSineWave(this.tick, 8000, sound);
for (int i = 0; i < this.tick; i++) {
soundTickArray[i] = tick[i];
soundTockArray[i] = tock[i];
}
for (int i = 0; i < silence; i++)
silenceSoundArray[i] = 0;
}
public void play() {
calcSilence();
do {
msg = new Message();
msg.obj = "" + currentBeat;
if (currentBeat == 1)
audioGenerator.writeSound(soundTockArray);
else
audioGenerator.writeSound(soundTickArray);
if (bpm <= 120)
mHandler.sendMessage(msg);
audioGenerator.writeSound(silenceSoundArray);
if (bpm > 120)
mHandler.sendMessage(msg);
currentBeat++;
if (currentBeat > beat)
currentBeat = 1;
} while (play);
}
public void stop() {
play = false;
audioGenerator.destroyAudioTrack();
}
public double getBpm() {
return bpm;
}
public void setBpm(int bpm) {
this.bpm = bpm;
}
public int getNoteValue() {
return noteValue;
}
public void setNoteValue(int bpmetre) {
this.noteValue = bpmetre;
}
public int getBeat() {
return beat;
}
public void setBeat(int beat) {
this.beat = beat;
}
public double getBeatSound() {
return beatSound;
}
public void setBeatSound(double sound1) {
this.beatSound = sound1;
}
public double getSound() {
return sound;
}
public void setSound(double sound2) {
this.sound = sound2;
}
}
AudioGenerator:
import android.media.AudioFormat;
import android.media.AudioManager;
import android.media.AudioTrack;
public class AudioGenerator {
private int sampleRate;
private AudioTrack audioTrack;
public AudioGenerator(int sampleRate) {
this.sampleRate = sampleRate;
}
public double[] getSineWave(int samples,int sampleRate,double frequencyOfTone){
double[] sample = new double[samples];
for (int i = 0; i < samples; i++) {
sample[i] = Math.sin(2 * Math.PI * i / (sampleRate/frequencyOfTone));
}
return sample;
}
public byte[] get16BitPcm(double[] samples) {
byte[] generatedSound = new byte[2 * samples.length];
int index = 0;
for (double sample : samples) {
// scale to maximum amplitude
short maxSample = (short) ((sample * Short.MAX_VALUE));
// in 16 bit wav PCM, first byte is the low order byte
generatedSound[index++] = (byte) (maxSample & 0x00ff);
generatedSound[index++] = (byte) ((maxSample & 0xff00) >>> 8);
}
return generatedSound;
}
public void createPlayer(){
audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC,
sampleRate, AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_16BIT, sampleRate,
AudioTrack.MODE_STREAM);
audioTrack.play();
}
public void writeSound(double[] samples) {
byte[] generatedSnd = get16BitPcm(samples);
audioTrack.write(generatedSnd, 0, generatedSnd.length);
}
public void destroyAudioTrack() {
audioTrack.stop();
// This line seems to be a most likely culprit of the start/stop crash.
// Is this line even necessary?
audioTrack.release();
}
}
Layout:
<?xml version="1.0" encoding="utf-8"?>
<android.support.constraint.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context="com.example.boober.android_metronome.MainActivity">
<Button
android:id="#+id/playStopButton"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_marginBottom="8dp"
android:layout_marginEnd="8dp"
android:layout_marginStart="8dp"
android:layout_marginTop="8dp"
android:onClick="playStopButtonPressed"
android:text="Play"
app:layout_constraintBottom_toBottomOf="parent"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintTop_toTopOf="parent" />
<TextView
android:id="#+id/currentBeatTextView"
android:layout_width="100dp"
android:layout_height="50dp"
android:layout_marginEnd="8dp"
android:layout_marginStart="8dp"
android:layout_marginTop="32dp"
android:text="TextView"
android:gravity="center_vertical"
android:textAlignment="center"
android:textSize="30sp"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintTop_toBottomOf="#+id/playStopButton" />
</android.support.constraint.ConstraintLayout>
After thinking about dmarin's comment and reading the code, I arrive to the conclusion that dmarin answered your question. It's a race condition, and it's also an access of an object which is not initialized. So the short solution is: The code needs to check, if the accessed data is initialized. The AudioTrack objects can be checked, if it is null or if the getState() equals "initialized". Unfortunately, the problem does not disappear with my setup (Android Studio 3.1.2, Android SDK Build-Tools 28-rc2).
private boolean isInitialized() {
return audioTrack.getState() == AudioTrack.STATE_INITIALIZED;
}
After a code analysis one could notice the creation of AsyncTasks and AudioTracks. So, to minimize those, create the AsyncTask only once in the onCreate - function and set the AudioTrack object to static.
MainActivity
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
currentBeat = findViewById(R.id.currentBeatTextView);
playStopButton = findViewById(R.id.playStopButton);
// important objcts
aSync = new MetronomeAsyncTask();
aSync.executeOnExecutor(AsyncTask.THREAD_POOL_EXECUTOR, (Void[])null);
}
AudioGenerator
public class AudioGenerator {
/*changed to static*/
private static AudioTrack audioTrack;
...
}
I admit just changing it to static is not a beautiful solution. But since I only need one pipe to the AudioService, this will do.
Creating the audio-pipe, stopping the playing of the audio and freeing the resource will look like this:
public void createPlayer(){
if (audioTrack == null || ! isInitialized())
audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC,
sampleRate, AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_16BIT, sampleRate,
AudioTrack.MODE_STREAM);
if (isInitialized()){
audioTrack.play();
}
}
public void destroyAudioTrack() {
if (isInitialized()) {
audioTrack.stop();
}
}
public void stopRelease() {
if (isInitialized()) {
audioTrack.stop();
audioTrack.release();
}
}
The boolean play is repurposed by me. Also, the beat counter called currentBeat is reset, when the play button is pressed. For accessing from the MainActivity: The change from private to public of those variables is not the best solution.
// only called from within playStopPressed()
private void stopPressed() {
aSync.metronome.play = false;
}
// only called from within playStopPressed()
private void playPressed() {
aSync.metronome.play = true;
aSync.metronome.currentBeat = 1;
}
In play() of MetronomeBrain the loop becomes an endless loop. This problem will be fixed, soon. That is why the play boolean may be repurposed. The playing of the tones needs to be set to a different condition, which depends on play.
public void play() {
calcSilence();
/*a change for the do-while loop: It runs forever and needs
to be killed externally of the loop.
Also the play decides, if audio is being played.*/
do {
msg = new Message();
msg.obj = "" + currentBeat;
if (currentBeat == 1 && play)
audioGenerator.writeSound(soundTockArray);
else if (play)
audioGenerator.writeSound(soundTickArray);
if (bpm <= 120)
mHandler.sendMessage(msg);
audioGenerator.writeSound(silenceSoundArray);
if (bpm > 120)
mHandler.sendMessage(msg);
currentBeat++;
if (currentBeat > beat)
currentBeat = 1;
} while (true);
}
Now the loop runs forever, but it may only play, if play is set to true. If the clean-up is necessary, it may be done at the end of the Activity lifecycle like this in the MainActivity:
#Override
protected void onDestroy() {
aSync.metronome.stopReleaseAudio(); //calls the stopRelease()
aSync.cancel(true);
super.onDestroy();
}
As I stated, the code could be further improved, but it gives a fair hint and enough material to think/learn about AsyncTasks, Services like Audio Service and Activity - Lifecycles.
References
- https://developer.android.com/reference/android/os/AsyncTask
- https://developer.android.com/reference/android/media/AudioManager
- https://developer.android.com/reference/android/media/AudioTrack
- https://developer.android.com/reference/android/app/Activity#activity-lifecycle
TL;DR: Make sure that the objects are initialized before accessing them, just create everything once, and destroy them, when you do not need them e.g. at the end of the activity.
I want to run my service forever in the background but it stops after sometime I checked every solution on Youtube and Internet but I didn't get the answer,I tried every solution like using START_STICKY in onStartCommand() or using onTaskRemoved() method but it did not work.Any help would be appreciated.
This is my TheService class code. `
`
package apphub.secretapp;
import android.app.Service;
import android.content.Intent;
import android.media.MediaPlayer;
import android.media.MediaRecorder;
import android.os.Build;
import android.os.Environment;
import android.os.IBinder;
import android.os.SystemClock;
import android.widget.Toast;
import java.io.File;
import java.io.IOException;
import java.text.SimpleDateFormat;
import java.util.Date;
import java.util.Locale;
import java.util.Random;
/**
* Created by as on 12/24/2017.
*/
public class TheService extends Service implements
MediaRecorder.OnInfoListener {
String AudioSavePathInDevice = null;
MediaRecorder mediaRecorder ;
Random random ;
String RandomAudioFileName = "ABCDEFGHIJKLMNOP";
public static final int RequestPermissionCode = 1;
MediaPlayer mediaPlayer ;
private MediaRecorder mRecorder;
private long mStartTime;
//setting maximum file size to be recorded
private long Audio_MAX_FILE_SIZE = 1000000;//1Mb
private int[] amplitudes = new int[100];
private int i = 0;
private File mOutputFile;
#Override
public void onCreate() {
super.onCreate();
}
#Override
public int onStartCommand(Intent intent, int flags, int startId) {
super.onStartCommand(intent,flags,startId);
Toast.makeText(this, "Service Started", Toast.LENGTH_SHORT).show();
return START_STICKY;
}
private void startRecording() {
mRecorder = new MediaRecorder();
mRecorder.setOnInfoListener(this);
mRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mRecorder.setMaxFileSize(Audio_MAX_FILE_SIZE);
mRecorder.setOutputFormat
(MediaRecorder.OutputFormat.MPEG_4);
Toast.makeText(this, "Recording started", Toast.LENGTH_SHORT).show();
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.JELLY_BEAN)
{
mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.HE_AAC);
mRecorder.setAudioEncodingBitRate(48000);
} else {
mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
mRecorder.setAudioEncodingBitRate(64000);
}
mRecorder.setAudioSamplingRate(16000);
mOutputFile = getOutputFile();
mOutputFile.getParentFile().mkdirs();
mRecorder.setOutputFile(mOutputFile.getAbsolutePath());
try {
mRecorder.prepare();
mRecorder.start();
mStartTime = SystemClock.elapsedRealtime();
} catch (IOException e) {
}
}
protected void stopRecording(boolean saveFile) {
mRecorder.stop();
mRecorder.release();
mRecorder = null;
mStartTime = 0;
if (!saveFile && mOutputFile != null) {
mOutputFile.delete();
}
// to stop the service by itself
}
private File getOutputFile() {
SimpleDateFormat dateFormat = new SimpleDateFormat
("yyyyMMdd_HHmmssSSS", Locale.US);
return new File(Environment.getExternalStorageDirectory().getAbsolutePath().toString()
+ "/Voice Recorder/RECORDING_"
+ dateFormat.format(new Date())
+ ".m4a");
}
#Override
public IBinder onBind(Intent intent) {
return null;
}
#Override
public void onTaskRemoved(Intent rootIntent) {
Intent i =new Intent(getApplicationContext(),this.getClass());
i.setPackage(getPackageName());
startService(i);
super.onTaskRemoved(rootIntent);
}
#Override
public void onInfo(MediaRecorder mr, int what, int extra) {
if (what == MediaRecorder.MEDIA_RECORDER_INFO_MAX_FILESIZE_REACHED) {
getOutputFile();
startRecording();
}
}
#Override
public void onStart(Intent intent, int startId) {
super.onStart(intent, startId);
startRecording();
}
}
`
The simple answer is: You can't! Android is an OS created for mobile devices. Mobile devices are small battery operated computers with constrained memory. With that in mind the OS will kill your service whenever it needs memory.
In further on latest versions of the OS (specially Nougat and Oreo), those limitations are being imposed more heavily to give extra battery to users.
Any tricks, hacks and work-around you find online are just that, tricks and hacks. They might work in certain conditions or certain devices for a little bit, but you still won't have your service running forever, specially not on latest Androids.
The best scenario to try to have your Service run for as much as possible is to do two things:
return START_STICKY (like you're already doing). This indicates to the OS that you would like your Service to run for as long as possible, but there are zero guarantees that it will.
Use a foreground service. Call the methods startForeground(int, Notification) with a notification to show on the device notification panel. This will bring your process to a foreground state and allow it to stay for a bit longer, but again, no guarantees. PS.: Remember to remove the notification on your service onDestroy.
My understanding is,existing Screen OFF and ON intents are not exactly mean that the device is in sleep and waked up respectively. Any applications on the device holds partial wake lock, device will not be in deep sleep but screen may be off/on.
Is there any intents to listen CPU "WAKE UP" and "SLEEP" ?
Is there any way, we know CPU is waked UP from deep sleep ?
I needed a tool to do exactly this when troubleshooting some timing behavior on my app in the background. So I made my own class to do it. See code below. Here's how you use it:
CpuSleepDetector.getInstance().setSleepEndNotifier(new CpuSleepDetector.SleepEndNotifier() {
#Override
public void cpuSleepEnded(long sleepDurationMillis) {
Log.d(TAG, "The CPU just exited sleep. It was sleeping for "+sleepDurationMillis+" ms.");
}
});
CpuSleepDetector.getInstance().logDump();
The logDump method will dump the last 100 sleep events to LogCat. This is useful in troubleshooting, becaue to get the CPU to sleep, I had to not only disconnect my USB cable from my phone, I actually had to turn off my adb connection over WiFi. This way, you can reconnect adb at a later time and use the logDump method to get recent detections.
I know this is an old question, but hopefully this will be useful to somebody else.
Here's the code for the detector class:
import android.os.Handler;
import android.os.HandlerThread;
import android.os.Looper;
import android.os.SystemClock;
import android.util.Log;
import java.util.ArrayList;
import java.util.Collections;
import java.util.Date;
import java.util.HashMap;
public class CpuSleepDetector {
private static final String TAG = CpuSleepDetector.class.getSimpleName();
private static CpuSleepDetector instance = null;
private HandlerThread thread;
private Handler handler;
private SleepEndNotifier notifier;
public static CpuSleepDetector getInstance() {
if (instance == null) {
instance = new CpuSleepDetector();
}
return instance;
}
private CpuSleepDetector() {
thread = new HandlerThread("cpuSleepDetectorThread");
thread.start();
handler = new Handler(thread.getLooper());
watchForSleep();
}
private void watchForSleep(){
// uptime stalls when cpu stalls
final long uptimeAtStart = SystemClock.uptimeMillis();
final long realtimeAtStart = SystemClock.elapsedRealtime();
handler.postDelayed(new Runnable() {
#Override
public void run() {
long uptimeAtEnd = SystemClock.uptimeMillis();
long realtimeAtEnd = SystemClock.elapsedRealtime();
long realtimeDelta = realtimeAtEnd - realtimeAtStart;
long uptimeDelta = uptimeAtEnd - uptimeAtStart;
final long sleepTime = realtimeDelta - uptimeDelta;
if (sleepTime > 1) {
detectedStalls.put(new Date(), sleepTime);
prune();
if (notifier != null) {
new Handler(Looper.getMainLooper()).post(new Runnable() {
#Override
public void run() {
notifier.cpuSleepEnded(sleepTime);
}
});
}
}
watchForSleep();
}
}, 1000);
}
private HashMap<Date,Long> detectedStalls = new HashMap<Date,Long>();
private HashMap<Date,Long> getDetectedStalls() {
return detectedStalls;
}
private void prune() {
int numberToPrune = detectedStalls.size() - 100;
if (numberToPrune > 0) {
HashMap<Date,Long> newDetectedStalls = new HashMap<Date,Long>();
ArrayList<Date> dates = new ArrayList<>(getDetectedStalls().keySet());
Collections.sort(dates);
for (int i = numberToPrune; i < detectedStalls.size(); i++) {
newDetectedStalls.put(dates.get(i), detectedStalls.get(dates.get(i)));
}
detectedStalls = newDetectedStalls;
}
}
public void logDump() {
Log.d(TAG, "Last 100 known CPU sleep incidents:");
ArrayList<Date> dates = new ArrayList<>(getDetectedStalls().keySet());
Collections.sort(dates);
for (Date date: dates) {
Log.d(TAG, ""+date+": "+getDetectedStalls().get(date));
}
}
public void setSleepEndNotifier(SleepEndNotifier notifier) {
this.notifier = notifier;
}
public interface SleepEndNotifier {
public void cpuSleepEnded(long sleepDurationMillis);
}
}
I've been having a fairly annoying problem with a video chat app I'm developing, and that's the issue of audio echoing.
I am at best a rank amateur at this, but the project I'm working on requires at least fully functional audio communication. Video turned out to be a lot easier than I originally anticipated.
The intended structure is eventually a thread taking input and another playing output on the same phone, for developing this, I've made two small apps that take in mic input on one phone, and send it via Datagram socket to the other. The phones in question are LG Optimus L7-2 running Android 4.1.2 and Alcatel Idol Mini (I think it's also advertized as Onetouch or some such.) running Android 4.2.2.
The code that transfers audio works perfectly, with minimal background noise (I'm guessing thanks to my choice of input as well as the post processing), however, as long as the two phones are close enough, I get a rather alarming echo, which is only made worse if I dare attempt to put input/output in the same app at the same time.
After my initial attempts at somehow filtering it out failed (AcousticEchoCanceler seems to help less than NoiseSupressor, and AutomaticGainControl seems to do more damage than good), I've done a bit of reading but found nothing that could help.
I am at this point rather confused as I can't seem to shake the feeling that I'm missing something obvious, and that it shouldn't be THAT complicated to set up.
I'm in addition putting up the base code I'm using for the audio recording/playing.
The recorder segment
package com.example.audiotest;
import java.io.IOException;
import java.io.InputStream;
import java.net.DatagramPacket;
import java.net.DatagramSocket;
import java.net.InetAddress;
import java.net.ServerSocket;
import java.net.Socket;
import java.net.UnknownHostException;
import android.app.Activity;
import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.AudioTrack;
import android.media.MediaRecorder;
import android.media.audiofx.AcousticEchoCanceler;
import android.media.audiofx.AutomaticGainControl;
import android.media.audiofx.NoiseSuppressor;
import android.os.Bundle;
import android.util.Log;
import android.view.View;
import android.view.View.OnClickListener;
import android.widget.Button;
public class MainActivity extends Activity {
private Button startButton,stopButton;
public byte[] buffer;
public static DatagramSocket socket;
private int port=50005;
AudioRecord recorder;
private int sampleRate = 22050;
private int channelConfig = AudioFormat.CHANNEL_IN_MONO;
private int audioFormat = AudioFormat.ENCODING_PCM_16BIT;
private int minBufSize = AudioRecord.getMinBufferSize(sampleRate, channelConfig, audioFormat);
private boolean status = true;
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
startButton = (Button) findViewById (R.id.start_button);
stopButton = (Button) findViewById (R.id.stop_button);
startButton.setOnClickListener (startListener);
stopButton.setOnClickListener (stopListener);
Log.v("AudioPlayerApp","minBufSize: " + minBufSize);
//minBufSize += 2048;
minBufSize = 4096;
System.out.println("minBufSize: " + minBufSize);
}
private final OnClickListener stopListener = new OnClickListener() {
#Override
public void onClick(View arg0) {
status = false;
recorder.release();
Log.d("VS","Recorder released");
}
};
private final OnClickListener startListener = new OnClickListener() {
#Override
public void onClick(View arg0) {
status = true;
startStreaming();
}
};
public void startStreaming() {
Thread streamThread = new Thread(new Runnable() {
#Override
public void run() {
try {
DatagramSocket socket = new DatagramSocket();
Log.d("AudioPlayerApp"", "Socket Created");
minBufSize = 4096;
byte[] buffer = new byte[minBufSize];
Log.d("AudioPlayerApp","Buffer created of size " + minBufSize);
DatagramPacket packet;
final InetAddress destination = InetAddress.getByName("192.168.0.13");
recorder = new AudioRecord(MediaRecorder.AudioSource.VOICE_RECOGNITION,sampleRate,channelConfig,audioFormat,minBufSize);
AcousticEchoCanceler canceler = AcousticEchoCanceler.create(recorder.getAudioSessionId());
NoiseSuppressor ns = NoiseSuppressor.create(recorder.getAudioSessionId());
AutomaticGainControl agc = AutomaticGainControl.create(recorder.getAudioSessionId());
canceler.setEnabled(true);
ns.setEnabled(true);
//agc.setEnabled(true);
recorder.startRecording();
while(status == true) {
//reading data from MIC into buffer
minBufSize = recorder.read(buffer, 0, buffer.length);
//putting buffer in the packet
packet = new DatagramPacket (buffer,buffer.length,destination,port);
socket.send(packet);
}
} catch(UnknownHostException e) {
Log.e("AudioPlayerApp", "UnknownHostException");
} catch (IOException e) {
e.printStackTrace();
Log.e("AudioPlayerApp", "IOException");
}
}
});
streamThread.start();
}
}
And the player segment.
package com.test.playsound;
import java.net.DatagramPacket;
import java.net.DatagramSocket;
import java.net.ServerSocket;
import java.net.Socket;
import android.media.AudioFormat;
import android.media.AudioManager;
import android.media.AudioTrack;
import android.os.Bundle;
import android.app.Activity;
import android.util.Log;
import android.view.Menu;
public class MainActivity extends Activity {
static int port = 50005;
static String address = "";
static int sampleRate = 22050;
private boolean running = true;
private AudioTrack audioTrack;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
Log.v("Player", "Init complete");
openPlaySocket();
}
private void openPlaySocket() {
// TODO Auto-generated method stub
Thread t = new Thread(new Runnable() {
#Override
public void run() {
// TODO Auto-generated method stub
try {
Log.v("AudioPlayerApp", "Opening socket");
DatagramSocket sSock = new DatagramSocket(port);
byte[] output = new byte[4096];
Log.v("AudioPlayerApp", "Generating AudioTrack");
int minBufferSize = AudioTrack.getMinBufferSize(sampleRate,
AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT);
audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC,
sampleRate, AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT, minBufferSize,
AudioTrack.MODE_STREAM);
DatagramPacket receivePacket = new DatagramPacket(output,
output.length);
//Log.v("AudioPlayerApp", "Playing AudioTrack");
audioTrack.play();
while (running) {
//Log.v("AudioPlayerApp", "Waiting Packet");
sSock.receive(receivePacket);
Log.v("AudioPlayerApp","REcieved packet");
//Log.v("AudioPlayerApp", "Packet recieved");
try {
//Log.v("AudioPlayerApp", "writing data to audioTrack");
audioTrack.write(receivePacket.getData(), 0,
receivePacket.getData().length);
} catch (Exception e) {
Log.v("AudioPlayerApp",
"Failed to write audio: " + e.getMessage());
}
}
/*Log.v("AudioPlayerApp","Opening socket");
ServerSocket sSock = new ServerSocket(port);
Socket sock = sSock.accept();
Log.v("AudioPlayerApp","Socket opened "+port);
*/
} catch (Exception e) {
// TODO: handle exception
Log.v("AudioPlayerApp", "Error: " + e.getMessage());
}
}
});
Log.v("Player", "Starting thread");
t.start();
}
#Override
public boolean onCreateOptionsMenu(Menu menu) {
// Inflate the menu; this adds items to the action bar if it is present.
getMenuInflater().inflate(R.menu.activity_main, menu);
return true;
}
}
I'm aware it contains bad practices (such as not checking whether the device in question has support for certain things, or releasing resources and such) however, this was in an effort to start testing out and fixing the echo as fast as possible. I've confirmed that both phones have access to AcousticEchoCanceller, NoiseSupression, recording rights, internet rights, and aside for the fact that AudioFormat.VOICECOMMUNICATION causes my AudioRecord to crash, I've had no other problems.
I'm looking for any ideas or advice on the subject, as I'm quite frankly stumped. What can be done to fix the issue of echoing while recording and playing voice?
the AcousticEchoCanceler class is for canceling or removing the audio played by speaker and captured by microphone of the same device and there is small delay between play and capture.
the AcousticEchoCanceler class can not remove the echo caused by placing two phone near each other because of long and variable nature of echo delay of echo path.
You need to leverage the build in echo cancellation at the hardware level, check if AcousticEchoCanceler.isAvailable() check and return true
Then you can try the following combinations from here on SO
I'm having a problem with Android's MediaPlayer in that it is too slow when calling the prepare method. I've tried to simply keep a Vector of the few MediaPlayer objects (with their preloaded data sources) but calling .start() multiple times results in weird issues.
The first issue is it will skip every other play, and sometimes the play will be half (or less) as loud.
The tones played are very very short but need to be played as quickly as possible. My source code is posted below.
Any help is greatly appreciated.
Kevin
package com.atClass.lemon;
import java.util.Vector;
import com.atClass.cardShoe.SettingTools.SETTING_PREF;
import com.atClass.cardShoe.SettingTools.SETTING_STUB;
import com.atClass.cardShoe.SettingTools.SETTING_VALUE;
import android.content.res.AssetFileDescriptor;
import android.media.MediaPlayer;
import android.media.MediaPlayer.OnCompletionListener;
import android.net.Uri;
import android.util.Config;
import android.util.Log;
public class MediaHandler {
public static int cRepeat;
public static float cVolume = Integer.valueOf(Prefs.cPrefsGet.getString(SETTING_PREF.annunciator_volume.name()+SETTING_STUB._int.name(), PrefDefaults.getDefault(SETTING_PREF.annunciator_volume,SETTING_STUB._int)));
public static boolean cVolumeEnabled = !(Prefs.cPrefsGet.getString(SETTING_PREF.annunciator_volume.name()+SETTING_STUB._value.name(),PrefDefaults.getDefault(SETTING_PREF.annunciator_volume)).equals(SETTING_VALUE.disabled.name()));
static Vector <MediaPlayer> cQuickMediaPlayerList = new Vector<MediaPlayer>();
public static enum AUDIO_CLIP {
app_boot_sound(R.raw.windows_hardware_insert),
app_results_sound(R.raw.windows_exclamation),
app_warning_sound(R.raw.windows_hardware_fail),
app_card_draw_sound(R.raw.fs_beep5),
app_lid_open_sound(R.raw.windows_hardware_fail),
app_top_tigger_overdraw_sound(R.raw.fs_beep6),
test(R.raw.fs_beep4);
private int enumResourceId;
AUDIO_CLIP(int input){ enumResourceId = input;}
int getItem(){return enumResourceId;}
}
public static int getAudioClipIndex(AUDIO_CLIP iAudioClip){
for (int i=0; i<AUDIO_CLIP.values().length; i++){
if (AUDIO_CLIP.values()[i] == iAudioClip){
return i;
}
}
return 0;
}
public static void setupQuickMediaPlayer(){
cQuickMediaPlayerList.clear();
for (int i=0; i<AUDIO_CLIP.values().length; i++){
MediaPlayer lMediaPlayer = new MediaPlayer();
final AssetFileDescriptor afd = Global.gContext.getResources().openRawResourceFd(AUDIO_CLIP.values()[i].getItem());
try{
lMediaPlayer.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength());
afd.close();
lMediaPlayer.prepare();
}catch(Exception e){}
lMediaPlayer.setVolume(cVolume,cVolume);
lMediaPlayer.setLooping(false);
lMediaPlayer.setOnCompletionListener(new OnCompletionListener(){
#Override
public void onCompletion(MediaPlayer lMediaPlayer) {
lMediaPlayer.release();
try{lMediaPlayer.prepare();}catch(Exception e){e.printStackTrace();}
}});
cQuickMediaPlayerList.add(lMediaPlayer);
}
}
public static void playAudio(AUDIO_CLIP iAudioClip){
float volume = cVolume;
volume++;
volume /= 10;
playAudio(iAudioClip,volume);
}
public static void playAudio(final AUDIO_CLIP iAudioClip, final float iVolume){
Thread lThread = new Thread(new Runnable(){
public void run() {
//int resourceId = iAudioClip.getItem();
Log.d(Global.TAG,"--> Playing audio clip: " + iAudioClip.name() + "," + iAudioClip.getItem() + "," + getAudioClipIndex(iAudioClip));
if (cVolumeEnabled == true){
//Log.d(Global.TAG,"--> Supplying volume: " + iVolume);
//Works but is too slow
// try {
// final MediaPlayer lMediaPlayer = new MediaPlayer();
// AssetFileDescriptor afd = Global.gContext.getResources().openRawResourceFd(iAudioClip.getItem());
// lMediaPlayer.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength());
// afd.close();
// lMediaPlayer.prepare();
// lMediaPlayer.setVolume(iVolume,iVolume);
// lMediaPlayer.setLooping(false);
// lMediaPlayer.setOnCompletionListener(new OnCompletionListener(){
// #Override
// public void onCompletion(MediaPlayer arg0) {
// lMediaPlayer.release();
// }});
// lMediaPlayer.start();
// }catch(Exception e){}
try{
//Works half the time
cQuickMediaPlayerList.get(getAudioClipIndex(iAudioClip)).start();
}catch(Exception e){}
}
}
});
lThread.setPriority(Thread.MAX_PRIORITY);
lThread.start();
}
}
You should use SoundPool instead: http://developer.android.com/reference/android/media/SoundPool.html
In your onCompletionListener, you call release(), followed by prepare(). This is illegal, and is probably why you're having problems starting them multiple times. If you want to call it again, don't use release(), because that frees all the resources for the MP, and should only be called when you are done with it. Use stop() instead.
However, this still won't speed up the prepare(). You might want to try seekTo(0) instead, but even then it might not be as fast as you want. It really depends on how fast you're talking about.