Trigger an audio file when call is answered - android

Is there a way to launch an audio file when answering a call to be played NOT into the call (so the other side could hear), but only in the call speaker (so only our side could hear).
Sounds strange, I know but it is part of a much larger app.

First of all, you'll need to set up a BroadcastReceiver (let's call it "CallReceiver"), and permission to know about the phone state (intuitively, the permission to add is android.permission.READ_PHONE_STATE).
Register your CallReceiver action like this.
<receiver android:name=".CallReceiver" android:enabled="true">
<intent-filter>
<action android:name="android.intent.action.PHONE_STATE"></action>
</intent-filter>
</receiver>
At your CallReceiver, you may decide upon which actions should your audio play back (incoming/outcoming/phone ringing...), so just read the EXTRA_STATE, and getCallState() (check out the TelephonyManager docs).
About the audio, you will need to use the AudioManager, and set the "in call" mode of playback before playing the sound.
private AudioManager am = (AudioManager) getSystemService(Context.AUDIO_SERVICE);
am.setMode(AudioManager.MODE_IN_CALL);
am.setSpeakerphoneOn(false);
I hope this helps!

private void PlayShortAudioFileViaAudioTrack(String filePath) throws IOException {
// We keep temporarily filePath globally as we have only two sample sounds now..
if (filePath == null)
return;
//Reading the file..
byte[] byteData = null;
File file = null;
file = new File(filePath); // for ex. path= "/sdcard/samplesound.pcm" or "/sdcard/samplesound.wav"
byteData = new byte[(int) file.length()];
FileInputStream in = null;
try {
in = new FileInputStream(file);
in.read(byteData);
in.close();
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
// Set and push to audio track..
int intSize = android.media.AudioTrack.getMinBufferSize(8000, AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_8BIT);
AudioTrack at = new AudioTrack(AudioManager.STREAM_VOICE_CALL, 8000, AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_8BIT, intSize, AudioTrack.MODE_STREAM);
if (at != null) {
at.play();
// Write the byte array to the track
at.write(byteData, 0, byteData.length);
at.stop();
at.release();
} else
Log.d("TCAudio", "audio track is not initialised ");

Related

How can i use AudioTrack to play mp3 file and also seek to position

Iam trying to develop an application where i can set the speed of music file(mp3) to be set like 1x,1.5x,2x,2.5x like this.but MediaPlayer does not support this feauture unless it is 23 api.How can i use AudioTrack to play this mp3 file and also seek to position.the below code gives me "zzzzzzz" sound.
public void playAudio(){
int minBufferSize = AudioTrack.getMinBufferSize(8000, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT);
int bufferSize = 512;
audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, 8000, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, minBufferSize, AudioTrack.MODE_STREAM);
String filepath = Environment.getExternalStorageDirectory().getAbsolutePath();
int i = 0;
byte[] s = new byte[bufferSize];
try {
final String path= Environment.getExternalStorageDirectory().getAbsolutePath() + "/folioreader/audio"+".mp3";
FileInputStream fin = new FileInputStream(path);
DataInputStream dis = new DataInputStream(fin);
audioTrack.play();
while((i = dis.read(s, 0, bufferSize)) > -1){
audioTrack.write(s, 0, i);
}
audioTrack.stop();
audioTrack.release();
dis.close();
fin.close();
} catch (FileNotFoundException e) {
// TODO
e.printStackTrace();
} catch (IOException e) {
// TODO
e.printStackTrace();
}
}
AudioTrack can play uncompressed audio (WAV) not any compressed ones (mp3 or AAC etc.,) You need to call MediaCodec to decode and then use Audio Track to play audio. Refer these links, https://developer.android.com/reference/android/media/AudioTrack.html and https://developer.android.com/reference/android/media/MediaCodec.html.
For faster playback, give sample rate proportional to the speed that you require. For ex. to play 8kHz audio in 2X rate, give 16kHz in AudioTrack and so on. This is crude way.

Playing two sounds simultaneously - Android

Is it possible to play two sound (mp3) files at the same time? I have tried using two different MediaPlayer objects-
MediaPlayer mediaPlayer;
MediaPlayer mediaPlayer2;
to play the sounds, but that does not work. I cannot use SoundPool either as the sound files in use are around 10MB each (since SoundPool doesn't work well with sound files > 3MB).
Here is some code to get familiar with my situation-
#Override
public void onResume() {
super.onResume();
if(mediaPlayer == null)
{
mediaPlayer = MediaPlayer.create(getActivity().getApplicationContext(), R.raw.song1);
}
if(mediaPlayer2 == null)
{
mediaPlayer2 = MediaPlayer.create(getActivity().getApplicationContext(), R.raw.song2);
}
}
private void startPlaying() {
mediaPlayer.setLooping(true);
mediaPlayer.start();
mediaPlayer2.start();
}
Any suggestions? Is there some way to make this 2 MediaPlayer objects approach work? If not then what other options are there? Code would be helpful!
So playing two audio files simultaneously is definitely an issue so I thought, another way to look at the problem would be to combine two audio files programmatically into one, and play that. That turned out to be simple to implement with WAV files (MP3 and other compressed formats would need to be uncompressed first?). Anyway, here's how I did it:
InputStream is = getResources().openRawResource(R.raw.emokylotheme); // Name of file 1
byte [] bytesTemp2 = fullyReadFileToBytes(new File(
Environment.getExternalStorageDirectory().getAbsolutePath()+
"/Kylo Ren/"+filename+"_morphed.wav")); // Name of file 2
byte [] sample2 = convertInputStreamToByteArray(is);
byte[] temp2 = bytesTemp2.clone();
RandomAccessFile randomAccessFile2 = new RandomAccessFile(new File(
Environment.getExternalStorageDirectory().getAbsolutePath()+
"/Kylo Ren/"+filename+"_morphed.wav"), "rw");
//seek to skip 44 bytes for WAV formats
randomAccessFile2.seek(44);
for (int n = 0; n < bytesTemp2.length; n++)
{
bytesTemp2[n] = (byte) ((temp2[n] + (sample2[n])));
}
randomAccessFile2.write(bytesTemp2);
randomAccessFile2.close();
And here are the support functions:
public byte[] convertInputStreamToByteArray(InputStream inputStream)
{
byte[] bytes= null;
try
{
ByteArrayOutputStream bos = new ByteArrayOutputStream();
byte data[] = new byte[1024];
int count;
while ((count = inputStream.read(data)) != -1)
{
bos.write(data, 0, count);
}
bos.flush();
bos.close();
inputStream.close();
bytes = bos.toByteArray();
}
catch (IOException e)
{
e.printStackTrace();
}
return bytes;
}
byte[] fullyReadFileToBytes(File f) throws IOException {
int size = (int) f.length();
byte bytes[] = new byte[size];
byte tmpBuff[] = new byte[size];
FileInputStream fis= new FileInputStream(f);
try {
int read = fis.read(bytes, 0, size);
if (read < size) {
int remain = size - read;
while (remain > 0) {
read = fis.read(tmpBuff, 0, remain);
System.arraycopy(tmpBuff, 0, bytes, size - remain, read);
remain -= read;
}
}
} catch (IOException e){
throw e;
} finally {
fis.close();
}
return bytes;
}
In essence, what the code does is: It gets the bytes of the two audio files (one is within the app in R.raw and the other is in the external storage directory), sums them up, and writes the new bytes to another file.
The issue with this code is that it generates some amount of background noise. It isn't a lot but I believe the summing up of the bytes at certain points (maybe extremas?) leads to the noise. If someone knows how to fix this, could you edit the answer and let me know.
P.S. this was for an open source voice changer app with dramatic background noise effects called Kylo Ren Voice Changer (https://github.com/advaitsaravade/Kylo-Ren-Voice-Changer)
I have used two instances of MediaPlayer in a Service. My code is similar to yours. I had some problems but finally solved it. Please check my answer here Unable to play two MediaPlayer at same time in Nexus 5
If you still have problems, please put your full code and the logcat error.
You can also try to create two fragments and each play a sound.
I have another problem if i played an mp3 file, then click the back button and then start the activity again and ican play the same file again. You can will read Android Mediaplayer multiple instances when activity resumes play sound in the same time

Accessing the output video while recording

In short, I'm looking for a way to get the byte stream from the camera while recording video.
The aim is to continuously record while saving certain portions of the current recording without stopping the actual recording process to access the output file. Is this even possible, or will I need to actually stop the recording and save it for it be playable?
I've seen projects and open source library's that allow live streaming from the camera to a server via a local socket and the ParcelFileDescriptor class, so I assume (maybe incorrectly) that the recorder byte stream must be accessible somehow.
Any suggestions or help would be greatly appreciated.
Set output file to FileDescriptor:
mRecorder.setOutputFile(getStreamFd());
Then use this function:
private FileDescriptor getStreamFd() {
ParcelFileDescriptor[] pipe = null;
try {
pipe = ParcelFileDescriptor.createPipe();
new TransferThread(new ParcelFileDescriptor.AutoCloseInputStream(pipe[0]),
new FileOutputStream(getOutputFile())).start();
} catch (IOException e) {
Log.e(getClass().getSimpleName(), "Exception opening pipe", e);
}
return (pipe[1].getFileDescriptor());
}
private File getOutputFile() {
return (new File(Environment.getExternalStorageDirectory().getPath().toString() + "/YourDirectory/filename"));
}
New thread code:
static class TransferThread extends Thread {
InputStream in;
FileOutputStream out;
TransferThread(InputStream in, FileOutputStream out) {
this.in = in;
this.out = out;
}
#Override
public void run() {
byte[] buf = new byte[8192];
int len;
try {
while ((len = in.read(buf)) > 0) {
out.write(buf, 0, len);
}
in.close();
out.flush();
out.getFD().sync();
out.close();
} catch (IOException e) {
Log.e(getClass().getSimpleName(),
"Exception transferring file", e);
}
}
}
Don't forget to add persmissions to your manifest file:
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
I had a similiar problem and wanted to record H264 in an MP4 file while accessing the H264 NAL units camera byte stream (redirect it to libRTMP). The following example helped alot (requires at least Android 4.3):
http://bigflake.com/mediacodec/ <- the CameraToMpegTest.java and "Android Breakout game recorder patch" examples
Basically, Androids MediaCodec class provides low level access to the device encoders/decoders. Take a look at the function drainEncoder() of the examples above:
the video data is send to MediaMuxer to create an output file
you can easily access the H264 NAL units from encodedData ByteBuffer and process them the way you want
Example:
int old_pos = encodedData.position();
encodedData.position(0);
byte[] encoded_array = new byte[encodedData.remaining()];
encodedData.get(encoded_array);
encodedData.position(old_pos);

Play audio file during a call, however the person on the other end is not able to hear it

I want to play audio file during a call and i'm able to play it but when i make a call it play file only on that device on which i have installed the application.
here is my sample code by which i have played the audio file.
public void play() {
bufferSize = AudioTrack.getMinBufferSize(16000,
AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT);
AudioTrack at = null;
at = new AudioTrack(AudioManager.STREAM_VOICE_CALL,
16000, AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_16BIT, bufferSize, AudioTrack.MODE_STREAM);
String filePath = Environment.getExternalStorageDirectory()
.getAbsolutePath();
String file = filePath + "/" + "myfile.wav";
int i = 0;
byte[] s = new byte[bufferSize];
try {
FileInputStream fin = new FileInputStream(file);
BufferedInputStream bis = new BufferedInputStream(fin, 44000);
DataInputStream dis = new DataInputStream(bis);
at.play();
while ((i = dis.read(s, 0, bufferSize)) > -1) {
at.write(s, 0, i);
}
at.stop();
at.release();
dis.close();
fin.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
This code works fine with me.
But i want that this file would be able to hear by the other end person who make call. how could i do it???
Any idea?? i know android API doesn't allow to do it..but anyone who did it??
please give only genuine and accurate answer not assumptions.

Android:Playing AudioTrack multiple times producing crash

I am trying to play audio buffered sound (.wav) using AudioTrack. Please see the code below. I need to call this function under a Thread to support simultaneous play. It is fine being under a Thread. It is working fine playing the sound normally. But if i execute playing the sound using AudioTrack one after another continuously(i.e. executing second play before completing the first play sound), produces device crash (force close unexpectedly error).
Does anyone come across such problems and resolve it in a way?
private void PlayAudioTrack(String filePath) throws IOException
{
if (filePath==null)
return;
byte[] byteData = null;
File file = null;
file = new File(filePath); // sdcard path
byteData = new byte[(int) file.length()];
FileInputStream in = null;
try {
in = new FileInputStream( file );
in.read( byteData );
in.close();
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
int intSize = android.media.AudioTrack.getMinBufferSize(8000, AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_8BIT);
at = new AudioTrack(AudioManager.STREAM_MUSIC, 8000, AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_8BIT, intSize, AudioTrack.MODE_STREAM);
at.play();
at.write(byteData, 0, byteData.length);
at.stop();
}
Appreciate your response.
Thanks.
You have to release the AudioTracks resources as well as stopping it
at.release();

Categories

Resources