I'm trying to send audio through AudioStream from android device to Vlc media player, Vlc catches audio stream but drops all the blocks.
Follwing is a block of code. Is there a problem with codec?Vlc Screenshot
AudioManager audiomanager =(AudioManager) getSystemService(Context.AUDIO_SERVICE);
audiomanager.setMode(AudioManager.MODE_IN_COMMUNICATION);
audioGroup = new AudioGroup();
audioGroup.setMode(AudioGroup.MODE_NORMAL);
localip= getLocalAddress().toString();
audioStream = new AudioStream(getLocalAddress());
locolport.append(String.valueOf(audioStream.getLocalPort()));
audioStream.setCodec(AudioCodec.PCMU);
audioStream.setMode(RtpStream.MODE_NORMAL);
audioStream.associate(remoteIP, 22222);
audioStream.join(audioGroup);
Actually the problem was with permissions of android device, I was requiring permissions in manifest file but in android version Marshmallow or higher I had to request permissions on run time at least once(if granted).
Related
I'm looking to pair A2DP stereo sound with the interrupt functionality of AudioManager's MODE_IN_COMMUNICATION. The purpose of this pairing is to insert snippets of custom music into AM/FM broadcasts by using the music as something that "dials" the Android device and stops the AM/FM broadcast.
I have this working decently with SCO using the following code to start the "phone call".
Here is the AudioManager code:
AudioManager localAudioManager = (AudioManager) context.getSystemService(Context.AUDIO_SERVICE);
localAudioManager.setMode(0);
localAudioManager.setBluetoothScoOn(true);
localAudioManager.startBluetoothSco();
localAudioManager.setMode(AudioManager.MODE_IN_COMMUNICATION);
Here is the MediaPlayer I'm trying to play:
MediaPlayer mediaPlayer = new MediaPlayer();
mediaPlayer.setAudioAttributes(new AudioAttributes.Builder()
.setUsage(AudioAttributes.USAGE_MEDIA)
.setContentType(AudioAttributes.CONTENT_TYPE_SPEECH)
.build());
mediaPlayer.setDataSource("https://wmbr.org/WMBR_Archive_128.m3u");
mediaPlayer.prepare();
mediaPlayer.start();
The audio produced by this code is low-quality and mono as opposed to stereo. I would like to change that.
The issue is that the Android Dev site for startBluetoothSco says:
Even if a SCO connection is established, the following restrictions apply on audio output streams so that they can be routed to SCO headset: - the stream type must be STREAM_VOICE_CALL - the format must be mono - the sampling must be 16kHz or 8kHz
Is there any existing way to combine stereo sound and the interrupt functionality?
Additional context: in this answer it seems that MODE_IN_COMMUNICATION and MODE_IN_CALL use the PHONE routing strategy. He also says
If your BT accessory supports the Hands-free profile it will use a SCO link for the voice audio, and the ACL channel used for A2DP should be closed to avoid interference between the two.
I'm assuming this means my only option is a custom routing strategy, and I'm not sure what that entails.
I am connecting a mobile device with Android OS 4.1 to a Bluetooth device (device class = 1792), using BluetoothSco to route audio (voice). I've setup a BluetoothSocket using createRfcommSocketToServiceRecord successfully.
My settings:
Using AudioRecord and AudioTrack with frequency = 8000, MediaRecorder.AudioSource.MIC as the source for AudioRecord , AudioManager.STREAM_VOICE_CALL for the AudioTrack, and trying both MODE_IN_COMMUNICATION and MODE_IN_CALL for the AudioManager mode.
without success. I don't get audio on my device.
My questions:
Should I use MODE_IN_COMMUNICATION or MODE_IN_CALL?
Need I switch to MODE_NORMAL or other mode in order to play on device?
Can you suggest a code flow to make SCO audio play on a device?
Can you point out some working code to review?
Notes:
The "Media audio" profile (A2DP) is disabled on the device - only "Call audio" profile (HFP) is enabled.
Will gladly share some code, yet given existing SO Q&As it will probably look the same.
Regards.
What I am Trying to do : Reading .wav file in C(linux) ,forwarding buffer data through bluetooth rfcomm socket , receiving buffer in android and then giving buffer to Audio Track to play.(Need android application to play audio streaming)
code :
1- C-code for rfcomm socket creation Ccode for rfcomm socket
2 - C-code for forwarding data
FILE *fp;
char buffer[1024];
fp = fopen("feelgood.wav","r"); //for audio track use reading .wav file
while(i=fread(buffer, sizeof(buffer),1, fp) > 0){
status=write(bluetooth_socket, buffer,strlen(buffer));
usleep(100000);
}
3- Android code for reading from socket is something like this:
//Audio Track initialization for Streaming
AudioTrack track = new AudioTrack(AudioManager.STREAM_MUSIC,44100,AudioFormat.CHANNEL_OUT_MONO,AudioFormat.ENCODING_PCM_8BIT,10000, AudioTrack.MODE_STREAM);
track.play();
//Receiving data from socket
byte[] buffer = new byte[1024];
int bytes;
bytes = socket.getInputStream().read(buffer);
track.write(buffer, 0,bytes);
Problem : Actually problem I am not getting why Audio track is not playing properly(hint of audio music with lot of noise is heard).How to listen noisefree audio on Android application part with this approach .Is there audio track implementation problem or buffer problem.
Related question(Receive audio via Bluetooth in Android) but cannot follow a2dp approach on Android as sink.
I'm now programming an app to give vocal feedback through internal speaker while user performing some exercises.
Now I want to add a function to record the user's heartbeat and breathe through a stethoscope plugged in the phone by a 3.5mm jack. The problem is that when I plug in the jack, the speaker won't play any sound because the phone think that earpieces were connected and the sound will be played through them.
What I want to achieve is to record the sound by a stethoscope plugged in the phone while play sound through internal speaker.
Basically I use two ways to play sound, MediaPlayer and TextToSpeech.
I searched on the internet, found these articles:
Forcing sound output through speaker in Android
want to play TTS on Bluetooth headset
What they told me to do is to add permission MODIFY_AUDIO_SETTINGS to manifest
declare variables:
MediaPlayer mediaPlayer;
AudioManager mAudioManager;
HashMap<String, String> mHashAlarm = new HashMap<String, String>();
in OnCreate method:
mAudioManager = (AudioManager) getSystemService(Context.AUDIO_SERVICE);
mAudioManager.setSpeakerphoneOn(true);
mediaPlayer = MediaPlayer.create(context, R.raw.ding);
mHashAlarm.put(TextToSpeech.Engine.KEY_PARAM_STREAM, String.valueOf(AudioManager.STREAM_MUSIC));
...
tts.speak(text, TextToSpeech.QUEUE_FLUSH, mHashAlarm);
But neither of them works.
Does anyone know how to achieve this function? Any ideas would be a lot of help. Thanks.
Simply purchase a audio splitter...after that u get two ports plug your external mic on one and plug any speaker on another ....
Say I make a call to someone with my android phone.
Is it possible to play some audio file into the call?
I just tried like this
if(state==TelephonyManager.CALL_STATE_OFFHOOK){
AudioManager am = (AudioManager) pccontext.getSystemService(Context.AUDIO_SERVICE);
am.setMode(AudioManager.MODE_NORMAL);
am.setSpeakerphoneOn(true);
MediaPlayer mp = MediaPlayer.create(pccontext, R.raw.beep);
mp.start();
}
But its not working in MODE_NORMAL. When I tried with MODE_IN_CALL phone is playing the audio but caller on the other side is not able to listen the audio played by the receiver phone.
Unfortunately, it is not possible, it is an Android security limitation.