I have a college assignment to build Android app that communicates with Ubuntu (or any other Linux distribution), and streams audio via microphone and speakers both on PC and phone. Switching the direction of communication should be done on Android and script for listening on Bluetooth port on PC should be written in Python or some other lightweight language. It does not have to be full-duplex, only single-duplex.
Is the answer in the BluetoothA2dp Android profile or is there something else?
I'm common with making simple Android apps.
Thanks a lot!
Not sure if you still need the answer, but I am working on something similar.
Basically working with python on windows platform to record streaming audio from microphone of laptop then process the sound for ANC [ automatic noise cancellation ] and pass it through band-pass filter then output the audio stream to a Bluetooth device.
I would like to ultimately port this to smartphone, but for now prototyping with Python as that's lot easier.
While I am still early stage on the project, here are two piceses that may be helpful,
1) Stream Audio from microphone to speakers using sounddevice
Record external Audio and play back
Refer to soundaudio module installation details from here
http://python-sounddevice.readthedocs.org/en/0.3.1/
import sounddevice as sd
duration = 5 # seconds
myrecording = sd.rec(duration * fs, samplerate=fs, channels=2, dtype='float64')
print "Recording Audio"
sd.wait()
print "Audio recording complete , Play Audio"
sd.play(myrecording, fs)
sd.wait()
print "Play Audio Complete"
2)Communicate to bluetooth
Refer to details from here:
https://people.csail.mit.edu/albert/bluez-intro/c212.html
import bluetooth
target_name = "My Phone"
target_address = None
nearby_devices = bluetooth.discover_devices()
for bdaddr in nearby_devices:
if target_name == bluetooth.lookup_name( bdaddr ):
target_address = bdaddr
break
if target_address is not None:
print "found target bluetooth device with address ", target_address
else:
print "could not find target bluetooth device nearby"
I know I am simply quoting examples from these sites, You may refer to these sites to gain more insight.
Once I have a working prototype I will try to post it here in future.
Related
I am new to android studios and I have the task to develop an app which transfers data from an app (Acceleration sensor data - i have created this app already which shows the data) to matlab (on the pc).
I don't really know how I should do this. I've experimented a bit with bluetooth apps, but I don't have a clue how to connect to Matlab.
I would be greatful for your help.
Thanks in advance,
Annika
Unfortunately I can not speak to the android side of things, but MatLab can connect to generic devices with the UART interface, which is fairly low level.
The process with some microprocessors that I am using is to connect the device to the PC, and then note the Outgoing com port.
(In windows 10, these can be found in Bluetooth settings -> More Bluetooth options)
Then you can use
s = serial('COM<what you found in settings>');
s.Baudrate=115200;
s.InputBufferSize = 100;
fopen(s{i});
serials = instrfindall;
to open an connection. The critical command is serial, the other parameters depend on your device/ configuration. Sometimes there can be issues, in which case one options is to build a loop that tries again until it works.
You then collect the data sent via UART via
flushinput(serials);
temp = fscanf(serials,'%s');
and then split the string. If data is sent continuously, you wrap this into a while loop.
After you are done, you can clean up via
fclose(s{i});
delete(instrfind)
instrreset
It should be noted, that establishing a connection takes longer, the more enabled COM ports there are. So it might be worth disabling all those you don't need.
For more specific things matlab can do, check out What Is the MATLAB Serial Port Interface
My new surveillance camera just arrived, so I'm trying to write an app to live stream the video from it.
Since it came with basically no documentation, I installed the 'onvifer' android app which allows you to browse the camera's capabilities. This app works fine - gets the video and allows PTZ controls, etc. It reports the streaming url as:
rtsp://192.1.0.193:554/mpeg4
I tested the stream in the VLC windows client, and it's able to stream video from that URL as well. This makes me comfortable that the network is working OK.
The camera states the feed will be 1920x1080; VLC confirms this.
The basic code in my activity:
VideoView videoView = (VideoView)this.findViewById(R.id.VideoView);
videoView.setVideoURI(Uri.parse("rtsp://192.1.0.193:554/mpeg4"));
videoView.requestFocus();
videoView.start();
I've also given the app INTERNET permissions in AndroidManifest.xml, disabled authentication on the camera, and am running on a real device (not the emulator).
When I run the app, LogCat shows this immediately:
setDataSource IOException happend :
java.io.FileNotFoundException: No content provider: rtsp://192.1.0.193:554/mpeg4
at android.content.ContentResolver.openTypedAssetFileDescriptor (ContentResolver.java).
About 15 seconds later, the app shows a "Can't play this video" modal dialog box and this is added to LogCat:
MediaPlayer error (100, 0)
AudioSystem AudioFlinger server died!
MediaPlayer error (100, 0)
VideoView Error: 100,0
I've googled everything I can think of, but haven't found anything useful.
Any thoughts?
wild-ass-guess on your logcat and the RC=100... No SDP file or no equivalent for RTSP of the 'moov atom' block required to negotiate details of the stream /container/ codec/ format... You can get the AOSP code for mediaPlayer/videoView and grep the RC value in the source.
RTSP is gnarly to debug ( note the tools links ) and not assured to run inside a NAT'd network due to UDP issues. So, to get better result, you may have to look into forcing your config to do data channel on TCP an not UDP. Or it could be other issues , of which there are many.
If you really want to investigate, some possible tools below:
Use command line and CURL client to request your stream:
Android - Java RTSP Session Mgmt package on Git
Protocol dumps for CLI RTSP sessions to Youtube RTSP/SDP streams
To pursue the issue, you may need to get into the weeds with debug tools that track details of the protocol negotiation that preceeds the MediaPlayer actually starting play on the stream. That would include learning the RFP and the protocol details.
videoView.setVideoURI(“rtsp://192.1.0.193:554/mpeg4”);
Try your app on another phone.
You may find the problem is about the mobile device.
Try this
path:"rtsp://218.204.223.237:554/mobile/1/4C024DFE77DC717D/onnuvesj43xj7t26.sdp".
See whether the code has something wrong.
Does anyone know if there is an equivalent to MPVolumeView available in Android?
Basically, it's a a built in component in iOS that can present users with a system volume slider, and / or (what I'm really after in Android) a list of available bluetooth / Airplay audio output options (i.e. bluetooth speakers). The image below shows it in action:
Is there any easy option for listing bluetooth etc. audio routing options in Android, or do you have to write all the scanning, connecting, audio routing code yourself?
From the lack of response, and the endless googling around the subject of connecting to A2DP Bluetooth sinks, that I've been doing lately, the sad answer to this question seems to be no, there is nothing quite like the MPVolumeView in Android.
The nearest thing would be to either write it all yourself, as I feared, or, to simply pop open the system Bluetooth options windows from your app (ensuring you've got BLUETOOTH_ADMIN permissions set up in your manifest first):
Intent intentOpenBluetoothSettings = new Intent();
intentOpenBluetoothSettings.setAction(android.provider.Settings.ACTION_BLUETOOTH_SETTINGS);
startActivity(intentOpenBluetoothSettings);
I want to create an Android application that is capable of receiving an audio stream. I thought of using the A2DP profile, but is seems as if Android doesn't support A2DP sink. Looks like there are a lot of people that's searching for a solution for this problem. But what about receiving an ordinary bit stream, and then convert the data into audio in the application? I was thinking of receiving an PCM or Mp3 data stream via the RFCOMM (SPP Bluetooth profile), and then play it using AudioTrack.
First, how do I receive a bit stream on my Android phone via the RFCOMM? And is it possible to receive a bit stream via RFCOMM as a PCM or Mp3 stream?
Second, if it isn't possible to receive a bit stream via RFCOMM as a PCM or Mp3 stream, how do I convert the received bit stream into audio?
Third, how do I convert the received data into audio AND play the audio simultaneously, in "real time"? Can I just use onDataReceived?
To be clear, I'm not interested of using the A2DP profile! I want to stream the data via the RFCOMM (SPP Bluetooth profile). The received data stream will be in PCM or Mp3. I thought of writing my own app, but if anyone knows of an app to solve this I'd be glad to hear about it! I'm using Android 2.3 Gingerbread.
/Johnny
No. Trying to write an Android application that handles this will not be the solution. At least if you want to use A2DP Sink role.
The fact is that Android, as you mentioned it, does not implement the API calls to BlueZ (the bluetooth stack Android uses till Jelly Bean 4.1) regarding A2DP sink capabilities. You have to implement them yourself. I will try to guide you, as I was also interested in doing this my self in the near past.
Your bluetooth-enabled Android device is advertising itself as an A2DP source device by default. You have to change this first, so nearby devices may recognize your device as a sink. To do this, you must modify the audio.conf file (usally located in /etc/bluetooth/) and make sure the Enable key exists and the value Source is attached to this key, so you will get something like :
Enable=Source
Reboot, nearby devices should now recognize your device as an A2DP sink.
Now you will have to interact with BlueZ to react appropriately when an A2DP source device will start to stream audio to your phone.
Android and BlueZ are talking to each other via D-BUS. In fact, Android connects to the DBUS_SYSTEM channel and listens to every BlueZ advertisement, such as events, file descriptors ...
I remember having successfully bound my self using a native application to this d-bus channel and got access to the various events BlueZ was posting. This is relatively easy to achieve using as reference, the BlueZ API available here. If you go this way, you will have to build a native application (C/C++) and compile it for your platform. You must be able to do this using the Android NDK.
If you find it difficult to use D-BUS, you can try this Java library I just found that handles the communication to D-BUS for you : http://jbluez.sourceforge.net/. I have never used it but it is worth a try in my opinion.
What you really have to do is find out when an A2DP source device is paired to your phone and when he starts to stream music. You can retrieve these events through D-BUS. Once somebody will try to stream music, you need to tell BlueZ that your native application is going to handle it. There is a pretty good document that explains the flow of events that you should handle to do this. This document is accessible here. The part you're interested in comes on page 7. The sink application in the given example is PulseAudio but it could be your application as well.
BlueZ will forward you a UNIX socket when you will call the org.bluez.MediaTransport.Acquire method. Reading on this socket will give you the data that are currently streamed by the remote device. But I remember having been told by a guy working on the BlueZ stack that the data read on this socket are not PCM pure audio, but encoded audio content instead. The data are generally encoded in a format called SBC (Low Complexity Subband Coding).
Decoding SBC is not very difficult, you can find a decoder right here.
The ultimate step would be to forward the PCM audio to your speakers.
To prevent you from getting stuck and in order to test your application in an easier manner, you can use the d-bus binary that should be available on your Android system. He is located in /system/bin.
Quick tests you can make before doing anything of the above might be :
Get Devices list :
dbus-send --system --dest=org.bluez --print-reply /
org.bluez.Manager.GetProperties
This returns an array of adapters with their paths. Once you have these path(s) you can retrieve the list of all the bluetooth devices paired with your adapter(s).
Get paired devices :
dbus-send --system --print-reply --dest=org.bluez
/org/bluez/{pid}/hci0 org.bluez.Adapter.GetProperties
This gives you the list of paired devices whithin the Devices array field.
Once you have the list of devices paired to your Bluetooth Adapter, you can know if it is connected to the AudioSource interface.
Get the devices connected to the AudioSource interface :
dbus-send --system --print-reply --dest=org.bluez
/org/bluez/{pid}/hci0/dev_XX_XX_XX_XX_XX_XX
org.bluez.AudioSource.GetProperties
org.bluez.Manager.GetProperties
Hope this helps.
Another work around is using HandsFreeProfile.
in Android, BluetoothHeadset is working on that.
Wait until status changed to BluetoothHeadset.STATE_AUDIO_CONNECTED.
then you can record audio from bluetooth headset.
mMediaRecorder = new MediaRecorder();
mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
mMediaRecorder.setOutputFile(mFilename);
mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
try {
mMediaRecorder.prepare();
} catch (IllegalStateException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
mMediaRecorder.start();
[Irrelevant but works] This hack serves only mp3 streaming via WIFI hotspot (I use it in my car which has only AUX input):
Install the app AirSong,
Turn on wifi hotspot,
Connect the other device to that hotspot,
Access 192.168.43.1:8088 from the device's browser and you are on.
(wondering why "192.168.43.1" only? because thats the default gateway of any device connected to Android Hotspot)
audio.conf seems to be missing in Android 4.2.2?
To receive pcm audio stream via rfcomm , you can use code flow as a hint explained (Reading Audio file in C and forwarding over bluetooth to play in Android Audio track) , with a change . change freq used while initializing from 44100 to 22050
AudioTrack track = new AudioTrack(AudioManager.STREAM_MUSIC,22050,AudioFormat.CHANNEL_OUT_MONO,AudioFormat.ENCODING_PCM_8BIT,10000, AudioTrack.MODE_STREAM);
note:This streaming still consists some noise but your
"receiving an PCM data stream via the RFCOMM (SPP Bluetooth profile), and then play it using AudioTrack."
will work.
I'm trying to achieve this goal: I have a bluetooth device connected to my android phone. Given this device can vibrate, is it possible to send a vibrate command to the device?
I have researched the bluetooth apis and the vibrate api on android developers' site but nothing clearly answers my question.
In bluetooth HFP (Handsfree) spec, there is no command (AT Command) to send vibrate to the peer device. As per the technical specification of bracelet "Vibration prompt for incoming phone" , once an AT command is send from phone to bracelet indicating a incoming call, bracelet generates vibration locally, phone does not ask/request to vibrate.
If you want peer device to vibrate,may be you have to use your own defined commands (AT command), with command recognized by the peer device.
I'm very glad to update this thread that I was able to programmatically vibrate the BT bracelet and in the right way using AT commands. So my application is based on Bluetooth Chat application in the Android samples. Here are the steps I followed:
connect the device to the phone
device follows BT Hands-Free profile specification. That requires "handshaking" by means of AT commands.
So I mimic this exchange of commands (with hard coded responses to bracelet's commands)
Once handshake is complete, I send AT command based on BT Hands-Free profile specification to RING and +CLIP:
On receiving the last set of response from application, the bracelet vibrates.
It seems simple, but without knowledge of BT Hands-Free profile specification, and sample AT commands this was nearly impossible.
#bt_user: Thanks for your pointer, that put me on the right track of R&D.
I was fiddeling with exaclty the same problem. After days of trial and error, I finally got it to work. I think it depends on the speed at wich you answer the HF's commands, as well as on the correct line-endings ([13][10]"COMMAND"[13][10]).
Here is my DroidScript which works. It's not cleaned up, but it works.
https://gist.github.com/t-oster/68a568ac4c4e133f67ac
The exact sequence, which works for my bracelet is:
CR is ASCII Code 13 and LF is ASCII Code 10
> AT+BRSF=0<cr>
< <cr><lf>+BRSF:0<cr><lf>
< <cr><lf>OK<cr><lf>
> AT+CIND=?<cr>
< <cr><lf>+CIND: ("service",(0,1)),("call",(0,1))<cr><lf>
< <cr><lf>OK<cr><lf>
> AT+CIND?<cr>
< <cr><lf>+CIND: 1,0<cr><lf>
< <cr><lf>OK<cr><lf>
> AT+CMER=3,0,0,1<cr>
> <cr><lf>OK<cr><lf>
from then on I can just send
<cr><lf>RING<cr><lf>
to make it vibrate.
i have developed a java android version
using parts from Thomas Oster and an example i found online
https://gist.github.com/shimondoodkin/a582d910045ab06ab68c