I am currently working on an Android audio visualizer using connected smart home devices (such as Philips hue) to visualize music. I have trouble compensating the delay caused by bluetooth speakers.
I have noticed, that the Youtube app, in difference to many other video streaming apps, has almost no delay between audio and video, when using bluetooth speakers.
I figure, that they must compensate for the bluetooth latency by adjusting (delaying) the video signal.
Is there a reasonable way of detecting or approximating the latency/delay of bluetooth speakers?
Thank you very much for you help!!
Best,
Stefan
This would not work for Bluetooth speakers. However, I have been able to measure the audio latency of a Bluetooth dongle using Google's Dr. Rick O'Rang loopback dongle, using Glenn Kasten test app.
AVDTP 1.3 protocol supports delay reporting. Try looking to see if Android exports this data.
It's in the somewhere: AVDT_PSC_DELAY_RPT is present in https://android.googlesource.com/platform/system/bt/+/master/stack/include/avdt_api.h#153.
Related
Is it possible to play 2 different sounds simultaneously from 2 built-in speakers (1=main speaker, 2=earpiece speaker) on Android, preferably using Oboe C++ library.
In this thread, it was asked a similar question, but for 2 different audio devices. In my case, I just want to play on the same audio device but through 2 different speakers.
It was possible to record data from built-in microphones simultaneously using stereo channels. A similar approach for speakers didn't work.
Any help much appreciated, thank you.
Note:
The term Audio Device refers to a device capable of receiving or
sending audio. An audio device can have multiple microphones and/or
speakers attached to it, and these are represented as different
channels.
Debugging device is Google Pixel XL running Android 9
If the audio device has 2 speakers then you should be able to play different sounds through each speaker by supplying different data to each channel.
However, I believe the speakers you're referring to:
2 built-in speakers (1=main speaker, 2=earpiece speaker)
are actually 2 separate audio devices each with a single speaker. In which case you won't be able to use them at the same time, although hacks might be available to make the speakers part of the same audio device (I haven't tried this).
I am developing an app that makes use of an Android device microphone and Bluetooth headset. The audio is transmitted from the microphone to the headset immediately. My use case might greatly benefit from the so called "audio beamforming", when the audio signal is amplified in a specific physical direction while sound from all the other directions is attenuated. As far as I understand, this technique requires at minimum 2 audio inputs running synchronously, which in my case should be the microphone of the device itself and the microphone on the Bluetooth headset.
Is this feasible in Android SDK or NDK? Thanks in advance!
I want to make an app that makes it possible to connect an iPod or mp3 player to my Android device and let the Android speakers function as external speakers.
The ideal situation would be to actually read from speaker output so I can connect a stereo mini jack cable.
Is it possible to read from the headset output with the Android SDK?
A second option would be to use a mono mini jack instead. I could maybe directly read from microphone and output as a music player. Although, having to use a mono mini jack would be a huge disadvantage, because most people don't own such a cable.
UPDATE
For my second option I found this link that would let me take a special adapter onto a stereo cable so the iPod output can go into the mic input. It's a TRRS adapter. This works, but still isn't the ideal solution to me. http://www.techlife.net/2012/12/add-an-audio-input-to-android.html
ANOTHER UPDATE
I did a test with only a mono cable, but it seems that the mic is not recognized, so I really need the TRRS adapter to make sure that the mic is on. I found some apps that can help me with measuring input volume. I think I can achieve my goal for myself with the adapter, but reading from headset output would be nicer and could actually result in building an app.
You need to understand some basic things...
Audio output lets you "take audio out of your device".
It's not audio input that would let you "insert audio signal into your device".
So the concept that you've presented cannot work, because this socket is not able to receive audio signal through normal stereo jack cable (and connector).
You could try to make it work with a device that supports the headphones/mic set (it's a different kind of 3.5 mm jack connector). It's so called TRRS (four-conductor). But to use it in your project you probably would need some cable/socket soldering and maybe even some sort of microporcessor to help processing the signals.
I am trying to access, programatically, the data received from 2 microphones on Android devices.
This arises several questions:
Are there shipping Android devices with 2 microphones (e.g. for stereo recording)? I know there are devices with 2 microphones for echo cancellation / noise reduction, but as far as I could find they can be accessed as a single microphone for any programatic purpose.
Are there devices with a microphone / headphone socket supporting stereo external microphones?
Assuming any of the above is positive, is there a way to know what is the currently operating microphone setup?
I will appreciate any response!
Thanks,
Yoav
I only found out that e.g. once you plug in wired headset with microphone it doesn't matter what AudioSource you specify in you code - it always give you the audio stream form headset mic. I tried to get access to internal mic using AudioSource.CAMCORDER but without luck. I haven't tried with wireless (BT) headset though. However if I plugin headphones (w/o mic) it uses internal microphone. At least this is the outcome on my SGS2 with ICS 4.0. If somebody find a workaround I would be happy to hear as well.
I haven't tried yet, but maybe the Native Developement Tools can allow you to access any microphone you want from low level.
If you want to make things a bit simpler, you could consider using OpenSL ES for Android, although i have no idea if it provides low-level microphone control.
I would like to know whether it is possible to do A2DP streaming and RFCOMM to the same Bluetooth device concurrently. Would opening a socket for RFCOMM communication cause the A2DP stream to drop? Any known issues in this usage assuming it is possible? Thanks.
yes it is possible.
Opening the stream will not cause any issue on the A2DP, but depending on the implementation pushing a lot of data on the RFCOMM while A2DP is also streaming might cause some gaps in the streaming due to bandwidth constraints
In case anyone else finds this question and is looking for an answer. I'm doing this, on my Galaxy Nexus the audio streaming performs consistently but I see a drop in the data rate on my RFCOMM socket. I don't have exact numbers with me. But performance definitely dips when playing audio over A2DP, and then recovers after stopping A2DP. This is for serial communication at 115200 bps.
I should also add that this was done with Android 4.3 which uses a new bluetooth stack that is not BlueZ