Where to find Bluedroid config files on Android 5.0 - android

I want to use a Lenovo Tab2 A7 (Android 5.0, rooted) as a A2DP sink. To do this, i have to edit the files:
/external/bluetooth/bluedroid/include/bt_target.h
and
/packages/apps/Bluetooth/res/values/config.xml
But i cant find them. I searched in every folder on the tablet. I also searched on the source which is provided by Lenovo:
http://support.lenovo.com/de/de/products/tablets/a-series/a7-10-2/downloads/ds105762
Can somebody give me a hint?

These files bt_target.h and config.xml actually do not exist on android device. These files are present in AOSP code. To enable A2DP sink you have to actually dig into the code, go to these files location as mentioned by you, make the changes and then compile either complete image or just bluedroid.
Also your device should be rooted if you just want to replace bluedroid library. In case of complete image, anyway you have to flash your complete image.
But, unfortunately it's not yet done for A2DP sink. With above steps you'll be able to enable sink profile and connect to it and start streaming from remote device to your device but the streaming data could not be played on your device. Reason being, AOSP does not have that portion of code available as of now which could play the streaming data on your phone speaker.
Refer below link for more info
A2DP sink discussion on android
But wait, there is one good news, with limitation though, Samsung Galaxy S7 and above series phones do support sink role. But right now there seems to be limitation in sink functionality of the phone. Galaxy S7 and above series phones can only connect with Tizen Smart TVs at present through in built quickconnect app.

Related

Unable to record from USB microphone on Chrome on Android (Lollipop)

Based on previous thread, getUserMedia is supported in Chrome on Android now (version 40).
I got this working using the usual getUserMedia idiom (see, for example, http://www.html5rocks.com/en/tutorials/getusermedia/intro/).
This works on desktop (any mic) and on mobile (default mic).
However, when a USB mic is plugged in, this does not seem to work. The usual permission dialog is shown. However, the input audio will actually come from the phone's default mic (ie, scratching/tapping the usb mic produced no noise), even though the USBmic is clearly connected (ie, it lit up).
On desktop, you can fix this by selecting the audio input source, but I'm not sure how to do that on Android.
The really funny thing is that typical audio recording APPs can actually CORRECTLY get the audio input from the USB mic on the same device.
Has anyone experienced this problem?
I can reproduce this issue it appears to be specific to some Android phones and Android tablet devices that I have tested, including a Samsung Tab A7 and the Amazon Fire 10".
Currently testing Samsung Tab A7 with Chrome 103.0.5060.129 Android 11; SM-T220 Build/RP1A.200720.012
I do not have this issue in chrome on my Android Oneplus 8 running Android 11.
More information for reproducing:
The expected behavior does not work in Chrome, but does work as expected in Firefox.
Plug in an external USB microphone device
Open a site such as https://vdo.ninja/ or a https://www.webrtc-experiment.com/
Capture audio.
The default audio will be captured from the devices internal microphone instead of the USB microphone.
On sites such as the vdo.ninja, where you are able to select the audio source, even if the source is listed, such as "Wired Headset", changing the input will result in using the internal microphone.

Android Capture USB Mic Audio without root

My Android application is targeting non-rooted Samsung Galaxy Note 2 phones running Android 4.4.2.
When my USB microphone is plugged into the phone, it appears as /dev/snd/pcmC1D0c, so I know it is being recognizing as an audio capture device.
Is there a way to capture the Audio from this device into my app without rooting the Galaxy Note 2?
I tried AudioRecord with all the MediaRecorder.AudioSource options and they all use the phone's internal mic.
I did not find any option in Settings/Sound to change the input device.
I investigated Samsung's Professional Audio SDK and it looked very promising until I found the Galaxy Note 2 device I am targeting is unsupported by this SDK.
If there is no way to access it from Java, is it possible to use native c code (JNI) to access the existing tinyALSA driver and capture the audio without rooting the phone?
I know I can use use libusb to capture the audio packets directly, but I was hoping for an cleaner/easier way since the phone already recognizes my audio input device.

Audio is not coming properly in Android 4.0.3

I am developing a SIP based application to make and receive a calls. Recently i have tested the application on different devices. first, i have tested with Samsung Galaxy five(2.2) and Samsung Galaxy y(2.3.6),when called both side audio was fine.
But when i have tried with Galaxy y(2.3.6) and Sony XPERIA (4.0.3),there was an one sided voice.Application worked fine with 2.2 and 2.3.6 but while testing with 2.3.6 and 4.0.3 there was an audio issue with ICS.
Anyone have idea about this issue or Had this kind of problem then please give me the some idea about this.
Thanks
EDIT:
I am using codecs in application and i have created the .so file for that codecs. But i have created the .so file for only armeabi processor. And i think in Android 4.0.3 there is an armeabi-v7a. so should i need to create the .so file for armeabi-v7a CPU?
armeabi also runs on armeabi-v7a.
Anyways, the processor type is device dependent and not OS dependent.
You can Have armeabi-v7 even in Gingerbread...
You need to find the exact position of the package/audio-loss.
In SIP-Applications, the error-chain is quite long.
Can you verify with shark if RTP packets are received on both devices even if you got no audio (You need root for running shark/tcpdump om your device)?
Otherwise, try forcing RTp proxy and run wireshark on the proxy.
If packets are transmitted in both directions, check where the packets are lost and they reach the encoder.
Put logs and checks verywhere in the audio-transmission pipeline until you know where they are gone and feed us with more information.
as sebweisgerber states, there are several possible failure points, if the call is properly established but no audio is hear, it should be in audio capturing/playing, audio de/codification or audio transmission. If we take into account that this works in 2.3.6 but not 4.0.3, I would bet for audio capturing/playing or codification.
Are you working with third party libraries for codification? Because, if this libraries are trying to access system's codification libraries, like csipsimple project, most of library names for decoding have changed from Gingerbread to ICS so, this would explain the "one sided" behaviour.
Hope this helps.

Communicate with external MIDI device from Android device

What classes are available for Android platform to communicate (in/out) with external MIDI device? I have HTC Desire smartphone, it has USB port, I'd assume it is possible to connect it to a MIDI synthesizer, using standard USB cable + [Type A -> Mini A] converter. I'd like to write a MIDI sequencer app that would be able to record MIDI stream from the synthesizer and then play it back later.
Short answer: None. Slightly longer answer: On the HTC Desire there is no built-in support for USB host mode (which you need, since the usb-midi adapter would be the USB client).
(Android 3.1 does have some support for USB Host mode, but that's not available for the HTC Desire)
If you're not afraid of a soldering iron, you could go the midi-over-bluetooth route: http://nettoyeur.noisepages.com/2011/01/midi-over-bluetooth-part-iii-new-hardware/
Much has changed on this front. The following library allows for MIDI i/o with a USB OTG adapter on API 12+:
https://github.com/kshoji/USB-MIDI-Driver
It's far from perfect, and in my testing, pretty crashy, but it should at least be a good starting point for someone looking for the relevant classes.
As #edovino said, you need USB host mode, and then drivers.
If you don't mind rooting around your phone, and the hardware supports it, you do have some options. Check out this link: http://sven.killig.de/android/N1/2.2/usb_host/
This guy was able to get audio, video, keyboard, and some other stuff working. He includes audio/MIDI drivers.
Yes, as pointed above, it is not feasible using the phone because it does not provide USB-host capability. I have been working religiously over the past few months to make an XY-controller for my synthesizer so that I can transmit controller values to use for performances. I reckon, that is what you want to do with additional functionality.
Bluetooth is definitely an option and if you look up for libpd, Peter Brinkmann himself has addressed this issue and acknowledged that Bluetooth dongles for MIDI are not far away.
WiFi is also something one may be willing to look at. Using rtpMIDI, we can create sessions on the PC side of things and use just about any WiFI enabled MIDI device to transmit/receive data. IF you are looking to control software synthesizers using a phone, this seems ideal. TouchDAW and TouchOSC, applications on android, make use of this feature.
With the USB-Android driver, the only problem I see is no support for isochronous transfers using USB host. So, we cannot guarantee latency deterministic. But, considering no other bus accessories attached, the performance seems pretty decent in my tests.

Redirecting/duplicating the UI to an external output

Is it possible in the Android framework to duplicate what is displayed on the main display (UI)?
I have a situation where I need to demonstrate my app to many people, and it would be easier to do if I can duplicate the screen contents to an external monitor/TV. I am not married to the idea of using the HDMI port, I would be happy doing this through Wi-Fi or Bluetooth or USB if need be. What I am looking for is to see if I can do something similar to what Windows does by default when a second monitor is connected.
I have been through the developer's documentation and haven't been able to find anything that would allow me to do this, but it would not be the first time I've missed something. Specifically I need to do this with an HTC Evo.
Your options are limited, mostly by your choice of device. The HTC EVO's HDMI port will only play back apps via the built-in Gallery application (videos and still photos).
You will need to use a "software projector" like Droid#Screen -- attach your EVO to a Android SDK-equipped notebook that is connected to a projector. Droid#Screen will display the EVO's screen on the notebook (and, from there, on the projector). However, the frame rate is limited to about 5-6 fps, due to limitations in the SDK tools that Droid#Screen leverages.
Or, get your hands on an HTC Droid Incredible, which supports composite output to TVs of anything on the main display via a special cable. The Samsung Galaxy Tab also supports this for anything that does not involve a SurfaceView, based on my experimentation to date. Some versions of the Samsung Galaxy S also support this, at least to some extent.
Or, use a webcam.
Or, use an ELMO (basically a webcam designed for document or device projection).
You can write a UiCloningService in jni that exposes a JNI method to clone the display. Usually, as Android is based on Linux, it will use the Linux framebuffer technology to represent display devices as dev nodes under /dev/fb* or /dev/graphics/fb*, where '*' can be 0,1,2,... depending on number of display connected.
As your device already has an HDMI port, it would be exposed via /dev/graphics/fb1, considering fb0 to be your default LCD display.
In the cloning service, you can then write to device attribute files created for the HDMI port under sysfs and, if the display driver of your device has implemented those features (which most probably would have, otherwise what point to have an external HDMI display), these features/functions in the driver will be responsible for cloning the Ui on your primary display to the secondary display.
But you would have to write the Ui cloning service in JNI.(usually device manufacturers provide such methods, if at all an Android SDK is provided by them for development on that particular device).
For eg., I have attached a UiCloningService.cpp that has a cloning JNI function for Android GingerBread on an OMAP3 platform below:
UiCloningService.cpp

Categories

Resources