How to detect if device has 3.5 mm audio jack? - android

I've gone through many posts and forum articles but didn't find anything related to my requirement. Lots of new android devices are coming with no headphone jack built in and I don't need to deal with those devices.
Through PackageManger I could only detect feature AUDIO_OUTPUT.
Is there any effective way from which I can check whether 3.5 mm audio jack exists in device?

I don't think it is possible that the app can determine if a device has a 3.5 mm audio jack.
From: https://source.android.com/devices/audio/usb#hostApplications
the line:
There are no APIs specific to USB digital audio.
Doesn't give me much hope especially as in that same document earlier mentions when talking about TRS Mini (audio jack) and USB Audio:
Note: This is an artificial comparison, since a real Android device would probably have both options available.
Which may have been true when the document was first written but as you've noted now less guaranteed.
A scan of the Android Compatibility Definition Document shows in Android CDD Section 7.8.2 Audio that no distinction is made between a 3.5mm audio jack and USB audio connection.
One suggestion is to submit a feature request to the Android Issue Tracker with as detailed use case as to why the legacy headphone jack should be a feature flag.

Related

How to enable second SPI channel of Raspberry Pi 3 on Android Things?

I'm trying to get second SPI channel spidev1.x by adding the code below into config.txt on Raspberry Pi 3.
dtparam=spi=on
dtoverlay=spi0-hw-cs
dtoverlay=spi1-1cs
So far I get spidev0.x under folder /dev, but not spidev1.x. The code above works on Raspbian. Is there a method to enable second SPI channel for AT?
The issue here divides into two parts:
Enabling the spidev driver in the kernel.
Accessing SPI1 using the SpiDevice APIs provided by Peripheral I/O.
Since you cannot see /dev/spidev1.x in the kernel, both problems are at hand.
For issue #1, there seems to be a link between the usage of UART0 and SPI1 as noted in this RPi forum post. In the latest preview of Android Things (DP2), UART0 is currently shared with the console and routed to the GPIO header pins for use by apps. It's possible that you might have to disable UART0 in order to get SPI1 to work at the kernel level.
However, regarding issue #2, the Peripheral I/O APIs do not currently expose SPI1 in Android Things DP2. So even with the kernel issue resolved there would not be a direct way to access the port from an app. We are working on ways to enable additional ports outside of those pre-defined at build time, but do not have a solution at this time.

Is it possible to make the Gear 2 smartwatch (running tizen) the bluetooth microphone for an android phone that it's paired with?

Basically I want to make the tizen smartwatch into a bluetooth headset for a period of time. We have a tizen and an android developer handy and we're willing to build anything necessary to make this work.
This kind of process seems to work with built-in android applications like the standard phone app. But there doesn't seem to be any documentation online as to how an app developer would leverage streaming the mic.
It should be noted that we do need to get the audio into the microphone input on the phone for our third party software to work. It's not as simple as just getting the audio to the phone.
Any help, even someone telling us what isn't possible, will be greatly appreciated.
It is possible to play sound with the HTML audio tag: http://developer.samsung.com/forum/board/thread/view.do?boardName=SDK&messageId=269002&startId=zzzzz~&searchSubId=0000000032&searchType=ALL&searchText=sound
It is possible to capture the sound in a Host Android application
It is possible to exchange data bytes by bluetooth with the accessory SDK: http://developer.samsung.com/samsung-mobile#accessory
The data transfer is quick and efficient, so low quality sound may works with little delay
So it certainly is possible. But you'll have to code (or use compatible javascript and android libraries) all the streaming code which is quite a lot of work

Call Audio Stream Modification in Android 4.0 ICS

I've been working on a project that would greatly benefit from call-stream modification. This has been repeatedly said/assumed to be unachievable, as most people believe that the hardware loop for the in-call audio is completely disconnected from the main MCU of the device.
Questions like Stream audio to a phone call Android have received answers stating that it is impossible to access the audio. I agree that this is definitely impossible from the Android API, but it is completely unclear whether the hardware ACTUALLY is disconnected completely.
The stackoverflow user 'artsylar' said that they were able to modify the 'framework layer' of Android OS to inject recorded audio into call streams, which would be a huge step forward (see Play an audio clip onto an ongoing call, artsylar's comment on the selected answer). Assuming artsylar's success is valid, there definitely is a way to control the call stream audio by modifying the framework (I assume the telephony base framework in the Android source).
Basically, I completely agree that modifying or controlling the call-stream is impossible from the application layer. However, I am interested in customizing the Android OS in framework or Radio Interface Layer; artsylar seems to have had success, but there is no explanation in open-literature on how. Given the current state of Android technology, could anyone clarify the above to actually establish whether controlling call audio is possible by modifying the core Android OS, and a good path to accomplish this goal?
I believe that a final clarification on this issue would be of great value to the open-source community.
Thanks!
It's technically possible to inject audio into the voice call uplink on some platforms (Qualcomm's MSM8960 and APQ8064, for example). Support exists at the hardware level and at the device driver level. But to make that functionality available to normal applications you'd have to create a custom Android ROM where you've added all the necessary user-space parts in both the Java layers and native layers of Android.
So the short answer is: no, there's no standard way of doing this as an app developer (doesn't matter if you use the SDK or NDK).
If you're working for an OEM or by some other means are able to build and flash your own Android ROMs you can probably get the information you need by asking your platform vendor.
It is very difficult to do so because it relates to handling the Linux Kernal inside the Android OS.
Not only is there no API support , but also the security issue is not allowed to do so.
As being a professional in the software engineering field especially the programmers, we
never assume anyone's success on invention and the related project is valid until the project is being tested.
Also streaming the audio during the call may invoke the issue of privacy and security issue among the smartphone users and the service provider of telephony

Communicate with external MIDI device from Android device

What classes are available for Android platform to communicate (in/out) with external MIDI device? I have HTC Desire smartphone, it has USB port, I'd assume it is possible to connect it to a MIDI synthesizer, using standard USB cable + [Type A -> Mini A] converter. I'd like to write a MIDI sequencer app that would be able to record MIDI stream from the synthesizer and then play it back later.
Short answer: None. Slightly longer answer: On the HTC Desire there is no built-in support for USB host mode (which you need, since the usb-midi adapter would be the USB client).
(Android 3.1 does have some support for USB Host mode, but that's not available for the HTC Desire)
If you're not afraid of a soldering iron, you could go the midi-over-bluetooth route: http://nettoyeur.noisepages.com/2011/01/midi-over-bluetooth-part-iii-new-hardware/
Much has changed on this front. The following library allows for MIDI i/o with a USB OTG adapter on API 12+:
https://github.com/kshoji/USB-MIDI-Driver
It's far from perfect, and in my testing, pretty crashy, but it should at least be a good starting point for someone looking for the relevant classes.
As #edovino said, you need USB host mode, and then drivers.
If you don't mind rooting around your phone, and the hardware supports it, you do have some options. Check out this link: http://sven.killig.de/android/N1/2.2/usb_host/
This guy was able to get audio, video, keyboard, and some other stuff working. He includes audio/MIDI drivers.
Yes, as pointed above, it is not feasible using the phone because it does not provide USB-host capability. I have been working religiously over the past few months to make an XY-controller for my synthesizer so that I can transmit controller values to use for performances. I reckon, that is what you want to do with additional functionality.
Bluetooth is definitely an option and if you look up for libpd, Peter Brinkmann himself has addressed this issue and acknowledged that Bluetooth dongles for MIDI are not far away.
WiFi is also something one may be willing to look at. Using rtpMIDI, we can create sessions on the PC side of things and use just about any WiFI enabled MIDI device to transmit/receive data. IF you are looking to control software synthesizers using a phone, this seems ideal. TouchDAW and TouchOSC, applications on android, make use of this feature.
With the USB-Android driver, the only problem I see is no support for isochronous transfers using USB host. So, we cannot guarantee latency deterministic. But, considering no other bus accessories attached, the performance seems pretty decent in my tests.

How to Stream Audio from PC through Bluetooth

Here's what I am trying to do. Capture any audio being currently played on the PC and stream it through Bluetooth and then play it through the Android device paired with the PC. I have worked with Bluetooth a little but very basic stuffs. And I have very less idea on how to go about on this. My target device is Android 2.2 (and above). I guess I have to use Bluetooth profiles, but not too sure. Also, I am not aware of any other caveats that I may have to face.
Would anyone like to point me at the correct direction. Any tips, links would help. Thank You.
It depends on the capability / profile that your android device supports, for streaming you normally will use the A2DP profile and the android device will need to support the A2DP sink role. Typically this role is supported by Stereo headsets , speakers etc. Android phones do not support sink - phones are A2DP source (or initiator of the streaming)

Categories

Resources