ALSA - unmuting devices? - android

I have been trying to capture audio, within a native linux program running on an Android device via adb shell.
Since I seemed to be getting only (very quiet) noise, i.e. no actual signal (interestingly, an Android/Java program doing similar did show there was a signal on that input),
I executed alsa_amixer, which had one entry that looked like the right one:
Simple mixer control 'Capture',0
Capabilities: cvolume cswitch penum
Capture channels: Front Left - Front Right
Limits: Capture 0 - 63
Front Left: Capture 31 [49%] [0.00dB] [off]
Front Right: Capture 31 [49%] [0.00dB] [off]
"off". That would explain the noise.
So I looked for examples of how to use alsa_amixer to unmute the channels, I found different suggestions for parameters like "49% on" or "49% unmute", or just "unmute" none of which works. (if the volume% is left out, it says "Invalid command!", otherwise, the volume is set, but the on/unmute is ignored)
I also searched how to do this programatically (which I'll ultimately need to do, although the manual approach would be helpful for now), but wasn't too lucky there.
The only ALSA lib command I found which sounds like it could do something like that was "snd_mixer_selem_set_capture_switch_all", but the docs don't day what the parameter does (1/0 is not on/off, I tried that ;) )

The manual approach to set these things via alsa_amixer does work - but only if android is built with the 'BoardConfigCommon.mk' modified, at the entry: BOARD_USES_ALSA_AUDIO := false, instead of true.
Yeah, this will probably disable ALSA for android, which is why it wouldn't meddle with the mixer settings anymore.
To you android programmers out there, note that this is a very niche use case of course, as was to be expected by my original post to begin with.
This is not what most people would want to do.
I just happen to tinker with an android device here in unusual ways ;-)

Just posting the code as question giver suggested, also don't like external links.
#include <alsa/asoundlib.h>
int main()
{
snd_mixer_t *handle;
snd_mixer_selem_id_t *sid;
snd_mixer_open(&handle, 0);
snd_mixer_attach(handle, "default");
snd_mixer_selem_register(handle, NULL, NULL);
snd_mixer_load(handle);
snd_mixer_selem_id_alloca(&sid);
snd_mixer_selem_id_set_index(sid, 0);
snd_mixer_selem_id_set_name(sid, "Capture");
snd_mixer_elem_t* elem = snd_mixer_find_selem(handle, sid);
snd_mixer_selem_set_capture_switch_all(elem, 0);
snd_mixer_selem_set_capture_dB_all(elem, 0, 0);
snd_mixer_close(handle);
}

Related

Who will call "Visualizer_process" in Android

I want to capture the audio wave frame from the audio buffer, I found android.media.audiofx.Visualizer can do such thing, but it can only returns partial and low quality audio content
I found android.media.audiofx.Visualizer will call to the function Visualizer_command(VISUALIZER_CMD_CAPTURE) at android4.0\frameworks\base\media\libeffects\visualizer
I found the function Visualizer_process will make the audio content to low quality. I want to rewrite the Visualizer_process , and want to find who will call Visualizer_process, but I cannot find the caller from Android source code, can anyone help me ?
thanks very much!
The AudioFlinger::PlaybackThread::threadLoop calls AudioFlinger::EffectChain::process_l, which calls AudioFlinger::EffectModule::process, which finally calls the actual effect's process function.
As you can see in AudioFlinger::EffectModule::process, there's the call
int ret = (*mEffectInterface)->process(mEffectInterface,
&mConfig.inputCfg.buffer,
&mConfig.outputCfg.buffer);
mEffectInterface is an effect_handle_t, which is an effect_interface_s**. The effect_interface_s struct (defined here) contains a number of function pointers (process, command, ...). These are filled out with pointers the actual effect's functions when the effect is loaded. The effects provide these pointers through a struct (in EffectVisualizer it's gVisualizerInterface).
Note that the exact location of these functions may differ between different Android releases. So if you're looking at Android 4.0 you might find some of them in AudioFlinger.cpp (or somewhere else).

Audio Codec For Android

Following is the code I came through while making some changes in the audio files. Can you please tell what exactly does this code do and what does "RX" in the following code specify. Any leads would be great
SectionDevice
Name "OutputLime"
Comment "Rx Lime jack output"
EnableSequence
'SLIM_0_RX Channels':0:Two
'RX3 MIX1 INP1':0:RX1
'RX5 MIX1 INP1':0:RX2
'RX4 DSM MUX':0:CIC_OUT
'RX6 DSM MUX':0:CIC_OUT
'LINEOUT1 Volume':1:66
'LINEOUT2 Volume':1:66
'LINEOUT3 Volume':1:66
'LINEOUT4 Volume':1:66
EndSequence
DisableSequence
'RX3 MIX1 INP1':0:ZERO
'RX5 MIX1 INP1':0:ZERO
'RX4 DSM MUX':0:DSM_INV
'RX6 DSM MUX':0:DSM_INV
'LINEOUT1 Volume':1:0
'LINEOUT2 Volume':1:0
'LINEOUT3 Volume':1:0
'LINEOUT4 Volume':1:0
EndSequence
what does "RX" in the following code specify
Output devices or paths are typically labeled RX; and conversly input devices/paths are labeled TX. You can remember that by thinking of an RX device as something that Recieves audio data from the system (e.g. a speaker), and a TX device as something that Transmits audio data to the system (e.g. a microphone).
What this code does is define an audio output device named "OutputLime" (is that a typo of "OutputLine" btw?), and the actions that should be taken by the ALSA Usecase Manager when that device is enabled or disabled.
Each line in the enable/disable sequences specifies an ALSA control (on the ALSA card corresponding to your codec, which typically would be card 0), and what value to write to the control.
SLIM_0_RX refers to a channel on the SLIMBus connecting the DSP and the codec. Typically you'll see a corresponding 'SLIMBUS_0_RX Audio Mixer MultiMedia1':1:1 in the verbs in your UCM file that refer to playback that should be routed through the codec, which basically says that anything written to MultiMedia1 (pcmC0D0p) should go to SLIM_0_RX.
So the code is setting this up as a stereo output device. Looks a lot like the loudspeaker device actually.
I don't remember exactly what all those other controls represent. Some are volumes obviously, and it's not a wild guess that the others are for specifying which channel on the physcial stereo device should get the left output and which should get the right output.Perhaps you can look it up in the codec data sheet if you've got one. Otherwise you can check if the driver source code for your codec is available and look there for clues (or perhaps in the msm-pcm-routing code, assuming that this is a Qualcomm platform).

Scheduling latency of Android sensors handlers

rather than an answer I'm looking for an idea here.
I'd like to measure the scheduling latency of sensor sampling in Android. In particular I want to measure the time from the sensor interrupt request to when the bottom half, which is in charge of the data read, is executed.
The bottom half already has, besides the data read, a timestamping instruction. Indeed samples are collected by applications (being java or native, no difference) as a tuple [measurement, timestamp].
The timestamp follows the clock source clock_gettime(CLOCK_MONOTONIC, &t);
So assuming that the bottom-half is not preempted, somehow this timestamp gives an indication of the task scheduling instant. What is missing is a direct or indirect way to find out its corresponding irq instant.
Safely assume that we can ask any sampling rate to the sensor. The driver skeleton is the following (Galaxy's S3 gyroscope)
err = request_threaded_irq(data->client->irq, NULL,
lsm330dlc_gyro_interrupt_thread\
, IRQF_TRIGGER_RISING | IRQF_ONESHOT,\
"lsm330dlc_gyro", data);
static irqreturn_t lsm330dlc_gyro_interrupt_thread(int irq\
, void *lsm330dlc_gyro_data_p) {
...
struct lsm330dlc_gyro_data *data = lsm330dlc_gyro_data_p;
...
res = lsm330dlc_gyro_read_values(data->client,
&data->xyz_data, data->entries);
...
input_report_rel(data->input_dev, REL_RX, gyro_adjusted[0]);
input_report_rel(data->input_dev, REL_RY, gyro_adjusted[1]);
input_report_rel(data->input_dev, REL_RZ, gyro_adjusted[2]);
input_sync(data->input_dev);
...
}
The key constraint is that I need to (well, I only have enough resources to) perform this measurement from user-space, on a commercial device, without toucing and recompliling the kernel. Hopefully with a limited mpact on the experiment accuracy. I don't know if such an experiment is possible with this constraint and so far I couldn't figure out any reasonable method.
I might consider also recompiling the kernel if the experiment then becomes straightforward.
Thanks.
First Its not possible to perform this measurement without touching the kernel.
Second I didnt see any bottom half configured in your ISR code.
Third if at all Bottom half is scheduled and kernel can be recompiled , you can sample jiffie value in ISR and again resample it in bottom half. take the difference between the two samples and subtract that offset from timestamp that is exported to U-space.

How to Control DTMF volume programmatically android

I want to make a Seek Bar that controls DTMF volume(e.g 0 to 100). I have searched a lot but could not find any thing. I am doing this but its not working..
int seekbarValue=seekBar.getProgress();
AudioManager audioManager=(AudioManager) getSystemService(Context.AUDIO_SERVICE);
audioManager.setStreamVolume(AudioManager.STREAM_DTMF, seekbarValue, 0);
Please any one tell me a solution to control DTMF volume.
None of the (AudioManager.STREAM_*) volumes go to 100 (int).
A valid stream volume for setStreamVolume(...) is between 0 and getStreamMaxVolume(int streamType).
Each stream can have a different max volume int, like 8, 10, or 16 from what I remember. Might even be different on different devices.
I hope that is enough to point you and future visitors in the right direction.
I wanted to add a little more information to the answer of #Anonsage.
As #Anonsage mentioned, each stream has a different max volume. As per Kitkat 4.4.2 implementation, these are the max values.
STREAM.DTMF: 15
STREAM.MUSIC: 15
STREAM.VOICECALL: 5
STREAM.RINGTONE: 7
If you actually get into the lower levels and look into the AudioService.java of AOSP code, there is a rescaling operation that helps to show similar UI and update the actual values of the stream.

Android - turn off hardware key lights

Inside my app, I need a way to turn off the lights on the standard Android phone keys (Home, Menu, Back, and Search) - how can I do this programmatically?
According to this page, the hardware key backlights can be controlled by writing to a specific file in the filesystem with superuser privileges (i.e. phone must be "rooted"):
Q: How can I control the keyboard
backlight?
A: The keyboard backlight can be
controlled via
/sys/class/leds/keyboard-backlight/brightness.
It appears that it's a simple on-off
control (echoing '0' turns it off,
echoing '1' or higher turns it on).
For some reason, the default system
backlight control stuff seems to set
this to "83", but I don't know why. I
can't seem to see any difference
between 83 and any other number. The
file is readable by anyone, but only
writable by root, so you'll need root
access to the phone to manipulate it
this way.
So to turn off the backlight programmatically, you could invoke exec() on the Runtime like so:
Runtime r = Runtime.getRuntime();
r.exec("echo 0 > /system/class/leds/keyboard-backlight/brightness");
Depends on what you are doing, but would probably be wise to check the result of exec() afterwards to see if a write error occurred.
Note: I tested this on my own phone and it seems to work without acting as root. However, this may not be the case on every phone, so you may have different results.
This is applicable only for the device samsung devices:
To get the BackLight sate:
int backLight = Settings.System.getInt(getContentResolver(), "button_key_light");
// if it return -1 it means that light is on
// if it return 0 the light is off
// some time it will return values like 600(1.5 sec)
if you want to put the backLight as off u can do like this
Settings.System.putInt(getApplicationContext().getContentResolver(), "button_key_light", 0);

Categories

Resources