Android light sensor : detect specified flash code - android

i wanna know how to recognize flash code of the blinking LED.
If I set in app correct code: 0,5+1;0,5+3 (0,5 sec LIGHT , 1 sec DARK, 0,5 sec LIGHT, 3 sec DARK),and then with light sensor detect LED flashing,
how to recognize first flash (0,5) if flashing is continuously?? How to compare detected values with specified?

Considering you are getting the signal without noise, then you will be getting a sequence: 0.5 LIGHT , 1 DARK, 0.5 LIGHT, 3 DARK, 0.5 LIGHT , 1 DARK, 0.5 LIGHT, 3 DARK, ...
In this way, I think you are not matching a specific event, but matching using a time window (0.5 + 1 + 0.5 + 3 = 5 seconds). When moving the time window along the signals detected, you will find your events, and then you can identify the specific ones.
It's important to check the frequency you can get out of the light sensor. Let's say, if you are getting at 10fps, then you will get an array of values:
[0, 10, 200, 230, 209, 198, 201, 10, 7, 20, 17, 18, 10, 11, 10, 12, 13, ... ]
Then, by setting a threshold, you can see where is the start and end of light and dark.
When you are using the time window of 5 seconds, the array you keep will be at a length of 50. You might want to check the array by first connecting the head and tail of it in order to match the sequence you want.
Hope this helps!

Related

BLE Heart Rate Senser Value Interpretation

I have an Android App where I get Heart Rate Measurements from a Polar H10 Device.
I'm totally lost on how to interpret the heart rate. Various links to the bluetooth.com site are resulting in 404 errors unfortunately.
The characteristics value is i.e.
[16, 59, 83, 4]
From what I understood the second byte (59) is the heart rate in BPM. But this does not seem to be decimal as the value goes up to 127 and then goes on -127, -126, -125, ... It is not hex either.
I tried (in kotlin)
characteristic.value[1].toUInt()
characteristic.value[1].toInt()
characteristic.value[1].toShort()
characteristic.value[1].toULong()
characteristic.value[1].toDouble()
All values freak out as soon as the -127 appears.
Do I have to convert the 59 to binary (59=111011) and see it in there? Please give me some insight.
### Edit (12th April 2021) ###
What I do to get those values is a BluetoothDevice.connectGatt().
Then hold the GATT.
In order to get heart rate values I look for
Service 0x180d and its
characteristic 0x2a37 and its only
descriptor 0x2902.
Then I enable notifications by setting 0x01 on the descriptor. I then get ongoing events in the GattClientCallback.onCharacteristicChanged() callback. I will add a screenshot below with all data.
From what I understood the response should be 6 bytes long instead of 4, right? What am I doing wrong?
On the picture you see the characteristic on the very top. It is linked to the service 180d and the characteristic holds the value with 4 bytes on the bottom.
See Heart Rate Value in BLE for the links to the documents. As in that answer, here's the decode:
Byte 0 - Flags: 16 (0001 0000)
Bits are numbered from LSB (0) to MSB (7).
Bit 0 - Heart Rate Value Format: 0 => UINT8 beats per minute
Bit 1-2 - Sensor Contact Status: 00 => Not supported or detected
Bit 3 - Energy Expended Status: 0 => No Present
Bit 4 - RR-Interval: 1 => One or more values are present
So the first byte is a heart rate in UInt8 format, and the next two bytes are an RR interval.
To read this in Kotlin:
characteristic.getIntValue(FORMAT_UINT8, 1)
This return a heart rate of 56 bpm.
And ignore the other two bytes unless you want the RR.
It seems I found a way by retrieving the value as follows
val hearRateDecimal = characteristic.getIntValue(BluetoothGattCharacteristic.FORMAT_UINT8, 1)
2 things are important
first - the format of UINT8 (although I don't know when to use UINT8 and when UINT16. Actually I thought I need to use UINT16 as the first byte is actually 16 (see the question above)
second - the offset parameter 1
What I now get is an Integer even beyond 127 -> 127, 128, 129, 130, ...

Explanation of how this MIDI lib for Android works

I'm using the library of #LeffelMania : https://github.com/LeffelMania/android-midi-lib
I'm musician but I've always recorded as studio recordings, not MIDI, so I don't understand some things.
The thing I want to understand is this piece of code:
// 2. Add events to the tracks
// Track 0 is the tempo map
TimeSignature ts = new TimeSignature();
ts.setTimeSignature(4, 4, TimeSignature.DEFAULT_METER, TimeSignature.DEFAULT_DIVISION);
Tempo tempo = new Tempo();
tempo.setBpm(228);
tempoTrack.insertEvent(ts);
tempoTrack.insertEvent(tempo);
// Track 1 will have some notes in it
final int NOTE_COUNT = 80;
for(int i = 0; i < NOTE_COUNT; i++)
{
int channel = 0;
int pitch = 1 + i;
int velocity = 100;
long tick = i * 480;
long duration = 120;
noteTrack.insertNote(channel, pitch, velocity, tick, duration);
}
Ok, I have 228 Beats per minute, and I know that I have to insert the note after the previous note. What I don't understand is the duration.. is it in milliseconds? it doesn't have sense if I keep the duration = 120 and I set my BPM to 60 for example. Neither I understand the velocity
MY SCOPE
I want to insert notes of X pitch with Y duration.
Could anyone give me some clue?
The way MIDI files are designed, notes are in terms of musical length, not time. So when you insert a note, its duration is a number of ticks, not a number of seconds. By default, there are 480 ticks per quarter note. So that code snippet is inserting 80 sixteenth notes since there are four sixteenths per quarter and 480 / 4 = 120. If you change the tempo, they will still be sixteenth notes, just played at a different speed.
If you think of playing a key on a piano, the velocity parameter is the speed at which the key is struck. The valid values are 1 to 127. A velocity of 0 means to stop playing the note. Typically a higher velocity means a louder note, but really it can control any parameter the MIDI instrument allows it to control.
A note in a MIDI file consists of two events: a Note On and a Note Off. If you look at the insertNote code you'll see that it is inserting two events into the track. The first is a Note On command at time tick with the specified velocity. The second is a Note On command at time tick + duration with a velocity of 0.
Pitch values also run from 0 to 127. If you do a Google search for "MIDI pitch numbers" you'll get dozens of hits showing you how pitch number relates to note and frequency.
There is a nice description of timing in MIDI files here. Here's an excerpt in case the link dies:
In a standard MIDI file, there’s information in the file header about “ticks per quarter note”, a.k.a. “parts per quarter” (or “PPQ”). For the purpose of this discussion, we’ll consider “beat” and “quarter note” to be synonymous, so you can think of a “tick” as a fraction of a beat. The PPQ is stated in the last word of information (the last two bytes) of the header chunk that appears at the beginning of the file. The PPQ could be a low number such as 24 or 96, which is often sufficient resolution for simple music, or it could be a larger number such as 480 for higher resolution, or even something like 500 or 1000 if one prefers to refer to time in milliseconds.
What the PPQ means in terms of absolute time depends on the designated tempo. By default, the time signature is 4/4 and the tempo is 120 beats per minute. That can be changed, however, by a “meta event” that specifies a different tempo. (You can read about the Set Tempo meta event message in the file format description document.) The tempo is expressed as a 24-bit number that designates microseconds per quarter-note. That’s kind of upside-down from the way we normally express tempo, but it has some advantages. So, for example, a tempo of 100 bpm would be 600000 microseconds per quarter note, so the MIDI meta event for expressing that would be FF 51 03 09 27 C0 (the last three bytes are the Hex for 600000). The meta event would be preceded by a delta time, just like any other MIDI message in the file, so a change of tempo can occur anywhere in the music.
Delta times are always expressed as a variable-length quantity, the format of which is explained in the document. For example, if the PPQ is 480 (standard in most MIDI sequencing software), a delta time of a dotted quarter note (720 ticks) would be expressed by the two bytes 82 D0 (hexadecimal).

Compute the power consumption(mW) of an Android/iPhone app

I want to calculate and show in a plot the power consumption of my app over the time. The x axis is the time (hours) and the y axis the power consumption in mW.
I have the discharge values for my application (100, 93, 82, 78, 71, 64, 59, 49, 41) that correspond to initial charge, 1h, 2h... The battery of the smartphone is 3.7V and 1850mAh. I calculated the power consumption the same way:
cons(W) = voltage (V) * discharge amount (%) * capacity (mAh) / discharge time (h)
cons (W) = 3.7V * 1.85 Ah * [100, 93, 82, 78, 71, 64, 59, 49, 41] / [0.1 1 2 3 4 5 6 7 8 ]
Is that correct? I know there is a way to directly obtain the values I need but I want to compare several apps and I don't have time to compute the values again. So, based on the previous calculation, What I am doing wrong? I am obtaining values too large. Any suggestion?
Android and iOS have the possibility to show power consumption on a per-app basis.
At least Android should support API calls, to access these values.
(These calculations are more valid then just using battery drain, still not perfect. [i.e. they use processor time, readout of sensor-values, ...])
possible duplicate: https://stackoverflow.com/questions/23428675/android-to-check-battery-stats-per-application

Audio is also be cancelled which overlap with acoustic echo when using WebRtc_Aecm on Android

As an example:
PCM captured by microphone:
1, {2,3} {4,5} {6,7}, 8, 9,
{A,B} means A is the audio data I really want to capture, B is the echo at the same time.
A and B both captured by mic at the same time.
The issue I encounter: the audio 2, 4 and 6 are also cancelled while cancelling 3, 5 and 7.
This is my code:
WebRtcAecm_Init( &aecm , 8000 );
While ( aecProcessing )
{
WebRtcAecm_BufferFarend( speakerBuffer );
WebRtcAecm_Process( aecm , micBuffer , NULL , aecBuffer , 160 , 200 );
}
if you run a loopback testing, the normal voice can be cancelled partially.
don't use constant delay like 200ms, cus' this delay is always changes, you should estimate it every 1 second or shorter.
EDIT
please make clear what is echo and what is normal voice.

In Android app setting current Thread priority to less than 1 causes a force close

I am trying to increase the priority of a thread in my application by putting:
Thread.currentThread().setPriority(N);
The normal priority of current thread is 5, I can change N to as low as 1, but if I put it to 0 or -1 I get a force close message on my phone.
Is there a reason I cannot increase the priority of this thread?
If you look at the documentation for Thread.setPriority(), it says the priority must be in a range defined by MAX_PRIORITY (10) and MIN_PRIORITY (1). Since 0 and -1 are outside that range you should be seeing an IllegalArgumentException.

Categories

Resources