Read RSSI value of all BLE devices on range - android

I am doing an aplication that needs to meassure the signal strenght of all BLE beacons(estimote) to process through a neural network, i want to store that RSSI in an array that will be sent to a server implemented in C#, i've already implemented it with wifi and want to obtain a similar result, here is my code for Wifi:
ListaWifi = ObjWifi.getScanResults();
rssi = new int[2];
for (int i = 0; i < 2; i++) {
rssi[i] = ListaWifi.get(i).level;
}
Also i want the data to always have the same order:
int[] ret = new int[3];
for (int i = 0; i < tam; i++) {
switch ((BSSID[i])) {
case "9c:d6:43:94:ee:3f":
ret[0] = rssi[i];
break;
case "9c:d6:43:94:f1:63":
ret[1] = rssi[i];
break;
case "9c:d6:43:94:f1:99":
ret[2] = rssi[i];
break;
default:
break;
}
}
Is there a way to continuosly monitor this data of BLE beacons and store in the same way as done with wifi??

I think the RSSI value is included in the advertising packets. I don't know any C#, so I can't help you with the specifics of getting that value from the packet (but it should be possible).
However, you're second stipulation that they always come in the same order would require some messing around. Those advertising packets are broadcast over the air in no particular order and multiple times (the beacons don't communicate with each other and coordinate their packets). One solution you could do is listen for a fixed period of time, collect all the ad packets together, then sort them by address (you'll probably get multiple from the same address, so maybe average the RSSI values). I have no idea why you require them to be in the same order, so that may not be the best solution to the problem you haven't stated.

Related

Pic16F688 has no stable readings via buletooth

I have spent much time trying to find out where is my mistakes while Im flashing the PIC16F688. The Pic has successfully flashed using PicKit2. Im using the Pic to convert analog pressure sensor to digital output and sending the data via Bluetooth, but the Bluetooth is not receiving stable numbers of data. The data is consist of 4 character decimal number that is between 0 and 1023.
The problem is that the Bluetooth can't wait at specific number and keep reading it, instead, it is reading the 4 digits in random.
I think my mistake is within the configuration of internal oscillator.
I'm attaching my code, the code is written to configure the flexiforce sensor circuit that outputs analog voltage up to 5v, and then the pic duty is to convert it to digital as I mentioned above.
it might be my wiring is not correct, please If you could help out with this one
and what configuration "at edit project" do I need to choose for Mikro PRO software?
I used "Bluetooth terminal" app to see my data asynchronous from Bluetooth.
Thank you.
char *temp = "0000";
unsigned int adc_value;
char uart_rd; int i;
void main()
{
OSCCON = 0x77;
ANSEL = 0b00000100;
CMCON0 = 0X07;
TRISA = 0b00001100;
UART1_Init(9600);
Delay_ms(100);
while (1)
{
adc_value = ADC_Read(2);
temp[0] = adc_value/1000+48;
temp[1] = (adc_value/100)%10+48;
temp[2] = (adc_value/10)%10+48;
temp[3] = adc_value%10+48;
for (i=0;i<4; i++)
UART1_Write(temp[i]);
UART1_Write(13);
Delay_ms(1000);
}
}
You can use itoa function to convert ADC integer value to characters for sending over UART. If there is error in calculation then you wont get appropriate value. Below code snippet for your reference :
while (1)
{
adc_value = ADC_Read(2);
itoa(adc_value, temp, 10);
for (i=0;i<4; i++)
UART1_Write(temp[i]);
UART1_Write(13);
Delay_ms(1000);
}
Please check Baud Rate you have configured at both ends is same or not. If baudrate mismatches then you will get Random value at Bluetooth Terminal where you are reading values.
What i would suggest, if you have a logic analyser, hook it up. If you don't recalculate your oscillator speed with the datasheet. It could just be that the internal oscillator is not accurate enough. What also works, is to write a function in assembly that waits a known time (by copy-pasting a lot of NOPs and using this to blink a led. Then start a stopwatch and count, say, 100 blinks. This is what i used to do before i had a logic analyser. (They are quite cheep on ebay).

Android multicast high packet loss

I am trying to do multicast in android. I have three phones ( all Galaxy S5 ). One phone has Wifi tethering turned on and is acting as AP(mobile data is turned off). Of the other two phones, one is transmitting to a multicast address the other is receiving, with both connected to the AP of the first phone. The transmitting and receiving phone are kept side by side.
I found that of 1000 packets that I am sending, I am receiving only about 200. Below is an outline of how I am sending
MulticastSocket mMulticastSocket = new MulticastSocket(port);
InetAddress multicastGrp = InetAddress.getByName("239.255.255.250");
mMulticastSocket.joinGroup(multicastGrp);
String sendStr = "";
int packetSize = 1400;
for(int i = 0; i < packetSize; i++)
sendStr = sendStr+"a";
for(int i = 0; i < 1000; i++)
mMulticastSocket.send(new DatagramPacket(sendStr.getBytes(), sendStr.getBytes().length, mMulticastSocket, port);
Receive is something like,
//acquire multicast lock
byte[] buffer = new buffer[2048];
DatagramPacket rPack = new DatagramPacket(buffer, buffer.length);
mMulticastSocket.setReceiveBufferSize(1024*64);
int recvCount = 0;
while(true) {
mMulticastSocket.receive(rPack);
recvCount++;
}
//release multicast lock
Both send and receive are done in worker threads. I also found that as the value of 'packetSize' reduces, the number of received packets is increasing. I guess it is due to CPU load or receive buffer, but in any case, I want to multicast packets of size 1400 bytes and receive as many as possible( I know that number of received packets is a function of the channel between the two phones but I think keeping them side by side is almost the best channel that can be had). Also, when I am doing a UDP unicast, i am able to receive about 900 of the 1000 packets sent.
I am not able to understand why the number of received packets is so low for multicasting. What is it that I am missing?
Your server is sending packets as fast as possible. There is nothing in your sender loop that would throttle back the data rate of the packets being sent. So each packet is being sent microseconds apart. Most likely something along the network stack will cause some buffers to fill up. UDP has no built in flow control or congestion control. If you can have some small delay (~1ms to 5ms) between packets, you will probably see much less packet loss.
for (int i = 0; i < 1000; i++)
{
mMulticastSocket.send(new DatagramPacket(sendStr.getBytes(),
sendStr.getBytes().length, mMulticastSocket
Thread.sleep(5, 0); // quick hack to throttle back rate
}

Arduino Mega receiving correct data through Serial 0, but not Serial 1-3

I currently have a HC-06 Bluetooth device connected to my Arduino Mega 2560 in order to receive strings sent from an Android device. With the HC-06 on Serial 0, I am receiving the data without error with the following code:
String inString = "";
int index = 0;
boolean stringComplete = false;
void setup() {
Serial.begin(9600);
pinMode(pwmPin, OUTPUT);
}
void loop() {
if(stringComplete) {
ParseSerialData(); // Parse the received data
inString = ""; // Reset inString to empty
stringComplete = false; // Reset the system for further input of data
}
}
void serialEvent() {
while(Serial.available() && stringComplete == false) {
char inChar = Serial.read();
inData[index] = inChar; // Store it in char array
index++;
if (inChar == '\n') { // Check for termination character
index = 0; // Reset the index
stringComplete = true; // Set completion of read to true
} else {
inString += inChar; // Also store as string
}
}
}
When I try to replace "Serial" with "Serial1" and "serialEvent()" with "serialEvent1()" and move the Bluetooth device to the TX1 and RX1, this program no longer works.
I have read that some people had similar problems when using AVR-GCC 4.4.x and solved the issue by downgrading to 4.3.x, but I have 4.3.2 (on Windows 8.1, same problem has arisen with Arduino IDE 1.0.3, 1.0.5-r2, and 1.5.6-r2).
I added the following print statements (with Serial 0 to print to the monitor on my PC) to the code with the Bluetooth device still on Serial 1:
String inString = "";
int index = 0;
boolean stringComplete = false;
void setup() {
Serial1.begin(9600);
Serial.begin(9600);
pinMode(pwmPin, OUTPUT);
Serial.println("Setting up...");
}
void loop() {
if(stringComplete) {
ParseSerialData();
inString = "";
stringComplete = false;
}
}
void serialEvent1() {
Serial.println("In serialEvent1...");
while(Serial1.available() && stringComplete == false) {
Serial.println("Char in...");
char inChar = Serial1.read();
Serial.println("WTF");
Serial.println(inChar);
Serial.println("WTF2");
inData[index] = inChar;
index++;
if (inChar == '\n'){
Serial.println("Termination char read...");
index = 0;
stringComplete = true;
}else{
inString += inChar;
}
}
}
Doing this, on the monitor I get:
Setting up...
In serialEvent1...
Char in...
WTF
WTF2
inChar typically prints as nothing, but during one test it was printing as an '#' character. The string sent is "s,1\n" from the Android device.
Based on the print out, the serial event is triggered by availability of serial data, but Serial1.available() remains true for only the first iteration, 's' is not read in (nor any of the other characters that do when Serial is used), and a termination character (newline char) is never read in so that I can begin parsing.
I also tried various baud rates with no difference in behavior. Based on reading Arduino documentation, serial port 1 should work just like serial port 0, and I did not miss substituting Serial for Serial1 in any part of the code.
What could be causing errors in communicating over Serial1 in the same way that has worked flawlessly on Serial0?
I also found a related Stack Overflow question, which was solved with something similar to my code (which works perfectly with Serial0 and is based on the Arduino documentation) using an expected termination character (the difference being that his code implements serial reading in the main loop, whereas mine is in a serialEvent). For some reason, it seems that both of us were having issues with Serial1 data not showing as available at the start of the next iteration. For some reason, serialEvent1 is not being called again for me. And I still don't understand why the first/only character read is not 's.' At first I was thinking that the stream was getting flushed before getting to the serial event again, but that still doesn't account for reading in an incorrect first character.
Also, I added the following Serial1 print statement to run multiple times in the Arduino setup and the Android device receives it each time with no errors, so sending data is working just fine:
Serial1.print("b,1\n");
Or even
Serial1.print("A long test message. A long test message.\n");
I'm fairly close to answering my own question now with further testing/debugging. I actually think the answer may end up being hardware-related rather than software. I wanted to find out if the problem was with the data sent from the HC-06 to port 1, or with the reading function of port 1. I basically had serial port 0 read in data, then send it serially to port 1, which would read that data, and send feedback over Bluetooth to the Android device. Serial port 1 read the data fine coming from port 0, so the issue is reading in data specifically from the HC-06. It may simply be a voltage level issue, so the answer may not belong with Stack Overflow. I will leave the question unanswered though until I definitively have found the root cause (allowing for the possibility that I might need some define statement for the HC-06 or serial port 1 for data to be read correctly, though I'm guessing a voltage level conversion may do the trick. I'm not sure why there would be such a difference between Serial0 and Serial1 though).
I solved the problem enabling the pull-up resistor of the RX1 pin:
Serial1.begin(9600);
pinMode(19, INPUT);
digitalWrite(19, HIGH);
Therefore the 3 V is "overridden" by Arduino's 5 V for logical HIGH and zero is pulled down by Bluetooth's TX for logical LOW.
I did it slightly differently by using the INPUT_PULLUP feature to pull the hardware Serial3 pin HIGH:
Serial3.begin(19200);
pinMode(15, INPUT_PULLUP); // Serial3 receive pin
It works a treat. Previously my serial communications between two Arduino Mega 2560s had been mainly corruption with the occasional correct transmission. Now it is mainly the correct transmission. Previously, the time taken to receive a valid serial value was up to 2.5 seconds (due to having to repeatedly retry to get a valid transmit). Now the time taken to get a valid transmit is 20 ms. (I use a twisted pair of single core wires about 35 cm length.)
After checking the serial data lines on my oscilloscope, I was able to come up with a solution to my issue. With the setup described about the related Stack Overflow question (Bluetooth TX → RX0, TX0 → RX1, TX1 → Bluetooth RX), I checked the signals sent by the Bluetooth device and by the Arduino's serial port 0.
The Bluetooth device's signal was low at 400 mV and high at 3.68 V, while the Arduino's port 0 sent low at 0V and high at 5 V. I knew the Bluetooth device was a 3.3V level device, but the Arduino should read anything above about 3V as high, so this should have not been an issue (and it obviously wasn't on Serial 0).
Anyway, I replaced the HC-06 Bluetooth device with a backup one I had purchased and everything works fine with the code using Serial1 that I posted in my question. This Bluetooth device was transmitting low at about 160 mV and high at 3.3 V.
It seems that my problem was rooted in the hardware (voltage levels) as expected, and that for some reason Serial1 is more sensitive to changes in digital levels than is Serial0.

Estimating beacon proximity/distance based on RSSI - Bluetooth LE

I've got a simple iOS app which displays the proximity of the Bluetooth LE beacons it detects using such expressions as "immediate", "near" etc. and I need to write something similar on Android.
I've followed the tutorial at Android developer and I'm able to list detected devices and now want to estimate the distance/proximity - this is where it's become a problem. According to this SO thread it's just a handful of mathematical calculations. However, they require me to provide a txPower value.
According to this tutorial by Dave Smith (and cross-referencing with this Bluetooth SIG statement), it should be broadcast by the beacon devices as an "AD structure" of type 0x0A. So what I do is parse the AD structures and look for the payload of the one that matches the type.
Problem: I've got 4 beacons - 2 estimotes and 2 appflares. The estimotes don't broadcast the txPower at all and the appflares broadcast theirs as 0.
Is there anything I'm missing here? The iOS app seems to be handling it all without any problem, but using the iOS SDK it does it behind the scenes so I'm not sure how to produce the exact same or similar behaviour. Is there any other way I could solve my problem?
In case you'd like to take a look at the code I'm using to parse the AD structures, it's taken from the aforementioned Dave Smith's github and can be found here. The only change I did to that class was add the following method:
public byte[] getData() {
return mData;
}
And this is how I handle the callback from the scans:
// Prepare the callback for BLE device scan
this.leScanCallback = new BluetoothAdapter.LeScanCallback() {
#Override
public void onLeScan(final BluetoothDevice device, int rssi, byte[] scanRecord) {
if (!deviceList.contains(device)) {
MyService.this.deviceList.add(device);
Log.e("Test", "Device: " + device.getName());
List<AdRecord> adRecords = AdRecord.parseScanRecord(scanRecord);
for (AdRecord adRecord : adRecords) {
if (adRecord.getType() == AdRecord.TYPE_TRANSMITPOWER) {
Log.e("Test", "size of payload: " + adRecord.getData().length);
Log.e("Test", "payload: " + Byte.toString(adRecord.getData()[0]));
}
}
}
}
};
And what I see in the console is:
04-01 11:33:35.864: E/Test(15061): Device: estimote
04-01 11:33:36.304: E/Test(15061): Device: estimote
04-01 11:33:36.475: E/Test(15061): Device: n86
04-01 11:33:36.475: E/Test(15061): size of payload: 1
04-01 11:33:36.475: E/Test(15061): payload: 0
04-01 11:33:36.525: E/Test(15061): Device: f79
04-01 11:33:36.525: E/Test(15061): size of payload: 1
04-01 11:33:36.525: E/Test(15061): payload: 0
The txPower mentioned by #davidgyoung is given by the formula:
RSSI = -10 n log d + A
where
d = distance
A = txPower
n = signal propagation constant
RSSI = dBm
In free space n = 2, but it will vary based on local geometry – for example, a wall will reduce RSSI by ~3dBm and will affect n accordingly.
If you want the highest possible accuracy, it may be worthwhile to experimentally determine these values for your particular system.
Reference: see the paper Evaluation of the Reliability of RSSI for Indoor Localization by Qian Dong and Waltenegus Dargie for a more detailed explanation of the derivation and calibration.
double getDistance(int rssi, int txPower) {
/*
* RSSI = TxPower - 10 * n * lg(d)
* n = 2 (in free space)
*
* d = 10 ^ ((TxPower - RSSI) / (10 * n))
*/
return Math.pow(10d, ((double) txPower - rssi) / (10 * 2));
}
It is unclear whether your inability to read the "txPower" or "measuredPower" calibration constant is due to the AdRecord class or due to the information being missing from the advertisements you are trying to parse. It doesn't look to me like that class will parse a standard iBeacon advertisement. Either way, there is a solution:
SOLUTION 1: If your beacons send a standard iBeacon advertisement that includes the calibration constant, you can parse it out using code in the open source Android iBeacon Library's IBeacon class here.
SOLUTION 2: If your beacons DO NOT send a standard iBeacon advertisement or do not include a calibration constant:
You must hard-code a calibration constant in your app for each device type you might use. All you really need from the advertisement to estimate distance is the the RSSI measurement. The whole point of embedding a calibration constant in the transmission is to allow a wide variety of beacons with quite different transmitter output power to work with the same distance estimating algorithm.
The calibration constant, as defined by Apple, basically says what the RSSI should be if your device is exactly one meter away from the beacon. If the signal is stronger (less negative RSSI), then the device is less than one meter away. If the signal is weaker (more negative RSSI), then the device is over one meter away. You can use a formula to make a numerical estimate of distance. See here.
If you aren't dealing with advertisements that contain a "txPower" or "measuredPower" calibration constant, then you can hard-code a lookup table in your app that stores the known calibration constants for various transmitters. You will first need to measure the average RSSI of each transmitter at one meter away. You'll then need some kind of key to look up these calibration constants in the table. (Perhaps you can use the some part of the string from the AD structure, or the mac address?) So your table might look like this:
HashMap<String,Integer> txPowerLookupTable = new HashMap<String,Integer>();
txPowerLookupTable.put("a5:09:37:78:c3:22", new Integer(-65));
txPowerLookupTable.put("d2:32:33:5c:87:09", new Integer(-78));
Then after parsing an advertisement, you can look up the calibration constant in your onLeScan method like this:
String macAddress = device.getAddress();
Integer txPower = txPowerLookupTable.get(macAddress);
use the getAccuracy() method in the library, it gives you the distance of the beacon

Sending data from arduino through bluetooth serial

I’m currently trying to design a controller that would communicate through Bluetooth to an android phone (Galaxy Nexus). I’m facing a few challenges. Also, I don’t have much practical programming experience.
At the core of the controller resides an Arduino microcontroller who scans the state of 8 digital pins and six analogue pins (10 bits) and sends the data through serial, to an HC-05 Bluetooth chip.
The android phone should then read the serial information sent through Bluetooth and either transmit the packet to another phone - That will take me a while to implement as I know very little of how the internet works - or analyse and interpret it for further action to be taken
What I’m asking for are insight as to the best way to go about doing this. Now what does best mean?
We’ll I want it to be real time so when I press a button or a combination of bottoms, boom, the android phone takes action fast enough for a human not to perceive a delay.
Then I want to be able to associate the information the phone reads in the serial buffer to the corresponding button or analogue pin. In case there is an error or to avoid that they fall out of sync
Here is what I have so far. I have not tested this code yet, partly because I’m still learning how to program the android part of this project, partly because I want some feedback as to whether I’m being silly or this is actually an effective way to go about doing this:
//Initialization
/*
This Program has three functions:
1) Scan 8 digital pins and compile their state in a single byte ( 8 bits)
2) Scans 6 Analogue inputs corresponding to the joysticks and triggers
3) Send this info as packets through seril --> Bluetooth
*/
#include <SoftwareSerial.h>// import the serial software library
#define A = 2
#define B = 3
#define X = 4
#define Y = 5
#define LB = 6
#define RB = 7
#define LJ = 8
#define RJ = 9
#define RX = 10
#define TX = 11
//Pin 12 and 13 are unused for now
SoftwareSerial Bluetooth(RX,TX); //Setup software serial using the defined constants
// declare other variables
byte state = B00000000
void setup() {
//Setup code here, to run once:
//Setup all digital pin inputs
pinMode(A,INPUT);
pinMode(B,INPUT);
pinMode(X,INPUT);
pinMode(Y,INPUT);
pinMode(LB,INPUT);
pinMode(RB,INPUT);
pinMode(LJ,INPUT);
pinMode(RJ,INPUT);
//Setup all analogue pin inputs
//setup serial bus and send validation message
Bluetooth.begin(9600);
Bluetooth.println("The controller has successfuly connected to the phone")
}
void loop() {
//Main code here, to run repeatedly:
//Loop to sum digital inputs in a byte, left shift the byte every time by 1 and add that if the pin is high
for(byte pin = 2, b = B00000001; pin < 10; pin++, b = b << 1){ if (digitalRead(pin)== HIGH) state += b; }
//Send digital state byte to serial
Bluetooth.write(state);
//Loop to read analgue pin 0 to 5 and send it to serial
for( int pin = 0, testByte = 0x8000; pin < 6 ; pin++, testByte = testByte >> 1) { Bluetooth.write(analogRead(pin)+testByte); }
}
//Adding some validation would be wise. How would I go about doing that?
// Could add a binary value of 1000_0000_0000_0000, 0100_0000_0000_0000, 0010_0000_0000_0000 ... so on and then use a simple AND statement at the other end to veryfy what analogue reading the info came from
// so say the value for analgue is 1023 and came from analgue pin 1 I would have 0100_0011_1111_1111 now using an bitwise && I could check it against 0100_0000_0000_0000 if the result is 0100_0000_0000_0000 then I know this is the analogu reading from pin 1
//could easily implement a test loop with a shiting bit on the android side
Is it pointless for me to be doing bit shifts like this? Ultimately, if I’m not mistaken, all data is sent as bytes ( 8 bits packets) so will the Bluetooth.write(analogRead(pin)+testByte) actually send two bytes or will it truncate the int data? how will it be broken down and how can I recuperate it on the android end?
How would you go about implementing this? Any insights or words of advice?
It's great that you are learning this! Some suggestions:
Your code will be easier to read, understand and maintain if you space it out a bit, particularly by adding newlines...
void loop() {
//Main code here, to run repeatedly:
//Loop to sum digital inputs in a byte,
//left shift the byte every time by 1 and add that if the pin is high
for( byte pin = 2, b = B00000001; pin < 10; pin++, b = b << 1) {
if (digitalRead(pin)== HIGH)
state += b;
}
//Send digital state byte to serial
Bluetooth.write(state);
//Loop to read analgue pin 0 to 5 and send it to serial
for( int pin = 0, testByte = 0x8000; pin < 6 ; pin++, testByte = testByte >> 1) {
Bluetooth.write(analogRead(pin)+testByte);
}
}
Also, you can use the shift operators with values other than one. so instead of keeping a shift mask that you then add in, you just use the a simple variable. A more classic way to do what you expressing in the first looop would be something like this:
#define PIN_OFFSET 2
for ( int i=0; i < 8; i++) {
if (digitalRead( i+PIN_OFFSET)== HIGH)
state += (1 << i);
}
The second loop could be done similarly:
for( int pin = 0; pin < 6; pin++) {
short testByte = 0x8000 >> pin;
short result = analogRead(pin) + testByte;
Bluetooth.write( result >> 8);
Bluetooth.write( result & 0xFF);
}
Note that the Bluetooth.write() call sends a byte, so this code sends first the most significant byte, then the least.
Lastly, you probably want to zero your state variable at be beginning of loop() -- otherwise, once a bit is set it will never get cleared.
You may want to think about what you are sending to the phone. It will be binary data, and that can be difficult to deal with -- how do you know the start and end, and often a value will be misinterpretted as a control character messing you up big time. Consider changing it into a formatted, human readable string, with a newline at the end.
I hope that helps.
The Arduino end of this is more than fast enough to do what you want to do and you need not worry in the slightest about minimising the number of bytes you send out from the micro to the phone when you're talking about such a tiny amount of data. I'm not quite clear what you're sending to the phone; are you wanting a different value for each button so when a button is pressed it sends out something like:
Button Was Pressed, Button Number (Two bytes)
And if it's an analogue value:
Analogue Value Read, Analogue Value MSB, Analogue Value LSB (assuming the analogue value is greater than eight bits
I don't know the Arduino code you're using, but my suspicion is that this line:
Bluetooth.write(analogRead(pin)+testByte)
Would send one byte which is the addition of the analogue value and whatever testByte is.
Would you be able to post a link to the documentation for this API?
Any delay with all this will come at the phone end. Please don't concern yourself in the slightest about trying to save a byte or two at the Arduino end! (I'm currently doing a project with a Cortex M3 sending data via a Bluetooth module to Android phones and we're shipping several k of data around. I know from this experience the difference between two and twenty bytes is not of consequence in terms of delay.)
It may be good for you to opt for ready available bluetooth terminal apps for your android handset. On Google play, few apps are:
1) https://play.google.com/store/apps/details?id=arduino.bluetooth.terminal&hl=en
2) https://play.google.com/store/apps/details?id=com.sena.bterm&hl=en
(and few more are availble on Google play).This will help you to focus only on controller side development.

Categories

Resources