I need to know exactly how many packets per interval my BLE can handle. The peripheral, as per its datasheet, handles 6 packets per interval, but I have been unable to find out how many packets the Central can handle. The Central device is a Motorola Moto G (generation 2), running Android 5.0.2.
By examining he btsnoop_hci.log file I have been able to identify multiple connection parameters, such as the connection interval (7.5ms in my case). My questions is wheter it is possible to determine how many packages can be exchanged in a single coonection interval, by examining the negotiation packets in Wireshark.
In the spec, there is no negociation about the max number of packets in a Connection Event. A connection event can simply last for at most (ConnectionInterval - 150µs) (See 6.B.4.5.1).
Limitations, if any, are in the PHYs, at either side. Most HCI firmwares limit to 4-5 packets per connection event, per direction.
It is up to the controller on the master side to decide how long the connection event should be open (as long there are more packets from any side). The slave has nothing to say about this.
For a central host, when creating a connection as well as when updating connection parameters there are two HCI parameters Minimum_CE_Length and Maximum_CE_Length. These are informational parameters that indicate how long it should keep its connection event open. If these are set to high numbers, the controller will have the connection event open as long as possible.
Sadly, Android set these parameters both to 0 which means most controllers will restrict the connection event to only 3 or 4 packets.
Related
My BLE server permanently measures a sensor value and sends a notification with 20 byte user data after each measurement. The goal is to generate as much throughput as possible.
On the client side, the value sent by the server is received and processed.
rxBleConnection.setupNotification(setDescriptorEnableNotification(characteristic))
.flatMap(notificationObservable -> notificationObservable)
.observeOn(Schedulers.newThread())
.buffer(1)
.subscribe(bytes -> {
onNotificationReceived(bytes, buffer);
} , throwable -> {
// Handle an error here.
onNotificationSetupFailure(throwable);
}
);
If I set the Connection intervall to 11.25ms, I receive all values. However, if I set the connection interval to 30ms, I receive a few values and then the connection is closed.
In the Android Log i see the followed message:
BleGattException status=8 (0x8),
bleGattOperationType=BleGattOperation{description='CONNECTION_STATE'
Why is the connection interrupted and what is the trigger?
With the help of a BLE Sniffer this is not recognizable. The set connection parameters are accepted and the transfer begins. Suddenly the transmission ends and the error message appears.
Update:
BLE Sniffer screenshot has been added:
30ms, this is connection interval you set in server or android?
Btw, on android you can set speed mode
if (android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.LOLLIPOP) {
mBluetoothGatt.requestConnectionPriority(BluetoothGatt.CONNECTION_PRIORITY_HIGH);
}
Error 8 means the connection timed out. There is nothing wrong on the Android side. The problem is with the communication between the two Bluetooth controllers. If you have a sniffer then you should be able to see who is the one that fails to send packets.
Here is an Image from BLE Sniffer.
BLE Sniffer
I have the similar problem. On Android 7.0 there were two ways to keep connection:
1) If devices are bonded and there is a charachteristic reading callback with constant thread of packets. If there are no packets by some time, then connection fails.
2) If devides are not bonded, but I do TXCharacheristic.read every few seconds. If don't do that some time, then connection fails.
But now in Android 7.1.2 this way doesn't work.
May be the first way will work for you.
On your Android device you should make bonding, on your kit you should handle this bonding.
On Nexus 6P and Samsung S7 it doesn't work anymore, but I didn't try it on the other devices.
I suppose, you have min connection interval 7.5 on your BLE kit, but now it is deprecated on Android.
Try to set min connection interval to 11.25 on your BLE kit and set connection priority
gatt.requestConnectionPriority(CONNECTION_PRIORITY_HIGH);
where CONNECTION_PRIORITY_HIGH = 1
in onConnectionStateChange.
It had worked for me when I had changed min connection interval in my nordic from 7.5 to 11.25.
If I want to transfer a lot of data (e.g. 1 MB file) over BLE, what's the best way to do it?
I control both sides of the connection, but the client side is iOS/Android so only has access to GATT. I can't do anything with L2CAP.
I also can't wait for Bluetooth 4.1, 6LoWPAN, Connection-Oriented-Channels or anything like that.
I would assume the answer is to have one "request" characteristic that you write a data request to ("Give me 3000 bytes starting at byte 0"), and a "data out" characteristic that sends lots of 20 byte notifications (the maximum characteristic size) containing the data.
Is there a better way?
Yes we are using the approach you have mentioned.
Request data with the last index number(First time the index is 0)
The server send you with data with index no.Store the index no for subsequent format
continue Step 1 and 2 till the time server sends end of data-probably with index -1 or something.
Make sure you transfer the data you required in the most space efficient format.See if you can zip the files and transfer it.
You can update connection interval to small value with smallest 6*1.25 ms in remote BLE device.
Actually, BLE is designed for Low energy, small packet, low data rate.
L2cap data will be transmitted in different data channel with frequency hop. Packets TX/RX happen within each connection interval and max number of packets TX/RX in an event is restricted by specification, finally implemented by manufacture. So we can change connection interval as small as possible to increase data rate.
Refer BT 4.0 spec Vol 2, 7.8.18 LE Connection Update Command.
Try to negotiate a larger MTU than the default.
Then each notification can be larger. Even though it will be fragmented by the L2CAP layer, you will get a slightly larger throughput since the packet header will be smaller.
I'm writing an BLE application, where need to track if peripherals device is advertising or has stop.
I followed getting peripherals without duplications this and BLE Filtering behaviour of startLeScan() and I completely agree over here.
To make it feasible I kept timer which re-scan for peripherals after certain time (3 sec). But with new device available on market(with 5.0 update), some time re-scan take bit time to find peripherals.
Any suggestion or if anyone have achieved this?
Sounds like you're interested in scanning advertisements rather than connecting to devices. This is the "observer" role in Bluetooth Low Evergy, and corresponds to the "broadcaster" role more commonly known as a Beacon. (Bluetooth Core 4.1 Vol 1 Part A Section 6.2)
Typically you enable passive scanning, looking for ADV_IND packets broadcast by beacons. These may or may not contain a UUID. Alternatively, you can active scan by transmitting SCAN_REQ to which you may receive a SCAN_RSP. Many devices use different advertising content in ADV_IND and SCAN_RSP to increase the amount of information that can be broadcast - you could, for instance, fit a UUID128 into the ADV_IND followed by the Device Name in the SCAN_RSP. (Bluetooth Core 4.1 Vol 2 Part E Section 7.8.10)
Now you need to define "go away" - are you expecting the advertisements to stop or to fade away? You will get a Receive Signal Strength Indication "RSSI" with each advertisement (Bluetooth Core 4.1 Vol 2 Part E Section 7.7.65.2) - this is how iBeacon positioning works and there's plenty of support for beacon receivers in Android.
Alternatively you wait for N seconds for an advertisement that should be transmitted every T seconds where N>2T. The downside of the timed approach is that probably not receiving a beacon isn't the same as definitely receiving a weak beacon; to be sure you need N to be large and that impacts the latency between the broadcaster being switched off or moving out of range and your app detecting it.
One more thing - watch out that Advertising stops if something connects to a Peripheral (if you really are scanning for peripherals) another good reason to monitor RSSI.
First scenario: Bonded Devices
We know that if a bond is made, then most of the commercially available devices send directed advertisements in during re-connection. In situations such as this, according to BLE 4.0 specification, you cannot scan these devices on any BLE sniffer.
Second scenario: Connectable Devices
Peripheral devices are usually in this mode when they are initially in the reset phase. The central sends a connect initiator in response to an advertisement packet. This scenario offers you a lot of flexibility since you can play around with two predominant configuration options to alter connection time. These are: slavelatency on the peripheral and conninterval on the central. Now, I don't know how much effort it's going to take get it working on the Android platform, but if you use the Bluez BLE stack and a configurable peripheral such as a TI Sensor tag, then you can play around with these values.
Third scenario: Beacon devices
Since this is what your question revolves around, according to the BLE architecture, there are no parameters to play with. In this scenario, the central is just a dumb device left at the mercy of when a peripheral chooses to send it's beaconing signal.
Reference:
http://www.amazon.com/Inside-Bluetooth-Communications-Sensing-Library/dp/1608075796/ref=pd_bxgy_14_img_z
http://www.amazon.com/Bluetooth-Low-Energy-Developers-Handbook/dp/013288836X/ref=pd_bxgy_14_img_y
Edit: I forgot, have you tried setting the advertiser to non-connectable? That way you should be able to get duplicate scan results
I am dealing with a similar issue, that is, reliably track the RSSI values of multiple advertising devices over time.
It is sad, the most reliable way i found is not nice, rather dirty and battery consuming. It seems due to the number of android devices that handle BLE differently the most reliable.
I start LE scan, as soon as i get a callback i set a flag to stop and start scan again. That way you work around that DUPLICATE_PACKET filter issue since it resets whenever you start a fresh scan.
The ScanResults i dump into a sqlite db wich i shrink and evaluate once every x seconds.
It should be easy to adapt the shrinking to your use case, i.e. removing entries that are older than X, and then query for existance of a device to find out if you received a ScanResult in the last X seconds. However dont put that X value too low, as you must take into account that you still lose alot of advertisement packets on android LE scan, compared to a BLE scan on i.e. bluez..
Edit:
I can add some information i already found for speeding up the performance on Advertisement discovery. It involves modifying and compiling the bluedroid sources and root access to the device. Easiest would be building a full android yourself, i.e. Cyanogenmod.
When a LE scan is running, the bluetooth module sends the scan sesponse via HCI to the bluedroid stack. There various checks are done until it finally gets handed to the Java onScanResult(...) which is accessed via JNI.
By comparing the log of the hci data sent from the bluetooth module (can be enabled in /etc/bluetooth/bt_stack.conf) with debug output in the bluedroid stack aswell as the Java side i noticed that alot of advertisement packets are discarded, especially in some check. i dont really understand, beside that it has something to do with the bluedroid inquiry database
From the documentation of ScanResult we see that the ScanRecord includes the advertisement data plus the scan response data. So it might be that android blocks the report until it got the scan response data/ until it is clear there is no scan response data. This i could not verify, however a possibility.
As i am only interested in rapid updates on the RSSI of those packets, i simply commented that check out. It seems that way every single packet i get from the bluetooth moduly by hci is handed through to the Java side.
In file btm_ble_gap.c in function BOOLEAN btm_ble_update_inq_result(tINQ_DB_ENT *p_i, UINT8 addr_type, UINT8 evt_type, UINT8 *p)
comment out to_report = FALSE; in the following check starting on line 2265.
/* active scan, always wait until get scan_rsp to report the result */
if ((btm_cb.ble_ctr_cb.inq_var.scan_type == BTM_BLE_SCAN_MODE_ACTI &&
(evt_type == BTM_BLE_CONNECT_EVT || evt_type == BTM_BLE_DISCOVER_EVT)))
{
BTM_TRACE_DEBUG("btm_ble_update_inq_result scan_rsp=false, to_report=false,\
scan_type_active=%d", btm_cb.ble_ctr_cb.inq_var.scan_type);
p_i->scan_rsp = FALSE;
// to_report = FALSE; // to_report is initialized as TRUE, so we basically leave it to report it anyways.
}
else
p_i->scan_rsp = TRUE;
I have 2 android phones phones, both connected to the same wifi, both with bluetooth.
I want some method that syncs somehow the phones and starts a function on the same time on both phones.
For example playing a song at the same time.
I already tried with bluetooth but its with lag, sometimes 0.5 secs. I want something in +- 0.01sec if possible.
Someone suggesting playing it in the future with 2-3 seconds, sending the time-stamp, but how do you sync the internal clocks of the devices then ?
Before calling that particular method, try to measure the latency between the two devices:
1.First device says Hi(store the current time)
2.Second device receives the Hi.
3.Second device says back Hi !!
4.First device receives the Hi.((storedTime - currentTime) / 2 )
Now you have the latency, send your request to second device to start your particular method and start it on first one after the latency.
Try to measure the latency 5 to 10 times to be more accurate.
you have a way to transfer data between the devices right ?
if so you can send a time-stamp which is in the future,
ex: if the present time stamp is 1421242326 you send 1421242329 or something and start the function at that time on both devices.
Basically use #Dula's suggestion (device 1 sends command to device 2 and gives a "start time" which lies in the future). Both devices then start the action at the same time (in the future).
To make sure that the devices are synchronized, you can use a server-based time sync (assuming that both devices have Internet access). To do this, each device contacts the same server (using NTP, or HTTP-based NTP, or contacts a known HTTP server, like www.google.com and uses the value in the "Date" header of the HTTP response). The "server-date" is compared to the system clock on the device, and the difference is the "time-offset from server-time". The time-offsets can be used to synchronize on the "server-time", which is then used as the time base for the actual action (playing the media, etc.).
If your WiFi router allows clients to talk to each other (many public hotspots disable this), you could implement a simple socket listener on one (or each) device and have the initiating device broadcast a message.
For more complicated things and network flexibility, I've had good success with connected sessions using AllJoin. There is a bit of a learning curve to do interesting things, but the simple stuff is pretty easy once you understand the architecture.
Use a server to provide a synchronous event to just the two clients who have decclared their mutual affinity (random as a parm and pair serializer Partner-1 or Partner-2 which they share prior to their respectve calls for the sync event).
Assume both clients on same subnet (packets from 2 events serialized on the server , arrive across the network at the 2 clients simultaneously client-side) This provides synchronous PLays by 2 , bound clients.
The event delivered by server is either a confirm to play queued selected track OR a broadcast( decoupled, more formal)
The only tricky thing is the server side algorythm implementing this:
Queue a pair of requests or error
Part1, part2 with same Random value constitute valid pair if both received before either times out.
On a valid pair schedule both to the same future event in their respective , committed responses.
OnSchedule do the actual IO for 2 paired requests. Respective packets will arrive back at respective clients at same time, each response having been subject to equal network latency
Ng if two diff carrier 4G or lte networks involved. (Oops)
This thing is possible via socket, you will send a event via socket then the other device receive that event. For learn socket io chat
maybe it's not the answer you are looking for but i think that due to the high precision you are wanting , you should look for a push technology, i advice you to take look at SignalR. It's real time technology which gives you abstraction of sending methods , it have a built-in methods like Clients.All.Broadcast that fit your needs.
You can try to use some MQTT framework to send message between two device, or into a set with more number of devices.
My BLE application requires computation on the server side (BLE chip) which takes time and results with disconnection.
Th flow is like this:
1- Android phone writes the characteristic value to the BLE chip.
2- The chip evaluates this value and starts computation.
3- The connection is lost soon after the computation has started.
What solution can I apply to prevent the disconnecton? I have two solutions in my mind:
1- Changing the connection interval: Currently Android uses 7.5 msec as connection interval. Since the computation on BLE chip takes time, packets are not sent or received during the computation. Increasing the connection interval will decrease the number of lost packets. However there is no guarantee that Android phone will accept the new connection parameters.
2- Running the computation in a separate thread: I dont think that BLE chips' SDK support multi-threading such that while there is a computation process going on, it will keep receiving and sending packets and prevent the disconnection. I use CSR chip and I think it doesnt support.
Please correct me if I am wrong at my points.
Do you have any other suggestions to solve the issue?
Thanks in advance.
Thank you for the answers. I found out what the problem is after spending hours.
First of all, when Android gives error 133 or 129, it is most probably because of the remote device.
At the beginning I thought that the problem occurred because of the supervision timeout. Then I re-configured the connection parameters of the CSR chip but it didn't help.
There is a problem about CSR app development with xIDE (IDE of CSR). When there is run-time-error due to index overshoot or accessing some invalid pointers, then you would not receive any errors in xIDE. I finally found out the array problem and fixed it. Now it works perfect.
Thanks a lot!
I don't know exactly if what i going to explain it's feasible under Android because I used BLE only with a low level applications, anyway if your problems are the connection parameters you can try to change the Slave_Latency.
It should be usefull since playing with this parameter, you can change the number of connection intervals in which the Central device can wait until it considers the connection lost.
The following equation is usefull to derive the connection parameters:
Effective_Connection_Interval = (Connection_Interval)*(1+(Slave_Latency))
Remember that can exists some kind of Supervision_Timeout that can collide with your Effective_Connection_Interval