Setting BLE connection parameters in movesense application - android

I am wondering if there is a way to set the BLE connection parameters in my custom firmware and also for the logbook service. As I would like to transfer data as fast as possible from the device to a mobile application.
From what I can see in the BLE spec, and from reading documentation the Connection interval, the number of packets per interval and the "data length extension" can be set to increase the transfer rate. But these cannot be set from the either the Android Device nor the iOS device. And the recommendation is to set it from the peripheral device
And from what I have read in the Movesense documentation I could not find any way of setting the preferred settings from trough the Movsense API.

Update: The BLE parameter customization is coming out in version 2.2
Original answer:
For now there is no way of setting the BLE connection parameters from the Movesense device. The default settings (connection interval 7.5ms-1000ms, MTU 158, DLE enabled) allow the phone to choose the best performing settings. I've typically seen connection interval of 45ms when connecting with Android phone that supports large MTU & DLE (BLE 4.2) which provides the maximum transfer rate possible (around 10-12kB/s).
We have planned to add a way to give more control to the BLE parameters as well as to optimize the power consumption in typical use cases (when this would be available for developers is not yet clear).
Full disclosure: I work for Movesense team

Related

How can I enforce a minimum Bluetooth MTU for an Android app?

I'm working with a piece of hardware that sends and receives packets of data over particular BLE characteristics. Currently, each packet has to be an individual read or write.
When talking to the device from iOS, with the MTU negotiated up to 157 bytes, it's easy to observe this limitation. It's also easy to confirm that this MTU negotiation will work on the entire range of Apple devices the app needs to support. Without MTU negotiation, we would be limited to the default 23 byte ATT MTU, meaning we could only send and receive 20 byte packets (due to GATT overhead). The data format used for certain features will not fit in a 20 byte packet.
In the near future we'll be porting the app to Android and eventually releasing it for public consumption. I know that the Android API provides the requestMtu() method for attempting MTU negotiation, but if a user's Android device does not support an ATT MTU higher than the default 23 bytes, we'd like to be able to tell the user as soon as possible that the app will be unable to perform certain functions. Ideally, we'd like to stop the user from installing the app on such a device.
Is there a way to detect (directly or indirectly) the BLE MTU capabilities of an Android device without first requiring the user to connect to a BLE peripheral?
(Note: I've encountered a few anecdotes claiming that assorted Android devices only support the 23 byte default, but these mostly seem to come from developers who don't know about MTU negotiation. It's possible that all extant Android devices support higher MTUs. I don't know whether this is true.)
Can you give an example of an Android device that does not support something higher than the default MTU?
What I have heard all Android devices should support a higher MTU than the default. The only problem is for 4.3 and 4.4 which does not have the requestMtu API, but then you can simply initiate the MTU change from the peripheral instead.

RLC protocol in cellular networks. Is there a way to control the mode used from within an app?

I'm developing a VoIP application that has to perform well on mobile networks. It is tolerant to packet loss, but here's the bad part: I found out that on mobile networks, on all standards from GSM to LTE, there's that RLC protocol used between the device and the base station. RLC can operate in two modes: acknowledged and unacknowledged. Acknowledged mode, which I observed being used during my experiments, means that if there are any bit errors during packet transmission, it will be retransmitted until there are none, thus holding up the send queue the whole time it gets retransmitted. In the unacknowledged mode, a packet with bit errors is just dropped, and that's what I need.
So... Is there any way I can control the RLC mode used for my application's packets, or is said mode configured by the network? I already tried the "service type" field in the IP header, but it didn't seem to do the trick.
I've put the "android" tag here, but, ideally, I'm interested in a solution that works across all major mobile operating systems.
There is no way to do this from the device for two reasons:
1 - This is a network configured attribute to enforce QoS wanted by operator. All data services will run on RLC AM sharing the same channel. A channel with a dedicated QoS (like higher priority and RLC UM) would run typically for operator's phone service (VoLTE).
2 - Even if device could pick it's preferred mode, that's a low level configuration used by the modem / chipset. This, this is not provided to Android developers (RLC is under PDCP, which is under IP layer).
But still, I don't think your VoIP performance is being affected just because of this property. Latency depends on other factors as well, such as load in the network and radio conditions. The higher the number of users in an LTE network the higher the latency. The worst the cell coverage, the worst the latency.
Give it a try on speedtest.net to check latency. It should be ok until 30ms or so.
Cheers.

burst notifications with Bluetooth Low Energy on Android

I'm developing a Bluetooth low energy application to connect with a device which will be sending 20 byte long transmissions in notification mode in intervals of 6 milliseconds or more.
So far the application is working fine. It can scan, discover and then subscribe to the characteristic to receive data notifications. The issue is that for the first 2-4 seconds the data will be read nicely in a sequential order but after that the notification data starts to appear in bursts or as in chunks of data but not in consistent intervals between each transmission.
This doesn't happen when i check the data transmission with the Texas Instruments BLE evaluation kit, there my reader shows a perfect transfer with not bursts appearing. Only on android it's become visible.
Could this be an issue that can be configured to fix in android side?
Could this be a problem with the high transmission rate (~milliseconds intervals)?
Thank you..
So it sums up to that optimal throughput can be achieved with the proper configuration of connection parameters for the BLE connection. It is usually done at the peripherals end and may have to differ for the platform connecting to (i.e. IOS , Android may have different connection requirements..)
P.S. : Since i was looking at android found this method documented here https://developer.android.com/reference/android/bluetooth/BluetoothGatt.html#requestConnectionPriority(int) which is calling for a connection priority( CONNECTION_PRIORITY_BALANCED, CONNECTION_PRIORITY_HIGH or CONNECTION_PRIORITY_LOW_POWER) But I didn't test it.
You could try to enable Bluetooth HCI Snoop Log in Developer Options and then view the log file in WireShark. Look for connection update commands, these can be issued by either side of communication. This command change the transmission settings and slow down the transfer. Also look for GAPROLE_PARAM_UPDATE_ENABLE in your TI BLE app.
Yes Michael we use CC2650 and for our requirement BLE is sufficient bur I'm not sure if it really supports bluetooth classic (http://www.ti.com/product/CC2650/description) .
You can try playing around the BLE connection parameters to get the setup tuned, that's what we did other than trying to build the app giving priority to BLE operations.Take a look at this for more information on connection parameters.
https://devzone.nordicsemi.com/question/60/what-is-connection-parameters/
You can't configure the connection parameters on the phone but the peripheral(i.e. SensorTag) even so there's not guarantee that the given parameters will actually be accepted by the central device in case will settle with a set of parameters accepted by the central device. (Android and IOS have different policies in terms of these..)
In our case we are transmitting in intervals of 15ms and seems quite stable. But all these high frequent transfers at the cost of the low power consumption capabilities of BLE which is really what it is intended for. We could go even below that close to 7.5ms which is the minimum connection interval supported by Android. Our initial tests were stable but reliability of such a low latency is questionable.

Android Bluetooth Low Energy sequential Write perfomance

I do have a Bluetooth LE remote controlled car. Therefore i need to write periodically to a drive characteristic on the car. My microcontroller (AtmelXMega128A1 # 32Mhz + nRF8001) should be able to handle up to 122 connections per second # 7,5ms connection interval.
My Android App is based on cordova and a bluetooth low energy plugin: https://github.com/randdusing/BluetoothLE
I am running this on a Nexus 5 with Android v4.4.4.
I have a timer which sends values for steering and acceleration to the car each 175ms. I would like to send each 50ms but that does not work. I cannot tell where the problem is but i guess it is the android implementation of GATT (I get the pending command error at some level).
If i write more than it can handle the car executes all commands in a row but time shifted. Some queue hickup obviously and this is not the Microcontroller as it operates much faster.
I am doing a timing change which seems to be successful. I tried turning WiFi off as i hoped it would help but nothing changed.
Is there any experience on periodical writings to a GATT characteristics on Android? Examples would be great.
First of all you should make a robust design. Data should be driven by callback from the Android BT Stack telling when it's ready to accept more data (when the previous transmission is done). Do not use a timer. There will always happen need for retransmissions on the lower stack level so you cannot rely on an exact transfer interval and throughput.
The 7.5ms is the shortest possible connection interval however the default is usually much slower (48.75ms on my Nexus 5 with Android L) So from your peripheral you should try to request a faster connection interval once connected. This will speed up your throughput and responsiveness.
Some Android BT stacks refuse if you try to force a very fast connection interval. You should be handling that intelligently. Like trying with 7.5ms (parameter = 6) and increase it if it failed. iOS design guideline say you must not use a lower value than 20 (*1.25ms) and the upper request value should be at least 20 higher than the lower. You will get a faster connection parameter though if you request values min=10, max=20 and end around 18ms or something.
For android it seems most will accept the 7.5ms (value 6) but again you should not force it because the stack might cancel the connection then.
I made experiments on Android L, requesting connection intervals from the peripheral side when connected. Android rounded off requests so only every 3rd step gave a difference.
6=7.5ms, 9=11.25ms, 12=15.0ms, ..., 39=48.75ms which seems to be the default value on Nexus 5 running Android L.
Bluetooth is a shared resource in the broadcom chipset most are using on the smartphone side. Wifi, BT Classic, BT Low Enegy and sometimes GPS shares bandwidth. You will see hiccups and must be tolerant about them. Make a robust design.
Something else you can try is to renegotiate MTU-SIZE if you need larger data packages than the default. This is by specification an optional BLE feature however Apple broke it completely in iOS7 where they use it as a mandatory thing to boost up throughput. This broke all BLE devices which didn't implement the response handler and so it crashed and could never be used with iOS devices until a SW update was made. Baaaaaad. For android this is not a problem though.

How to enable high speed Bluetooth (3.0+HS or 4.0) in Android?

So I have a BT client and a server application on two Bluetooth 4.0 android phones. The server waits for a connection via
BluetoothServerSocket serverSocket = mBluetoothAdapter.listenUsingRfcommWithServiceRecord(SDP_NAME, UUID.fromString(SDP_UUID));
and the client connects to it via
socket = device.createRfcommSocketToServiceRecord(UUID.fromString(SDP_UUID));
Then, using a AsyncTask, I am sending data in an endless loop from the client to the server.
byte[] buffer = new byte[4096];
outputStream.write(buffer);
I calculated the speed and only got around 230KByte/s, which is exactly the 2,1 MBit/s that Bluetooth EDR offers. How to I send the data via Bluetooth HS (24 MBit/s)?
BT 3.0+HS is a scheme where the high rates are achieved by actually using Wifi physical layer. So it only works if you have the right kind of BT/Wifi combo chips that support it, which isn't really very common. Having a 4.0 device does not mean it does 3.0+HS, it just means it can do BT Low Energy, which is low data rate.
I understand that Google has not opened up the API's required to drive the function built in the 4.0 chips. Since the functionality works on laptops and various Windows OS's maybe the mobile Window OS is closer or capable of operating such with a software patch. I think the priority for Google is to work on low battery before HS.
Also I think that Wireless operators were not keen on allowing high speed tethering for free, which has killed the software development efforts.

Categories

Resources