Bluetooth Android Audio skips when transferring data over RFCOMM - android

Hey Guys I'm dealing with an annoying thing.
While I'm sending larger amounts of data over the RFCOMM channel and connected A2DP, the audio will skip while. I've tried a lot of different things the only sure fire way is to space out the data being sent with delays. I'm pretty sure this is a low level Android issue as it mostly happens on 2.3.X but still happens on 4.0
Has anyone seen a similar issue?

An A2DP connection can consume the majority of available bluetooth bandwidth. Once you start adding other RFCOMM packets, you are taking up space that could otherwise be used for A2DP retransmissions, so your ability to hide lost packets is decreased. Other portions of bandwidth can be lost if your device is doing periodic page or inquiry scans, so you might want to ensure that is not happening. Basically, I wouldn't have too much expectation of running A2DP and RFCOMM at the same time unless your RFCOMM traffic is extremely low.

Related

How to increase the speed of transfer of data from android to arduino in bluetooth?

I'm trying to use an android app to do the processing of a path finding algorithm for a robot using Bluetooth. But currently, it takes 1 or 2 seconds for the transfer to complete, so that there is an output in the Arduino. Is there a way to minimise this to make the transfer-output instant?
This kind of delay is causing problems such as stopping instantly when an obstacle is detected. Is there any better way of doing this?
Thanks in advance!
You didn't mention which device you are using. I assume that you connected the Bluetooth chip set to UART port(As in arduino Uno), In that case the slowest part in whole communication is the serial interface between Arduino and Bluetooth chip set. Check what baud rate you are using and can it increase further. I think default will be 9600 which is only around 960 bytes per second. Set the maximum baud rate supported by your device and the Bluetooth chip.
Simple answer: You can't, bluetooth is laggy like that. If you instead had your path finding algorithm on the arduino board itself, you could avoid the issue. You can also try adding a delay to your arduino code, because it is possible that the arduino is sending messages repeatedly without taking into account the lag that bluetooth has.
Two simple solutions worked for me:-
Increase the delay to 50 - 100 milliseconds.
Add this after the Serial.begin(9600) in setup();
Serial.setTimeout(50);
Step two is the most important. It worked for me only after I added the above code. This is not mentioned very often in many other forums that I have looked when I had the exact same problem.

Measuring packet loss through Android N

I am working on a project that is meant for testing network diagnostics. The section I'm working on now involves testing TCP and UDP connections. One of the points of measurement is packet loss for both TCP and UDP.
In the past I used: '/statistics/rx_dropped' (I'm leaving off the beginning portion of that file path), suffice to say it points to the number of packets dropped in total on a specified network interface.
However, in Android N this file requires root to read and I can't assume that the phone this app will be on is rooted.
Does anyone have a decent way of measuring packet loss on the client side for at least TCP that doesn't require rooting?
I am mostly aware of networking jargon so don't be shy, the reason I ask is because the way I measure has to be fairly elegant (either using some existing java/android library or finding a legal way of reading packet loss from the device).

Working behind the NATs - a scheme for device communication

I'm trying to come up with a solution enabling data exchange between an embedded device (xMega128(C) based) and an Android apps. The catch is the data exchange must be conducted via the Internet and both the embedded device and the mobile device running the app can be behind different NATs, connecting using different ISPs, 3G, LTE, etc.
I tried UDP hole punching, but it does not work with symmetric NATs. Multi hole punching with prediction also does not guarantee 100% reliabilty. I also considered using ICE, but ICE C libraries (pjnath, libnice) are incompatible with the hardware chosen (libs require os). Right now I'm considering implementing or use (if exists) traffic relay server, but that just seems like a hack to me.
Are there any other options I hadn't considered? Any help will be appreciated.
Ideally, the communication scheme would be:
100% reliable
relatively low-latency (3 seconds absolute max)
scalable (say up to 500k devices in the future)
initializable by both the app and the device
multi-user – one device would connect to many android apps
Also, if this helps, the data exchange between the device and the app is not very high-intensity – roughly 1 session per hour, ~50 messages per session with 10-20 seconds between them, each message weighing around 100 bytes.
What you're describing is effectively peer to peer or a subset thereof and to get that working reliably is a lot of work. Where peer to peer fails you normally fall back to a relay server. It can be done but the amount work to do it is quite large. Your list of requirements is also quite steep...
100% reliable
There's no such thing as a reliable connection. You need to build in fault tolerance to the app to make it reliable.
relatively low-latency (3 seconds absolute max)
Quite often you will be limited by physics ie speed of light. Low latency is hard.
scalable (say up to 500k devices in the future)
I don't know what this means ie is this concurrent connections?
From wikipedia on NAT Traversal
Many techniques exist, but no single method works in every situation
since NAT behavior is not standardized. Many NAT traversal techniques
require assistance from a server at a publicly routable IP address.
Some methods use the server only when establishing the connection,
while others are based on relaying all data through it, which adds
bandwidth costs and increases latency, detrimental to real-time voice
and video communications.
ie it will work sometimes ie it will be unreliable so you need to user several methods to make it reliable.
As long as both endpoints are behind different NATs you don't control, it won't work reliably. No way. You need a relay.

How to get data coming from gsm network

I'm trying to see how gsm affects data on a phone call. Here is what I'm trying to do. One person will be talking on a phone and I will record his voice from phone's mic while he speaks and on the other phone I will get the data coming from gsm and compare them. I want to write an android application to get that data. Is that possible on android or can you suggest another way to achieve this?
Some background (you may know this already)...
When you make a GSM call, the analogue signal in the phone microphone corresponding to your speech is converted into a series of digital values and then encoded with a voice codec. This is basically a clever algorithm to capture as much of the speech as possible, in as little data as possible.
The idea is to maintain very good speech quality while saving on the amount of bandwidth needed for a call. Techniques used include not transmitting quite periods (when you are not speaking) and various compressions and predictive encoding algorithms. There have been and still are a number of codecs in use in GSM, but the latest and general preferred codec is called AMR-Narrowband.
Nearly all GSM deployments encrypt speech between the phone and the base station - while there are publicised weaknesses in the various encryption algorithms, I am assuming (hoping...) that decrypting is not what you are looking for.
Your question - 'I want to see that if there will be data loss or corruption when voice reaches over gsm'
Firstly, is it worth noting that speech is 'relatively' tolerant of small amounts of data loss and corruption, at least compared to data. It is quite common to have bursts of packet loss in VoIP networks and it may cause a temporary degradation in voice quality. Secondly, packet loss in a VoIP network will include delayed packets which can be confusing - if the packet arrives too late to be included in the 'sound' being played in the receivers speaker then it is effectively lost from the VoIP point of view, even though other measures may show that it simply arrived late.
To actually measure the loss between the GSM phone and the basestation you would need access to the data received at the basestation, which you will not usually have unless you are the operator.
If you do an end to end test, from one GSM to another, your speech path will traverse other network nodes also, so you will not know if any loss or corruption is happening over the GSM air interface or in one or more of the other nodes.
You would also need to be aware of handover from one cell to another and from 2G to 3G (GSM to UMTS) which may affect your tests (even stationary phones can handover in certain circumstances).
If your interest is purely academic then the easiest thing might be to create your own GSM base station and test on this - there exists several open source GSM 'network in a box' projects which should allow you do this. I have not used it myself, but this one looks the most actively supported at this time - check out the mailing list under the community tab for a good place to maybe follow up your investigations:
http://openbts.org

What's the best Bluetooth mode to use for streaming data?

I have a microcontroller collecting sensor data at a rate of about 4 Hz. I would like to send this data over Bluetooth to a remote entity. I'm thinking the serial port profile (SPP) would be best, since it emulates a physical cable. Under normal usage, it would not be unreasonable to expect the connection to be held open for 10 to 12 hours at a time.
Has anyone done any work in this field? Does anyone know the best profile to use, or any resources to use as reference? Thanks in advance.
Yup I'd use SPP/RFCOMM. That will be supported in all Bluetooth versions and all implementations (except for the very very very basic which might not support RFCOMM/SPP). There shouldn't be any problem with keeping the connection open for a long time.
Bluetooth 2.1 supports a streaming mode over a basic RFCOMM channel, if your API will let you configure it.

Categories

Resources