I am trying to do multicast in android. I have three phones ( all Galaxy S5 ). One phone has Wifi tethering turned on and is acting as AP(mobile data is turned off). Of the other two phones, one is transmitting to a multicast address the other is receiving, with both connected to the AP of the first phone. The transmitting and receiving phone are kept side by side.
I found that of 1000 packets that I am sending, I am receiving only about 200. Below is an outline of how I am sending
MulticastSocket mMulticastSocket = new MulticastSocket(port);
InetAddress multicastGrp = InetAddress.getByName("239.255.255.250");
mMulticastSocket.joinGroup(multicastGrp);
String sendStr = "";
int packetSize = 1400;
for(int i = 0; i < packetSize; i++)
sendStr = sendStr+"a";
for(int i = 0; i < 1000; i++)
mMulticastSocket.send(new DatagramPacket(sendStr.getBytes(), sendStr.getBytes().length, mMulticastSocket, port);
Receive is something like,
//acquire multicast lock
byte[] buffer = new buffer[2048];
DatagramPacket rPack = new DatagramPacket(buffer, buffer.length);
mMulticastSocket.setReceiveBufferSize(1024*64);
int recvCount = 0;
while(true) {
mMulticastSocket.receive(rPack);
recvCount++;
}
//release multicast lock
Both send and receive are done in worker threads. I also found that as the value of 'packetSize' reduces, the number of received packets is increasing. I guess it is due to CPU load or receive buffer, but in any case, I want to multicast packets of size 1400 bytes and receive as many as possible( I know that number of received packets is a function of the channel between the two phones but I think keeping them side by side is almost the best channel that can be had). Also, when I am doing a UDP unicast, i am able to receive about 900 of the 1000 packets sent.
I am not able to understand why the number of received packets is so low for multicasting. What is it that I am missing?
Your server is sending packets as fast as possible. There is nothing in your sender loop that would throttle back the data rate of the packets being sent. So each packet is being sent microseconds apart. Most likely something along the network stack will cause some buffers to fill up. UDP has no built in flow control or congestion control. If you can have some small delay (~1ms to 5ms) between packets, you will probably see much less packet loss.
for (int i = 0; i < 1000; i++)
{
mMulticastSocket.send(new DatagramPacket(sendStr.getBytes(),
sendStr.getBytes().length, mMulticastSocket
Thread.sleep(5, 0); // quick hack to throttle back rate
}
Related
when android communicate with pc via usb accessory mode, the android can not receive a data if pc send 512bytes data to android.
but, there are no problem if over(or below) than 512bytes data transfer.
And if android receive other byte data after receive 512bytes data then incoming both missing data and other byte data(512bytes + other data).
my read code on thread is below.
#Override
public void run() {
byte[] readbuffer = new byte[16384];
int bytesRead = 0;
try {
while ((bytesRead = mInputStream.read(readbuffer, 0, readbuffer.length)) != -1) {
//my code here afrer read.
....
mHandler.sendMessage(msg);
}
} catch (IOException e) {
e.printStackTrace();
}
}
this is happened not only 512bytes but some other specific lengths(512bytes, 1024bytes, 2048 bytes...).
is this android accessory mode bug?
anybody know this issue?
It is not a bug with AOA but your sender not finishing the USB transaction. Unlike USB control transfers, bulk transfer does not transmit the data size, so for a bulk transfer to finish one of these conditions must be satisfied:
The amount of data received is the amount of data requested.
The size of the data is less than maximum buffer size.
A Zero-length package is received.
For high speed mode, the maximum buffer size is 512 bytes, so if you send 0-511 bytes, condition 2 is satisfied. In case data is 513-1023 length, it will be split in two packages 512 bytes + 1-511 bytes, so again, the last package satisfies the 2nd condition.
In case you send exactly 512 bytes, the receiver does not know either you have finish the transaction or there is remaining data (in an additional package) so it keeps waiting and freezes. So, for lengths multiple of buffer size (512 on high speed and 64 in full-speed) you need to send an additional zero length package for finishing the USB transfer.
I would like to monitor Wifi-Direct network (Bandwidth, latency etc). How can I measure the time it takes for X bytes to be received over a network (wifi direct). I mean TX + over the network + RX time.
In DMMS (android Studio)I found the option of Network Statistics but here it is only shown transmission and reception time (and it is not very accurate because it appears on a graph).
I had thought about using System.currentTimeMillis() but I have not found how to synchronize the clocks of both devices.
TX:
socket.bind(null);
socket.connect((new InetSocketAddress(host, port)), SOCKET_TIMEOUT);
int tamMensaje= 1024*1024; // 1 MB
byte[] bitAleatorio = new byte [tamMensaje]; // 1 byte [-128 a 127
for (int x=0;x<bitAleatorio.length;x++){
bitAleatorio[x] =(byte) Math.round(Math.random());
}
DataOutputStream DOS = new DataOutputStream(socket.getOutputStream());
for(int i=0;i<1024;i++){
DOS.write(bitAleatorio,(i*1024),1024);
}
RX:
ServerSocket serverSocket = new ServerSocket(SERVERPORT);
Socket client = serverSocket.accept();
DataInputStream DIS = new DataInputStream(client.getInputStream());
int tamMensaje= 1024*1024;
byte[] msg_received2= new byte [tamMensaje];
for(int i=0;i<1024;i++){
DIS.read(msg_received2,(i*1024),1024);
}
client.close();
serverSocket.close();
Thanks
There are two approaches that can considerably accurate solve the problem:
Sync time on both devices.
You can use NTP for that and either install a separate app like this one or implement it in your code using a library like this one.
After that you can rely on System.currentTimeMillis() to compare the message sent/receive time on both devices.
Using relative time or time of single device.
You can implement something like icmp echo, using udp datagrams (they are faster than tcp). The algorithm should be following (assuming we have devices A and B):
A sends some packet to B and saves timeSent somewhere;
B receives packet and immediately sends ACK packet to A;
A receives ACK and saves timeRecv;
Finally, Long latency = (timeSent - timeRecv)/2;.
This will work for small payloads, like icmp echo. Measuring large network transmission time can be done by sending separate ACK responses for both start/end of receiving it.
I am doing an aplication that needs to meassure the signal strenght of all BLE beacons(estimote) to process through a neural network, i want to store that RSSI in an array that will be sent to a server implemented in C#, i've already implemented it with wifi and want to obtain a similar result, here is my code for Wifi:
ListaWifi = ObjWifi.getScanResults();
rssi = new int[2];
for (int i = 0; i < 2; i++) {
rssi[i] = ListaWifi.get(i).level;
}
Also i want the data to always have the same order:
int[] ret = new int[3];
for (int i = 0; i < tam; i++) {
switch ((BSSID[i])) {
case "9c:d6:43:94:ee:3f":
ret[0] = rssi[i];
break;
case "9c:d6:43:94:f1:63":
ret[1] = rssi[i];
break;
case "9c:d6:43:94:f1:99":
ret[2] = rssi[i];
break;
default:
break;
}
}
Is there a way to continuosly monitor this data of BLE beacons and store in the same way as done with wifi??
I think the RSSI value is included in the advertising packets. I don't know any C#, so I can't help you with the specifics of getting that value from the packet (but it should be possible).
However, you're second stipulation that they always come in the same order would require some messing around. Those advertising packets are broadcast over the air in no particular order and multiple times (the beacons don't communicate with each other and coordinate their packets). One solution you could do is listen for a fixed period of time, collect all the ad packets together, then sort them by address (you'll probably get multiple from the same address, so maybe average the RSSI values). I have no idea why you require them to be in the same order, so that may not be the best solution to the problem you haven't stated.
I currently have a HC-06 Bluetooth device connected to my Arduino Mega 2560 in order to receive strings sent from an Android device. With the HC-06 on Serial 0, I am receiving the data without error with the following code:
String inString = "";
int index = 0;
boolean stringComplete = false;
void setup() {
Serial.begin(9600);
pinMode(pwmPin, OUTPUT);
}
void loop() {
if(stringComplete) {
ParseSerialData(); // Parse the received data
inString = ""; // Reset inString to empty
stringComplete = false; // Reset the system for further input of data
}
}
void serialEvent() {
while(Serial.available() && stringComplete == false) {
char inChar = Serial.read();
inData[index] = inChar; // Store it in char array
index++;
if (inChar == '\n') { // Check for termination character
index = 0; // Reset the index
stringComplete = true; // Set completion of read to true
} else {
inString += inChar; // Also store as string
}
}
}
When I try to replace "Serial" with "Serial1" and "serialEvent()" with "serialEvent1()" and move the Bluetooth device to the TX1 and RX1, this program no longer works.
I have read that some people had similar problems when using AVR-GCC 4.4.x and solved the issue by downgrading to 4.3.x, but I have 4.3.2 (on Windows 8.1, same problem has arisen with Arduino IDE 1.0.3, 1.0.5-r2, and 1.5.6-r2).
I added the following print statements (with Serial 0 to print to the monitor on my PC) to the code with the Bluetooth device still on Serial 1:
String inString = "";
int index = 0;
boolean stringComplete = false;
void setup() {
Serial1.begin(9600);
Serial.begin(9600);
pinMode(pwmPin, OUTPUT);
Serial.println("Setting up...");
}
void loop() {
if(stringComplete) {
ParseSerialData();
inString = "";
stringComplete = false;
}
}
void serialEvent1() {
Serial.println("In serialEvent1...");
while(Serial1.available() && stringComplete == false) {
Serial.println("Char in...");
char inChar = Serial1.read();
Serial.println("WTF");
Serial.println(inChar);
Serial.println("WTF2");
inData[index] = inChar;
index++;
if (inChar == '\n'){
Serial.println("Termination char read...");
index = 0;
stringComplete = true;
}else{
inString += inChar;
}
}
}
Doing this, on the monitor I get:
Setting up...
In serialEvent1...
Char in...
WTF
WTF2
inChar typically prints as nothing, but during one test it was printing as an '#' character. The string sent is "s,1\n" from the Android device.
Based on the print out, the serial event is triggered by availability of serial data, but Serial1.available() remains true for only the first iteration, 's' is not read in (nor any of the other characters that do when Serial is used), and a termination character (newline char) is never read in so that I can begin parsing.
I also tried various baud rates with no difference in behavior. Based on reading Arduino documentation, serial port 1 should work just like serial port 0, and I did not miss substituting Serial for Serial1 in any part of the code.
What could be causing errors in communicating over Serial1 in the same way that has worked flawlessly on Serial0?
I also found a related Stack Overflow question, which was solved with something similar to my code (which works perfectly with Serial0 and is based on the Arduino documentation) using an expected termination character (the difference being that his code implements serial reading in the main loop, whereas mine is in a serialEvent). For some reason, it seems that both of us were having issues with Serial1 data not showing as available at the start of the next iteration. For some reason, serialEvent1 is not being called again for me. And I still don't understand why the first/only character read is not 's.' At first I was thinking that the stream was getting flushed before getting to the serial event again, but that still doesn't account for reading in an incorrect first character.
Also, I added the following Serial1 print statement to run multiple times in the Arduino setup and the Android device receives it each time with no errors, so sending data is working just fine:
Serial1.print("b,1\n");
Or even
Serial1.print("A long test message. A long test message.\n");
I'm fairly close to answering my own question now with further testing/debugging. I actually think the answer may end up being hardware-related rather than software. I wanted to find out if the problem was with the data sent from the HC-06 to port 1, or with the reading function of port 1. I basically had serial port 0 read in data, then send it serially to port 1, which would read that data, and send feedback over Bluetooth to the Android device. Serial port 1 read the data fine coming from port 0, so the issue is reading in data specifically from the HC-06. It may simply be a voltage level issue, so the answer may not belong with Stack Overflow. I will leave the question unanswered though until I definitively have found the root cause (allowing for the possibility that I might need some define statement for the HC-06 or serial port 1 for data to be read correctly, though I'm guessing a voltage level conversion may do the trick. I'm not sure why there would be such a difference between Serial0 and Serial1 though).
I solved the problem enabling the pull-up resistor of the RX1 pin:
Serial1.begin(9600);
pinMode(19, INPUT);
digitalWrite(19, HIGH);
Therefore the 3 V is "overridden" by Arduino's 5 V for logical HIGH and zero is pulled down by Bluetooth's TX for logical LOW.
I did it slightly differently by using the INPUT_PULLUP feature to pull the hardware Serial3 pin HIGH:
Serial3.begin(19200);
pinMode(15, INPUT_PULLUP); // Serial3 receive pin
It works a treat. Previously my serial communications between two Arduino Mega 2560s had been mainly corruption with the occasional correct transmission. Now it is mainly the correct transmission. Previously, the time taken to receive a valid serial value was up to 2.5 seconds (due to having to repeatedly retry to get a valid transmit). Now the time taken to get a valid transmit is 20 ms. (I use a twisted pair of single core wires about 35 cm length.)
After checking the serial data lines on my oscilloscope, I was able to come up with a solution to my issue. With the setup described about the related Stack Overflow question (Bluetooth TX → RX0, TX0 → RX1, TX1 → Bluetooth RX), I checked the signals sent by the Bluetooth device and by the Arduino's serial port 0.
The Bluetooth device's signal was low at 400 mV and high at 3.68 V, while the Arduino's port 0 sent low at 0V and high at 5 V. I knew the Bluetooth device was a 3.3V level device, but the Arduino should read anything above about 3V as high, so this should have not been an issue (and it obviously wasn't on Serial 0).
Anyway, I replaced the HC-06 Bluetooth device with a backup one I had purchased and everything works fine with the code using Serial1 that I posted in my question. This Bluetooth device was transmitting low at about 160 mV and high at 3.3 V.
It seems that my problem was rooted in the hardware (voltage levels) as expected, and that for some reason Serial1 is more sensitive to changes in digital levels than is Serial0.
I have a problem with socket send (or write) function on android.
There is my network lib that I use on Linux and Android. Code is written in C.
On Android, application creates a service, which loads a native code and creates the connection with the help of my network lib. Connection is the TCP socket. When I call send (or write, no difference), code hangs in this call in most cases. Sometimes, it unhangs after 10-120 seconds. Sometimes, it waits longer (until I kill the application). Data size being sent is about 40-50 bytes. First data sending (handshake, 5 bytes) never hangs (or I am just lucky). The hanging send is, usually, next after handshake packet. Time between this first handshake packet sending and hanging sending is about 10-20 seconds.
The socket is used on another thread (I use pthread), where the recv is called. But, I do not send data to Android in this time, so recv is just waiting when I call send.
I am sure that other side is waiting for the data – I see that recv on other side returns with EAGAIN every 3 seconds (I set timeout) and immediately calls recv again. Recv is waiting 10 bytes always (minimal size of packet).
I am unable to reproduce this behavior on Linux-to-Android transfer or Linux-to-Linux, only on Adnroid-to-Linux. I am able to reproduce this with two available to me different Android devices, so I don’t think this is the problem in broken hardware of one particular device.
I tried to set SO_KEEPALIVE and TCP_NODELAY options with no success.
What can issue the hang-up on send/write calls and how can I resolve this?
Socket created with this code:
int sockfd, n;
addrinfo hints, *res, *ressave;
bzero(&hints, sizeof(addrinfo));
hints.ai_family = AF_INET;
hints.ai_socktype = SOCK_STREAM;
if ((n = getaddrinfo(host, serv, &hints, &res)) != 0)
{ /* stripped error handling*/ }
ressave = res;
do
{
sockfd = socket(res->ai_family, res->ai_socktype, res->ai_protocol);
if (sockfd < 0) continue;
if (connect(sockfd, res->ai_addr, res->ai_addrlen) == 0)
{
break; /* success */
}
close(sockfd); /* ignore this one */
} while ((res = res->ai_next) != NULL);
Hanging send operation is:
mWriteMutex.lock();
mSocketMutex.lockRead();
ssize_t n = send(mSocket, pArray, size, 0);
mSocketMutex.unlock();
mWriteMutex.unlock();
The problem is solved with the help of Nikolai N Fetissov in commentaries - his right question has unblocked my mind and I found a problem in RWMutex.