I am using a TCP socket connection in an action game, we use lockstep synchronization, client will receive 20 - 40 packets per second. It works fine when the game is running on a PC but when running on Android devices, the socket will get stuck every 20 seconds
Server sends 5 packets per second
I have tried to use Unity3D's C# socket, Android Java socket and Android native C socket and blocking/non-blocking mode, small/large (1byte/100byte) data per packet, less/more (5/50) packets per second, use single thread/main thread, on multiple Android devices, all of them have the same issue.
PS: It seems the 20 second duration is based on devices, not my app or connection; that means if last stuck happens at 1:00:00, the next stuck will happen at 1:00:20, even if we reconnect or restart the app.
Android native C code:
extern "C" JNIEXPORT int JNICALL
Java_com_example_ymoon_sockettest_MainActivity_connect(JNIEnv *env, jobject /* this */)
{
__android_log_print(ANDROID_LOG_INFO, "CSocket", "Connecting...");
int sock = socket(AF_INET, SOCK_STREAM, 0);
if (sock < 0) return -1;
struct sockaddr_in server;
memset(&server, 0, sizeof(server));
server.sin_family = AF_INET;
server.sin_port = htons(12350);
server.sin_addr.s_addr = inet_addr("192.168.3.66");
if (connect(sock, (struct sockaddr*)&server, sizeof(server)) < 0)
{
close(sock);
return -2;
}
__android_log_print(ANDROID_LOG_INFO, "CSocket", "Connected");
char buf[100];
int t = -1;
int received = 0;
while (true)
{
int count = recv(sock, &buf, 100, 0);
if (count < 1) break;
received += count;
struct timeval tv;
gettimeofday(&tv, NULL);
int m = tv.tv_sec * 1000 + tv.tv_usec / 1000;
int diff = m - t;
t = m;
std::string p = t < 0 ? "" : diff < 50 ? "-" : diff < 100 ? "-|" : diff < 150 ? "--|" :
diff < 250 ? "---|" : diff < 500 ? "----|" : diff < 1000 ? "-----|" : "------|";
__android_log_print(diff > 500 ? ANDROID_LOG_ERROR : ANDROID_LOG_INFO,
"CSocket", "%i | %s %i", received, p.c_str(), diff);
}
close(sock);
return 0;
}
Am I doing something wrong ?
It is my first time to ask a question on stackoverflow, sorry for my bad English, any help or suggestion is appreciated, thanks.
Edit: add server code, i rewrite a simple tcp server to test (Window platform)
int main()
{
addrinfo conf, *add = nullptr;
memset(&conf, 0, sizeof(conf));
conf.ai_flags = AI_PASSIVE;
conf.ai_socktype = SOCK_STREAM;
conf.ai_family = AF_INET;
if (getaddrinfo(nullptr, "12350", &conf, &add)) return 0;
SOCKET serverSock = socket(AF_INET, SOCK_STREAM, 0);
char opt = 1;
if (setsockopt(serverSock, SOL_SOCKET, SO_REUSEADDR, &opt, sizeof(opt)) == -1 ||
setsockopt(serverSock, IPPROTO_TCP, TCP_NODELAY, &opt, sizeof(opt)) == -1 ||
bind(serverSock, add->ai_addr, add->ai_addrlen) == -1 ||
listen(serverSock, 0) == -1)
{
close(serverSock);
return 0;
}
printf("Listening....\n");
sockaddr_storage incoming_addr;
int size = sizeof(incoming_addr);
SOCKET clientSock = accept(serverSock, (sockaddr*)&incoming_addr, &size);
printf("Client connected\n");
char buf[1] = { 0 };
int sendCount = 0;
while (true)
{
time_t t = time(nullptr);
tm *lt = localtime(&t);
printf("%02d:%02d:%02d Send to client %i\n", lt->tm_hour, lt->tm_min, lt->tm_sec, ++sendCount);
if (send(clientSock, buf, 1, 0) < 1) break;
Sleep(200);
}
close(serverSock);
return 0;
}
Edit: Add WireShark capture image:
Wireshark shot when stuck happen
I have test it (code posted here) in my home, and another WIFI environment, it worked fine. The only difference is the WIFI environment, so I think maybe our company WIFI network settings cause this issue.
Related
I have been trying to check if a NodeJs Server is up from C code in Android JNI, but my attempt never returns true even when the server is up, even when I can connect to the server from the app.
Here's my approach in C :
const char *myIpAddress = "192.168.0.13";
int count = 10;
struct in_addr ip_bin;
int sock = socket(AF_INET, SOCK_STREAM, 0);
if (sock < 0) {
return my_call_back_func("0");;
}
struct sockaddr_in server_addr;
memset(&server_addr, 0, sizeof(server_addr));
server_addr.sin_family = AF_INET;
q server_addr.sin_port = htons(PORT_NO);
server_addr.sin_addr.s_addr = inet_pton(AF_INET, myIpAddress, &ip_bin);
__android_log_print(ANDROID_LOG_WARN, "MyApp", "Binary representation of %s: %x\n", myIpAddress, ntohl(ip_bin.s_addr));
int i;
for (i = 0; i < count; i++) {
__android_log_print(ANDROID_LOG_WARN, "MyApp", "\n>>> %d PING <<<", i);
if (connect(sock, (struct sockaddr *) &server_addr, sizeof(server_addr)) >= 0) {
close(sock);
__android_log_print(ANDROID_LOG_WARN, "MyApp", "\n>>> IP connection successful !!! <<<");
my_call_back_func("1");
return;
}
close(sock);
usleep(PING_SLEEP_RATE);
}
NodeJs Server :
const MY_PORT = process.env.PORT || 4000;
const MY_HOST = "192.168.0.13";
const server = http.createServer(app);
server.on("listening", function () {
console.log(`>>> R.D Server is running ${JSON.stringify(server.address())}`);
});
server.listen(MY_PORT, MY_HOST, () => {
console.log(`Listening on port ${MY_PORT} . . . .`);
console.log(`CORS-enabled web server listening on port : ${MY_PORT}`);
});
What is it that I'm doing wrong?
Thank you all in advance.
I am debugging a RTC video stutters issues on Android and I tried serval different devices. To simplify the question, I just keep sending udp packets with an interval about 10ms from a MAC and receiving them on Android with a good wifi. I can see big jitters (bigger than 200ms) almost every minute, can be bigger than 600ms sometimes. Especially when I open and close a task manager. Not duplicated with localhost testing. Can this be fixed?
while(1) {
int s = recvfrom(socket_fd, buffer, sizeof(buffer), 0, (struct sockaddr *)&recv_addr, (socklen_t *)&addr_len);
if (s > 0) {
struct timeval tv_ioctl;
tv_ioctl.tv_sec = 0;
tv_ioctl.tv_usec = 0;
int error = ioctl(socket_fd, SIOCGSTAMP, &tv_ioctl);
if (error == 0) {
int64_t ms = tv_ioctl.tv_sec * 1000LL + tv_ioctl.tv_usec/1000;
if (pre_rev_ms == 0) {
pre_rev_ms = ms;
}
if (ms - pre_rev_ms > 200) {
LOGV("Udp glitches\n");
}
pre_rev_ms = ms;
}
}
}
I have Tire Pressure Management System (TPMS) adapter that plugs into USB (http://store.mp3car.com/USB_TPMS_Version_2_20_4_Sensor_Kit_p/com-090.htm). I have it working with the original Windows software, as well as Linux C code to read the tire pressures and temperature. I'm now trying to use this adapter on Android and am having some difficulty. I can detect the device fine, but my reads are all returning -1 bytes read, whatever I try. Here's the C code I'm trying to convert:
int TpmsPlugin::readUsbSensor(int sid, unsigned char *buf)
{
int r, transferred;
buf[0] = 0x20 + sid;
r = libusb_interrupt_transfer(mDeviceHandle, ENDPOINT_OUT, buf, 1, &transferred, INTR_TIMEOUT);
if (r < 0) {
DebugOut() << "TPMS: USB write interrupt failed, code " << r << endl;
}
r = libusb_interrupt_transfer(mDeviceHandle, ENDPOINT_IN, buf, 4, &transferred, INTR_TIMEOUT);
if (r < 0) {
DebugOut() << "TPMS: USB read interrupt failed, code " << r << endl;
}
return r;
The value of sid is 1, 2, 3 or 4 depending on the wheel. The values are then extracted with:
lfPressure = ((float)buf[0]-40) * PRESSURE_SCALE * KPA_MULTIPLIER;
lfTemperature = (float)buf[1]-40;
You can see full implementation of this driver here as well: https://github.com/otcshare/automotive-message-broker/blob/master/plugins/tpms/tpmsplugin.cpp
My Android version is able to find the USB device, get permission to use it, connect to it, get the UsbEndpoints (it lists two), but whether bulkTransfer() or controlTransfer() I try, I've failed. In particular, I've tried a lot of different controlTransfer values based on all the docs I could find. Here is some code that I've tried:
UsbInterface intf = TpmsSectionFragment.device.getInterface(0);
UsbEndpoint endpoint_in = null, endpoint_out = null;
for (int i = 0; i < intf.getEndpointCount(); i++) {
UsbEndpoint ep = intf.getEndpoint(i);
if (ep.getDirection() == UsbConstants.USB_DIR_IN)
endpoint_in = ep;
else if (ep.getDirection() == UsbConstants.USB_DIR_OUT)
endpoint_out = ep;
}
UsbDeviceConnection connection = gUsbManager.openDevice(TpmsSectionFragment.device);
connection.claimInterface(intf, false);
int timeout = 1000;
int length = 4;
while (true) {
for (int sensorId = 1; sensorId <= 4 && mReadThreadActive; sensorId++) {
byte[] tpmsRaw = new byte[length];
tpmsRaw[0] = (byte) (0x20 + sensorId);
int out_len = connection.bulkTransfer(endpoint_out, tpmsRaw, 1, timeout);
int in_len = connection.bulkTransfer(endpoint_in, tpmsRaw, 4, timeout);
//int out_len = connection.controlTransfer(0x42, 0x0, 0x100, 0, tpmsRaw, tpmsRaw.length, timeout);
//int in_len = connection.controlTransfer(0x41, 0x0, 0x100, 0, tpmsRaw, tpmsRaw.length, timeout);
Any thoughts on what I could be doing wrong are greatly appreciated. I'm happy to try a few different things to debug further if you have any suggestions.
Thanks!
Here's the solution based on the help from Chris. I converted the calls to queue / requestWait:
ByteBuffer buf = ByteBuffer.allocate(4);
buf.put(0, (byte) (0x20 + sensorId));
UsbRequest send = new UsbRequest();
send.initialize(connection, endpoint_out);
Boolean sent = send.queue(buf, 1);
UsbRequest r1 = connection.requestWait();
send.initialize(connection, endpoint_in);
send.queue(buf, 4);
UsbRequest r2 = connection.requestWait();
The other thing I needed to tweak was this call and set the second parameter to true:
connection.claimInterface(intf, true);
That's it. Done. Thanks for the help!
there's simple socket program (server / client) i made.
server is working on windows and client is android app which contains shared library made of c socket.
in client side, to avoid freeze i changed socket to NON_BLOCK than rollback to BLOCK socket after passing connect() function. after that, i search connection is available using getpeername().
as below...
flags = fcntl(sock, F_GETFL, 0);
fcntl(sock, F_SETFL, flags|O_NONBLOCK);
nRet = connect(sock, (struct sockaddr*)&addr, sizeof(addr));
if (nRet < 0)
if (errno != EINPROGRESS && errno != EWOULDBLOCK)
return ERR_CONN;
...
...
nRet = select(sock+1, &readset, &writeset, NULL, &tv);
if (nRet == 0)
if (errno != EINPROGRESS && errno != EWOULDBLOCK)
return ERR_SELECT;
nRet = getpeername(sock, (struct sockaddr*)&addr, &len);
if (nRet != 0)
return ERR_GETPEER;
fcntl(sock, F_SETFL, flags);
everything's working well. except working on 3G mode.
sometimes even if Server got connection, client side return errors in getpeername().
error code is ENOTCONN.
what should i implement to avoid this? any suggestion would be appreciate.
thx in advance.
i found what's wrong.
first, i should roll back to BLOCK socket after passing connect() not before return.
second, it seems that select() doesn't guarantee to hold for timeval, so i made function like GetTickCount() than counted time to break loop.
here's my solution.
fcntl(O_NONBLOCK);
connect();
fcntl(BLOCK);
while (true)
{
fd_set rdfs, wdfs;
FD_ZERO(&rdfs); FD_ZERO(&wdfs);
FD_SET(sock, &rdfs); FD_SET(sock, &wdfs);
tv.tv_sec = 0; tv.tv_usec = 100;
nRet = select(sock + 1, &rdfs, &wdfs, NULL, &tv);
if (nRet == -1) return -1;
else if (nRet)
{
nRet = getpeername();
if (!nRet)
break;
}
if (get_tick_count() - delay > timeout)
return -2;
}
thx! sarnold, caf :)
To determine if the socket is connected, it is more usual to use getsockopt() rather than getpeername():
int so_error;
socklen_t len = sizeof so_error;
getsockopt(sock, SOL_SOCKET, SO_ERROR, &so_error, &len);
if (so_error == 0) {
/* socket is connected */
}
I'm trying to produce a simple server that will allow me test the Androids security features. I need to develop an application that will open a socket.
I've produced something similar in C, but I am having no look with java. Here's the application in C
// simpleserver3.c
#define MY_PORT 9999
#define MAXBUF 99
void indata(int clientfd, struct sockaddr_in client_addr)
{
char buffer[12];
printf("%s:%d connected\n", inet_ntoa(client_addr.sin_addr), ntohs(client_addr.sin_port));
recv(clientfd, buffer, MAXBUF, 0); //this is will overflow the buffer
printf("%X \n", &buffer);
}
int main(int Count, char *Strings[])
{
struct sockaddr_in self, client_addr;
int sockfd,clientfd;
/*---Create streaming socket---*/
if ( (sockfd = socket(AF_INET, SOCK_STREAM, 0)) < 0 ) //socketfd = handle for socket
{
perror("Socket");
exit(errno);
}
/*---Initialize address/port structure---*/
bzero(&self, sizeof(self));
self.sin_family = AF_INET;
self.sin_port = htons(MY_PORT);
self.sin_addr.s_addr = INADDR_ANY;
/*---Bind the structure to the socket handle ---*/
if ( bind(sockfd, (struct sockaddr*)&self, sizeof(self)) != 0 )
{
perror("socket--bind");
exit(errno);
}
/*---Make it a "listening socket"---*/
if ( listen(sockfd, 20) != 0 )
{
perror("socket--listen");
exit(errno);
}
//set socklen_t to length of client address
socklen_t addrlen=sizeof(client_addr);
/*---accept a connection (creating a data pipe)---*/
clientfd = accept(sockfd, (struct sockaddr*)&client_addr, &addrlen); //create handle for communicating
indata(clientfd, client_addr);
close(clientfd);
close(sockfd);
return;
}
Any sugguestion would be great, Aneel
It's been a while since I used C, so I can't comment on your C code, but you should probably take a look at the Android documentation for the Socket class:
http://developer.android.com/reference/java/net/Socket.html
Check out this example: http://thinkandroid.wordpress.com/2010/03/27/incorporating-socket-programming-into-your-applications/