I'm facing a wierd problem something can not explain ;)
now i'm developing client program working on the android phone.
this app connects remote server and does something.
core library which's made in C++ (NDK) and Android UI works fine when using WIFI mode
but system freezes when 3G data mode.
i got where this freezing causes, it was in connect() function.
the wierd thing is socket is already set NON-BLOCK mode before connect() line.
m_nSock = socket(AF_INET, SOCK_STREAM, 0);
if (m_nSock <= 0)
{
close(m_nSock);
return -1;
}
flags = fcntl(m_nSock, F_GETFL, 0);
fcntl(m_nSock, F_SETFL, flags | O_NONBLOCK);
struct sockaddr_in AddrClient;
memset(&AddrClient, 0x00, sizeof(AddrClient));
AddrClient.sin_family = AF_INET;
AddrClient.sin_addr.s_addr = inet_addr(szIP);
AddrClient.sin_port = htons(nPort);
nRet = connect(m_nSock, (struct sockaddr*)&AddrClient, sizeof(AddrClient));
blocking takes always about 21 seconds. (it may show default time is used somewhere in the kernel, i think.) how can i fix this? what should i search for?
any suggestion is welcome.
thanks in advance.
Try these changes:
put socket to non-blocking mode:
dword mode = 1;
ioctl(socket, FIONBIO, &mode);
back to blocking mode:
mode = 0;
ioctl(socket, FIONBIO, &mode);
This is how it works for me to set blocking mode
Your blocking code doesn't look right - you should be using F_SETFL as the command to set the flags. So:
int flags = fcntl(sock, F_GETFL);
fcntl(sock, F_SETFL, flags | O_NONBLOCK);
Related
I have Android on my device.
I'm drawing pictures before Android finish his loading.
I have issue with high DDR usage(average frequency too high),
checked by cat /sys/kernel/debug/clk/measure_only_mccc_clk/clk_measure
I found temporary solution - release drm resources before Andoid. But it is not good for me as I have black screen gap between my pictures and Android displaying.
If I move releasing of resources on time after Android I receiving again my problem with high DDR freq.
I checked state of /sys/kernel/debug/dri/0/state and find out the difference between success work of DDR and unsuccess.
so difference:
connector[168]: shared-disp-1
crtc=(null)
and
connector[168]: shared-disp-1
crtc=crtc-6
and for crtc's
crtc[170]: crtc-6
enable=0
active=0
planes_changed=1
mode_changed=1
active_changed=1
and
crtc[170]: crtc-6
enable=1
active=1
planes_changed=1
mode_changed=0
active_changed=0
So questions is:
Where can I read about work with drm in userspace?
How to disable connector and crtc?
So I finded the way how to disable CRTC in my case
It's just need to call in the end of my program
bufferId = 0;
x, y = 0;
arrayOfConnectors = nullptr;
numberOfConnectors = 0;
mode = nullptr;
drmModeSetCrtc(fd, crtcId, bufferId, x, y, arrayOfConnectors, numberOfConnectors, mode);
I currently have a HC-06 Bluetooth device connected to my Arduino Mega 2560 in order to receive strings sent from an Android device. With the HC-06 on Serial 0, I am receiving the data without error with the following code:
String inString = "";
int index = 0;
boolean stringComplete = false;
void setup() {
Serial.begin(9600);
pinMode(pwmPin, OUTPUT);
}
void loop() {
if(stringComplete) {
ParseSerialData(); // Parse the received data
inString = ""; // Reset inString to empty
stringComplete = false; // Reset the system for further input of data
}
}
void serialEvent() {
while(Serial.available() && stringComplete == false) {
char inChar = Serial.read();
inData[index] = inChar; // Store it in char array
index++;
if (inChar == '\n') { // Check for termination character
index = 0; // Reset the index
stringComplete = true; // Set completion of read to true
} else {
inString += inChar; // Also store as string
}
}
}
When I try to replace "Serial" with "Serial1" and "serialEvent()" with "serialEvent1()" and move the Bluetooth device to the TX1 and RX1, this program no longer works.
I have read that some people had similar problems when using AVR-GCC 4.4.x and solved the issue by downgrading to 4.3.x, but I have 4.3.2 (on Windows 8.1, same problem has arisen with Arduino IDE 1.0.3, 1.0.5-r2, and 1.5.6-r2).
I added the following print statements (with Serial 0 to print to the monitor on my PC) to the code with the Bluetooth device still on Serial 1:
String inString = "";
int index = 0;
boolean stringComplete = false;
void setup() {
Serial1.begin(9600);
Serial.begin(9600);
pinMode(pwmPin, OUTPUT);
Serial.println("Setting up...");
}
void loop() {
if(stringComplete) {
ParseSerialData();
inString = "";
stringComplete = false;
}
}
void serialEvent1() {
Serial.println("In serialEvent1...");
while(Serial1.available() && stringComplete == false) {
Serial.println("Char in...");
char inChar = Serial1.read();
Serial.println("WTF");
Serial.println(inChar);
Serial.println("WTF2");
inData[index] = inChar;
index++;
if (inChar == '\n'){
Serial.println("Termination char read...");
index = 0;
stringComplete = true;
}else{
inString += inChar;
}
}
}
Doing this, on the monitor I get:
Setting up...
In serialEvent1...
Char in...
WTF
WTF2
inChar typically prints as nothing, but during one test it was printing as an '#' character. The string sent is "s,1\n" from the Android device.
Based on the print out, the serial event is triggered by availability of serial data, but Serial1.available() remains true for only the first iteration, 's' is not read in (nor any of the other characters that do when Serial is used), and a termination character (newline char) is never read in so that I can begin parsing.
I also tried various baud rates with no difference in behavior. Based on reading Arduino documentation, serial port 1 should work just like serial port 0, and I did not miss substituting Serial for Serial1 in any part of the code.
What could be causing errors in communicating over Serial1 in the same way that has worked flawlessly on Serial0?
I also found a related Stack Overflow question, which was solved with something similar to my code (which works perfectly with Serial0 and is based on the Arduino documentation) using an expected termination character (the difference being that his code implements serial reading in the main loop, whereas mine is in a serialEvent). For some reason, it seems that both of us were having issues with Serial1 data not showing as available at the start of the next iteration. For some reason, serialEvent1 is not being called again for me. And I still don't understand why the first/only character read is not 's.' At first I was thinking that the stream was getting flushed before getting to the serial event again, but that still doesn't account for reading in an incorrect first character.
Also, I added the following Serial1 print statement to run multiple times in the Arduino setup and the Android device receives it each time with no errors, so sending data is working just fine:
Serial1.print("b,1\n");
Or even
Serial1.print("A long test message. A long test message.\n");
I'm fairly close to answering my own question now with further testing/debugging. I actually think the answer may end up being hardware-related rather than software. I wanted to find out if the problem was with the data sent from the HC-06 to port 1, or with the reading function of port 1. I basically had serial port 0 read in data, then send it serially to port 1, which would read that data, and send feedback over Bluetooth to the Android device. Serial port 1 read the data fine coming from port 0, so the issue is reading in data specifically from the HC-06. It may simply be a voltage level issue, so the answer may not belong with Stack Overflow. I will leave the question unanswered though until I definitively have found the root cause (allowing for the possibility that I might need some define statement for the HC-06 or serial port 1 for data to be read correctly, though I'm guessing a voltage level conversion may do the trick. I'm not sure why there would be such a difference between Serial0 and Serial1 though).
I solved the problem enabling the pull-up resistor of the RX1 pin:
Serial1.begin(9600);
pinMode(19, INPUT);
digitalWrite(19, HIGH);
Therefore the 3 V is "overridden" by Arduino's 5 V for logical HIGH and zero is pulled down by Bluetooth's TX for logical LOW.
I did it slightly differently by using the INPUT_PULLUP feature to pull the hardware Serial3 pin HIGH:
Serial3.begin(19200);
pinMode(15, INPUT_PULLUP); // Serial3 receive pin
It works a treat. Previously my serial communications between two Arduino Mega 2560s had been mainly corruption with the occasional correct transmission. Now it is mainly the correct transmission. Previously, the time taken to receive a valid serial value was up to 2.5 seconds (due to having to repeatedly retry to get a valid transmit). Now the time taken to get a valid transmit is 20 ms. (I use a twisted pair of single core wires about 35 cm length.)
After checking the serial data lines on my oscilloscope, I was able to come up with a solution to my issue. With the setup described about the related Stack Overflow question (Bluetooth TX → RX0, TX0 → RX1, TX1 → Bluetooth RX), I checked the signals sent by the Bluetooth device and by the Arduino's serial port 0.
The Bluetooth device's signal was low at 400 mV and high at 3.68 V, while the Arduino's port 0 sent low at 0V and high at 5 V. I knew the Bluetooth device was a 3.3V level device, but the Arduino should read anything above about 3V as high, so this should have not been an issue (and it obviously wasn't on Serial 0).
Anyway, I replaced the HC-06 Bluetooth device with a backup one I had purchased and everything works fine with the code using Serial1 that I posted in my question. This Bluetooth device was transmitting low at about 160 mV and high at 3.3 V.
It seems that my problem was rooted in the hardware (voltage levels) as expected, and that for some reason Serial1 is more sensitive to changes in digital levels than is Serial0.
i am trying to capture video on android using v4l2 under jni. i found some guide and followed the step:
fd = open("/dev/video0", O_RDWR);
/* init part */
ioctl(fd, VIDIOC_QUERYCAP, &caps);
ioctl(fd, VIDIOC_ENUM_FMT, &fmtdesc);
ioctl(fd, VIDIOC_S_FMT, &fmt);
ioctl(fd, VIDIOC_REQBUFS, &req);
ioctl(fd, VIDIOC_QUERYBUF, &buf);
ioctl(fd, VIDIOC_QBUF, &buf);
/* capture part */
FILE *fp = fopen("/sdcard/img.yuv", "wb");
for (i = 0; i < 20; i++)
{
ioctl(fd, VIDIOC_DQBUF, &buf);
fwrite(buffers[buf.index].start, 1, buf.bytesused, fp);
ioctl(fd, VIDIOC_QBUF, &buf);
}
fclose(fp);
this is the main structure of my code. all the function run correctly and return 0. however, when i open the output file with binary viewer, i found that all the data is 0.
is there any problem with my code? i got confused because all the functions returned 0.
Thanks!!
You are using an array called buffers[]. But I can't see where it's declared or what it stands for. If there is no code missing above, you will always get zeros cause you are writing buffer[] to the file and not the stuff you get from v4l2.
Further more, the initial values of caps, fmtdesc, fmt, req and buf prior to the ioctl command would be interesting too. Depending on their inital values, you will have different communication interfaces. Issues could be hidden in these parts.
As you wrote in your question, all ioctl commands would return 0, there should be no error. If everything behaves as expected. Another way to check for issues is calling
perror("<your comment or hint to line above>");
after each ioctl command. This would print you more information about errors on your std-out. ( more details about perror can be found in this thread When should I use perror("...") and fprintf(stderr, "...")?)
Are you trying to get the images from the camera? (on some phones video0 you used above is the back cam) On some android devices the camera has to be started by complex procedure using other device drivers besides videoXY. And trying to get the images from video0 while the official camera app is running might be difficult. The official v4l2 api says:
V4L2 drivers should not support multiple applications reading or writing the same data stream on a device by copying buffers, time multiplexing or similar means. This is better handled by a proxy application in user space.
From: http://linuxtv.org/downloads/v4l-dvb-apis/common.html#idp18553208
Can you post more (detailed) code? I might be able to help, as I'm doing very similar stuff.
To be able to reproduce it, it would be very interesting with which android device you are working (type / model number / android version).
I have a problem with socket send (or write) function on android.
There is my network lib that I use on Linux and Android. Code is written in C.
On Android, application creates a service, which loads a native code and creates the connection with the help of my network lib. Connection is the TCP socket. When I call send (or write, no difference), code hangs in this call in most cases. Sometimes, it unhangs after 10-120 seconds. Sometimes, it waits longer (until I kill the application). Data size being sent is about 40-50 bytes. First data sending (handshake, 5 bytes) never hangs (or I am just lucky). The hanging send is, usually, next after handshake packet. Time between this first handshake packet sending and hanging sending is about 10-20 seconds.
The socket is used on another thread (I use pthread), where the recv is called. But, I do not send data to Android in this time, so recv is just waiting when I call send.
I am sure that other side is waiting for the data – I see that recv on other side returns with EAGAIN every 3 seconds (I set timeout) and immediately calls recv again. Recv is waiting 10 bytes always (minimal size of packet).
I am unable to reproduce this behavior on Linux-to-Android transfer or Linux-to-Linux, only on Adnroid-to-Linux. I am able to reproduce this with two available to me different Android devices, so I don’t think this is the problem in broken hardware of one particular device.
I tried to set SO_KEEPALIVE and TCP_NODELAY options with no success.
What can issue the hang-up on send/write calls and how can I resolve this?
Socket created with this code:
int sockfd, n;
addrinfo hints, *res, *ressave;
bzero(&hints, sizeof(addrinfo));
hints.ai_family = AF_INET;
hints.ai_socktype = SOCK_STREAM;
if ((n = getaddrinfo(host, serv, &hints, &res)) != 0)
{ /* stripped error handling*/ }
ressave = res;
do
{
sockfd = socket(res->ai_family, res->ai_socktype, res->ai_protocol);
if (sockfd < 0) continue;
if (connect(sockfd, res->ai_addr, res->ai_addrlen) == 0)
{
break; /* success */
}
close(sockfd); /* ignore this one */
} while ((res = res->ai_next) != NULL);
Hanging send operation is:
mWriteMutex.lock();
mSocketMutex.lockRead();
ssize_t n = send(mSocket, pArray, size, 0);
mSocketMutex.unlock();
mWriteMutex.unlock();
The problem is solved with the help of Nikolai N Fetissov in commentaries - his right question has unblocked my mind and I found a problem in RWMutex.
I am coding a native application in android and I need to get the default gateway of a device on my application. Here is my current code to get the default gateway.
static int get_default_gateway(char *def_gateway, int buf_size)
{
FILE* pipe;
char buffer[128];
char result[2049];
char cmd[] = "netstat -r | grep ^default | awk '{print $2}'";
pipe = popen(cmd, "r");
if (!pipe) return 1;
memset(result, 0, sizeof(result));
while(!feof(pipe)) {
memset(buffer, 0, sizeof(buffer));
if(fgets(buffer, 128, pipe) != NULL)
{
strcat(result, buffer);
}
}
pclose(pipe);
memset(def_gateway, 0, buf_size);
strncpy (def_gateway, result, buf_size );
return 0;
}
It works on my LG p500 but on some devices it doesn't return anything.
My question is: Does popen() works on android? I read somewhere that it is not included in bionic.
And is there any other method to get the default gateway? I need it to be written in C and not java.
Thank you
Yea, probably popen() should work on any Android. But unfortunately grep and awk - not. Take a look at /proc/net/route - line where Destination equals to 00000000 is your default gateway. Also perhaps you can use NETLINK_ROUTE socket, though I never used it and can't say more.
See also this related question.