android - Bluetooth: program stuck at inputstream reading - android

Helo.
Im developing an application that transferes data over bluetooth(with a flight recorder device). When i am recieving a lot of data data(3000 - 40000 lines of text, depends of the file size) my aplication seems to stop recieving the data. I recieve the data with InputStream.read(buffer). For example: I send a command to the flight recorder, it starts sending me a file(line by line), on my phone i recieve 120 lines and then the app stucks.
Intresting is that on my HTC Desire the app stucks just sometimes, on the Samsung Galaxy S phone the application stucks every single time i try to recive more than 50 lines.
The code is based on the BluetoothChat example. This is the part of code where i am listening to the BluetoothSocket:
byte[] buffer = new byte[1024];
int bytes =0;
while(true)
{
bytes = mmInStream.read(buffer);
readMessage = new String(buffer, 0, bytes);
Log.e("read", readMessage);
String read2 = readMessage;
//searching for the end of line to count the lines(the star means the start of the checksum)
int currentHits = read2.replaceAll("[^*]","").length();
nmbrOfTransferedFligts += currentHits;
.
.
.
//parsing and saving the recieved data
I must say that i am running this in a while(true) loop, in a Thread, that is implemented in an android Service. The app seems to stuck at "bytes = mmInStream.read(buffer);"
I have tried to do this with BufferedReader, but with no success.
Thanks.

The app seems to stuck at "bytes = mmInStream.read(buffer);"
But that is normal behavior: InputStream.read(byte[]) blocks when there is no more data available.
This suggests to me that the problem is on the other end or in the communication between the devices. Is is possible that you have a communication problem (which is a bit different on the Galaxy vs. the Desire) that is preventing more data from being received?
Also, I would suggest that you wrap a try/catch around the read statement to be sure that you catch any possible IOException's. Though I guess you would have seen it in logcat if that were happening.
Speaking of logcat, I would suggest that you look at the logcat statements that Android itself it generating. I find that it generates a lot for Bluetooth and this might help you to figure out whether there really is any more data to be read().

Related

InputStream reads available data in more than 1 try

I have two applications. Connected to each other over wifi. I am using InputStream to read data sent from server app.
The code is std one,
try {
bytesRead = mmInStream.read(buffer, 0, 6300);//read(buffer);
Logger.d(TAG, "Bytes read from inStream : "+bytesRead);
if (-1 != bytesRead) {
handler.obtainMessage(12, bytesRead, -1, buffer).sendToTarget();
} else {
connectionLost();
}
} catch (Exception e) {
e.printStackTrace();
connectionLost();
}
Killing and resetting threads in connectionLost method.
I am sending close to 6kb data from server app, in a JSON String.
This works 3 out of 5 times.
Sometimes the read will return say, 1.5kb buffer and in second run it will give rest of data. But meanwhile first 1.5 is sent to JSON parser and that gives me error.
I printed bytes written to outputBuffer from server side. it will write 6k bytes every time. I want to know why at sometime, read() method reads only half of stream and rest of it in second try?
How do i know if it has read total data or only half of it?
i dont know before hand how many bytes server will send. (I came up with number because i am debugging the code. that 6k may change later).
Thank you in advance. Stuck at this issue for two days. :(
It works as designed. When you read a stream, you are not guaranteed to get all available bytes in one go. Most likely they are not even available when you do the first read.
You need some programmatic ways to find out if a message is complete. For example if it is a JSON object or array, you will notice if it is complete or not by analysing what you got. Another way would be to transmit the length of the message.

Sending image via bluetooth is split into packages

I modified the sample Bluetooth Chat application with a feature of sending image. I ran into a problem I can't figure out. The sample app needs a string to send and receive data. I am suing Base64 to convert the bitmaps to byte then to string so I could send the text with the selected image. If I send a small image (e.g. a fully black image with 4kb size) it is sent and received and displayed in the listview. But when I send larger images, they are split into many packages. This means, instead of one image, I get numerous texts, and the size of array I use to populate the listview will be anyhing more than 1, depending on the number of packages, so my listview is populated with a lot of text.
As it turns out, it depends on the size of the image. I experimented with many images. In case of a photo, whose length (as a string) is 50096, I get 51 packages on the receiver side. In case of another image, whose lenght is 77896, I get 81 packages. So I decided to try with a really small image. I created a black image in Paint, and it was succesfully sent and received. I measured it's length, which was 280. Then I experimented with a white image with a little yellow, which had a lenght of 980. That was also successfully sent. Then I added some black to the image which resulted in 1554 in length. This time I received a couple of texts instead of the image.
So my conclusion is that the length of string is restricted when sending data via Bluetooth, which may be obvious to someone who is familiar with the technology.
What can I do about this? If I knew the max number of characters that I can send in one package, I could calculate the number of packages on the sender side and I could concatenate all the packages on the receiver size.
I show you two images on the receiver side to demonstrate this behavious. The first one is the correct function, the second one is the wrong function:
Your problem is in the receiver portion, as can be found in this function from the bluetooth chat source code:
public void run() {
Log.i(TAG, "BEGIN mConnectedThread");
byte[] buffer = new byte[1024];
int bytes;
// Keep listening to the InputStream while connected
while (true) {
try {
// Read from the InputStream
bytes = mmInStream.read(buffer);
// Send the obtained bytes to the UI Activity
mHandler.obtainMessage(BluetoothChat.MESSAGE_READ, bytes, -1, buffer)
.sendToTarget();
} catch (IOException e) {
Log.e(TAG, "disconnected", e);
connectionLost();
// Start the service over to restart listening mode
BluetoothChatService.this.start();
break;
}
}
}
Specifically, note the byte[] buffer = new byte[1024];. You are limited to only 1K. If you want to do more, you need to increase the size of the buffer, or build a method to detect that files are exceeding this buffer size.
A few things of note, should you design a protocol to handle this.
You shouldn't need to send this in base 64, the stream accepts binary just fine.
Put a header that includes the file type, packet number, and total number of packets. If you want to be really fancy, include a checksum of some sort, to verify the packet is intact.
You should include a mechanism to re-transmit missing packets.
You might want to send out an initial header, explaining what is coming.

Mono for Android - Activity crash upon service call

My application has a UI (implemented with an Activity) and a service (implemented with the IntentService). The service is used to send data (synchronous, using NetworkStream.Write) to a remote server as well as to update the transmission status to the UI (implemented using Broadcast Receiver method).
Here is my problem:
The application works properly if the size of the buffer used for the NetworkStream.Write is 11 KB or less.
However, if the size of the buffer is larger than 11 KB, say 20 KB (this size needed in order to send jpg images), then the sevice keeps working properly (verified with log file), nonetheless the UI its gone (similar as if device's back button is pushed) and I can't find the way to bring it back. Its important to point out that in this case the Activity its not going into OnStop() nor OnDestroy() states.
At first I thought this would be some ApplicationNotResponding related issue due to a server delay, yet the UI crashes after about 5 sec.
Moreover, this only happens with the Hardware version. The emulator version works fine.
// SEND STREAM:
Byte[] outStream = new Byte[20000];
// -- Set up TCP connection: --
TcpClient ClientSock = new TcpClient();
ClientSock.Connect("myserver.com", 5555);
NetworkStream serverStream = ClientSock.GetStream();
serverStream.Write(outStream, 0, outStream.Length);
serverStream.Flush();
// . . .
// RECEIVE STREAM:
inStream.Initialize(); // Clears any previous value.
int nBytesRead = 0;
nBytesRead = serverStream.Read(inStream, 0, 1024);
// -- Closing communications socket: --
ClientSock.Close();
One thing first: I would have been commented the question to clarify one thing before I give an answer, but unfortunately I don't have enough reputation yet.
The thing I would have asked for is: Why do you need to have a buffer greater than 11k to send an JPG image?
I nearly do the same in one (async) task with an image of 260k, but with a buffer of 10240 Bytes. Works without difficulties.
byte[] buffer = new byte[10240];
for (int length = 0; (length = in.read(buffer)) > 0;) {
outputStream.write(buffer, 0, length);
outputStream.flush();
bytesWritten += length;
progress = (int) ((double) bytesWritten * 100 / totalBytes);
publishProgress();
}
outputStream.flush();
I use this part to read an JPG image from resources or SD and post to my server.
Well you may want to change your application to use asynctask and take a look to the guide :
http://developer.android.com/training/basics/network-ops/connecting.html
Network operations can involve unpredictable delays. To prevent this from causing a poor user experience, always perform network operations on a separate thread from the UI.
Since android 4.0 it's impossible to perform network related task in the same thread as the UI thread. Also just to be clear http://developer.android.com/guide/components/services.html
Caution: A service runs in the main thread of its hosting process—the
service does not create its own thread and does not run in a separate
process

get SensorValues from Arduino ADK

I'am working with arduino and ADK und I have problems to get Information from Arduino-Sensors. At the moment I can send Information from Phone to control LED or servos.... . But it is impossible for me to send Data via acc.write(msg,3) to smarthphone. I'am using this example Dev-O-Rama and try to extend this example to get sensor data(without success). To get sensordata I'am using code from ADK example. Everytime when I try to send data from arduino with: acc.write(msg,3) it seems everything freezing. Have someone a working "little" example or a hint what can cause this behavior?
LG Marcel
I figure it's the Android application that's freezing?
Try using the USBControl library in my project
The arduino write code is very simple:
//Batt update, 0.1 Hz loop
if (acc.isConnected() && millis()-timer_batt >= 1000/BATT_FREQ) {
timer_batt = millis();
//Serial.print("b");
msg[0] = SYNC;
msg[1] = BATTERY_LEVEL;
msg[2] = getBatt();
acc.write(msg, 3);
}
Make sure you're limiting the writing frequency, and reading quickly enough on both sides. There's a strange behaviour where the connection will appear to hang if there's too much data being buffered.
It's hard to say what to do without knowing what kind of data you are sending. Basically you have to make sure use the acc.write() method with the correct parameters. The first one is the byte array itself and the second one is the number of bytes in the array. So if your byte array is 5 bytes in size you need to call it like that: acc.write(msg, 5).
It is even better to make it more dynamic by using the sizeof Arduino function:
acc.write(msg, sizeof(msg));

AudioTrack: Playing sound coming in over WiFi

I've got an AudioTrack in my application, which is set to Stream mode. I want to write audio which I receive over a wireless connection. The AudioTrack is declared like this:
mPlayer = new AudioTrack(STREAM_TYPE,
FREQUENCY,
CHANNEL_CONFIG_OUT,
AUDIO_ENCODING,
PLAYER_CAPACITY,
PLAY_MODE);
Where the parameters are defined like:
private static final int FREQUENCY = 8000,
CHANNEL_CONFIG_OUT = AudioFormat.CHANNEL_OUT_MONO,
AUDIO_ENCODING = AudioFormat.ENCODING_PCM_16BIT,
PLAYER_CAPACITY = 2048,
STREAM_TYPE = AudioManager.STREAM_MUSIC,
PLAY_MODE = AudioTrack.MODE_STREAM;
However, when I write data to the AudioTrack with write(), it will play choppy... The call
byte[] audio = packet.getData();
mPlayer.write(audio, 0, audio.length);
is made whenever a packet is received over the network connection. Does anybody have an idea on why it sounds choppy? Maybe it has something to do with the WiFi connection itself? I don't think so, as the sound doesn't sound horrible the other way around, when I send data from the Android phone to another source over UDP. The sound then sounds complete and not choppy at all... So does anybody have an idea on why this is happening?
Do you know how many bytes per second you are recieving, the average time between packets compares, and the maximum time between packets? If not, can you add code to calculate it?
You need to be averaging 8000 samples/second * 2 bytes/sample = 16,000 bytes per second in order to keep the stream filled.
A gap of more than 2048 bytes / (16000 bytes/second) = 128 milliseconds between incoming packets will cause your stream to run dry and the audio to stutter.
One way to prevent it is to increase the buffer size (PLAYER_CAPACITY). A larger buffer will be more able to handle variation in the incoming packet size and rate. The cost of the extra stability is a larger delay in starting playback while you wait for the buffer to initially fill.
I have partially solved it by placing the mPlayer.write(audio, 0, audio.length); in it's own Thread. This does take away some of the choppy-ness (due to the fact that write is a blocking call), but it still sounds choppy after a good second or 2. It still has a significant delay of 2-3 seconds.
new Thread(){
public void run(){
byte[] audio = packet.getData();
mPlayer.write(audio, 0, audio.length);
}
}.start();
Just a little anonymous Thread that does the writing now...
Anybody have an idea on how to solve this issue?
Edit:
After some further checking and debugging, I've noticed that this is an issue with obtainBuffer.
I've looked at the java code of the AudioTrack and the C++ code of AudioTrack And I've noticed that it only can appear in the C++ code.
if (__builtin_expect(result!=NO_ERROR, false)) {
LOGW( "obtainBuffer timed out (is the CPU pegged?) "
"user=%08x, server=%08x", u, s);
mAudioTrack->start(); // FIXME: Wake up audioflinger
timeout = 1;
}
I've noticed that there is a FIXME in this piece of code. :< But anyway, could anybody explain how this C++ code works? I've had some experience with it, but it was never as complicated as this...
Edit 2:
I've tried somewhat different now, the difference being that I buffer the data I receive, and then when the buffer is filled with some data, it is being written to the player. However, the player keeps up with consuming for a few cycles, then the obtainBuffer timed out (is the CPU pegged?) warning kicks in, and there is no data at all written to the player untill it is kick started back to life... After that, it will continually get data written to it untill the buffer is emptied.
Another slight difference is that I stream a file to the player now. That is, reading it in chunks, the writing those chunks to the buffer. This simulates the packages being received over wifi...
I am beginning to wonder if this is just an OS issue that Android has, and it isn't something I can solve on my own... Anybody got any ideas on that?
Edit 3:
I've done more testing, but this doesn't help me any further. This test shows me that I only get lag when I try to write to the AudioTrack for the first time. This takes somewhat between 1 and 3 seconds to complete. I did this by using the following bit of code:
long beforeTime = Utilities.getCurrentTimeMillis(), afterTime = 0;
mPlayer.write(data, 0, data.length);
afterTime = Utilities.getCurrentTimeMillis();
Log.e("WriteToPlayerThread", "Writing a package took " + (afterTime - beforeTime) + " milliseconds");
However, I get the following results:
Logcat Image http://img810.imageshack.us/img810/3453/logcatimage.png
These show that the lag initially occurs at the beginning, after which the AudioTrack keeps getting data continuously... I really need to get this one fixed...

Categories

Resources