I have two applications. Connected to each other over wifi. I am using InputStream to read data sent from server app.
The code is std one,
try {
bytesRead = mmInStream.read(buffer, 0, 6300);//read(buffer);
Logger.d(TAG, "Bytes read from inStream : "+bytesRead);
if (-1 != bytesRead) {
handler.obtainMessage(12, bytesRead, -1, buffer).sendToTarget();
} else {
connectionLost();
}
} catch (Exception e) {
e.printStackTrace();
connectionLost();
}
Killing and resetting threads in connectionLost method.
I am sending close to 6kb data from server app, in a JSON String.
This works 3 out of 5 times.
Sometimes the read will return say, 1.5kb buffer and in second run it will give rest of data. But meanwhile first 1.5 is sent to JSON parser and that gives me error.
I printed bytes written to outputBuffer from server side. it will write 6k bytes every time. I want to know why at sometime, read() method reads only half of stream and rest of it in second try?
How do i know if it has read total data or only half of it?
i dont know before hand how many bytes server will send. (I came up with number because i am debugging the code. that 6k may change later).
Thank you in advance. Stuck at this issue for two days. :(
It works as designed. When you read a stream, you are not guaranteed to get all available bytes in one go. Most likely they are not even available when you do the first read.
You need some programmatic ways to find out if a message is complete. For example if it is a JSON object or array, you will notice if it is complete or not by analysing what you got. Another way would be to transmit the length of the message.
Related
when android communicate with pc via usb accessory mode, the android can not receive a data if pc send 512bytes data to android.
but, there are no problem if over(or below) than 512bytes data transfer.
And if android receive other byte data after receive 512bytes data then incoming both missing data and other byte data(512bytes + other data).
my read code on thread is below.
#Override
public void run() {
byte[] readbuffer = new byte[16384];
int bytesRead = 0;
try {
while ((bytesRead = mInputStream.read(readbuffer, 0, readbuffer.length)) != -1) {
//my code here afrer read.
....
mHandler.sendMessage(msg);
}
} catch (IOException e) {
e.printStackTrace();
}
}
this is happened not only 512bytes but some other specific lengths(512bytes, 1024bytes, 2048 bytes...).
is this android accessory mode bug?
anybody know this issue?
It is not a bug with AOA but your sender not finishing the USB transaction. Unlike USB control transfers, bulk transfer does not transmit the data size, so for a bulk transfer to finish one of these conditions must be satisfied:
The amount of data received is the amount of data requested.
The size of the data is less than maximum buffer size.
A Zero-length package is received.
For high speed mode, the maximum buffer size is 512 bytes, so if you send 0-511 bytes, condition 2 is satisfied. In case data is 513-1023 length, it will be split in two packages 512 bytes + 1-511 bytes, so again, the last package satisfies the 2nd condition.
In case you send exactly 512 bytes, the receiver does not know either you have finish the transaction or there is remaining data (in an additional package) so it keeps waiting and freezes. So, for lengths multiple of buffer size (512 on high speed and 64 in full-speed) you need to send an additional zero length package for finishing the USB transfer.
I'm pretty troubled with android NFC transacting with chip's FIFO cache area. Another side(B) and android app(A): B was electrify to chip,and B write data to chip's FIFO cache area by C language. The chip's FIFO cache area can not save data after outage, and when chip's FIFO cache area send all data, the chip's FIFO cache area will clear.
The situation is, A close to chip, send a APDU command, now chip receives the command and produces a signal. Then, the B detects the chip's signal, and grabs the command, then, B gets the command's first byte(fb), more then, B write [fb+data+9000] to chip's FIFO cache area. Finally, the transaction of send data back to A manage with chip self and we don't know the chip how to manage the send back.
The problem is, when B's write [fb+data+9000] less than 15 bytes(means data only 12 bytes), A can receive the [fb+data+9000] from chip. But the [fb+data+9000] more than 15 bytes, A throws TagLostException.
The chip use ISO14443-4 protocol.
The command:
The transact code:
`
try {
isoDep.close();
isoDep.connect();
}catch (IOException e){
errorfound = true;
onMessageReceived.onError(e);
}
if (!errorfound) {
if (isoDep.isConnected()) {
try {
isoDep.setTimeout(1200);
response = isoDep.transceive(newtest1_apdu);
int status = ((0xff & response[response.length - 2]) << 8) | (0xff & response[response.length - 1]);
if (status != 0x9000) {
log.error("retrieve data ,read failure");
}else {
log.info("retrieve data, result=" + numeralParse.toReversedHex(response));
}
onMessageReceived.onMessage(response);
}
catch (TagLostException e) {
log.info("catch tag lost exception, tag isConnected=" + isoDep.isConnected());
onMessageReceived.onError(e);
}
catch (IOException e) {
log.info("catch IOException, isoDep isConnected=" + isoDep.isConnected());
onMessageReceived.onError(e);
}
}else {
log.error("isoDep not connect");
}
}
`
Android app(A) try to a variety of commands, contains this format: .
And another side(B) only gets first byte in command and write [fb+data+9000] to chip's FIFO cache area. this isn't timeout reason, except setTimeOut(1200), also try setTimeOut(5000) or not setTimeOut. Other, A and B were not appoint the APDU command specific meaning. Other, by different APDU command, A work well with read Public transportation card(may be this read to block area, and now work with cache area, both work way not same). Other, the configuration of chip is basic default. Other, with other card reader test, chip's send data out success.
I go to Google,Bing,Baidu,Android office issues,stackoverflow and so on to search answer, but cannot find. This problem very bothered us. Apologetic with my poor English. Please help, extremely thank you.
(the chip is FM11NC08.)
New progress, We found, giving up using APDU command, if A send 1 byte, A can receive maximum 16 bytes. And if A send 2 bytes, A can receive maximum 15 bytes. And if A send 15 bytes, A can receive maximum 2 bytes. The chip's FIFO cache area has 32 bytes space. After B receive A's data, B will clear FIFO cache area, then write data to FIFO cache area.
Thanks in advance.
Today, B changes the chip's communication rate(from 1M to 2M) and a part of codes. Then A work well with chip! So, we found the communication rate has an impact on NFC communication. If you have the same trouble with NFC communication, might to try our way!
Thanks for people who consider this problem in not-solve days.
I modified the sample Bluetooth Chat application with a feature of sending image. I ran into a problem I can't figure out. The sample app needs a string to send and receive data. I am suing Base64 to convert the bitmaps to byte then to string so I could send the text with the selected image. If I send a small image (e.g. a fully black image with 4kb size) it is sent and received and displayed in the listview. But when I send larger images, they are split into many packages. This means, instead of one image, I get numerous texts, and the size of array I use to populate the listview will be anyhing more than 1, depending on the number of packages, so my listview is populated with a lot of text.
As it turns out, it depends on the size of the image. I experimented with many images. In case of a photo, whose length (as a string) is 50096, I get 51 packages on the receiver side. In case of another image, whose lenght is 77896, I get 81 packages. So I decided to try with a really small image. I created a black image in Paint, and it was succesfully sent and received. I measured it's length, which was 280. Then I experimented with a white image with a little yellow, which had a lenght of 980. That was also successfully sent. Then I added some black to the image which resulted in 1554 in length. This time I received a couple of texts instead of the image.
So my conclusion is that the length of string is restricted when sending data via Bluetooth, which may be obvious to someone who is familiar with the technology.
What can I do about this? If I knew the max number of characters that I can send in one package, I could calculate the number of packages on the sender side and I could concatenate all the packages on the receiver size.
I show you two images on the receiver side to demonstrate this behavious. The first one is the correct function, the second one is the wrong function:
Your problem is in the receiver portion, as can be found in this function from the bluetooth chat source code:
public void run() {
Log.i(TAG, "BEGIN mConnectedThread");
byte[] buffer = new byte[1024];
int bytes;
// Keep listening to the InputStream while connected
while (true) {
try {
// Read from the InputStream
bytes = mmInStream.read(buffer);
// Send the obtained bytes to the UI Activity
mHandler.obtainMessage(BluetoothChat.MESSAGE_READ, bytes, -1, buffer)
.sendToTarget();
} catch (IOException e) {
Log.e(TAG, "disconnected", e);
connectionLost();
// Start the service over to restart listening mode
BluetoothChatService.this.start();
break;
}
}
}
Specifically, note the byte[] buffer = new byte[1024];. You are limited to only 1K. If you want to do more, you need to increase the size of the buffer, or build a method to detect that files are exceeding this buffer size.
A few things of note, should you design a protocol to handle this.
You shouldn't need to send this in base 64, the stream accepts binary just fine.
Put a header that includes the file type, packet number, and total number of packets. If you want to be really fancy, include a checksum of some sort, to verify the packet is intact.
You should include a mechanism to re-transmit missing packets.
You might want to send out an initial header, explaining what is coming.
I have an application that is playing MP3 files that are available at a public URL. Unfortunately the server does not support streaming, but the Android makes the user experience quite acceptable.
It all works fine for all platforms except for JellyBean. When requesting the MP3, JB requests for a Range-Header for 10 times. Only after the 10-th attempt it seems to revert to the old behavior. Looks like this already reported issue.
I found another SO thread where a solution recommended is to use Tranfer-Encoding: chunked header. But just below there is a comment that this doesn't work.
For the moment I have no control whatsoever to deliver above response headers, but until I will be able to do that I thought to search for an alternative at client side. (even so, I can only return a Content-Range that contains indexes from 0 to Content-Length - 1. Ex. Content-Range: bytes 0-3123456/3123457).
What I tried to do is to implement a pseudo-streaming at client side by:
Open an input stream to the MP3.
Decode the incoming bytes using JLayer. I found the decoding at this link.
Send the decoded array bytes to an already playeable stream_mode AudioTrack.
The piece of code that does the decoding can be found there, I have only modified it so it will receive an InputStream:
public byte[] decode(InputStream inputStream, int startMs, int maxMs) throws IOException {
ByteArrayOutputStream outStream = new ByteArrayOutputStream(1024);
float totalMs = 0;
boolean seeking = true;
try {
Bitstream bitstream = new Bitstream(inputStream);
Decoder decoder = new Decoder();
boolean done = false;
while (!done) {
Header frameHeader = bitstream.readFrame();
if (frameHeader == null) {
done = true;
} else {
totalMs += frameHeader.ms_per_frame();
if (totalMs >= startMs) {
seeking = false;
}
if (!seeking) {
// logger.debug("Handling header: " + frameHeader.layer_string());
SampleBuffer output = (SampleBuffer) decoder.decodeFrame(frameHeader, bitstream);
if (output.getSampleFrequency() != 44100 || output.getChannelCount() != 2) {
throw new IllegalArgumentException("mono or non-44100 MP3 not supported");
}
short[] pcm = output.getBuffer();
for (short s : pcm) {
outStream.write(s & 0xff);
outStream.write((s >> 8) & 0xff);
}
}
if (totalMs >= (startMs + maxMs)) {
done = true;
}
}
bitstream.closeFrame();
}
return outStream.toByteArray();
} catch (BitstreamException e) {
throw new IOException("Bitstream error: " + e);
} catch (DecoderException e) {
throw new IOException("Decoder error: " + e);
}
}
I am requesting the decoded bytes in time chunks: starting with (0, 5000) so I will have a bigger array to play at first, then I am requesting the next byte arrays that span over a second: (5000, 1000), (6000, 1000), (7000, 1000), etc.
The decoding is fast enough and is done in another thread and once a decoded byte array is available I am using a blocking queue to write it to the AudioTrack that is playing in another thread.
The problem is that the playback is not smooth as the chunks are not continuous in a track (each chunk is continuous, but added in the AudioTrack results in a sloppy playback).
To wrap up:
If you have bumped into this JellyBean issue, how did you solve it?
If any of you tried my approach, what am I doing wrong in above code? If this is the solution you used, I can publish the rest of the code.
Thanks!
It looks like you are trying to develop your own streaming type. This can get blocky or interrupted playback because you have to attempt continuous information piping w/out running out of bytes to read from.
Basically, you will have to account for all the situations that a normal streaming client takes care of. For instance, sometimes some blocks may be dropped or lost in transmission; sometimes the audio playback may catch up to the download; the cpu starts lagging which affects playback; etc. etc.
Something to research if you want to continue down this path would be Sliding Window implementation, it is essentially an abstract technique to try to keep the network connectivity always active and fluid. You should be able to find several examples through google, here is a place to start: http://en.wikipedia.org/wiki/Sliding_window_protocol
Edit: One workaround that may help you until this is fixed would be to include the source code for MediaPlayer.java and AudioManager.java from SDK <16 into your project and see if that resolves the problem. If you do not have the source code you can download it with the SDK Manager.
AudioTrack is blocking by nature from docs(Will block until all data has been written to the audio mixer.). I'm not sure if you're reading from the file and writing to AudioTrack in the same Thread; if so, then I'd suggest you spin up a thread for AudioTrack.
Helo.
Im developing an application that transferes data over bluetooth(with a flight recorder device). When i am recieving a lot of data data(3000 - 40000 lines of text, depends of the file size) my aplication seems to stop recieving the data. I recieve the data with InputStream.read(buffer). For example: I send a command to the flight recorder, it starts sending me a file(line by line), on my phone i recieve 120 lines and then the app stucks.
Intresting is that on my HTC Desire the app stucks just sometimes, on the Samsung Galaxy S phone the application stucks every single time i try to recive more than 50 lines.
The code is based on the BluetoothChat example. This is the part of code where i am listening to the BluetoothSocket:
byte[] buffer = new byte[1024];
int bytes =0;
while(true)
{
bytes = mmInStream.read(buffer);
readMessage = new String(buffer, 0, bytes);
Log.e("read", readMessage);
String read2 = readMessage;
//searching for the end of line to count the lines(the star means the start of the checksum)
int currentHits = read2.replaceAll("[^*]","").length();
nmbrOfTransferedFligts += currentHits;
.
.
.
//parsing and saving the recieved data
I must say that i am running this in a while(true) loop, in a Thread, that is implemented in an android Service. The app seems to stuck at "bytes = mmInStream.read(buffer);"
I have tried to do this with BufferedReader, but with no success.
Thanks.
The app seems to stuck at "bytes = mmInStream.read(buffer);"
But that is normal behavior: InputStream.read(byte[]) blocks when there is no more data available.
This suggests to me that the problem is on the other end or in the communication between the devices. Is is possible that you have a communication problem (which is a bit different on the Galaxy vs. the Desire) that is preventing more data from being received?
Also, I would suggest that you wrap a try/catch around the read statement to be sure that you catch any possible IOException's. Though I guess you would have seen it in logcat if that were happening.
Speaking of logcat, I would suggest that you look at the logcat statements that Android itself it generating. I find that it generates a lot for Bluetooth and this might help you to figure out whether there really is any more data to be read().