Android Inputread stream reads exactly the given buffer size? - android

I am reading a file with:
char [] buffer = new char[300];
FileInputStream istream = new FileInputStream(path);
InputStreamReader file = new InputStreamReader(istream);
size = file.read(buffer);
file.close();
After a few tries, it turns out that the file.read(buffer) reads exactly the number of chars allocated for buffer (in this case, 300, even that the file has much more characers in it).
Can I rely on read() always reading as much as it can, without generating any exception?
Or is this an undocumented feature?
The read method description says:
Reads characters from this reader and stores them in the character
array buf starting at offset 0. Returns the number of characters
actually read or -1 if the end of the reader has been reached.
There is no mention of the buffer allocation issue.
This is very important, and a good thing that it works this way, because it allows you to define the size of the buffer as you want/need and there is no need to guess, no need to code for exceptions. Actually, it is read(char[] buffer) but it works as read(char[] buffer, int size).

Yes you can rely on this call, unless an I/O error occurs, which is already mentionned in the api.
If you look at the code of read(char cbuf[]) you'll notice it calls the method public int read (char[] buffer, int offset, int length).
From Android source code:
public int read(char cbuf[]) throws IOException { read(cbuf, 0, cbuf.length);}
In your implementation, you need to continue reading the file with file.read(buffer) to obtain remaining bytes. The content of buffer needs to be appended to another buffer that will grow, depending on the size of the file you're reading.
You could also allocate that buffer with the size of the file with the method getTotalSpace()

Related

Why InputStream.inAvailable() is always returning 0?

I am making an android app and sending an xml to an ip address. I should get back an xml as response but bytes in inputstream buffer is always empty. I am using following code:
String sMessage = "<Server><CONNECT><IP>192.168.1.14</IP><Client_ID>123</CLIENT_GUID></CONNECT></Server>";
Socket clientSocket = null;
clientSocket = new Socket("192.168.252.148",34543);
PrintWriter pw = new PrintWriter(clientSocket.getOutputStream(),true);
pw.write(sMessage);
InputStream in = clientSocket.getInputStream();
byte[] buffer = new byte[in.available()];
System.out.println("buffer size: "+buffer.length);
pw.close();
in.close();
clientSocket.close();
Any idea why am i not getting bytes in my inputstream. Thanks in advance.
http://docs.oracle.com/javase/6/docs/api/java/io/InputStream.html#available()
The available method for class InputStream always returns 0.
This method should be overridden by subclasses.
Try wrapping using a BufferedInputStream.
BufferedInputStream in = new BufferedInputStream(clientSocket.getInputStream());
I should get back an xml as response but bytes in input stream buffer
Maybe so, but not instantaneously, which is what your code assumes. There are few if any correct uses of available(), and this isn't one of them. Just block in the read.
.available() can not be used in inter-process communication (serial included), since it only checks if there is data available (in input buffers) in current process.
In serial communication, when you send a massage and then immediately call available() you will mostly get 0 as serial port did not yet reply with any data.
The solution is to use blocking read() in a separate thread (with interrupt() to end it):
try this Thread interrupt not ending blocking call on input stream read
on some streams (such as BufferedInputStream, that have an internal buffer), some bytes are read and kept in memory, so you can read them without blocking the program flow. In this case, the available() method tells you how many bytes are kept in the buffer.
new BufferedOutputStream(clientSocket.getOutputStream()));
new BufferedInputStream (clientSocket.getInputStream())

Program crashes on trying to open a file

My android program crashes on this line when the file size is very large. Is there any way I can prevent the program from crashing ?
byte[] myByteArray = new byte[(int)mFile.length()];
Additional details :-
I am trying to send a file to server.
error log-
E/dalvikvm-heap(29811): Out of memory on a 136309996-byte allocation.
You should use a stream when reading the file. Since you've mentioned sending to a server, you should stream that file to the server.
As others have mentioned, you should consider your data size (1GB seems excessive). I haven't tested this, but the basic approach in code would look something like:
// open a stream to the file
FileInputStream fileInputStream = new FileInputStream(filePath);
// open a stream to the server
HttpURLConnection connection = new URL(url).openConnection();
DataOutputStream outputStream = new DataOutputStream(connection.getOutputStream());
byte[] buffer = new byte[BUFFER_SIZE]; // pick some buffer size
int bytesRead = 0;
// continually read from the file into the buffer and immediately write that to output stream
while ((bytesRead = fileInputStream.read(buffer)) != -1) {
outputStream.write(buffer);
}
Hope that is clear enough for you to fit to your needs.
Yep. Don't try to read the whole file into memory at once...
If you really need the whole file in memory you might have more luck with allocating dynamic memory for each line and storing the lines in a list. (you might be able to get a bunch of smaller chunks of memory but not one big piece)
Without knowing the context we can't tell, but normally you would parse the file into data structs rather than just storing the whole file in memory.
In JDK 7 you can use Files.readAllBytes(Path).
Example:
import java.nio.file.Files;
import java.nio.file.Paths;
import java.nio.file.Path;
Path path = Paths.get("path/to/file");
byte[] myByteArray = Files.readAllBytes(path);
Don't try reading the complete file into memory. Instead open a stream and process the file line by line (is it's a text file) or in parts. How that has to be done depends on the problem you are trying to solve.
EDIT: You say you want to upload a file, so please check this question. You don't need to have the complete file in memory.

BufferedInputStream.read() not reliably returning data to my app - data bunches up

My app receives data through a serial port... they're typically small chunks of data. For example, sometimes it 40 bytes, sometimes 60 bytes. All the chunks of data are separated by a second, or possibly even a minute.
I read that using BufferedInputStream is good for reading chunks of data so that the app doesn't create a lot of CPU overhead by reading data byte by byte.
So that's what I did - just like this example: http://www.roseindia.net/java/example/java/io/ReadFilterFile.shtml
When it works, it works great!
My app gets a complete chunk of data - I was worried that I would receive incomplete chunks, but no, to my amazement it's complete chunks.
However, sometimes it doesn't work so well
What seems to happen is that a small chunk of data doesn't cause the read() method to complete. When a little bit larger chunk comes along later then finally the read() will return. This is undesirable !
I do not want my app to be denied a chunk of data until another chunk arrives.
Question:
How do I ensure that BufferedInputStream.read() returns shortly after a small chunk of data was received ? Is byte-by-byte read the only way ?
I have it solved - the solution was to use smaller buffers... taking the roseindia.net sample, the following fixes the read() to always return after a chunk of data:
final int BUFFER_SIZE = 128;
byBuffer = new byte[BUFFER_SIZE];
try
{
FileInputStream fin = new FileInputStream("Filterfile.txt");
BufferedInputStream bis = new BufferedInputStream(fin, BUFFER_SIZE*2);
// Now read the buffered stream.
while (bis.available() > 0)
{
int iBytesRead = bis.read(byBuffer, 0, BUFFER_SIZE);
}
}
catch (Exception e)
{
System.err.println("Error reading file: " + e);
}
maybe the 8K default buffer was too large and the small chunks of data didn't reliably pass some threshold ?

Problem with audio

In my android app I can record audio and save it on the phone/sdk. I checked that it is audible and clear when i play it back on the phone. The size of the audio file it created is 5.9kb(.amr format).
Next i upload the file to the server, it stores the audio on sql db. The upload is successful. When the uploaded audio is played, it is all garbled...
In the database i store the audio in a column with datatype image and is of length 16.
My question is ..why is the noise garbled after upload. How do i verify that the audio is saved correctly without any noise added.
Code for file upload
InputStream = new DataInputStream(new FileInputStream( FileName));
byte[] responseData = new byte[10000];
int length = 0;
StringBuffer rawResponse = new StringBuffer();
while (-1 != (length = InputStream.read(responseData)))
rawResponse.append(new String(responseData, 0, length));
String finalstring = rawResponse.toString();
voicedataArray = finalstring.getBytes();
Your problem is very much likely due to the use of StringBuffer to buffer the response. A character in Java is a two-byte entity corresponding to a Unicode character point. The documentation for String#getBytes() says:
Returns a new byte array containing the characters of this string
encoded using the system's default charset.
So there's no guarantee that the bytes you are passing in, being converted to characters, then back to bytes is the same stream you passed in the first place.
I think you would need to code your solution using a dynamically expanding byte buffer in place of the StringBuffer.
Also, two notes about the usage of StringBuffer:
1) All accesses to the StringBuffer are synchronized, so you're paying a performance penalty. StringBuilder is a modern-day replacement that doesn't do synchronization under the hood.
2) Each time you append to the StringBuffer:
rawResponse.append(new String(responseData, 0, length));
you are allocating a new string and throwing it away. That's really abusive to the garbage collector. StringBuffer actually has a form of append() that will directly take a char array, so there is no need to use an intermediate String. (But you probably don't want to use a StringBuffer in the first place).

Android file copy

I am finding that reading one line at a time from a text file on the SD card is rather slow. I imagine that it might be quicker if the file is in internal memory, so I want to copy files from the SD card to internal storage.
The file copy examples I can find on the web seem to involve copying one byte at a time from an InputStream to an OutputStream or from a FileReader to a FileWriter. Is this really the quickest and most efficient method?
If you are pulling the file in for use in your application what I suggest you do is read in the data then stuff the in memory data you have collected into some kind of reader (BufferedReader perhaps) so that you can then read the lines from there.
Here is an example of what I typically do:
// Assumption: I already have the file object I want to read
// Note: I'm not doing any error handling.
InputStream input = new FileInputStream(file);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
byte[] buffer = new byte[1024];
int bytesRead = 0;
while( (bytesRead = input.read(buffer)) > 0){
baos.write(buffer, 0, bytesRead);
}
StringReader stringReader = new StringReader( new String(baos.toByteArray()) );
BufferedReader bufferedReader = new BufferedReader( stringReader );
for(String line : bufferedReader.readLine()){
// TODO: Handle each line appropriately or something
Log.d("Reading Data Example", line);
}
One of the truisims of CS that only becomes more true with time as CPUs get faster is: I/O is slow.
If you want speed, generally your best bet is to do as few I/O's as possible. Ideally, find out how big that file is, allocate that much memory, and then read the entire thing in one big I/O. Then you can just access the data from program memory. If you might not have enough RAM for every concievable file size then you might have to do a bit more work, but this is what you should strive for.

Categories

Resources