Problem with audio - android

In my android app I can record audio and save it on the phone/sdk. I checked that it is audible and clear when i play it back on the phone. The size of the audio file it created is 5.9kb(.amr format).
Next i upload the file to the server, it stores the audio on sql db. The upload is successful. When the uploaded audio is played, it is all garbled...
In the database i store the audio in a column with datatype image and is of length 16.
My question is ..why is the noise garbled after upload. How do i verify that the audio is saved correctly without any noise added.
Code for file upload
InputStream = new DataInputStream(new FileInputStream( FileName));
byte[] responseData = new byte[10000];
int length = 0;
StringBuffer rawResponse = new StringBuffer();
while (-1 != (length = InputStream.read(responseData)))
rawResponse.append(new String(responseData, 0, length));
String finalstring = rawResponse.toString();
voicedataArray = finalstring.getBytes();

Your problem is very much likely due to the use of StringBuffer to buffer the response. A character in Java is a two-byte entity corresponding to a Unicode character point. The documentation for String#getBytes() says:
Returns a new byte array containing the characters of this string
encoded using the system's default charset.
So there's no guarantee that the bytes you are passing in, being converted to characters, then back to bytes is the same stream you passed in the first place.
I think you would need to code your solution using a dynamically expanding byte buffer in place of the StringBuffer.
Also, two notes about the usage of StringBuffer:
1) All accesses to the StringBuffer are synchronized, so you're paying a performance penalty. StringBuilder is a modern-day replacement that doesn't do synchronization under the hood.
2) Each time you append to the StringBuffer:
rawResponse.append(new String(responseData, 0, length));
you are allocating a new string and throwing it away. That's really abusive to the garbage collector. StringBuffer actually has a form of append() that will directly take a char array, so there is no need to use an intermediate String. (But you probably don't want to use a StringBuffer in the first place).

Related

Program crashes on trying to open a file

My android program crashes on this line when the file size is very large. Is there any way I can prevent the program from crashing ?
byte[] myByteArray = new byte[(int)mFile.length()];
Additional details :-
I am trying to send a file to server.
error log-
E/dalvikvm-heap(29811): Out of memory on a 136309996-byte allocation.
You should use a stream when reading the file. Since you've mentioned sending to a server, you should stream that file to the server.
As others have mentioned, you should consider your data size (1GB seems excessive). I haven't tested this, but the basic approach in code would look something like:
// open a stream to the file
FileInputStream fileInputStream = new FileInputStream(filePath);
// open a stream to the server
HttpURLConnection connection = new URL(url).openConnection();
DataOutputStream outputStream = new DataOutputStream(connection.getOutputStream());
byte[] buffer = new byte[BUFFER_SIZE]; // pick some buffer size
int bytesRead = 0;
// continually read from the file into the buffer and immediately write that to output stream
while ((bytesRead = fileInputStream.read(buffer)) != -1) {
outputStream.write(buffer);
}
Hope that is clear enough for you to fit to your needs.
Yep. Don't try to read the whole file into memory at once...
If you really need the whole file in memory you might have more luck with allocating dynamic memory for each line and storing the lines in a list. (you might be able to get a bunch of smaller chunks of memory but not one big piece)
Without knowing the context we can't tell, but normally you would parse the file into data structs rather than just storing the whole file in memory.
In JDK 7 you can use Files.readAllBytes(Path).
Example:
import java.nio.file.Files;
import java.nio.file.Paths;
import java.nio.file.Path;
Path path = Paths.get("path/to/file");
byte[] myByteArray = Files.readAllBytes(path);
Don't try reading the complete file into memory. Instead open a stream and process the file line by line (is it's a text file) or in parts. How that has to be done depends on the problem you are trying to solve.
EDIT: You say you want to upload a file, so please check this question. You don't need to have the complete file in memory.

How do I go about setting TextView to display UTF-8 when the String is not an embedded Resource?

I'm encountering an odd situation whereby strings that I load from my resource XML file that have Spanish characters in them display correctly in my TextViews, but strings that I'm fetching from a JSON file that I load via HTTP at runtime display the missing char [] boxes
ESPAÑOL for example, when embedded in my XML strings works fine, but when pulled from my JSON is rendered as SPAÃ[]OL, so the Ñ is transformed into a à and a missing char!
I'm not sure at what point I need to intercept these strings and set the correct encoding on them. The JSON text file itself is generated on the server via Node, so, I'm not entirely sure if that's the point at which I should be encoding it, or if I should be encoding the fileReader on the Android side, or perhaps setting the TextView itself to be of some special encoding type (I'm unaware that this is an option, just sort of throwing my hands in the air, really).
[EDIT]
As per ianhanniballake's suggestion I am logging and seeing that the screwy characters are actually showing up in the log as well. However, when I look at the JSON file with a text viewer on the Android file system (it's sitting on the SDCARD) it appears correct.
So, it turned out that the text file was, indeed, encoded correctly and the issue was that I wasn't setting UTF-8 as my encoding on the FileInputStream...
The solution is to read the file thusly:
static String readInput() {
StringBuffer buffer = new StringBuffer();
try {
FileInputStream fis = new FileInputStream("myfile.json");
InputStreamReader isr = new InputStreamReader(fis, "UTF8");
Reader in = new BufferedReader(isr);
int ch;
while ((ch = in.read()) > -1) {
buffer.append((char) ch);
}
in.close();
return buffer.toString();
} catch (IOException e) {
e.printStackTrace();
return null;
}
}

Android Inputread stream reads exactly the given buffer size?

I am reading a file with:
char [] buffer = new char[300];
FileInputStream istream = new FileInputStream(path);
InputStreamReader file = new InputStreamReader(istream);
size = file.read(buffer);
file.close();
After a few tries, it turns out that the file.read(buffer) reads exactly the number of chars allocated for buffer (in this case, 300, even that the file has much more characers in it).
Can I rely on read() always reading as much as it can, without generating any exception?
Or is this an undocumented feature?
The read method description says:
Reads characters from this reader and stores them in the character
array buf starting at offset 0. Returns the number of characters
actually read or -1 if the end of the reader has been reached.
There is no mention of the buffer allocation issue.
This is very important, and a good thing that it works this way, because it allows you to define the size of the buffer as you want/need and there is no need to guess, no need to code for exceptions. Actually, it is read(char[] buffer) but it works as read(char[] buffer, int size).
Yes you can rely on this call, unless an I/O error occurs, which is already mentionned in the api.
If you look at the code of read(char cbuf[]) you'll notice it calls the method public int read (char[] buffer, int offset, int length).
From Android source code:
public int read(char cbuf[]) throws IOException { read(cbuf, 0, cbuf.length);}
In your implementation, you need to continue reading the file with file.read(buffer) to obtain remaining bytes. The content of buffer needs to be appended to another buffer that will grow, depending on the size of the file you're reading.
You could also allocate that buffer with the size of the file with the method getTotalSpace()

BufferedInputStream.read() not reliably returning data to my app - data bunches up

My app receives data through a serial port... they're typically small chunks of data. For example, sometimes it 40 bytes, sometimes 60 bytes. All the chunks of data are separated by a second, or possibly even a minute.
I read that using BufferedInputStream is good for reading chunks of data so that the app doesn't create a lot of CPU overhead by reading data byte by byte.
So that's what I did - just like this example: http://www.roseindia.net/java/example/java/io/ReadFilterFile.shtml
When it works, it works great!
My app gets a complete chunk of data - I was worried that I would receive incomplete chunks, but no, to my amazement it's complete chunks.
However, sometimes it doesn't work so well
What seems to happen is that a small chunk of data doesn't cause the read() method to complete. When a little bit larger chunk comes along later then finally the read() will return. This is undesirable !
I do not want my app to be denied a chunk of data until another chunk arrives.
Question:
How do I ensure that BufferedInputStream.read() returns shortly after a small chunk of data was received ? Is byte-by-byte read the only way ?
I have it solved - the solution was to use smaller buffers... taking the roseindia.net sample, the following fixes the read() to always return after a chunk of data:
final int BUFFER_SIZE = 128;
byBuffer = new byte[BUFFER_SIZE];
try
{
FileInputStream fin = new FileInputStream("Filterfile.txt");
BufferedInputStream bis = new BufferedInputStream(fin, BUFFER_SIZE*2);
// Now read the buffered stream.
while (bis.available() > 0)
{
int iBytesRead = bis.read(byBuffer, 0, BUFFER_SIZE);
}
}
catch (Exception e)
{
System.err.println("Error reading file: " + e);
}
maybe the 8K default buffer was too large and the small chunks of data didn't reliably pass some threshold ?

Android file copy

I am finding that reading one line at a time from a text file on the SD card is rather slow. I imagine that it might be quicker if the file is in internal memory, so I want to copy files from the SD card to internal storage.
The file copy examples I can find on the web seem to involve copying one byte at a time from an InputStream to an OutputStream or from a FileReader to a FileWriter. Is this really the quickest and most efficient method?
If you are pulling the file in for use in your application what I suggest you do is read in the data then stuff the in memory data you have collected into some kind of reader (BufferedReader perhaps) so that you can then read the lines from there.
Here is an example of what I typically do:
// Assumption: I already have the file object I want to read
// Note: I'm not doing any error handling.
InputStream input = new FileInputStream(file);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
byte[] buffer = new byte[1024];
int bytesRead = 0;
while( (bytesRead = input.read(buffer)) > 0){
baos.write(buffer, 0, bytesRead);
}
StringReader stringReader = new StringReader( new String(baos.toByteArray()) );
BufferedReader bufferedReader = new BufferedReader( stringReader );
for(String line : bufferedReader.readLine()){
// TODO: Handle each line appropriately or something
Log.d("Reading Data Example", line);
}
One of the truisims of CS that only becomes more true with time as CPUs get faster is: I/O is slow.
If you want speed, generally your best bet is to do as few I/O's as possible. Ideally, find out how big that file is, allocate that much memory, and then read the entire thing in one big I/O. Then you can just access the data from program memory. If you might not have enough RAM for every concievable file size then you might have to do a bit more work, but this is what you should strive for.

Categories

Resources