Update: The problem must be on the Android side, not Qt.
The problem is simply I can't send more than 1000 bytes (correctly) from Windows (via Qt) to Android. Here I post full information:
Code in Qt Creator (Windows side):
QFile inputFile(fileInfo->absoluteFilePath());
QByteArray read;
inputFile.open(QIODevice::ReadOnly);
int size = 0;
while(1){
read.clear();
read = inputFile.read(1000);
qDebug()<< "Read : " <<read.size();
size += read.size();
if(read.size() == 0){
break;
}
QByteArray toWrite(read);
newSocket->write(toWrite);
newSocket->flush();
newSocket->waitForBytesWritten();
this->sleep(1);
}
inputFile.close();
qDebug()<<"Transfer Done! " << size << " bytes";}
Java code (Android side):
DataOutputStream dos;
DataInputStream dis;
Socket s;
s = new Socket("192.168.137.1",8080);
dos = new DataOutputStream(s.getOutputStream());
dis = new DataInputStream(s.getInputStream());
while(true){
if(dis.available() > 0 ) {
int chunkSize = 1000;
byte[] b = new byte[chunkSize];
dis.read(b);
writeToExternalStoragePublic("test.png",b);}
The code works pretty good. It even works when I set the chunks size on both sides to 1,000,000, and the data is written to Android, but a lot of the bytes are empty.
Check these photos from hex Workshop data visualizer. Using 1000 as chunkSize in the left and 2000 in right.
Here are the files:
1000ChunkSize
2000ChunkSize
The problem is 1000 is too slow, and it takes a lot of time even for small files though it works. What do you suggest?
A possible problem was the timing on the server side even though it has:
waitForBytesWritten();
I put
this->sleep(1);
But it didn't help.
Update: The problem must be on the Android side, not Qt.
Related
I am sending files between 2 devices, so I established a socket communication. Right now, I am just trying to send one file, but in the future I want to send multiple files (selected by the user from a gridview).
The problem is that when I send one file, on the server side (that receives the file) the socket.getInputStream().read(buffer) does not detect the end of the file. It just waits for "more" data to be sent.
After searching a bit on this issue, I reached some topics that kind of gave me some options, but I am still not satisfied with it because I dont know if those options would be efficient to send multiple files. This is an example : How to identify end of InputStream in java
I could close the socket or the stream objects after sending a file, but if I want to send a lot of files, it wouldn't be efficient to be always closing and opening the sockets.
Code on the receiver :
File apkReceived = new File(Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DOWNLOADS) + "/testeReceiveServerComm.apk");
byte[] buffer = new byte [8192];
FileOutputStream fos=new FileOutputStream(apkReceived);
int count=0;
int total=0;//so apra ir vendo quanto recebi.
while((count = in.read(buffer)) != -1){
fos.write(buffer,0,count);
total+=count;
System.out.println("Server Comm receive thread - already received this ammount : "+total);
}
Code on the client (sender) :
File apkToSend=new File(filePath);
byte[] buffer = new byte [8192];
BufferedInputStream bis=new BufferedInputStream(new FileInputStream(apkToSend));
int count;
int total=0;
while((count=bis.read(buffer))!=-1){
out.write(buffer,0,count);
total+=count;
out.reset();
System.out.println("send thread - already sent this ammount : "+total);
}
out.flush();
bis.close();
I followed this tutorial to get started using Android host api with an Arduino board. I am using the Arduino Uno. I am able to transmit data and turn on a LED on the Arduino board and I can receive feedback from the Arduino board. I am trying to write to my Android device over the USB connection from the Arduino board like so:
Serial.print("Test");
I am receiving the Arduino data on the Android side like this:
byte[] buffer = new byte[10];
int bytes;
//try-catch statements omitted for simplicity
bytes = mUsbConnection.bulkTransfer(mUsbEndpointIn, buffer, buffer.length, 0);
Every once and awhile the data will be intact but more often than not, what I receive from the Arduino is a garbled mix of those letters from my original message(t,e,s, and t). Many times only 1 or 2 letters are displayed. If anyone could point me in the right direction or share some similar experience I would be appreciative. Thanks.
Edit
When I print out the data into Logcat, there are multiple copies of the data. For example, if I receive "ste" from Arduino, it will be printed out 2-5 times in Logcat.
I think I found something that works at least temporarily:
public void run(){
int i = 0;
byte[] buffer = new byte[4];
byte[] finalBuffer = new byte[8];
byte[] sendBuffer = new byte[8];
int bytes = 0;
while(true){
try{
bytes = mUsbConnection.bulkTransfer(mUsbEndpointIn, buffer, buffer.length, 0);
if (bytes == EXIT_CMD) {
return;
}
if (bytes > 0){
byte[] temporaryBuffer = new byte[bytes];
System.arraycopy(buffer, 0, temporaryBuffer, 0, bytes);
System.arraycopy(temporaryBuffer, 0, finalBuffer, i, bytes);
i += bytes;
java.util.Arrays.fill(buffer, (byte) 0);
}
//Dollar sign terminates string to indicate end of line
if (finalBuffer[7] == 36){
i = 0;
System.arraycopy(finalBuffer, 0, sendBuffer, 0, sendBuffer.length);
messageHandler.obtainMessage(UsbHostTestActivity.ARDUINO_MESSAGE,
sendBuffer.length, -1, sendBuffer).sendToTarget();
java.util.Arrays.fill(finalBuffer, (byte) 0);
}
I had to send strings that were 8 characters exactly from Arduino and they had to end with a dollar sign($) in order to indicate the end of the line, but the data being passed to my message handler always seemed to be correct. It's not the most robust solution but maybe someone can modify it to make it better or take another approach? Please let me know!
I'm trying to write a very large file to another very large file. I'm receiving this error on the filechannel writing line and I'm unsure why. I thought it was because I was going out of the limits of the data type long but long can go up to 9,223,372,036,854,775,807 and I'm only going up to 5,372,896,745 at the most. Any ideas why this is occurring? Is there some limit that MappedByteBuffer has? This doesn't occur for smaller files and I haven't run into any issues using the same code in a java desktop application. (Only happens on Android)
File f1 = new File(filename1);
FileChannel fic, foc;
long fsize;
MappedByteBuffer mBUf;
FileOutputStream out = new FileOutputStream(f1,true);
foc = out.getChannel();
File f2 = new File(filename2);
FileInputStream in = new FileInputStream(f2);
fic = in.getChannel();
fsize = fic.size();
for (long b = 0; b < fsize; b += 65536)
{
if (fsize - b < Resource.MEMORY_ALLOC_SIZE)
mBUf = fic.map(FileChannel.MapMode.READ_ONLY, b, fsize - b);
else
mBUf = fic.map(FileChannel.MapMode.READ_ONLY, b, Resource.MEMORY_ALLOC_SIZE);
foc.write(mBUf); //ERROR HERE!
}
fic.close();
in.close();
foc.close();
out.close();
Any ideas/feedback is appreciated!
Is there some limit that MappedByteBuffer has?
Of course there is. It is limited by the available virtual memory for a start, and after that by the virtual address space.
You should be using transferTo() for this task rather than MappedByteBuffers,, as there is no agreed means of disposing of the virtual address space occupied by the latter.
Unfortunately a Long does not go that high on a 32-bit system (which I believe Android is since it doesn't have over 4Gb of RAM). Therefore the maximum length of an unsigned long on Android is 4,294,967,295 which means you are exceeding its limit.
I am having some issues with bytes being dropped over a bluetooth connection between an android device (Gingerbread 2.3.1) and a PC. The way I receiving the data is in a 2 byte buffer. The values being received is streaming from the PC over a few minutes (values represent a waveform). Here are just a few snippets of code so you can get the idea. The base of my code is from the android bluetooth chat sample code.
BluetoothSocket socket;
...
mmInStream=socket.getInputStream;
...
byte[] buffer= new byte[2];
...
bytes = mmInStream.read(buffer);
Has anyone has issues with this type of thing? The dropped bytes seem to happen at random times while at other times the values received are as expected. I am using a 2 byte buffer because the values I am receiving are 16 bit signed integers. From the PC side of things I am using RealTerm to send the binary files of data.
Is it possible that my buffer is too small and that is causing the dropped bytes?
Thanks
Following up to your answer. You could just use a counter to remember how many bytes already read and compare it to the number wanted and also use it for the index to write the next byte(s). See a C# version at http://www.yoda.arachsys.com/csharp/readbinary.html
public static void ReadWholeArray (Stream stream, byte[] data)
{
int offset=0;
int remaining = data.Length;
while (remaining > 0)
{
int read = stream.Read(data, offset, remaining);
if (read <= 0)
throw new EndOfStreamException
(String.Format("End of stream reached with {0} bytes left to read", remaining));
remaining -= read;
offset += read;
}
}
I have found what the issue is. I want to thank alanjmcf for pointing me in the right direction.
I wasn't checking by bytes variable to see how many bytes were returned from the mmInStream.read(buffer). I was simply expecting that every buffer returned would contain 2 bytes. The way i solved the issue was with the following code after getting the buffer back from the InputStream:
//In the case where buffer returns with only 1 byte
if(lagging==true){
if(bytes==1){
lagging=false;
newBuf=new byte[] {laggingBuf, buffer[0]};
ringBuffer.store(newBuf);
}else if(bytes==2){
newBuf=new byte[] {laggingBuf, buffer[0]};
laggingBuf=buffer[1];
ringBuffer.store(newBuf);
}
}else if(lagging==false){
if(bytes==2){
newBuf = buffer.clone();
ringBuffer.store(newBuf);
}else if(bytes==1){
lagging=true;
laggingBuf=buffer[0];
}
}
This fixed my problem. Any suggestions on a better methodology?
I have an encryption and decryption code which I use to encrypt and decrypt video files (mp4). I'm trying to speed up the decryption process as the encryption one is not that relevant for my case. This is the code that I have for the decryption process:
private static void decryptFile() throws IOException, ShortBufferException, IllegalBlockSizeException, BadPaddingException
{
//int blockSize = cipher.getBlockSize();
int blockSize = cipher.getBlockSize();
int outputSize = cipher.getOutputSize(blockSize);
System.out.println("outputsize: " + outputSize);
byte[] inBytes = new byte[blockSize];
byte[] outBytes = new byte[outputSize];
in= new FileInputStream(inputFile);
out=new FileOutputStream(outputFile);
BufferedInputStream inStream = new BufferedInputStream(in);
int inLength = 0;;
boolean more = true;
while (more)
{
inLength = inStream.read(inBytes);
if (inLength == blockSize)
{
int outLength
= cipher.update(inBytes, 0, blockSize, outBytes);
out.write(outBytes, 0, outLength);
}
else more = false;
}
if (inLength > 0)
outBytes = cipher.doFinal(inBytes, 0, inLength);
else
outBytes = cipher.doFinal();
out.write(outBytes);
}
My question is how to speed up the decryption process in this code. I've tried decrypting a 10MB mp4 file and it decrypts in 6-7 seconds. However, I'm aiming for < 1 seconds. Another thing I would like to know is if my writing to the FileOutputStream out is actually slowing the process down rather than the decryption process itself. Any suggestions on how to go about speeding things up here.
I'm using AES for encryption/decryption.
Until I find a solution, I will be using a ProgressDialog which tells the user to wait until the video has been decrypted (Obviously, I'm not going to use the word: decrypted).
Why are you decrypting data only by blockSize increments ? You do not show what type of object cipher is, but I am guessing this is a javax.crypto.Cipher instance. It can handle update() calls over arrays of arbitrary length, and you will have much less overhead if you use longer arrays. You should process data by blocks of, say, 8192 bytes (that's the traditional length for a buffer, it interacts reasonably well with CPU inner caches).
bytebiscuit, your question gave me the solution which I am trying from past 6 days. I just modified your code little bit, and my 52 mb video file is getting decrypted in just 4 seconds. Previous decrypting technique took 45 seconds which was a different logic (not yours) . Thats a massive difference 45 seconds to 4 seconds. Where ever I have done modification I am putting //modified comment lines. I am sure if your video is 10mb video, it will get decrypted in 1 second for sure. Try applying this, it should work out.
private static void decryptFile() throws IOException, ShortBufferException, IllegalBlockSizeException, BadPaddingException
{
//int blockSize = cipher.getBlockSize();
int blockSize = cipher.getBlockSize();
int outputSize = cipher.getOutputSize(blockSize);
System.out.println("outputsize: " + outputSize);
byte[] inBytes = new byte[blockSize*1024]; //modified
byte[] outBytes = new byte[outputSize * 1024]; //modified
in= new FileInputStream(inputFile);
out=new FileOutputStream(outputFile);
BufferedInputStream inStream = new BufferedInputStream(in);
int inLength = 0;;
boolean more = true;
while (more)
{
inLength = inStream.read(inBytes);
if (inLength/1024 == blockSize) //modified
{
int outLength
= cipher.update(inBytes, 0, blockSize*1024, outBytes);//modified
out.write(outBytes, 0, outLength);
}
else more = false;
}
if (inLength > 0)
outBytes = cipher.doFinal(inBytes, 0, inLength);
else
outBytes = cipher.doFinal();
out.write(outBytes);
}
I suggest you use the profiling tool provided in the android sdk. it will tell you where you spend the most time (i.e. : file writing or decoding).
see http://developer.android.com/guide/developing/debugging/debugging-tracing.html
This work on the emulator as well as on an actual device.
Consider using the NDK. On devices before Froyo (and even Froyo itself), it would be really slow due to the lack of JIT (or a very simple one in Froyo). Even with the JIT, native architecture-optimized crypto code will always outrun Dalvik.
See also this question.
As an aside, if you're using AES directly, you're probably doing something wrong. If this is part of an effort to do DRM, make sure you realize the full extent of the fact that decompiling an Android app is trivial. Your key will not be secure, which by definition defeats the encryption.
Instead of spending efforts to improve an inadequate architecture, you should consider a streaming solution: it has the great advantage to spread the computation time for the decryption so that it becomes no more noticeable. I mean: do not produce another file from your video source but rather a stream, with a local http server. Unfortunately there is no such component in the SDK, you have to make your own implementation or search for an existing one.