Working on a project where an Android client communicates with a .Net server via sockets.
It can pass text messages without issue.
It now needs to be expanded to pass an jpeg image.
The server side code:
Dim fs As FileStream = New FileStream(imagePath, FileMode.Open)
Dim br As BinaryReader = New BinaryReader(fs)
sendBytes = br.ReadBytes(fs.Length)
logger.Debug("sending " & sendBytes.Length & " bytes")
clientStream.Write(sendBytes, 0, sendBytes.Length)
clientStream.Flush()
clientStream.Close()
The Android client code:
message send / receive
socket = new Socket(dstAddress, dstPort);
DataOutputStream writer = new DataOutputStream(socket.getOutputStream());
BufferedReader reader = new BufferedReader(new InputStreamReader(socket.getInputStream()));
byte[] outputBytes = requestString.getBytes();
writer.write(outputBytes);
Log.d(method, "Message sent: " + requestString);
while ((responseString = reader.readLine()) != null) {
response += responseString + "\n";
}
reader.close();
writer.close();
socket.close();
then trying to reconstruct the image from the response:
byte[] imageBytes = reponse.getBytes();
Log.d(method, "imageBytes.length: " + imageBytes.length);
ByteArrayInputStream is = new ByteArrayInputStream(imageBytes);
ImageView imageV = new ImageView(activity);
imageV.setImageBitmap(BitmapFactory.decodeStream(is));
LogCat error message is: SkImageDecoder::Factory returned null
PLUS the server log says it sent 14548 bytes,
BUT the client log says it received 25294 bytes.
An encoding issue?
I tried adding encoding to the server BinaryReader, no luck.
I also tried on the client side:
imageV.setImageBitmap(BitmapFactory.decodeByteArray(imageBytes, 0, imageBytes.length));
I have spent hours looking through dozens of posts, I also tried other changes I can't even remember.
but, always "Factory returned null"
What am I doing wrong?
Edit----
Tried changing to
byte[] imageBytes = Base64.decode(response, Base64.DEFAULT)
That generated: IllegalArgumentException: bad base-64
You cannot use readLine() to read the bytes of an image.
Declare a buffer and in a loop read() bytes in the buffer and save them.
You cannot use intermediate Strings either.
If the server only sends an image you could even use
imageV.setImageBitmap(BitmapFactory.decodeStream(socket.getInputStream()));)
variable of position in inputstream may be set to 1024 after the first decode. So add inputstream.reset() before the second decode. Hope that works.
Related
There is a string variable and an image photo taken from the camera intent. The directory location of the photo is known. I want to make a HTTP post of the string variable and the image photo to a webserver at the same time. Is that possible ? If so , how to do it ?
From what I understand, you need to send an image and a string to your webserver within a single POST request. Here's how you'd proceed.
You first need to Base64 encode your image.
Start by converting your image into a byte array:
InputStream image = new FileInputStream(<path_to_image>);
byte[] buff = new byte[8192];
int readBytes;
ByteArrayOutputStream byteArrOS = new ByteArrayOutputStream();
try {
while ( (readBytes = inputStream.read(buff) ) != -1) {
byteArrOS.write(buff, 0, readBytes);
}
} catch (IOException e) {
e.printStackTrace();
}
byte[] b = byteArrOS.toByteArray();
Then convert it to Base64:
String bsfEncodedImage = Base64.encodeToString(b, Base64.DEFAULT);
Then build a query with the the string and the resulting Base64 both encoded with URLEncoder and "utf-8":
strImgQuery = "str="+URLEncoder.encode(<string_data>, "utf-8")+"&image="+URLEncoder.encode(bsfEncodedImage, "utf-8");
Declare a new URL:
URL postUrl = new URL("http://<IP>/postreq");
Open the connection:
HttpURLConnection conn = (HttpURLConnection)postUrl.openConnection();
Set output to "true" (needed for a POST request but not for GET):
conn.setDoOutput(true);
Set the request method to POST:
conn.setRequestMethod("POST");
The timeout:
conn.setReadTimeout(1e4);
Buffer the output to the output stream and flush/run:
Writer buffWriter = new OutputStreamWriter(conn.getOutputStream());
buffWrite.write(strImgQuery);
buffWriter.flush();
buffWriter.close();
At server side you'll get the str and image POST params which is dependent on your server implementation.
Note that your url must follow the URL Specification, otherwise you'll get a MalformedURLException. If that's the case, be sure to check what exactly the issue is. For example if you use a non-existing ttp "protocol" instead of http your exception will look something like this:
java.net.MalformedURLException: unknown protocol: ttp
at java.net.URL.<init>(URL.java:592)
at java.net.URL.<init>(URL.java:482)
at java.net.URL.<init>(URL.java:431)
at com.pheromix.core.lang.NumberFormatExceptionExample.MalformedURLExceptionExample.sendGetRequest(MalformedURLExceptionExample.java:28)
at com.pheromix.core.lang.NumberFormatExceptionExample.MalformedURLExceptionExample.main(MalformedURLExceptionExample.java:17)
Also, this is a synchronous operation and is ran on the UI thread. It might be costly or it might not depending on other operations you're already running and the size of the POST data. If the problem arises, run the job on another thread.
You can use URLEncoder
String strUrl = "http://192.168.1.9/impots/" +URLEncoder.encode("outil.php?action=OutilImporterDonneesMobile", "utf-8");
URL url = new URL(strUrl);
In my android application , user can upload a 300kb image;
I'm going to use This ( Android Asynchronous Http Client ) which I think is great and also Whatsapp is one of it's users.
In this library , I can use a RequestParams ( which is provided by apache I think) , and add either a file to it or an string ( lots of others too).
here it is :
1- Adding a file which is my image ( I think as a multipart/form-data)
RequestParams params = new RequestParams();
String contentType = RequestParams.APPLICATION_OCTET_STREAM;
params.put("my_image", new File(image_file_path), contentType); // here I added my Imagefile direcyly without base64ing it.
.
.
.
client.post(url, params, responseHandler);
2- Sending as string ( So it would be base64encoded)
File fileName = new File(image_file_path);
InputStream inputStream = new FileInputStream(fileName);
byte[] bytes;
byte[] buffer = new byte[8192];
int bytesRead;
ByteArrayOutputStream output = new ByteArrayOutputStream();
try {
while ((bytesRead = inputStream.read(buffer)) != -1) {
output.write(buffer, 0, bytesRead);
}
} catch (IOException e) {
e.printStackTrace();
}
bytes = output.toByteArray();
String encoded_image = Base64.encodeToString(bytes, Base64.DEFAULT);
// then add it to params :
params.add("my_image",encoded_image);
// And the rest is the same as above
So my Question is :
Which one is better in sake of Speed and Higher Quality ?
What are the differences ?
NOTE :
I've read many answers to similar questions , but none of them actually answers this question , For example This One
Don't know if params.put() and params.add would cause for a change of multipart encoding.
The base64 endoded data would transfer 30% slower as there are 30% more bytes to transfer.
What you mean by quality i do not know. The quality of the uploaded images would be equal as they would be byte by byte the same to the original.
I've got several strings I want to send to a wearable app via the MessageApi.sendMessage method, which just takes a byte[] parameter to carry the payload.
I could concatenate all the strings together and introduce some arbitrary separating character sequence then convert the result to bytes and do the reverse to unpack it on the watch.
But I was wondering if there's a ready-made solution for such a thing in Android (or Java)?
As commented, Data*Stream are made for that:
ByteArrayOutputStream byteStream = new ByteArrayOutputStream();
DataOutputStream data = new DataOutputStream(byteStream);
data.writeUTF("Whatever string you have here");
data.writeUTF("Write any number of strings");
byte[] result = byteStream.toByteArray();
On the other side:
byte[] myBytes = ...; // Retrieve the bytes through MessageApi
ByteArrayInputStream byteStream = new ByteArrayInputStream(myBytes);
DataInputStream data = new DataInputStream(byteStream);
String str1 = data.readUTF();
// That would be "Whatever string you have here"
String str2 = data.readUTF();
// And "Write any number of strings"
That's that simple. Since those are byte[] back streams, there is no need to close them.
I'm trying to send some commands to Android (client) from VB.NET (server) using sockets. I can connect the client to the server, but I don't know how to receive the commands sent by the server.
Here's a part of my Android code:
public void connect(View v){ //called on button click
SERVER_IP = ip.getText().toString(); //it gets the server's ip from an editText
SERVER_PORT = Integer.parseInt(port.getText().toString()); //and the port too
Toast.makeText(this, "Trying to connect to\n" + SERVER_IP + ":" + SERVER_PORT + "...", Toast.LENGTH_SHORT).show();
new Thread(new Runnable() {
public void run() {
InetAddress serverAddr;
try {
serverAddr = InetAddress.getByName(SERVER_IP);
socket = new Socket(serverAddr, SERVER_PORT); //It establishes the connection with the server
if(socket != null && socket.isConnected()){ //not sure if it is correct
BufferedReader input = new BufferedReader(new InputStreamReader(socket.getInputStream()));
//Here comes the problem, I don't know what to add...
}
} catch (Exception e) {
}
}
}).start();
}
And here's a part of my VB.NET send code:
Private Sub Button1_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles Button1.Click
send(TextBox1.text)
End Sub
Private Sub Send(ByVal command)
Dim temp() As Byte = UTF8.GetBytes(command) 'Is UTF8 right to use for that?
stream.Write(temp, 0, temp.Length)
stream.Flush()
End Sub
Question1: is it right to us UTF8 instead of for example ASCII encoding?
Question2: what would I change in the Android code if it wanted to use a timer that sends a command every second?
Thanks.
To read input from a BufferedReader you need to do something similiar to this:
BufferedReader input = new BufferedReader(new InputStreamReader(socket.getInputStream()));
String line;
while((line = input.readLine()) != null){
// do something with the input here
}
A nice tutorial on sockets is available from oracle in the docs: http://docs.oracle.com/javase/tutorial/networking/sockets/readingWriting.html
The default charset on Android is UTF-8 http://developer.android.com/reference/java/nio/charset/Charset.html, so no worries there but you can always send a byte stream from the server onto the client and decode it however you want.
To receive a byte stream you need to do this:
BufferedInputStream input = new BufferedInputStream(socket.getInputStream());
byte[] buffer = new byte[byteCount];
while(input.read(buffer, 0, byteCount) != -1 ){
// do something with the bytes
// for example decode it to string
String decoded = new String(buffer, Charset.forName("UTF-8"));
// keep in mind this string might not be a complete command it's just a decoded byteCount number of bytes
}
As you see it's much easier if you send strings instead of bytes.
If you want to receive input from the server periodically, one of the solutions would be to create a loop which opens a socket, receives input, process it, closes the socket, and then repeats, our you could just keep the loop running endlessly until some command like "STOP" is received.
I've got Android device acting as a client, the PC is a Bluetooth Server, using Bluecove library
Code snippet from the client:
btSocket = serverBt.createRfcommSocketToServiceRecord(myUuid);
btAdapter.cancelDiscovery();
btSocket.connect();
InputStream in = btSocket.getInputStream();
OutputStream out = btSocket.getOutputStream();
OutputStreamWriter osw = new OutputStreamWriter(out);
InputStreamReader isr = new InputStreamReader(in);
osw.write(55);
osw.flush();
out.flush();
//osw.close();
logTheEvent("Stuff got written, now waiting for the response.");
int dummy = isr.read();
logTheEvent("Servers response: "+ new Integer(dummy).toString());
And the server:
StreamConnectionNotifier streamConnNotifier = (StreamConnectionNotifier)Connector.open( connectionString, Connector.READ_WRITE );
StreamConnection incomingConnection=streamConnNotifier.acceptAndOpen();
InputStream in = incomingConnection.openInputStream();
OutputStream out = incomingConnection.openOutputStream();
OutputStreamWriter osw = new OutputStreamWriter(out);
InputStreamReader isr = new InputStreamReader(in);
int fromClient = isr.read();
System.out.println("Got from client " + new Integer(fromClient).toString());
osw.write(999);
When the osw.close(); at the client is uncommented, the message gets transferred to the server, however, client is then unable to receive the response, IOException with message "socket already closed" is thrown.
However, when osw.close(); is commented, both client and server freeze:
A. Client hangs on reading server's response of course
B. Server hangs on streamConnNotifier.acceptAndOpen();
What should be done to enable two-way communication?
Is my code, or PC Bluetoototh stack, or bluecove to blame?
Bluetooth uses buffered output. That means there is a small memory location that contains all of the data you write to the stream. When this memory location gets full, it writes the buffer data to the socket in a packet. When you prematurely close the socket, that buffer gets wiped, and the data is gone.
In order to force the stream to write, try calling flush()
Something else you could do is set the buffer size to be very small, so data always gets written. The performance won't be very good if you do this, though.
Unfortunately, I don't have all of the code I wrote, but there's a base project here