For my application I need to have the latest data from an webpage that is hosted on a server on my local network.
So I request the latest page with a HTTP GET and when the data is received, I send another request.
With my current implementation I reach around the 100 - 120 ms per request. Is there a possibility to make this quicker because it's the same url that is requested.
For example keep the connection open to the page and grep the latest data without setting up a new connection?
This page is around the 900-1100 bytes.
HTTP get code:
public static String makeHttpGetRequest(String stringUrl) {
try {
URL url = new URL(stringUrl);
HttpURLConnection con = (HttpURLConnection) url.openConnection();
con.setReadTimeout(300);
con.setConnectTimeout(300);
con.setDoOutput(false);
con.setDoInput(true);
con.setChunkedStreamingMode(0);
con.setRequestMethod("GET");
return readStream(con.getInputStream());
} catch (IOException e) {
Log.e(TAG, "IOException when setting up connection: " + e.getMessage());
}
return null;
}
Reading inputstream
private static String readStream(InputStream in) {
BufferedReader reader = null;
StringBuilder total = new StringBuilder();
try {
String line = "";
reader = new BufferedReader(new InputStreamReader(in));
while ((line = reader.readLine()) != null) {
total.append(line);
}
} catch (IOException e) {
Log.e(TAG, "IOException when reading InputStream: " + e.getMessage());
} finally {
if (reader != null) {
try {
reader.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
return total.toString();
}
As I know there isn't an implementation like you are asking for. I've been dealing a lot with http requests and the best thing you can do is your code. There is another thing which need some attention...your connection maybe slow and depending on that connection time can be more or in some cases which I've been dealing a lot the connection's timeout isn't enough big, but that's server problem.
In my opinion you should use what you have now.
Related
I'm following an example of using the Reddit API in an Android app. I'm using Android Studio and Java. I have a link which returns a JSON object on a GET request (let's say http://www.reddit.com/r/dragonforce/.json), and the tutorial has this piece of code:
public static HttpURLConnection getConnection(String url){
System.out.println("URL: "+url);
HttpURLConnection hcon = null;
try {
hcon=(HttpURLConnection) new URL(url).openConnection();
hcon.setReadTimeout(30000); // Timeout at 30 seconds
hcon.setRequestProperty("User-Agent", "Alien V1.0");
} catch (MalformedURLException e) {
Log.e("getConnection()",
"Invalid URL: "+e.toString());
} catch (IOException e) {
Log.e("getConnection()",
"Could not connect: "+e.toString());
}
return hcon;
}
and
public static String readContents(String url){
HttpURLConnection hcon=getConnection(url);
if(hcon==null) return null;
try{
StringBuffer sb=new StringBuffer(8192);
String tmp="";
BufferedReader br=new BufferedReader(
new InputStreamReader(
hcon.getInputStream()
)
);
tmp = br.readLine();
while(tmp !=null) {
sb.append(tmp).append("\n");
tmp = br.readLine();
}
br.close();
return sb.toString();
}catch(IOException e){
Log.d("READ FAILED", e.toString());
return null;
}
}
I separated the tmp assignment for debug purposes. The problem is that nothing is read from the inputStream, and it returns an empty buffer to the JSONObject parser, resulting in JSONException end of input at character 0 of. I have user-permission in the Manifest for INTERNET, and the syntax for reading from the URL seems to be backed up by other sources on the internet, but it still seems something is amiss. Any help would be appreciated.
For anyone who is reading this down the line, the problem was that the URL in the tutorial was using HTTP instead of HTTPS, leading to a redirect response code and wasn't returning anything.
I am trying to connect to my Django backend server from my app. While in local/dev with http connection the android app is getting connected to the server, it is rreturning HTT 401 error for all API calls via the app (except the login call). However, funny thing is using Postman, I'm being able to reach the prod server.
Following is one of the code snippets (android):
try{
URL targetUrl = new URL(targetURL);
httpConnection = (HttpURLConnection) targetUrl.openConnection();
httpConnection.setRequestMethod("GET");
httpConnection.setRequestProperty("Authorization", "jwt " + mToken);
httpConnection.setConnectTimeout(10000); //10secs
httpConnection.connect();
Log.i(TAG, "response code:" + httpConnection.getResponseCode());
if (httpConnection.getResponseCode() != 200){
Log.e(TAG, "Failed : HTTP error code : " + httpConnection.getResponseCode());
return Constants.Status.ERR_INVALID;
}
//Received Response
InputStream is = httpConnection.getInputStream();
BufferedReader rd = new BufferedReader(new InputStreamReader(is));
String line;
while((line = rd.readLine()) != null) {
response.append(line);
//response.append('\r');
}
rd.close();
Log.i(TAG, response.toString());
// Save the tenant details
return parseTenantInfo(response.toString());
}catch (MalformedURLException e) {
e.printStackTrace();
return Constants.Status.ERR_NETWORK;
} catch (SocketTimeoutException e) {
e.printStackTrace();
return Constants.Status.ERR_NETWORK;
}
catch (IOException e) {
e.printStackTrace();
return Constants.Status.ERR_UNKNOWN;
}finally {
if(httpConnection != null) {
httpConnection.disconnect();
}
}
Following is the target url:
private static final String targetURL = Constants.SERVER_ADDR + APIs.tenant_get;
Here, SERVER_ADDR is https://www.example.com/ and tenant_get is apitogettenantinfo/
I am always getting 401 error. Kindly help me out!!! Thanks.
The most irritating thing is Postman works, android login works. So it is seemingly no issue with server (else how would postman work?). And I can't understand what the android issue is.
EDIT:
Following is the screenshot of my postman. A few things are blacked out for security & privacy:
http://imageshack.com/a/img923/231/wUrOuS.png
401 indicates unauthorised request, make sure you are sending correct token.
Also remove httpConnection.connect();
I have a trouble with my HttpsConnection on android.
First of all, no it is not a duplicate. I try almost all the solutions on SO, like changing the keep-alive option or the timeout ( and some of them indeed optimized a part of my code a little bit ) but it is still 5 to 10 times ( probably more ) slower on android than on iOS.
Sending a request to my server takes several seconds on android while it's almost instant on iOS and from a browser. I am sure that the server is not in cause. But it seems that getting the inputstream is terribly slow!
This line:
in=conn.getInputStream();
is the most delaying one, taking several seconds by itself.
My aim is to get a JSON from my server. My code is supposed to be technically as optimized as possible ( and it can probably help some people with HttpsConnection on the same time ):
protected String getContentUrl(String apiURL)
{
StringBuilder builder = new StringBuilder();
String line=null;
String result="";
HttpsURLConnection conn= null;
InputStream in= null;
try {
URL url;
// get URL content
url = new URL(apiURL);
System.setProperty("http.keepAlive", "false");
trustAllHosts();
conn = (HttpsURLConnection) url.openConnection();
conn.setHostnameVerifier(DO_NOT_VERIFY);
conn.setRequestMethod("GET");
conn.setRequestProperty(MainActivity.API_TOKEN, MainActivity.ENCRYPTED_TOKEN);
conn.setRequestProperty("Connection", "close");
conn.setConnectTimeout(1000);
in=conn.getInputStream();
// open the stream and put it into BufferedReader
BufferedReader br = new BufferedReader(new InputStreamReader(in));
while ((line=br.readLine())!= null) {
builder.append(line);
}
result=builder.toString();
//System.out.print(result);
br.close();
} catch (MalformedURLException e) {
result=null;
} catch (IOException e) {
result=null;
} catch (Exception e) {
result=null;
}
finally {
try {
in.close();
}catch(Exception e){}
try {
conn.disconnect();
}catch(Exception e){}
return result;
}
}
However, it keeps taking several seconds.
So I would like to know: is there a way to improve the speed of this API call? The problem is not the server or the JSON parsing but for sure the function above. Thanks a lot.
I am using eclipse ADT for my android development. let me explain my problem. I can receive the response from my server api, the problem is, the data is very huge and am unable to display entire response in my logcat. I used AsynTask for getting response.
DoinBackground method
getBookingResults = ServerConnection.getbookings(
BookingsActivity.this, Utils.URL + "users/"
+ "123145/" + "subscribed");
This is my Get() in separate class
public static String getData(Context ctx, String uri) {
BufferedReader reader = null;
StringBuilder sb = null;
try {
Log.d("Serverconnection URL ", uri);
URL url = new URL(uri);
HttpURLConnection con = (HttpURLConnection) url.openConnection();
con.setConnectTimeout(200000);
// save status code
Utils.statusCode = con.getResponseCode();
// String responseBody = EntityUtils.toString(response.getEntity());
sb = new StringBuilder();
reader = new BufferedReader(new InputStreamReader(
con.getInputStream()));
String line;
while ((line = reader.readLine()) != null) {
sb.append(line + "\n");
}
Log.d("server connection getData", "" + sb.toString());
return sb.toString();
} catch (SocketTimeoutException e) {
Log.d("server connection getData Error ", "" + e);
} catch (IOException e) {
e.printStackTrace();
return " ";
} finally {
if (reader != null) {
try {
reader.close();
} catch (IOException e) {
e.printStackTrace();
return " ";
}
}
}
return sb.toString();
}
When i am checking the response string in my logcat is shows string length 11743. The logcat is not displaying entire response
Help me out to handle huge data response
Thanks in advance
Thing is that you cannot blindly allocate all the data from server otherwise risk of OOM is very high. You should use technique similar to what android suggests with list, keep in memory only those elements visible to user. In other words, first you have to figure out what the size is or expect that size may be huge. Then load data chunk by chunk to some UI element and implement some kind of "load by scroll". In case you cannot load from the net as you scroll, perhaps due to nature of the connection, then you should load chunk by chunk and save the data to local store. And then display it chunk by chunk as described above. This is how I would do it. Sorry, not exactly the answer you look for.
The following code basically works as expected. However, to be paranoid, I was wondering, to avoid resource leakage,
Do I need to call HttpURLConnection.disconnect, after finish its usage?
Do I need to call InputStream.close?
Do I need to call InputStreamReader.close?
Do I need to have the following 2 line of code : httpUrlConnection.setDoInput(true) and httpUrlConnection.setDoOutput(false), just after the construction of httpUrlConnection?
The reason I ask so, is most of the examples I saw do not do such cleanup. http://www.exampledepot.com/egs/java.net/post.html and http://www.vogella.com/articles/AndroidNetworking/article.html. I just want to make sure those examples are correct as well.
public static String getResponseBodyAsString(String request) {
BufferedReader bufferedReader = null;
try {
URL url = new URL(request);
HttpURLConnection httpUrlConnection = (HttpURLConnection)url.openConnection();
InputStream inputStream = httpUrlConnection.getInputStream();
bufferedReader = new BufferedReader(new InputStreamReader(inputStream));
int charRead = 0;
char[] buffer = new char[1024];
StringBuffer stringBuffer = new StringBuffer();
while ((charRead = bufferedReader.read(buffer)) > 0) {
stringBuffer.append(buffer, 0, charRead);
}
return stringBuffer.toString();
} catch (MalformedURLException e) {
Log.e(TAG, "", e);
} catch (IOException e) {
Log.e(TAG, "", e);
} finally {
close(bufferedReader);
}
return null;
}
private static void close(Reader reader) {
if (reader != null) {
try {
reader.close();
} catch (IOException exp) {
Log.e(TAG, "", exp);
}
}
}
Yes you need to close the inputstream first and close httpconnection next. As per javadoc.
Each HttpURLConnection instance is used to make a single request but the underlying network connection to the HTTP server may be transparently shared by other instances. Calling the close() methods on the InputStream or OutputStream of an HttpURLConnection after a request may free network resources associated with this instance but has no effect on any shared persistent connection. Calling the disconnect() method may close the underlying socket if a persistent connection is otherwise idle at that time.
Next two questions answer depends on purpose of your connection. Read this link for more details.
I believe the requirement for calling setDoInput() or setDoOutput() is to make sure they are called before anything is written to or read from a stream on the connection. Beyond that, I'm not sure it matters when those methods are called.