I have the following code:
urlString = "..."
URL url = new URL(urlString);
HttpURLConnection urlConnection = (HttpURLConnection) url.openConnection();
How can I make urlString secure and still working? By secure I mean from viewing the APK and seeing the url itself, or is it a lost cause because any simple sniffer will know the URL anyway? if so, what can be done about it?
Related
I wrote a code to establish an HttpsUrlConnection and setRequestMethod as PUT. While debugging I see the method as GET. My SetRequestMethod is not working. I don't know why HttpsUrlConnection method in default GET.
My code looks like this
DisbleSSL disble = new DisbleSSL();
disble.disableSSLVerification();
URL url = new URL(url_string);
httpsUrlConnection = (HttpsURLConnection) url.openConnection();
httpsUrlConnection.setDoOutput(true);
httpsUrlConnection.setRequestMethod("PUT");
httpsUrlConnection.connect();
HttpsUrlConnection definitely support setrequestMethod.Actually but it may be Wrong Labled in Debugg console.
enter image description here
HttpsURLConnection use internal objects ->
protected DelegateHttpsURLConnection delegate
Try setRequestMethod before setDoOutput
So i have a webpage which i trying to access via a HttpsURLConnection. The certificate issued is by Digicert CA-3.However I always get a response as 400.The android developer website, wikipedia example works fine.This website also opens on chrome on the device.I want to know if this could a be a problem with how I am invoking the call.or what could be the issue
URL url = new URL(strUrl);
HttpsURLConnection urlConnection = (HttpsURLConnection) url
.openConnection();
SSLContext sslcontext = SSLContext.getDefault();
CookieSyncManager.createInstance(context);
CookieManager cookieManager = CookieManager.getInstance();
cookieManager.removeAllCookie();
urlConnection.setDoOutput(false);
urlConnection.setDoInput(true);
urlConnection.setSSLSocketFactory(sslcontext.getSocketFactory());
int code = urlConnection.getResponseCode(); //always 400 :(
Probably you should double check the URL you are using.. It is mostly because of client's Bad Request..
i don't know if the following is correct to the connection, I have a IOException.
sb=ftp://IDUSER:password#ftp.fercode.com/manolo;type=i
URL url = new URL( sb.toString() );
URLConnection urlc = url.openConnection();
urlc.getOutputStream();// this line throws a IOException
You need to use FTP client like Apache to properly upload and downland files.
Take look here:
http://commons.apache.org/proper/commons-net/apidocs/org/apache/commons/net/ftp/FTPClient.html
I'm trying to download an epub from hosting, but the link I have is in HTTP, and I just download the html that redirects me, as I can download the original file?
This is my Download Code:
URL url = new URL("http://st10.file.karelia.ru/25nv8s/9789d242bd127ce31991dd68fa434caa/7958c0b30a9fa22938770ac65e9f2544/principito.epub");
URLConnection conexion = url.openConnection();
conexion.connect();
I Just download a html file, but when I use for example a Link from drop-box begin as https://, everything works.
Try this
URL url = new URL("http://st12.file.karelia.ru/25nv8s/598577e8b5897f89c7e44edb0b2f09aa/6d2b030e8c28b800e957af40960b8d46/20mil_leguas.epub?force");
URLConnection conexion = url.openConnection();
conexion.connect();
I'm having a strange problem here. Here's the code, I'm using to fetch a url content:
URL u = new URL(url);
InputStream is = new BufferedInputStream(u.openStream());
I've got two urls, I want to fetch with this code. Both contain xml data. To be specific, the first one is http://www.berlingske.dk/unwire/latest/news_article/2/10, the second one is http://www.bt.dk/mecommobile/latest/news_article/1368/10?output_type=xml. The first one gets fetched correctly, the second one does not. I added some logging, and found out, that for the second url some weird html page gets fetched, instead of the expected xml. How can that be even possible?
I think you're talking about URL redirects, which was a problem I was having. Try the following code:
URL url = new URL(url);
HttpURLConnection ucon = (HttpURLConnection) url.openConnection();
ucon.setInstanceFollowRedirects(false);
URL secondURL = new URL(ucon.getHeaderField("Location"));
URLConnection conn = secondURL.openConnection();
InputStream is = new BufferedInputStream(conn.openStream());
The "magic" here happens in these 2 steps:
ucon.setInstanceFollowRedirects(false);
URL secondURL = new URL(ucon.getHeaderField("Location"));
By default InstanceFollowRedirects are set to true, but you want to set it to false to capture the second url. To be able to get that second url from the "weird html page", you need to get the header field called "Location".
Unless i misunderstood your problem, I hope this helps!