I'm having some trouble reading two BBC feeds on my Android app, they both seem to time out. This is especially strange because all other feeds work fine using the exact same system. I guess if someone could test them in their Eclipse it might help me determine if my work's firewall/proxy is restricting access to this specific website.
The feeds are http://newsrss.bbc.co.uk/weather/forecast/2159/Next3DaysRSS.xml and http://feeds.bbci.co.uk/news/england/kent/rss.xml.
I have other feeds being read with work fine e.g. http://www.kent.ac.uk/news/rss.html
The other strange thing is the chap next to me working on an iPhone under the same network restrictions is not having a problem.
Any help would be greatly appreciated!
For info here is the code I've been using to pull in feeds:
HttpParams params = new BasicHttpParams();
HttpConnectionParams.setConnectionTimeout(params, 10000);
// proxy settings
String proxyHost = android.net.Proxy.getDefaultHost();
int proxyPort = android.net.Proxy.getDefaultPort();
if(proxyPort != -1){
params.setParameter(ConnRoutePNames.DEFAULT_PROXY, new HttpHost(proxyHost,proxyPort));
}
URL url = null;
try {
SAXParserFactory spf = SAXParserFactory.newInstance();
SAXParser sp = spf.newSAXParser();
XMLReader xr = sp.getXMLReader();
url = new URL(feedUrl);
URLConnection conn = url.openConnection();
// setting these timeouts ensures the client does not deadlock indefinitely
// when the server has problems.
conn.setConnectTimeout(2000);
conn.setReadTimeout(2000);
xr.setContentHandler(this);
/* This is where it lurches indefinitely VVV */
xr.parse(new InputSource(url.openStream()));
} catch (IOException e) {
Log.e("RSS Handler IO", e.getMessage() + " >> " + e.toString());
} catch (SAXException e) {
Log.e("RSS Handler SAX", e.toString());
} catch (ParserConfigurationException e) {
Log.e("RSS Handler Parser Config", e.toString());
}
I'm going to guess you're having the same problem as this question. Does it work if you take out the proxy settings? If that's the problem, I'd suggest writing the response stream to memory before feeding it to your XMLReader.
Related
I'm doing an application that reads a xml from a url and displays it on screen, but when I want to read the answer gives me a FileNotFoundException
this is the code:
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
ListView rss = (ListView) findViewById(R.id.list);
try {
Authenticator.setDefault(new Authenticator() {
#Override
protected PasswordAuthentication getPasswordAuthentication() {
return new PasswordAuthentication("acceso!a!backend", "nu3v0!1nf0b43!2013".toCharArray());
}
});
URL rssUrl = new URL("http://ec2-54-224-94-185.compute-1.amazonaws.com/adjuntos/162/rss/home_mobile.xmlx");
SAXParserFactory factory = SAXParserFactory.newInstance();
SAXParser saxParser = factory.newSAXParser();
XMLReader xmlReader = saxParser.getXMLReader();
RSSHandler rssHandler = new RSSHandler();
xmlReader.setContentHandler(rssHandler);
////////here I am getting the FileNotFoundException
InputSource inputSource = new InputSource(rssUrl.openStream());
xmlReader.parse(inputSource);
NoticiasAdapter na = new NoticiasAdapter(this ,rssHandler.getChannel());
rss.setAdapter(na);
} catch (IOException e) {
e.printStackTrace();
} catch (SAXException e) {
e.printStackTrace();
} catch (ParserConfigurationException e) {
e.printStackTrace();
}
}
It's likely that URL does not exist, then. :)
Have you tried visiting that URL in your browser to confirm that it exists?
Also, the URLConnection throws the IOException when you start doing I/O because that's when it tries to make the physical connection. If you want the system to make the connection earlier (to make sure the URL exists, for example), you can use URLConnection's connect() method to make it connect earlier. (Just make sure you've done necessary connection configuration -0setting headers, etc. -- beforehand!) There's nothing wrong with waiting until you do I/O to make the connection, so you don't need to call connect() to make the connection earlier, but you can make the connection earlier if you want to.
I tried to open that url and it gives me a 404 error not found.
It seems likely the URL does not exist.
You could modify rssUrl.openStream() into two lines
URLConnection connection = rssUrl.openConnection()
InputSource inputSource = new InputSource(connection.getInputStream())
This way you could debug to figure out if you are able to connect instead of doing it all in one step. After all openStream() is shorthand for openConnection().getInputStream() http://docs.oracle.com/javase/7/docs/api/java/net/URL.html#openStream()
Once you figure out you have a valid URL and are able to connect you can change back to your implementation if desired.
I have this issue that has caused me to pound my head against the wall. I am writing a newspaper app that parses data in JSON from a database and displays it. The app works fine and passes data on WiFi and 4G, but chokes on 3G. Most of the time it takes between 30 seconds and 1 minute to grab data on 3G while only taking one to two seconds on WiFi. I often receive a warning message stating: HttpHostConnectException: Connection refused. I know the site works perfectly fine and is not causing issues because I can query fine on WiFi and 4G along with navigating from a desktop just fine with no problems. As another test, I borrowed my coworkers MiFi which is only on 3G in our area, and connected my device to it, and it passes data just fine although it is only 3G back to the Internet. So after looking at this, and trying to find a solution, I have come to the conclusion that maybe I am not doing something right on my end. To the best of my knowledge, everything is fine, but I am no expert. Any insight on this would be greatly appreciated.
Summary--
4G = Works
WiFI = Works
3G = Extremely slow
3G via WiFi(MiFi on 3G) =Works
public JSONObject makeHttpRequest(String url, String method, List params) {
// Making HTTP request
try {
if(method == "GET"){
// request method is GET
DefaultHttpClient httpClient = new DefaultHttpClient();
String paramString = URLEncodedUtils.format(params, "utf-8");
url += "?" + paramString;
HttpGet httpGet = new HttpGet(url);
HttpResponse httpResponse = httpClient.execute(httpGet);
HttpEntity httpEntity = httpResponse.getEntity();
is = httpEntity.getContent();
System.out.println("---GET--- Now grabing GET DATA");
}
} catch (UnsupportedEncodingException e) {
e.printStackTrace();
} catch (ClientProtocolException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
try {
BufferedReader reader = new BufferedReader(new InputStreamReader(
is, "iso-8859-1"), 8);
StringBuilder sb = new StringBuilder();
String line = null;
while ((line = reader.readLine()) != null) {
sb.append(line + "\n");
}
is.close();
json = sb.toString();
} catch (Exception e) {
Log.e("Buffer Error", "Error converting result " + e.toString());
}
// try parse the string to a JSON object
try {
jObj = new JSONObject(json);
} catch (JSONException e) {
Log.e("JSON Parser", "Error parsing data " + e.toString());
}
// return JSON String
return jObj;
}
Is the 3G on my MiFi equally slow? Cause otherwise it sounds like you are saying that your process fails where the connection is slow.
You mention that 3G takes > 30s. Are you running on app engine? GAE has a hard limit on how long transactions can take - I believe that limit is 30s.
What if you added a delay on your server so that even a Wifi request takes as long as 3G tests are taking now - to verify that it is the time taken that is causing the failure.
Also, I think those 3G results sound rather poor. I don't know how much data you are retrieving but it really doesn't sound like it should take that long. So perhaps your 3G connection is simply a poor quality connection (and the MiFi perhaps is a better 3G connection).
I want to parse the URL of a weather API in my sample Android app, but it arises an exception while parsing the url.
If I comment the last statement of the try block:
xr.parse(new InputSource(url.openStream()));
then my program runs successfully. Please revise my code where I parse my URL.
try {
URL url;
String queryString ="http://free.worldweatheronline.com/feed/weather.ashx?q=34.01,71.54&format=xml&num_of_days=5&key=ccad66928f081759132201";
url = new URL(queryString.replace(" ", "%20"));
SAXParserFactory spf = SAXParserFactory.newInstance();
SAXParser sp = spf.newSAXParser();
XMLReader xr = sp.getXMLReader();
WeatherHandler myWeatherHandler = new WeatherHandler();
xr.setContentHandler(myWeatherHandler);
xr.parse(new InputSource(url.openStream()));
Log.d(TAG, "it's all right");
} catch (Exception e) {
System.out.println(e);
Log.d(TAG, "it's Wrong");
}
Here is the Screen shot of the log cat when the exception occurs.
Are you sure the line:
new InputSource(url.openStream());
from
xr.parse(new InputSource(url.openStream()));
is returning something? If you have to connect to the Internet, don't forget to add the permissions to the manifest. And you should also test if you actually get something from that url before trying to parse.
At least that's my interpretation from your Log and Exception.
I've implemented a SAXparser in my application which has worked fine previously but i'm having problems with a new XML document.
This is my parser
public List<Article> getLatestArticles(String feedUrl) {
URL url = null;
try {
SAXParserFactory spf = SAXParserFactory.newInstance();
SAXParser sp = spf.newSAXParser();
XMLReader xr = sp.getXMLReader();
url = new URL(feedUrl);
xr.setContentHandler(this);
xr.parse(new InputSource(url.openStream()));
} catch (IOException e) {
Log.e("RSS Handler IO", e.getMessage() + " >> " + e.toString());
} catch (SAXException e) {
Log.e("RSS Handler SAX", e.toString());
} catch (ParserConfigurationException e) {
Log.e("RSS Handler Parser Config", e.toString());
}
catch (java.lang.IllegalArgumentException e){
Log.e("RSS Handler lang", e.getMessage() + " >> " + e.toString());
}
return articleList;
}
The parser starts off ok but then i get a java.lang.IllegalArgumentException error. I believe this may be due to an element with no value in my xml feed, it looks like this <Description/>.
Any suggestion on how to fix this would be much appreciated.
If </Description> is a self closing tag (i.e. it has not opening <Description> tag and no text value) then this syntax is perfectly correct.
It is hard to tell exactly what is causing the error without seeing the callback methods (e.g. startElement method) but there is one major gottcha that you can check
The SAX parse method throws an illegalArguementException if the InputStream is null. It might be worth checking the value coming into the SAX parser.
You can use the following code to check the input stream for nulls.
InputStreamReader reader = new InputStreamReader(url.openStream());
if (!reader.ready()) {
System.out.println("error");
}
The connection could close before all the data downloads.
Try using a BufferedInputStream and see if that helps:
BufferedInputStream _url_ = new BufferedInputStream(new URL(feedUrl).openStream());
...
BufferedReader isr = new BufferedReader(new InputStreamReader(_url_));
InputSource is = new InputSource(isr);
xr.parse(is);
<Description/> is the xml construction issue for SAXParser. have a look into the structure.
Ideally it should be </Description> as a closing tag.
For self-closing tags SAXParser will assume it as a closing tag.
So instead of startElement you will get a call from endElement.
I need to parse the XML data returned from accessing a REST-based service to display only one single tag. For example, parse the XML data shown below to display firstname tag and value John only.
<company>
<staff>
<firstname>John</firstname>
<lastname>Doe</lastname>
</staff>
</company>
I am struggling to figure out how to interface with the SAX parsing code once XML data is returned by RESTClient. I tried different approaches after learning from different example codes but still cannot figure it out partly because they do not have the same exact purpose. So please kindly teach me how to call/pass the data to the parsing code and what to return from the parsing code, whether the parsing code should be in a separate class, etc. I am basically clueless without some guidance. Relevant RESTClient code is presented below. Thanks!
public class RESTClient {
public static String callRESTService(String url) {
String result = null;
HttpClient httpclient = new DefaultHttpClient();
// Prepare a request object
HttpGet httpget = new HttpGet(url);
// Execute the request
HttpResponse response;
try {
response = httpclient.execute(httpget);
// Get hold of the response entity
HttpEntity entity = response.getEntity();
// If the response does not enclose an entity, there is no need
// to worry about connection release
if (entity != null) {
InputStream instream = entity.getContent();
SAXParserFactory spf = SAXParserFactory.newInstance();
SAXParser sp;
try (
sp = spf.newSAXParser();
XMLReader xr = sp.getXMLReader();
DefaultHandler handler = new DefaultHandler();
xr.setContentHandler(handler);
InputSource is = new InputSource(instream);
xr.parse(is);
//what should/can be returned here from the parsing code:
//String, InputSource, InputStream?. Convert data type?
}
catch (SAXException e)
{
e.printStackTrace();
}
catch (ParserConfigurationException e)
{
e.printStackTrace();
}
}
} catch (ClientProtocolException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
return result;
}
)
http://as400samplecode.blogspot.com/2011/11/android-parse-xml-file-example-using.html
You have to extend DefaultHandler and make your own XML handler where you will parse your xml.
You can see the example of extending Defaulthandler in above link.
I insist that you should keep your parser code in separate file . Since RestClient is only responsible for sending and receiving of data from remote location . This is would also provide u lose coupling between the two components .
what to return from the parsing code Output of processor depends on your needs. (Personally i am returning List of HashMap)
I suggest that you return a POJO here.