I have been trying to implement a video download feature into an application but I cannot get it right. Given a video link, how do I get the video info like video size, video title e.t.c. My code for web scraping this information hangs at parsing the url till application runs out of memory.
String url; //e.g https://www.example.com/video-480p.mp4
Map<String, dynamic>? videoInfo = {};
String? videoTitle;
String? videoImageLink;
var linkInfo = await http.get(Uri.parse((url)));//hangs here
if(linkInfo.statusCode!=200){
customToastWidget('Cannot Parse this link');//Toast for showing messages
return null;
}
final document = parse(linkInfo.body);//parse is a method from html package to get the html body of web page
videoInfo["videoSize"] = linkInfo.contentLength;
videoInfo["videoTitle"]=document.getElementsByTagName("title")[0].text;
Related
Below is the HTML tag for the audio link url,
And below is the logic I used to get the URL,
String url = "https://www.dictionary.com/browse/happy?s=t";
Document doc = Jsoup.connect(url).get();
Elements wordBlock = doc.getElementsByClass("e16867sm0");
Element e = wordBlock.get(0);
Elements audioSection = e.getElementsByClass("e1rg2mtf7");
String audioUrl = audioSection.get(0).attr("audio");
But still I am unable to get the URL,
How can we get the URL of audio by using class id.
You can use either doc.select("source[type=audio/ogg]" for the first file or doc.select("source[type=audio/mpeg]" for the second one.
You can also use source[type^=audio] - it will give you both.
I made an app that is just a webview showing a website, at a certain point in my website I have a button that prints a div via javascript, as it is it doesn't work, how can I make it possible?
my js code:
function printDiv(divName) {
var printContents = document.getElementById(divName).innerHTML;
var originalContents = document.body.innerHTML;
document.body.innerHTML = printContents;
window.print();
document.body.innerHTML = originalContents;
}
I am working on an xamarin app that has images stored in an S3 bucket. The querying works correctly in xamarin when using the correctly constructed Url:
https:// + BucketName + path + ".jpg?AWSAccessKeyId=keycode&Expires=expireNumber&Signature=signatureCode"
When using
Image.Source = urlAddress (as the above format)
The image is loaded fine
Part of the apps pages have custom renderers with Images that need to be rendered via url address. We are updating the images via url at each os level. The iOS is working correctly using the following code:
using (var url = new NSUrl(uri))
using (var data = NSData.FromUrl(url))
if (data != null)
return UIImage.LoadFromData(data);
Which successfully gets the image from Url and updates it. However I am having major issues having it work on Android. I have tried the following area:
making a basic android url and setting the imageView with the following code. Which has been explained to not work here https://forums.xamarin.com/discussion/4323/image-from-url-in-imageview
Android.Net.Uri url = Android.Net.Uri.Parse(url);
imageView.SetImageURI(url);
On that same link using WebClient was suggested by user 'rmacias' to download the data via the url and parse the bytes to an android Bitmap.
private Bitmap GetImageBitmapFromUrl(string url){
Bitmap imageBitmap = null;
using (var webClient = new WebClient())
{
var imageBytes = webClient.DownloadData(url);
if (imageBytes != null && imageBytes.Length > 0)
{
imageBitmap = BitmapFactory.DecodeByteArray(imageBytes, 0, imageBytes.Length);
}
}
return imageBitmap;}
This returns a 403 forbidden error. at the line var imageBytes = webClient.DownloadData(url)
However the same process is working in iOS, the string is already authenticated and I have set the authentication timeout for several minutes incase of slow load. I have also tiued the same url requesting method with the .Net.Http library.
It crashes at res = (HttpWebResponse)request.GetResponse(); with the same 403 Forbidden error.
I have tried multiple things with header authentications for the WebClient and Http client. It feels that its something specific about android requesting url data because the authentication in the url string works for the Xamarin images and in the ioS code.
I'm thinking there is something specific to android that I am missing? Help is much appreciated!
How about using HttpClient, which can leverage the platform specific HttpClientHandler's which Xamarin provides?
So something like:
// make sure to reuse your HttpClient instance, it is a shared resource
// using it in a using() and disposing it all the time, will leave
// sockets open and bog down the connection!
private static HttpClient _httpClient;
public async Task<byte[]> GetImageDataAsync(string url)
{
if (_httpClient == null)
{
// you could inject AndroidHttpClientHandler or NSUrlSessionHandler here...
_httpClient = new HttpClient();
// set headers etc...
}
var response = await _httpClient.GetAsync(url).ConfigureAwait(false);
if (!response.IsSuccessStatusCode)
return null;
var result = await response.Content.ReadAsByteArrayAsync().ConfigureAwait(false);
return result;
}
Then you can use this platform agnostically like:
var data = await GetImageDataAsync(url);
imageBitmap = BitmapFactory.DecodeByteArray(data, 0, data.Length);
on iOS
var data = await GetImageDataAsync(url);
var imageData = NSData.FromArray(data);
imageBitmap = UIImage.LoadFromData(imageData);
There are also nice libraries, such as FFImageLoading, which support this out of the box, with effects, loading of images in TableViews etc., which you can consider as an alternative.
I wanna save all web page including .css .js on android by programmatically.
So far I tried html get method and jsoup , webview content but all of them I could not save all page with css and js. These methods just save html parts of WEB Page. When I save the all page ,I want to open it offline.
Thanks in advance
You have to take the html, parse it and get the urls of the resources and then make requests for those urls too.
public class Stack {
private static final String USER_AGENT = "";
private static final String INITIAL_URL = "";
public static void main(String args[]) throws Exception {
Document doc = Jsoup
.connect(INITIAL_URL)
.userAgent(USER_AGENT)
.get();
Elements scripts = doc.getElementsByTag("script");
Elements css = doc.getElementsByTag("link");
for(Element s : scripts) {
String url = s.absUrl("src");
if(!url.isEmpty()) {
System.out.println(url);
Document docScript = Jsoup
.connect(url)
.userAgent(USER_AGENT)
.ignoreContentType(true)
.get();
System.out.println(docScript);
System.out.println("--------------------------------------------");
}
}
for(Element c : css) {
String url = c.absUrl("href");
String rel = c.attr("rel") == null ? "" : c.attr("rel");
if(!url.isEmpty() && rel.equals("stylesheet")) {
System.out.println(url);
Document docScript = Jsoup
.connect(url)
.userAgent(USER_AGENT)
.ignoreContentType(true)
.get();
System.out.println(docScript);
System.out.println("--------------------------------------------");
}
}
}
}
I have similar problem...
Using this code we can get images,.css,.js. However some html contents are still missing.
For instance when we save a web page via chrome,there are 2 options.
Complete html
html only
Out of .css,.js,.php..."Complete html" consists of more elements than "only html". The requirement is to download the html as complete like chrome does in the first option.
I am using the following code to fetch a division of a webpage using htmlunit.
String url="url";
String divisonname="dv";
WebView w=(WebView)v.findViewById(R.id.webView1);
WebClient webClient = new WebClient();
HtmlPage currentPage = webClient.getPage(url);
HtmlElement imgElement = currentPage.getHtmlElementById(divisionname);
Now, i don't know how to load this fetched data into webview.I just know this command w.loadurl(url);