I have a list of HTTP links that are basically links for streaming videos (HLS). I was wondering if there was a simple and straightforward way to basically pick from a list of HTTP links from an Android app and have them stream on my TV via chromecast.
I've looked at some of the sample apps on the Google Cast github and couldn't find any examples.
Thanks
HLS support is Yes so, to accomodate the HLS/m3u8 playlist in terms of android chromecast api you might consider remapping the m3u8 to a MediaList/MediaInfo combo and then using the chromecast ccl calls to play a mediainfo entry from the new list...
details making a new list entry and adding to list:
mediaList = new ArrayList<MediaInfo>();
JSONObject jsonObj = new VideoProvider().parseUrl(url);
JSONArray categories = jsonObj.getJSONArray(TAG_RESULTS);
if (null != categories) {
for (int i = 0; i < categories.length(); i++) {
JSONObject category = categories.getJSONObject(i);
String title = category.getString(TAG_MSG);
if(title.length() > 25) title = title.substring(0, 24);
String subTitle = category.getString(TAG_MSG);
JSONObject media3 = category.getJSONObject(TAG_MEDIA3);
String videoUrl = media3.getString(TAG_URL);
JSONObject media1 = category.getJSONObject(TAG_MEDIA1);
String bigImageurl = media1.getString(TAG_URL);
JSONObject media4 = category.getJSONObject(TAG_MEDIA4);
String imageurl = media4.getString(TAG_URL);
String studio = category.getJSONObject(TAG_CREATEDBY).getString(TAG_USERNAME);
mediaList.add(buildMediaInfo(title, studio, subTitle, videoUrl, imageurl,
bigImageurl));
}
The above provides some type of 'media bundle' that can be provide to one of the many, types of calls to startCastControllerActivity(#Type) in the ccl class VideoCastManager
take a good look at the section /==VideoCastControllerActivity management ===/ in that class. It may help you out
Related
how to play multiple videos one after the another in xamarin cross platform form c#
I have tried using list, array but the problem is that the last video only gets played, rest videos just doesnt play.
MediaQueue mq = new MediaQueue();
MediaFile mf = new MediaFile();
if (PlayStopButtonText.Text == "Play")
{
/*string videoUrl1 = "https://archive.org/download/BigBuckBunny_328/BigBuckBunny_512kb.mp4";
string videoUrl = "https://sec.ch9.ms/ch9/e68c/690eebb1-797a-40ef-a841-c63dded4e68c/Cognitive-Services-Emotion_high.mp4";
CrossMediaManager.Current.Play(videoUrl,MediaFileType.Video,ResourceAvailability.Remote);
//CrossMediaManager.Current.PlayNext();*/
mf.Url = videoUrll;
mf.Type = MediaFileType.Video;
mq.Insert(0,mf);
mf.Url = videoUrl;
mf.Type = MediaFileType.Video;
mq.Insert(1, mf);
foreach(var item in mq)
{
CrossMediaManager.Current.Play(item.Url,MediaFileType.Video);
}
}
We have a native Android app that uses WebRTC, and we need to find out what video codecs are supported by the host device. (VP8 is always supported but H.264 is subject to the device having a compatible chipset.)
The idea is to create an offer and get the supported video codecs from the SDP. We can do this in a web app as follows:
const pc = new RTCPeerConnection();
if (pc.addTransceiver) {
pc.addTransceiver('video');
pc.addTransceiver('audio');
}
pc.createOffer(...);
Is there a way to do something similar on Android? It's important that we don't need to request camera access to create the offer.
Create a VideoEncoderFactory object and call getSupportedCodecs(). This will return a list of codecs that can be used. Be sure to create the PeerConnectionFactory first.
PeerConnectionFactory.InitializationOptions initializationOptions =
PeerConnectionFactory.InitializationOptions.builder(this)
.setEnableVideoHwAcceleration(true)
.createInitializationOptions();
PeerConnectionFactory.initialize(initializationOptions);
VideoEncoderFactory videoEncoderFactory =
new DefaultVideoEncoderFactory(eglBase.getEglBaseContext()
, true, true);
for (int i = 0; i < videoEncoderFactory.getSupportedCodecs().length; i++) {
Log.d("Codecs", "Supported codecs: " + videoEncoderFactory.getSupportedCodecs()[i].name);
}
I think this is what you are looking for:
private static void codecs() {
MediaCodecInfo[] codecInfos = new MediaCodecList(MediaCodecList.ALL_CODECS).getCodecInfos();
for(MediaCodecInfo codecInfo : codecInfos) {
Log.i("Codec", codecInfo.getName());
for(String supportedType : codecInfo.getSupportedTypes()){
Log.i("Codec", supportedType);
}
}
}
You can check example on https://developer.android.com/reference/android/media/MediaCodecInfo.html
I am writing an Android App to access Google Drive Photos via CloudRail service. I am able to authenticate to the Google account in question and see all my files/folders in the Google Drive, but I can't access photos from Google Photos.
While browsing through the Drive API documentation, it makes a reference to spaces, specifically 3 spaces are defined: drive, photos and allDataFolder.
Where do I specify the spaces that I am interested in? But default, the drive space is being accessed. Even though, I specifically specified scope for photos:
https://www.googleapis.com/auth/drive.photos.readonly
And when Google Authentication pages opens in the mobile browser, it states that my app wants to gain access to the user's Google Photos and I grant this access. But when calling CloudRail service to get children, no photos are visible
`googledriveChildren = mGoogledriveService.getChildren("/"); // returns goole drive top level files/folders
`googledriveChildren = mGoogledriveService.getChildren("/photos"); // generates a NotFoundException
I have already been down this path and achieved the integration - with the help/guidance from the folks at Cloudrail. You should note that my integration is limited to reading/downloading from Google Photos. I have not found any way to write/upload. Nor have I found any way of reading the album structure that can be set up in Google Photos.
First, you need to include the scope for Google Photos. I did this as follows:
public static final String GOOGLE_PHOTOS_SCOPE = "https://www.googleapis.com/auth/drive.photos.readonly";
private final AtomicReference<CloudStorage> googlephotos = new AtomicReference<>();
List<String> scope = new ArrayList<>();
scope.add(My_Constants.GOOGLE_PHOTOS_SCOPE);
googlephotos.set(new GoogleDrive(context, google_client_id, "", Get.GetString(R.string.google_redirect_uri),
Get.GetString(R.string.google_authentication_state), scope));
((GoogleDrive) googlephotos.get()).useAdvancedAuthentication();
You then need to build a Cloudrail advancedRequest to download whatever data you want. I download the metadata I require as follows:
CloudStrorage service = googlephotos.get();
private void searchForGooglePhotos(final CloudStorage service) throws Throwable {
GoogleDrive google_drive = (GoogleDrive) service;
boolean more = true;
String pageToken = null;
while (more) {
StringBuilder builder = new StringBuilder();
String query = URLEncoder.encode("mimeType='image/jpeg' and trashed = false", "utf-8");
builder.append("/files?spaces=photos");
if (pageToken != null) {
builder.append("&pageToken=");
builder.append(pageToken);
}
builder.append("&q=");
builder.append(query);
builder.append("&fields=nextPageToken,files(id,name,modifiedTime,description,size," +
"imageMediaMetadata(height,rotation,width,time))");
AdvancedRequestSpecification specification = new AdvancedRequestSpecification(builder.toString());
AdvancedRequestResponse response = google_drive.advancedRequest(specification);
#SuppressWarnings("unchecked")
Map<String, Object> resultObjectMap = (Map<String, Object>) response.getBodyJsonParsed();
pageToken = (String) resultObjectMap.get("nextPageToken");
#SuppressWarnings("unchecked")
ArrayList<Map<String, Object>> filesObjectMap = ((ArrayList<Map<String, Object>>) resultObjectMap.get("files"));
for (Map<String, Object> fileObjectMap : filesObjectMap) {
// process downloaded files
}
more = (pageToken != null);
}
}
Subsequently in my app I use Glide to download the photos themselves when required. In the Glide DataFetcher I obtain the inputStream using:
if (model.getSourceRecord().isTypeGooglePhotos()) {
AdvancedRequestSpecification specification;
AdvancedRequestResponse response;
if (model.getIsThumbnail()) {
specification = new AdvancedRequestSpecification("/files" + model.getSourceId() +
"?spaces=photos&fields=thumbnailLink");
response = ((GoogleDrive) service).advancedRequest(specification);
#SuppressWarnings("unchecked")
Map<String, Object> parsed = (Map<String, Object>) response.getBodyJsonParsed();
String link = (String) parsed.get("thumbnailLink");
specification = new AdvancedRequestSpecification(link);
specification.disableBaseUrl();
} else {
specification = new AdvancedRequestSpecification("/files" + model.getSourceId() + "?spaces=photos&alt=media");
}
response = ((GoogleDrive) service).advancedRequest(specification);
input_stream = response.getBodyAsStream();
} else {
if (model.getIsThumbnail()) {
input_stream = service.getThumbnail(model.getSourceId());
} else {
input_stream = service.download(model.getSourceId());
}
}
Here, "model" contains various info associated with each photo. The sourceId comes from the "id" downloaded:
String source_id = java.io.File.separator + fileObjectMap.get("id");
I hope this helps.
Would anyone arriving at this question / response please note that, as of mid Jan 2018, Google have "sunset" (sic) the photos space (spaces=photos above). This means that the above solution no longer works.
On the Google REST API documentation: "The photos space will sunset in early January 2018. Your users can continue to access Google Photos via the drive space by enabling the Google Photos folder in My Drive in the Drive client settings"
Ugh!
In my application Youtube videos were playing perfectly in InApp. But certainly 2 days before I got the following alert message "Sorry, this video cannot be played" while playing the video and videos are not playing. I have tried different youtube video links, but no hope. If I use this code :
Intent browserIntent = new Intent(Intent.ACTION_VIEW,
Uri.parse("http://www.youtube.com/embed/Ai47z6qh8S0"));
startActivity(browserIntent);
Videos are playing then in browser. But I need this video to be played inside the application.
Previously I used the following code to create the youtube url
public static String calculateYouTubeUrl(String pYouTubeFmtQuality, boolean pFallback,
String pYouTubeVideoId) throws IOException,
ClientProtocolException, UnsupportedEncodingException {
String lUriStr = null;
HttpClient lClient = new DefaultHttpClient();
HttpGet lGetMethod = new HttpGet(OpenYouTubePlayerActivity.YOUTUBE_VIDEO_INFORMATION_URL +
pYouTubeVideoId);
HttpResponse lResp = null;
lResp = lClient.execute(lGetMethod);
ByteArrayOutputStream lBOS = new ByteArrayOutputStream();
String lInfoStr = null;
lResp.getEntity().writeTo(lBOS);
lInfoStr = new String(lBOS.toString("UTF-8"));
String[] lArgs=lInfoStr.split("&");
Map<String,String> lArgMap = new HashMap<String, String>();
for(int i=0; i<lArgs.length; i++){
String[] lArgValStrArr = lArgs[i].split("=");
if(lArgValStrArr != null){
if(lArgValStrArr.length >= 2){
lArgMap.put(lArgValStrArr[0], URLDecoder.decode(lArgValStrArr[1]));
}
}
}
//Find out the URI string from the parameters
//Populate the list of formats for the video
String lFmtList = URLDecoder.decode(lArgMap.get("fmt_list"));
ArrayList<Format> lFormats = new ArrayList<Format>();
if(null != lFmtList){
String lFormatStrs[] = lFmtList.split(",");
for(String lFormatStr : lFormatStrs){
Format lFormat = new Format(lFormatStr);
lFormats.add(lFormat);
}
}
//Populate the list of streams for the video
String lStreamList = lArgMap.get("url_encoded_fmt_stream_map");
if(null != lStreamList){
String lStreamStrs[] = lStreamList.split(",");
ArrayList<VideoStream> lStreams = new ArrayList<VideoStream>();
for(String lStreamStr : lStreamStrs){
VideoStream lStream = new VideoStream(lStreamStr);
lStreams.add(lStream);
}
//Search for the given format in the list of video formats
// if it is there, select the corresponding stream
// otherwise if fallback is requested, check for next lower format
int lFormatId = Integer.parseInt(pYouTubeFmtQuality);
Format lSearchFormat = new Format(lFormatId);
while(!lFormats.contains(lSearchFormat) && pFallback ){
int lOldId = lSearchFormat.getId();
int lNewId = getSupportedFallbackId(lOldId);
if(lOldId == lNewId){
break;
}
lSearchFormat = new Format(lNewId);
}
int lIndex = lFormats.indexOf(lSearchFormat);
if(lIndex >= 0){
VideoStream lSearchStream = lStreams.get(lIndex);
lUriStr = lSearchStream.getUrl();
}
}
//Return the URI string. It may be null if the format (or a fallback format if enabled)
// is not found in the list of formats for the video
return lUriStr;
}
Please help me to figure out this issue.
Thanks Santanu
This kind of youtube link wont work on android application, although it works fine on browsers. To make it work on an application you need to get the RTSP link of the video first.
Refer to this thread and see if you can find a solution there.
Example of youtube link working on android(I have tested this one):
rtsp://v2.cache6.c.youtube.com/CjgLENy73wIaLwn_aij76iwMRRMYESARFEIJbXYtZ29vZ2xlSARSB3JlbGF0ZWRgnfqw4Jajx8xPDA==/0/0/0/video.3gp
<iframe width="637" height="358" src="http://www.youtube.com/embed/Ai47z6qh8S0?fs=1&feature=oembed" frameborder="0" allowfullscreen=""></iframe>
try this or
you need to have hardware acceleration turned on
Intent i = new Intent(Intent.ACTION_VIEW);
i.setData(Uri.parse("Video url"));
VideoActivity.this.startActivity(i);
//And add below code in Manifest.xml file.
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
There are some pdf files in my d drive.In my android application I have to return the list of files from a servlet to my activity class,but through my codes it is returning the name of all files as a string.But i need each name separetly.How to do that,
You a using HTTP to pass data from your servlet to your Android application. HTTP doesn't care about the content of the response, so there is no standard way of defining the structure of the data you are passing. You are pretty much free to do as you please.
One option therefore would be to put your filenames into a comma-separated list in the servlet. In your Android application you chop the String apart into an array of Strings each representing one file name:
String[] filenames;
filenames = responseString.split(",");
That's the easy way. You could also create XML from your filenames in your servlet, and parse the XML in your Android application. That would look nicer, but it is more work.
Use JSONObject
JSONObject jsEmployeeObj = new JSONObject();
JSONArray empArray = new JSONArray();
while (condition) {
JSONObject jsObject = new JSONObject();
jsObject.accumulate("name", res.getString("name"));
empArray.add(jsObject);
}
if ((empArray != null) && (empArray.size() > 0)) {
jsEmployeeObj.accumulate("employeeList", empArray);
} else {
jsEmployeeObj.accumulate("employeeList", "empty");
}
String output = jsEmployeeObj.toString();
return the string ... u can parse JSON in android