How does the Twitch-application stream to Android? - android

I'm currently working on a project to develop a e-sport streaming calendar for a company. The app works fine but the problem is that twitch only lets you stream flash, and for Android that doesn't quite work after Google's decision to remove the support. Http-live-streaming isn't very well supported either so the group is currently at a dead end.
My question is therefore: How does the Twitch-application stream to Android?
It works on Android devices that doesn't support flash or HLS, so there should be another way do it.

My guess is it probably used HLS or RTSP(RTMP+RTSP is the most common scenarios) inside its flash client already, the Android app is just merely another stream client implementation.
As of HLS, it doen't need any kind of native support to work on Android, it's just plain simple HTTP, you can even write you own implementation if you want. The native MediaPlayer API Android has already provide implementation. It's the same for RTMP + RTSP.
So, as of your problem, there're two ways I can think to solve it:
Get a router that supports packet sniffering(maybe one router with OpenWRT flashed and tcpdump installed), and reverse-engineer the URL and protocol twitch Android client used, then use it in your app.
pros: no dependency on twitch app itself
cons: harder to pull off, may break if twitch changed its internal protocol
Reverse-engineer the Intent twitch app used to pass to its video player Activity, and mimic one of your own to allow user to open the player to watch the stream.
tools you may find useful: https://play.google.com/store/apps/details?id=uk.co.ashtonbrsc.android.intentintercept
pros: it's more reliable and more consistent
cons: may not work if the Intent is private, depends on user installing the twtich app
UPDATE:
I just found out Twitch website works on Android native browser, too. Seems like it used <video> tag from HTML5 standard. So the simplest solution could be just use a WebView to wrap around that stream page, but it's not good for user experience.
Alternatively, you could write a server-side code which accept a stream page URL as parameter and the video tag as an output and use regular expression or XPath or some XML parser library to extract the <video> tag to client. The client app can just set up a WebView with just that <video> tag inside it. This approach prevents your app from stopping to work if Twitch changes its page structure.
Also, if you wish not to use WebView, you can extract the src attribute of that <video> tag and play it with Android's native MediaPlayer AP if you want.

Related

Play Domain Restricted Video in Android/iOS App

I am building an Android & iOS App that has a video player, I am using one video hosting site(Wistia) for my videos. All videos are domain restricted, which means those will be played on a listed domain. The videos are getting properly played inside the web app(As we have allowed the video to be played for that domain) but I am not able to play those in my Android/iOS app.
Note: When I remove domain restriction from the video, then I am able to play the video in my app.
Can someone help me to find the domain of my Android app? Where should I define it in the code?
Below is the Wistia embedded code:
<script src="https://fast.wistia.com/embed/medias/j4q2kxdfd4.jsonp" async></script><script src="https://fast.wistia.com/assets/external/E-v1.js" async></script><span class="wistia_embed wistia_async_j4q2kxdfd4 popover=true popoverAnimateThumbnail=true" style="display:inline-block;height:84px;position:relative;width:150px"> </span>
Thank you.
Wistia is targeted at websites - they did have an iOS mobile app in the past but this was aimed more at contact owners, I believe, and is not supported anymore, either way.
They highlight this in their documentation (at time of writing):
Mobile OS Support
Most mobile devices only support HTML5 playback, which is Wistia’s default for mobile. This includes Android phones and tablets (4.1 and up), and iOS devices like iPhones and iPads.
To include Wistia in an app, the most recent way I have seen recommended by Wistia is to use a WebView and the standard embed code. This will allow you use the usual domain restrictions you have set.
The domain checking feature is most likely using the 'origin' or the 'referrer' field in the HTTPS request to determine the site the embed code is being used in. It is possible it is using a more complex mechanism than this but I think you will have to contact Wistia directly if and ask for support of that is the case.
Assuming it is this mechanism, you can look at the request headers in a browser inspector. For example, taking a site that uses Wistia and looks at the requests you will see something like this:
I've hidden the exact site name but both the origin and the referrer are the same top level domain name for the site hosting the videos.
The website on a mobile app will work the same way but if you are using a WebView in an Android app you will need to set the fields yourself, You may need to experiment as there seems to be different approaches but this is a good starting point: https://stackoverflow.com/a/5342527/334402
If you set these headers to a domain that is included in your set of allowed domains and the video still will not play then I think you will need to contact Wistia support directly.

WEBRTC onAddStream not Called in native APIS

I am working with videocall plugin for an app. Every thing is working fine on web but in mobile using react native , there is only support for onAddStream and not on track. So Call sets up fine but during renegotiation while converting audio call to video call, video from mobile to web is added successfully but video from web to mobile is not added as onAddStream event is not called while adding tracks to existing stream.
A workaround has been given in this link and this which is to return a new stream everytime with new track labels. If I do this and use add Stream instead of addTrack, will it raise compatibility issues with other browsers? Also as the peer connection is between browser and janus, or mobile and janus so does it need to be implemented in janus or the other peer? Is there any other workaround that I can use or has someone used renegotiation with older apis i.e onaddstream?
If this is the only option, can some one guide me on how to to change labels for tracks and streams either directly or by editing sdp etc.? The same solution is mentioned in this answer but unable to find any relevant code.
Looking forward to your answers.

What is the correct way developing website applications in Cordova?

I'm trying to do some simple website application for displaying my website and add some specific functionality to it.
My idea is to do something like Facebook app for mobile. Simply I need to display a website and replace File input - users should be able to capture a picture from camera or pick it from gallery (multiple select) and attach it to a post.
TL;DR;
Check images in the bottom.
What I have tried:
Using Cordova with Camera and Image picker plugin and displaying webpage in InnAppBrowser
Taking pictures with camera and picking pictures from gallery and then uploading them to server - there is a lot of examples of it.
What troubles I have found:
InnAppBrowser is forced fullscreen so I cannot resize it and place some buttons for picking pictures under it.
What do I need:
I just need to somehow attach images (from gallery or camera) to form file input or upload them to some kind of api instead - the api would process images on server and return some IDs which I can use instead of file input in the form on page to attach images to the post. Some hidden input where I would just insert IDs of uploaded images to be attached to the post (I'd write some if conditions into my PHP script).
I need my application to be multi-platform (Android, IOS, WP) so that is the reason I'm using Apache Cordova. I've tried lot of solutions and I've searched like for 5 hours. But I wasn't able to find anything useful.
Have somebody some experience in this way? Did somebody make some kind of that application?
If you can suggest any solution (it is not important to be a Cordova but it must be multiplatform) I'd be glad!
Thanks for your time!
Images
There is screen of desktop version with normal file input:
There is my vision of mobile application version with camera and image picker option right under web browser:
I guess I was not clear. The technical answer is Cordova/Phonegap are not for creating website applications. This means technically there is no "correct way" to do what you are asking.
For a website applications, all the pages are rendered from the website and controlled from the webpage/webbrowser.
For a mobile application, all the pages that the application can directly control are rendered on the mobile device. However, pages can be rendered (and/or created) from either the server or the mobile application, but the control of the page stays with the side that rendered (or created) the page. There is clear line between the two sides that can be moved, but at the *peril* of the programmer. (There are no points for being clever here, only added security issues.)
However, the Cordova and Phonegap do have plugins.The entire purpose is to use plugins to make certain task easier. However, there is a clear line between the phone and the website. To be clear on this last part, this means that all of the "plugin services" on the phone (accelerometer, contact list, etc.) are directly available to the application, and not the website. However, some of the "services" are also available as HTML5 APIs, such 'camera' and 'geolocation' – mixing the two is dangerous. The HTML5 APIs should remain on the webserver side, if used. The UX is different for HTML5. (I will not discuss HTML5 APIs any further, as they are beyond the scope of this discussion)
To make your idea work, you will need the following "core" (or equivalent third-party) plugins
file-transfer
camera (or equivalent)
inappbrowser
On the file-transfer and camera, you can do everything from the webserver, if you want. Then the only task for the end-user is to select the appropriate folder and image. If you do this from the server-side, then you CANNOT use the plugins.
If you want to use the plugins, then you cannot use a server-side generated webpage. You must create the form on the mobile device. This means the page and the form reside on the mobile device. However, if you write your webpage correctly you can dynamically add or delete elements. This means on the mobile side you have control over every step of the user experience and can enhance that experience.
On the inappbrowser, a common trick is to put the website in an iframe. However, you have no direct control on the iframe. Another common trick is to submit to the server via an API – then have the visible webpage update separately. Another common trick is to have a webpage with a websocket that could handle the webpage update. However, this could also be done with a push to the webpage, or have the webpage do polling of the server. Again, the App has NO direct control of the webpage.
This entire thread makes the following assumptions.
There is no "correct way" to do this task.
The images (photos) are stored on a website, and are publicly available for viewing.
It also assumes that no HTML5 APIs will be used.
If I interpreted your problem statement correctly, I believe what you are looking for is access to device native services - camera & gallery - from your mobile website.
A solution that fits your design requirements is for the browser to provide such services. Unfortunately WebKit and other browsers limit such support to things like Geoposition.
The way for Cordova to help you here is if your mobile website is an stand alone HTML5/CSS/JS application that can use CORS XHR or WebSockets to communicate with webindependent Web Services.
If you can bottle your website into a set of static html/js/css files that display content from dynamic web services then you are set. That same javascript can then call navigator.camera.getPicture(success, fail, options) and file-transfer the result to a waiting web service.
That camera api is not available to the InAppBrowser just as it is not available to WebKit Chrome/Safari/Edge. Trying to control the Mobile App via the InAppBrowser is most likely to fail due to security constraints.
What you might get away with is re-imaging your browser application as a series of discrete services that return raw html snippets suited for a new mobile app. Then write your Cordova app as the top level container that manages the navigation amongst the html snippets. This server-side rendering would be most useful if it was significantly challenging enough to overwhelm the mobile platform / web services pattern (think custom video server or expert system).
#Jakub,
Cedric has essentially stated it plainly. I will restate. You understanding about Cordova/Phonegap is not correct.
From: Top Mistakes by Developers new to Cordova/Phonegap
You have hit issue #5.
I QUOTE:
From Phonegap FAQ
A PhoneGap application may only use HTML, CSS, and JavaScript. However, you can make use of network protocols (XmlHTTPRequest, Web Sockets, etc) to easily communicate with backend services written in any language. This allows your PhoneGap app to remotely access existing business processes while the device is connected to the Internet.
In addition, Apple frowns on using apps as wrappers for websites.
Quote Apple iTunes Guidelines - 2.12
Apps that are not very useful, unique, are simply web sites bundled as Apps, or do not provide any lasting entertainment value may be rejected
To be clear, your idea may be valid, but you will likely need to rethink your internal workflow. You likely want to keep the same UI and UX.

Cordova Plugin to display audio information and controls on iOS and Android Lockscreen

I am looking for a plugin or plugins that would allow me to show audio metadata (Title, Artwork etc.) on the Lockscreen of both iOS and Android.
Along with that I would like to control the audio from the lockscreen (Play pause etc, it already happens in iOS I need it happen on Android).
Lastly the plugin should provide audio information within the control center in iOS and the notification area in Android of cause with the ability to Play Pause etc.
If it is easier to do with some native code tweaks then can someone please point that out also to a tutorial as to how it can be done.
Below are some examples of what I am speaking about
This plugin does what you're looking for: https://github.com/homerours/cordova-music-controls-plugin
As of writing it has a few issues with iOS html5 audio (see https://github.com/homerours/cordova-music-controls-plugin/issues/97 and https://github.com/homerours/cordova-music-controls-plugin/issues/66) but works across iOS, android and windows phone.
You could also try this, however I have never tried it. It uses the native players for iOS and Android, and will not work on chrome or your dev machine.
https://github.com/wnyc/cordova-plugin-playerhater
On my current project I have much the same need as you, (I am using HTML5 audio) and have had to put together a number of different plugins.
iOS Background Audio
https://github.com/AubreyHewes/cordova-background-audio
iOS Meta data:
https://github.com/ChoiZ/NowPlaying
https://github.com/shi11/RemoteControls
I am working on a solution for a modular android audio notification controls using the webintent plugin, but I am far from finishing it. I'll post back here when it's done.

Android/Adobe air - is it possible to embed air activity into an android application?

I am working on a live streaming (HLS) video player, however the built in android video view does not have good support for live streaming video below JellyBean 4.1/4.2, so I was looking into using adobe air to take care of the streaming video, however I have not found a way to embed an Adobe Air activity inside my Android application - I am only able to create a separate Air Mobile application that I launch from my original application! This isn't really ideal, so I was wondering if there was a way I could put them both into 1 application?
Also on a side note is there any way to pass data from an intent to adobe air? The only work around I've seen for passing Intent data is to create a url scheme intent filter and pass extra data in the url parameters as outlined here (again not ideal)
It is not posible because Adobe Air is not an activity. Adobe Air creates applications.

Categories

Resources