Does anyone know a method to communicate commands and receive events from flash objects using Android?
Flash media objects are quite handy for handling online media streams and I am wondering if we can control those objects to play media from an Android app.
A typical case will be: a flash object embedded in a webview and this webview is loaded by the Android application. The application send "play" commands to the object to start playing the stream and gets "finish" events when reaching the end of the streamed media.
create a webview in your main activity class (or anywhere)
var mainW = new WebView(this);
enable the needed features of your webview
mainW.getSettings().setPluginsEnabled(true);
mainW.getSettings().setAllowFileAccess(true);
mainW.getSettings().setPluginState(mainW.getSettings().getPluginState().ON);
place your html file somewhere which contains the swf and load it's location in webview
mainw.loadUrl(url);
Related
--First off I am aware this is invasive, I'm trying to make an alarm--
I'm looking to create an Android app that sets an alarm which will play Twitch audio without any interaction from the user. For the alarm part I'm using AlarmManager
However I am struggling to get twitch to autoplay on the locked phone screen when the alarm triggers.
My two leading ideas for how to do this (on Android) are:
Idea#1: Embed twitch player in a webpage that is shown on the lockscreen when the alarm triggers:
Twitch Embed
Fullscreen intent
However the player is muted and I haven't been able to find a way around this (and doesn't seem like it is supported)
[Similar issue #1]https://discuss.dev.twitch.tv/t/i-am-unable-to-mute-unmute-video-clips-in-twitch-preview/31585
[Similar issue #2]https://discuss.dev.twitch.tv/t/how-to-unmute-clips-on-load/26054/2
Idea #2: Just launch a stream using the Twitch app while the screen is locked. This is currently how I am starting a stream for Youtube audio and it launches while the phone is locked in the background fine. I know there is some functionality for this in the Twitch android app due to being able to play streams without the app open by changing settings in a stream. However currently if I try to launch a stream using Twitch while the screen is locked, as far as I can tell the video doesn't load until the user opens the phone.
So yeah, Is there a way to play Twitch audio without interaction from the user when the phone is asleep?
Thanks
Edit added code as text
I managed to solve this using solution #1 (Fullscreen intent with android webview). However instead of embeding the native twitch player, I used android webview to open the twitch website directly. The important part to make the videos not muted was to set the mediaPlaybackRequiresUserGesture flag to false. Code is below
`
val rO = intent.getSerializableExtra("RingtoneOption") as RingtoneOption
val url = rO.liveContentURL
setContentView(R.layout.activity_full_screen_alarm)
turnScreenOnAndKeyguardOff()
var myWebView = findViewById<View>(R.id.webview) as WebView
val webSettings = myWebView.settings
webSettings.javaScriptEnabled = true
webSettings.mediaPlaybackRequiresUserGesture = false;
myWebView.webViewClient = WebViewClient()
myWebView.webChromeClient = WebChromeClient()
myWebView.loadUrl(url)
`
I have an android app that plays some video content. The video is mp4 with some simple custom encryption.
In android the player (ExoPlayer) decrypts the video in real time while playing.
It uses a code like this:
// overriding the function that reads the video file to insert the decryption
public int read(byte[] buffer, int offset, int readLength) throws FileDataSourceException {
// ...
// buffer[] holds the video bits, decrypt them here
buffer[offset] = (byte)(buffer[offset] ^ 1234);
// ...
}
}
I now want to add support for Chrome cast - to be able to stream the video from a mobile phone to tv.
However looking at the api I cannot see a way to implement my decryption algorithm.
From what I see it supports either unencrypted videos or videos with some standard DRM.
Is it possible to implement a custom encryption, similar to the code above?
The new CAF framework provides three different options:
Styled Media Receiver
Custom Receiver
Default Media Receiver
The only one which supports DRM is the custom receiver and as you say it is designed for the standard DRM's.
However it should support CENC clear key which may be enough protection for your needs and will allow you avoid using a DRM service.
CENC clear key has the key in the clear as the name suggests. It is not very secure but it may be enough of a 'hurdle' (which is essentially what most security systems are) for you anyway.
I have a small app that plays sequential sounds (a teaching app playing the sillables of a word)
This could be accomplished by firing an event right after each sound stopped playing. Something like:
var sounds = new Array(new Audio("1.mp3"), new Audio("2.mp3"));
var i = -1;
playSnd();
function playSnd() {
i++;
if (i == sounds.length) return;
sounds[i].addEventListener('ended', playSnd);
sounds[i].play();
}
(source)
However, now android chrome has implemented some new restrictions on how to play sound: Sound events must all be fired by a user action.
So, when I run code very similar to the above, the first sound plays, and then I get
Uncaught (in promise) DOMException: play() can only be initiated by a user gesture.
How can a sequence of sounds, determined at run time, be played on Android's Chrome?
To start with, Google Chrome on Android has been having the limitation of not allowing applications to play HTML audio(s) without an explicit action by the user. However, it is different than how stock browser(s), in most cases, handles it.
The reason, as Chromium Org puts it, is that, Autoplay is not honored on android as it will cost data usage.
You may find more details on the same here.
Apart from the fact that this results in wastage of bandwidth, this also makes some sense, since mobile devices are used in public and in houses, where unsolicited sound from random Web sites could be a nuisance.
However, in the later versions, this idea was over ruled and Chrome on Android started allowing autoplay of HTML audios and videos on it. Again after a set of reviews and discussions, this feature was reverted to what it was, making it mandatory for a user action to invoke HTML audios and videos on the Chrome for Android.
Here is something that I found more on the same. As it says, the reason stated was that "We're going to gather some data about how users react to autoplaying videos in order to decide whether to keep this restriction". And hence the playing option without a user action was reverted back.
You can also find more about the blocking of _autoplay of audio(s) and video(s) here on Forbes and The Verge.
However, this is something that I can suggest you to try which will help you achieve what you intend to. All you have to do is copy this code and paste in your Chrome for Android. This helps you reset the flag which is default set to not allowing to play HTML audios and videos without user interaction:
chrome://flags/#disable-gesture-requirement-for-media-playback
OR
about:flags/#disable-gesture-requirement-for-media-playback
If the above procedure doesn't help/work for you, you can do this:
Go into chrome://flags OR about:flags (this will direct you to chrome://flags) and Enable the "Disable gesture requirement for media playback" option (which is actually the same as the above URL specified).
I have a problem. When I play HTML 5 video object in my webview (<video>), the LogCat returns several error messages like this one:
E/MediaPlayer(17981): mOnBufferingUpdateListener is null.
Failed to send MEDIA_BUFFERING_UPDATE message.
I understand that several listeners are not implemented yet but I am not able to get the player instance.
EDIT
I am on ICS (4.0.3) and I have already tried this solution :
WebView and HTML5 <video>
I have an Android app that opens a file using an external application (audio, video files launch the default media player, etc.)
I need to be able to catch if there were any errors while launching the external application. For example, when I launch a .au file (audio) from my app and the media player app says "sorry, the player does not support this type of file" I would like to catch this exception somehow.
Is there a way for my app to detect that error? I tried using startActivityForResult and overriding onActivityResult, but external apps don't seem to return a data object or response code. Instead, both variables are null
Looks like there is no way to detect errors from an external app. My app will have to rely on user input as to whether or not their file launched successfully