android ActiveSessions from MediaController - android

How restore all(or almost) session data of users activity by media controller(android.media.session.MediaController). at least Youtube and GP Music?
I can`t get pending intent from media controller for GPMusic (null). Intent from Youtube can restore only home screen(I wanna restore video and position)
List<MediaController> list = mgr.getActiveSessions(new ComponentName(getApplicationContext().getPackageName(), TestService.class.getName()));
...
PendingIntent pIntent = currentController.getSessionActivity()
pIntent.send();

This is discussed on the Android TV Developers community. Check out this post: https://plus.google.com/u/1/110913444113360071708/posts/HDs8UsyQwxL
From the MediaController you can get the Session activity to launch into the app. If you are not seeing it, then the app is not using MediaSession correctly, you can just launch the app using the package name. You can get the package name from the controller as well: mediaSessionController.getPackageName()
If you implemented OnActivieSessionsChangedListener correctly (which it sounds like you have if you made it this far) then the rest is dependent on the apps properly maintaining and updating their MediaSessions. Sadly, there are a lot of apps that do not maintain their MediaSession 100% correctly. It is possible that they do not update their metadata, playback state, or even not unregister their session correctly. Each app may act differently depending on how much they invested into MediaSession.

Related

Playing sounds sequentially on android's google chrome (given the new restrictions on playing sound)

I have a small app that plays sequential sounds (a teaching app playing the sillables of a word)
This could be accomplished by firing an event right after each sound stopped playing. Something like:
var sounds = new Array(new Audio("1.mp3"), new Audio("2.mp3"));
var i = -1;
playSnd();
function playSnd() {
i++;
if (i == sounds.length) return;
sounds[i].addEventListener('ended', playSnd);
sounds[i].play();
}
(source)
However, now android chrome has implemented some new restrictions on how to play sound: Sound events must all be fired by a user action.
So, when I run code very similar to the above, the first sound plays, and then I get
Uncaught (in promise) DOMException: play() can only be initiated by a user gesture.
How can a sequence of sounds, determined at run time, be played on Android's Chrome?
To start with, Google Chrome on Android has been having the limitation of not allowing applications to play HTML audio(s) without an explicit action by the user. However, it is different than how stock browser(s), in most cases, handles it.
The reason, as Chromium Org puts it, is that, Autoplay is not honored on android as it will cost data usage.
You may find more details on the same here.
Apart from the fact that this results in wastage of bandwidth, this also makes some sense, since mobile devices are used in public and in houses, where unsolicited sound from random Web sites could be a nuisance.
However, in the later versions, this idea was over ruled and Chrome on Android started allowing autoplay of HTML audios and videos on it. Again after a set of reviews and discussions, this feature was reverted to what it was, making it mandatory for a user action to invoke HTML audios and videos on the Chrome for Android.
Here is something that I found more on the same. As it says, the reason stated was that "We're going to gather some data about how users react to autoplaying videos in order to decide whether to keep this restriction". And hence the playing option without a user action was reverted back.
You can also find more about the blocking of _autoplay of audio(s) and video(s) here on Forbes and The Verge.
However, this is something that I can suggest you to try which will help you achieve what you intend to. All you have to do is copy this code and paste in your Chrome for Android. This helps you reset the flag which is default set to not allowing to play HTML audios and videos without user interaction:
chrome://flags/#disable-gesture-requirement-for-media-playback
OR
about:flags/#disable-gesture-requirement-for-media-playback
If the above procedure doesn't help/work for you, you can do this:
Go into chrome://flags OR about:flags (this will direct you to chrome://flags) and Enable the "Disable gesture requirement for media playback" option (which is actually the same as the above URL specified).

MediaStore.Audio.Media.RECORD_SOUND_ACTION not working in nougat

I am using MediaStore.Audio.Media.RECORD_SOUND_ACTION to open sound recorder application, I am not able to open application as no default application present, then i install two voice recorder application even though not able to see these application in my chooser intent. I am using following code-
Intent soundRecorderIntent = new Intent(); // create intent
soundRecorderIntent.setAction(MediaStore.Audio.Media.RECORD_SOUND_ACTION); // set action
startActivityForResult(soundRecorderIntent, ACTIVITY_RECORD_SOUND); // start activity
It works well in marshmallow
The code is correct and you've probably found an app that supports the intent by now. But to prevent that others waste their time like I did, here's a quick note:
Most of the currently top rated voice recording apps in the Play Store do not provide the necessary intent filter.
That seemed so unrealistic to me that I doubted my code when it didn't work after having installed the five most popular voice recording apps. But there's a handy manifest viewer app that reveals that their manifests simply do not declare the intent filter. So, as said, the code is correct.
To save you the time from searching for a suitable app, here are two that are invocable:
Audio Recorder from Sony
Samsung Voice Recorder from Samsung
There is no requirement for a voice recorder app to support this Intent action, and apparently your device does not ship with an app that supports this Intent action either. Your code itself seems fine.
If recording is optional to your app, catch the ActivityNotFoundException and tell the user that they do not have a recorder, and perhaps suggest to them that they install one that you test that works, if you can find one.
Otherwise, record the audio yourself, using MediaRecorder.

Why does an "Intent Selector" exists, what is it used for?

Android's Intent class provides an API called setSelector. I am trying to understand it from the example given in the documentation.
I want to ask that why did Android need to add this API ? What was breaking in Intent before this API ?
My understanding from reading the references is that the problem this API is intending to solve is where you want to send a launcher intent for an app that meets some general restrictions. Say you want to match all apps that open .mp3 files, but you don't want to actually open an mp3 file, you just want to launch an app that supports that. In that case, you could create a generic ACTION_MAIN, CATEGORY_LAUNCHER intent, and set the selector to an intent with an mp3 mime type or data URI.
Before this API there would be no way to do that - if you wanted to target an app that supports opening mp3s, you would have to send an intent for an mp3, which could either cause music to start playing, or cause the music player to throw an error. Also, depending on the music player's launch mode, the launcher intent may return to an existing instance of the music player, while the mp3 intent might create a new one.
According to my understanding, it gives choice to user which intent he wants to select. In that documentation they have given that it gives selection of intents whether user wants to open app's main activity or wants to launch any diff app/activity other than user's app. This is what i understood from that documentation. Check this links for your reference : https://code.google.com/p/android/issues/detail?id=67162 & http://grepcode.com/file/repository.grepcode.com/java/ext/com.google.android/android/4.4_r1/android/content/Intent.java#Intent.setSelector%28android.content.Intent%29

Google ChromeCast Playlist from Android Device

I'm using a sample app for the RemotePlaybackClient from #commonsware to play a video from a url to Google ChromeCast dongle, the app works like a charm but I would like to implement a playlist, any idea how to send a playlist to ChromeCast from an Android device?
As usual, I don't need code, just links, tutorials, etc... Tks.
Are you using a custom receiver?
If so, you can pass a json to such receiver with your playlist and manage that list with a playback state.
you might try looking at "mediaList" object here. Thats your playlist object.
This is a totally different project (not mediaRouter api but ccl instead) that i used because i wanted to implement a playlist and wanted to NOT take on my own receiver app. I wanted to see whether the default receiver could collaborate with an existing github sender sample - altered slightly to manipulate both a playList implemented in the "mediaList" AND to send appropriate and successive PLAY instructions to the default recieiver app when that app's state as relayed in normal "consumer" message traffic indicated state=ready.
D/ccl_VideoCastManager(31057): onApplicationStatusChanged() reached: Ready To Cast
So, when the default receiver fires the "ready" message, the senderApp can just call getNext to return an entry from "mediaList" and then send a "play(mediaInfo.entry)" to the default receiver.
onApplicationStatusChanged() is the interface used by the ccl to commmunicate/ sync player state between the local/remote players. When the default-remote-state changes to "ready to cast" you can use "VideoCastManager" and its base class to select the next MediaInfo entry and format a message for the remote to play it...
this.startCastControllerActivity(this.mContext, nextMediaInfo, 0, true);
code above from sender/ccl base tells the receiver to play the item that the sender has determine is next from list.
Note : i was advised to implement the playlist on a custom receiver app that i would write. Im not that ambitious and found a very simple hack on the sender/ccl classes that was reliable enough for me.

Casting video to ChromeCast by Youtube app

I tried using a Android phone to cast to Chromecast device by Youtube app. I added some Videos to queue, then I used another phone to cast to Chromecast device. The second one automatically knows the videos added to queue on the first one.
I don't know how Youtube app can do this?
EDIT I guess Youtube app uses one custom data channel besides Media channel. When Video is added to queue, sender app will send somethings (eg: videoId) to receiver. Receiver will save it in array of video ID. When another phone connects to Chromecast device, It'll receiver array of video ID from the receiver. Can anyone give other solutions? Thanks
I guess what you are asking is how you can create a play list, potentially shared by multiple devices. If that is the case, you have a couple of choices:
keep the playlist in the receiver: this is the simplest option. This will be a simple array on the receiver, kept in memory, which will go away when application ends. A custom receiver is required and it can implement the methods such as "append, insert, get, clear, ... to provide what the senders need. When each sender connects, it can ask (calling 'get' for example) for the current "queue" and then can modify the queue by other methods such as 'clear', 'append', 'insert', .... Note that there is no long-term persistence on the receiver (local storage is available but will be cleared as son as the app is gone).
keep the playlist in the cloud: you need to do most of the things that you do in the previous option but you also persist the playlist to the cloud; the advantage is that playlist lasts beyond the life of a session (this may or may not be desired). In addition, sender apps can potentially get the playlist fro the cloud directly, if needed.
The important thing is that the main storage for your playlist is not your sender devices; they don't know (and shouldn't know) abut the presence of other senders in the eco-system.
On the receiver side, we recently published a simple sample that sows how the notion of (local) playlist can be implemented; that is a simplified example but is enough to show that with minimal work, you can take advantage of the Media Channel; for more sophisticated handling of a shared queue, you definitely need an out-of-bound channel/namespace to handle all the additional api's that I mentioned above.

Categories

Resources