I tried using a Android phone to cast to Chromecast device by Youtube app. I added some Videos to queue, then I used another phone to cast to Chromecast device. The second one automatically knows the videos added to queue on the first one.
I don't know how Youtube app can do this?
EDIT I guess Youtube app uses one custom data channel besides Media channel. When Video is added to queue, sender app will send somethings (eg: videoId) to receiver. Receiver will save it in array of video ID. When another phone connects to Chromecast device, It'll receiver array of video ID from the receiver. Can anyone give other solutions? Thanks
I guess what you are asking is how you can create a play list, potentially shared by multiple devices. If that is the case, you have a couple of choices:
keep the playlist in the receiver: this is the simplest option. This will be a simple array on the receiver, kept in memory, which will go away when application ends. A custom receiver is required and it can implement the methods such as "append, insert, get, clear, ... to provide what the senders need. When each sender connects, it can ask (calling 'get' for example) for the current "queue" and then can modify the queue by other methods such as 'clear', 'append', 'insert', .... Note that there is no long-term persistence on the receiver (local storage is available but will be cleared as son as the app is gone).
keep the playlist in the cloud: you need to do most of the things that you do in the previous option but you also persist the playlist to the cloud; the advantage is that playlist lasts beyond the life of a session (this may or may not be desired). In addition, sender apps can potentially get the playlist fro the cloud directly, if needed.
The important thing is that the main storage for your playlist is not your sender devices; they don't know (and shouldn't know) abut the presence of other senders in the eco-system.
On the receiver side, we recently published a simple sample that sows how the notion of (local) playlist can be implemented; that is a simplified example but is enough to show that with minimal work, you can take advantage of the Media Channel; for more sophisticated handling of a shared queue, you definitely need an out-of-bound channel/namespace to handle all the additional api's that I mentioned above.
Related
According to the Release Notes (of July 8), the docs for the Sender and the updated answer of this question, the Styled Media Receiver of Google Cast does now support Closed Captioning or Subtitle tracks.
However, when I tell the Default or the Styled Media Receiver to show a text track, nothing happens. It does not even load the .vtt from the server, as I can see in the logs.
I can tell the receiver app got the text tracks just fine, but even using the Android example app, the subtitles never show up. According to all the logs, they are being sent and the receiver app is told to show them - but they never appear, they are never even loaded.
The MediaTrack is being created as follows:
new MediaTrack.Builder(2, MediaTrack.TYPE_TEXT)
.setName("Deutsch")
.setSubtype(MediaTrack.SUBTYPE_CAPTIONS)
.setContentId("https://example.com/video/caption_de.vtt")
.setContentType("text/vtt")
.setLanguage("de").build();
I have checked thrice that the file exists and is being loaded with the type text/vtt. But that does not matter, as the file is never even requested by the player. I have tried both MediaTrack.SUBTYPE_CAPTIONS and MediaTrack.SUBTYPE_SUBTITLES.
So I need to know, is this claimed support of CC in the Styled Media Receiver simply a lie? Or is there some undocumented trick required to make it possible?
If there is still a custom receiver required, I would like to know how to convert the example player to support subtitles, as it doesn't seem to support them either.
First, I suggest you change your wording in future posts (re: "..is simply a lie.."); that is not appropriate at all. Secondly, it works and you can test that with the CastVideos-android app (or ios variation of it for that matter); the first three videos have CC. Lastly, we have documentation on that subject on our documentation site (https://developers.google.com/cast/docs/android_sender, under "Using the Tracks API").
I'm using a sample app for the RemotePlaybackClient from #commonsware to play a video from a url to Google ChromeCast dongle, the app works like a charm but I would like to implement a playlist, any idea how to send a playlist to ChromeCast from an Android device?
As usual, I don't need code, just links, tutorials, etc... Tks.
Are you using a custom receiver?
If so, you can pass a json to such receiver with your playlist and manage that list with a playback state.
you might try looking at "mediaList" object here. Thats your playlist object.
This is a totally different project (not mediaRouter api but ccl instead) that i used because i wanted to implement a playlist and wanted to NOT take on my own receiver app. I wanted to see whether the default receiver could collaborate with an existing github sender sample - altered slightly to manipulate both a playList implemented in the "mediaList" AND to send appropriate and successive PLAY instructions to the default recieiver app when that app's state as relayed in normal "consumer" message traffic indicated state=ready.
D/ccl_VideoCastManager(31057): onApplicationStatusChanged() reached: Ready To Cast
So, when the default receiver fires the "ready" message, the senderApp can just call getNext to return an entry from "mediaList" and then send a "play(mediaInfo.entry)" to the default receiver.
onApplicationStatusChanged() is the interface used by the ccl to commmunicate/ sync player state between the local/remote players. When the default-remote-state changes to "ready to cast" you can use "VideoCastManager" and its base class to select the next MediaInfo entry and format a message for the remote to play it...
this.startCastControllerActivity(this.mContext, nextMediaInfo, 0, true);
code above from sender/ccl base tells the receiver to play the item that the sender has determine is next from list.
Note : i was advised to implement the playlist on a custom receiver app that i would write. Im not that ambitious and found a very simple hack on the sender/ccl classes that was reliable enough for me.
In chromecast
i want to send different kinds of url(mp4/mp3/png..) to the receiver,but how does the receiver to show them dynamically?
this is: how does the receiver recogonize what kind of the RemoteMedia received?
In the current version of the SDK, there is nothing from the framework side to help you with that directly. You can include the mimetype in the metadata and retrieve that on your receiver and do as you see fit. That said, if your media is only audio or video, things are better since the video element can handle both and you can just treat them the same but for images, you have to do some other work. Another approach is to look at the extension and try to guess the type but that is not fully reliable.
I am building an application where I would like to have a heart rate over a certain threshold trigger and event. I'm wondering if there's any way to do this by using data retrieved by another app on the phone (heart rate app) in my own application. I would rather not have to build a heart rate sensor from scratch!!
For example, using data from an app like this :
https://play.google.com/store/apps/details?id=com.macropinch.hydra.android&hl=en
Can I even do this? Or do I need a developers permission, or is the data output to files on the phone I can just read?
This is already discussed in another thread. Threre are mainly two methods, one is contentprovider and the other is sharedpreferences. But app that provides data should implement those. Data sharing between two applications
I'm wondering if there's any way to do this by using data retrieved
by another app on the phone (heart rate app) in my own application
If other application is providing remote service (look AIDL) then your application can call that service to achieve functionality.but AFAIK Cardiograph doesn't provide such service.
Or do I need a developers permission, or is the data outputted to
files on the phone I can just read?
The only option i see here is to obtain Souce Code for that obviously you need to contact developer :D
If this application is outputing data on SD card then you can read it.
I can start my application by simply putting the phone on a NFC-tag. But I would like to take the idea one step further. Imagine a simple time-tracking application with two NFC-tags. The first will start (and download) the application and register a starttime. The other will also start (and download) the application, but register a stoptime.
My problem I'd like to solve is that I don't want my phone to know about these tags. The application should not need to have a list of tag-ids programmed and know what actions that is connected to each id. The tag should carry the information needed to start the action on the phone with the correct parameters.
Are there any information about how to accomplish this scenario? I have installed "nfc-eclipse-plugin" but doesn't understand how to use it to get my goal and even less how to get my application to read the extra data.
Thanks in advance
Roland
Your tags should be capable of storing NDEF messages. Such messages are automatically read out by Android and passed to your app in an Intent. Automatically installing and/or starting your app can be accomplished by putting an Android Application Record in your tag. Any additional information ("start" or "stop" indication) can be stored in a proprietary record.
You probably want to put the AAR as the last record of the NDEF message, as it is detected and acted upon by Android automatically, but is only supported since ICS. To make automatic installation work with Gingerbread, you can put an additional URI record or SmartPoster record with a Google Play Store link in it as the first record of the message. Your app should then filter (ACTION_NDEF_DISCOVERED) for this URI, so it will also start automatically on Gingerbread.