Let's assume I have two androide devices. An AndroidTV and a smartphone.
On both devices I have installed my app, that is capable of video playback streamed from a server.
I want, similar to chromecast, to delegate the playback from my smartphone to the AndroidTV via my app.
How would I do that?
My starting point is this: Media Router here I go for Remote Playback.
If I understand this correctly,
the app on my AndroidTV device would be a MediaRoute Provider,
the app on my phone would use a MediaRoute to launch a CATEGORY_REMOTE_PLAYBACK-intent on the MediaRoute Provider?
This also means, that the callbacks handle all communication, so that I could differentiate via callbacks between passing the url and let the receiver app fetch title and images themself or let it be passed from the sender app?
Or am I on the wrong track?
the app on my AndroidTV device would be a MediaRoute Provider
No. MediaRouteProvider goes on the device that is the user's controller, which in this case would be the phone.
the app on my phone would use a MediaRoute to launch a CATEGORY_REMOTE_PLAYBACK-intent on the MediaRoute Provider?
You can do that. The advantage of using MediaRouteProvider is that any app on that phone that supports RemotePlaybackClient can now support your setup. If you only want your app to do this, you could skip MediaRouteProvider.
Then, either your app or your MediaRouteProvider is responsible for getting the request over to the Android TV device, and your app on it, by some means (WiFiDirect, Bluetooth, GCM over the Internet, whatever). There is nothing in the Android SDK that specifically addresses this -- you're on your own for rolling whatever protocol and connectivity you want. On the plus side, there's no built-in assumption in MediaRouteProvider about any particular way to deliver that information, meaning that you could be connecting to something that is not running Android at all.
Related
Lets imagine i want to build some app for videoconferences or videomeeting that is chromecast-compatible.
So i can use my android device to connect meeting, after that i just click something like "cast to device" and get audio and video on my TV.
Also i noticed that skype, zoom, discord and all apps like this has not bulid-in chromecast support. So maybe it is impossible?
I tried to find something about that but but found nothing useful. I found zoom sdk, and chromecasst sdk, but not seen what actually i shoud cast. As i understood chromecast can send only media-content to reciever, but zoom sdk not actually provide some media-like links for conversation(videoconference).
So i want to know what are my steps to build my own chromecast-compatible meeting app and what may i want to use to develop app like that?
Or maybe someone know how can i use Zoom sdk to achive what i want?
When a sender device connects to a Chromecast, it has a limited set of "commands" it can send to the device.
There is no immediate way to send data other than those messages from the sender to the device - there is a 'custom' message, but that too only includes stringified JSON
The way this would have to work is too set up a stream of the screen/app you want to display via Chromecast and send a LOAD message to the Chromecast which will connect back to the sender (or a third party server where the video can be streamed from) and play it.
I'm not too confident this is going to work as intended though - there will be a significant delay between the sender and the Chromecast, and since the Chromecast has no mic, you will also have to use the mic of the sender and deal with the acoustic feedback.
I created a SampleMediaRouteProvider as given in https://github.com/googlesamples/android-MediaRouter. In this sample there is also a MediaRouter that discovers all MediaRouteProviders in the network.
This SampleMediaRouteProvider is visible in the same device to other apps but its not visible to apps on other device in the same network.
Can you please help me to get MediaRouteProvider visible in wifi network.
Iam looking to create a MediaRenderer on a Android device that I could use to cast audio from other Android devices.
MediaRouteProvider doesn't really do what you're asking.
The MediaRouteProvider, which you've written, has the role of allowing apps on your phone - device A - to find other devices on the network (device B or C). Device B and C can play the media you have on device A. Device A might stream/mirror directly (say to device B) or 'cast' (say to device C).
When mirroring, you send the audio content out of device A to device B. When casting, you only send a URL to device C, for example, the link to a video on youtube.com. And device C goes directly to the source to get the content.
I think you will need to:
Put your MediaRouteProvider on every device.
Implement a discovery mechanism
for example, based on using SSDP, mDNS, etc.
have your mediarouter implmenent discovery
Implement a webserver on the device(s) with content
or use something like nanohttpd
Now Write your MediaRenderer
I am creating a application in android similar to Device Policy Administration that can remotely clear the data on the Android device.I have followed the sample in the android docs here http://developer.android.com/guide/topics/admin/device-admin.html
I need some assistance on how this is implemented.I have come across some apps that perform such tasks of remotely accessing the device(like clear data on the device,Ring the device to full volume).My doubt is
1.How did they implemented this functionality in android what concept have they used to send request to the app to ring the device?Is it via Push notification?
2.Also even if i mark Settings->Security->Device administrators->Android Device Manager(ADM)->Deactivate and from the console (https://www.google.com/android/devicemanager?u=0)
I Ring the Device...the device Rings.Shouldnt it not ring until and unless i dont activate the ADM.
3.Also even if i do not launch(start) the app ,I am still able to Ring my device.Does it mean that whenever i start my device my Device Administration App gets launched automatically?
Any help will be highly appreciable.
How did they implemented this functionality in android what concept have they used to send request to the app to ring the device?Is it via Push notification?
The Android Device Manager is a proprietary service and piece of software. You will need to get a job at Google, join the ADM team, and then learn how it is done.
That being said, a GCM-style push notification is a likely solution.
Does it mean that whenever i start my device my Device Administration App gets launched automatically?
No. Device administration != always running. Device administration status simply gives you access to other APIs that normal apps cannot use.
Note that a GCM-style push notification can be delivered to an app that is not already running.
The rest of your question has nothing to do with programming and everything to do with the proprietary implementation of ADM, which makes it off-topic for StackOverflow.
I started working on android (ics) nfc applications. Now I have the scenario of using an nfc smartphone an another nfc device which should communicate with each other.
Not a real problem, but I need a kind of Request/Response scenario. Like the nfc smartphone is held in front to the nfc device and sends an key to it via nfc. The device takes the key and starts communicating with an backend system and after that sends a response back to the nfc smartphone. The smartphone gets the response, handles it, smartphone can’t be removed from device and we’re done.
But I really need the response. I did not get this done via the nfc possibilities android ics (beam) is offering.
So therefore my question. Is this somehow possible??
And if yes could somehow explain how? Like using an external library or creating an own?
I think probably you need the NFC to enable the phone's Wifi-direct then do the two-way data transfer.
Or you may need to put the phone close to the device again, then do the beam again.
I'm building a Kiosk, using a MacMini and an Elotouch display. It would load a CoreAnimation based App. that plays multimedia content following user touch-based choices.
I'm in a early stage of the project. I can change the architecture/technology if needed.
I need that my Kiosk could also distribute mp3 content to Smartphones close to it, wirelessly.
For now I would like to support iOS and Android phones. I don't have any control on the smartphone side. The Kiosk is coin-operated (with time based session expiration) and connected to the web through a wifi network, managed by me.
Can you tell me a common, safe and simple way to accomplish this?
I thought to WebDav but I would like to explore alternatives, the simpler for the user-side the better.
If you have one-time/session-based URLs, displaying a QR code on your kiosk screen would be one way to get the download URL to the device (and invalidated after successful download/session expiration); this would require a QR code reader which neither iOS nor Android have built in, though, but many users have one.
Additionally, display the same URL that's encoded in the QR code using URL shortener service like bit.ly, goo.gl, etc. for the user to type in.
This way there's no set up for the user, no funny business with pushing data to the user (privacy/security concerns) and every smartphone does have a web browser.
The best way to accomplish this is by using OBEX Push and bluetooth.
There are plenty of command-line tools to list all bluetooth devices nearby, and to do a file transfer to one of them.
The user would just need to activate bluetooth discovery on his cell phone, search for the phone on the kiosk, and select his phone.
Another alternative is mail. WebDAV is a bad idea, because the user will have to type in the address (cumbersome!).
A lot of photo kiosks are already using OBEX Push to receive photos from phones.
The easiest is to provide different types of usb cables: micro-usb, mini-usb... Almost all cell phones can be attached as a USB disk nowadays.
Summary:
* OBEX Push
* USB connect
* Mail