I have an Android app that I would like to add the ability for the user to "cast" what is displayed on the app to a Chromecast. It could just be a local JPG but I would prefer the user to actually see actual "live" content of the app. Does anyone know if this is possible? I know there are apps like AllCast but wasn't sure if they were using supported features of the SDK or if it was a hack. I found some mention of the Default Media Receiver but could not find any documentation on how to use it with local content. Any advice or direction would be appreciated.
There is no Cast api to do that directly; you can look into WebRTC or something of that nature.
The way I do it is to use the Presentation class. The only problem is that you do need to use the ChromeCast app to start screen mirroring before you start your app.
I have not yet found a way to start mirroring my app (or, to be more precise, to show the contents of your Presentation class) from a ChromeCast UIButton within my app, even though I have been able to get that cast button working and connecting ... just not to start app-mirroring when using only my in-app chromecast button.
Related
I did screen sharing (tokbox) for my application.
It works fine inside my app.
But I cannot share screen outside my app...
Can anyone plz help?
https://tokbox.com/developer/guides/screen-sharing/android/
I had also faced the same issue and had mailed to tokbox support.
This was their response:
The way our screen capture code works is that it recursively traverses the view hierarchy and copies those images to a buffer and then send that buffer over on the webrtc data pipe. Hence once the app is pushed to the background, we could not traverse the view hierarchy and copy the image, so screen sharing works until we are in the application (Android or iOS native app). If you want to share the screen view of Opentok app only, it will work but outside the app won't work. It's just to take care of the privacy and security aspects of the mobile app users.
So according to them you cannot share screen outside the application. It will only work when app is in foreground.
Update
After constantly asking the tokbox support team I got the following reply from them:
To screenshare the content outside of your application on Android and iOS can be achieved.
For Android, you need to use the Media Projection API together with Vonage/Tokbox Custom Capturer.
For iOS, you need to use the iOS ReplayKit together with Vonage/Tokbox Custom Capturer.
Basically, the implementation is to get a frame from Media Project API or Replaykit and then pass it via a custom capturer.
Following their response, I found Accelerator Core Android repo which showed how to integrate Media Projection API with tokbox.
More specifically these two files: ScreenSharingFragment.java and ScreenSharingCapturer.java
Using these two files I am now able to share screen outside my application.
Note:
Apps that target Android 9 (API level 28) or higher should use Foreground services or else your app will crash due to security reasons.
According to Tokbox, we can't share the screen outside the application.
Manik here from the Video API team.
To screenshare the contents outside of your application on the Android platform, you need to use the Media Projection API. In combination with the Media Projection API, you need to use a Custom Capturer.
We're working on a sample application that will allow you to accomplish this - please stay tuned!
What I need to know is, Is it even Possible to view an app within another
app?
I have tried to find out how but I found much topics speaking about launching apps from another only
For example: To have an application that views the file manager, specific music player and google chrome and be able to switch between them with something like tabs?
You will have to do an IPC for what you like to acheive. have a look at this link. Though am not sure if you can run it the way you want.
http://developer.android.com/guide/components/aidl.html
I am a student programmer and the topic my degree work is to finalize one of the input methods for touchscreen devices by visually impaired people (including the blind).
I want to make my application work correct with TalkBack. But I totally don't know, how to do it. I've found the package for accessibility, but it's not clear for me, how to it integrates with TB.
You can start with simple layout with ImageView and add android:contentDescription="your string" as a parameter in xml. Then turn on talkback and click on that image to see what happens.
Use android:contentDescription="Generic Image" in any View with any custom content.
Note: When using ViewGroup, should be careful of clicking through view.
Here is a example: https://github.com/dotrinhdev/AndroidTalkback
As an application developer, you don't need to specifically integrate your app with TalkBack. Instead, you should focus on providing correct data to the accessibility framework. This will ensure that your application works not only with TalkBack, but also with Braille and switch-based accessibility services.
See the Android Developer guide on Making Applications Accessible for an overview of what steps you need to take to ensure your application works correctly with accessibility services.
You may also want to watch the Google I/O 2012 talk Making Android Apps Accessible, which covers basic application accessibility.
I'm developing a web app for android phones and I'm curious if it is possible to initiate call to a user from web interface. So basically, I want to call someone when clicking on their phone number on the web page.
While testing on Samsung GT-I9103, I've noticed that it has this functionality: on a web page, when a user clicks on a number, screen for initiating phone call gets shown. So, there is a way to do what I want. But, this functionality doesn't exist on Sony Ericsson, what makes me believe this really depends on the manufacturer. Am I right?
Also, I've checked phonegap documentation (http://docs.phonegap.com/en/2.2.0/index.html), but can't find what I need.
So, is it possible to do this?
Thanks.
Yes, it is slightly functional through JavaScript, from what I have read.
Refer to:
http://developer.android.com/guide/webapps/webview.html
Section: Binding JavaScript code to Android code
I'm working on an application that requires support for forward locking of media files on Android (1.6 and above). Of course, there appears to be no documentation in the APIs on how this might work.
The two questions I have are firstly whether forward locking is supported on Android (and in which versions of the platform) and secondly how to implement it in a program that, for example, downloads DRMed ringtones and wallpapers.
Android isn't that big on DRM, because of the open source heritage of the product. I think you'll have to implement any DRM solution yourself.
Forward lock means that your application offers no way to use protected content on other devices. Literally this means that your application must not have functionality for forwarding (sending) protected content to other devices or for writing to a file system or something else like that. For an open platform like Android that as well might require encrypting/obfuscating your applications content store to prevent access from other apps or from a USB-connected host computer.
A forward lock is identified by a flag in the media metadata (format is media-dependent IMHO). The content is not encrypted. Thus a forward lock is a simple check per content item to disable forwarding functionality as needed.
The whole concept came from and worked with closed embedded devices (like typical mobile phones 5 years ago) and sounds strange with open platforms like Android.
You have to ensure the applications which may want to transfer content to another device via bluetooth etc dont get the permission to transfer it.
OMA has defined how forward lock works. Hence that way you are sure of what has to be done.
You have to check the header information of the content to get information on whether it has to be forward locked or not.