I'd like to code my phonegap app so at a minimum the audio that is streaming/playing can stay active across pages. Even better would be a way to have it interact with the OS and stop when a call comes in or they play their own music or something else.
Has anyone accomplished either of these things yet?
Thanks!
The PhoneGap Media API plays just fine in the background on both iOS and Android.
In iOS you have to make sure to set the required background modes to include playback of audio:
"Support for some types of background execution must be declared in advance by the app that uses them. An app declares support for a service using its Info.plist file. Add the UIBackgroundModes key to your Info.plist file and set its value to an array containing one or more of the following strings:
audio—The app plays audible content to the user while in the background. (This content includes streaming audio or video content using AirPlay.)"
I don't think anything extra is required for Android.
NOTE: If you have multiple "pages" in your app and music must continue to play amongst them, be sure that the page changes are not actually "file" changes (i.e.: you are still really displaying index.html, just changing content via js/ajax/etc). The common mobile JS frameworks handle this just fine (jQuery Mobile, jQTouch, etc etc).
Related
Background
Some teams in my company's facilities are using Android tablets for multiple operations. We would like to allow them to access video demos from their tablets, when they are at specific places in the facility, without having to search through the tablet for the right video. So we thought of QR codes (printed on paper, stuck to the wall). Problem is: for security reasons, we can't allow the devices to access any network.
Question
So the idea is to store the videos locally, and have the QR code route to the local file. I thought this would be easy but it doesn't appear to be.
I have generated QR codes like file:///path/to/my/file.mp4, also trying to play with intents, like this:
file:///path/to/my/file.mp4#Intent;scheme=file;action=android.intent.action.VIEW;type=video/mp4;end.
(+ variants including intent://, using scheme=file, targeting images instead of videos...)
In the best cases, the browser opens, closes, and I receive "Cannot display file. Invalid PDF", and in the worst cases, "Sorry, the application could not be launched. The bar code content may be invalid.".
I also tried variants (images instead of videos, URL vs plain text QR codes, several code scanner applications), all with the same result. I have searched through SO, but most questions are about doing this programmatically in an application, while I would like to avoid designing an app just for this.
What am I doing wrong? Is there no way to do this that way?
Note: I had absolutely no knowledge about Android and intents before trying to solve this case, there may be smarter workarounds. Any hint is appreciated.
Given the network requirement within your company, it seems building a simple application is the easiest solution(maybe because I am an app developer).
So your app would need only 2 screens. QR code Scanner and a View that shows the video.(once you click done, go back to scanning)
The app can have all the videos you need prebuilt in the app itself and the QR codes can be the names of those videos.
Okay, I don't know if you are using web page or any android app for this fature, for web page you need to create your own scanner or integrate ready made solution i found 1 here, you will need setTimeinterval so it scans bar code once, Once barcode is found you will need to call your ajax method to fetch complete URL of that video, once any item is returned, you need to display it in iframe.
Hope it works for you.
I did screen sharing (tokbox) for my application.
It works fine inside my app.
But I cannot share screen outside my app...
Can anyone plz help?
https://tokbox.com/developer/guides/screen-sharing/android/
I had also faced the same issue and had mailed to tokbox support.
This was their response:
The way our screen capture code works is that it recursively traverses the view hierarchy and copies those images to a buffer and then send that buffer over on the webrtc data pipe. Hence once the app is pushed to the background, we could not traverse the view hierarchy and copy the image, so screen sharing works until we are in the application (Android or iOS native app). If you want to share the screen view of Opentok app only, it will work but outside the app won't work. It's just to take care of the privacy and security aspects of the mobile app users.
So according to them you cannot share screen outside the application. It will only work when app is in foreground.
Update
After constantly asking the tokbox support team I got the following reply from them:
To screenshare the content outside of your application on Android and iOS can be achieved.
For Android, you need to use the Media Projection API together with Vonage/Tokbox Custom Capturer.
For iOS, you need to use the iOS ReplayKit together with Vonage/Tokbox Custom Capturer.
Basically, the implementation is to get a frame from Media Project API or Replaykit and then pass it via a custom capturer.
Following their response, I found Accelerator Core Android repo which showed how to integrate Media Projection API with tokbox.
More specifically these two files: ScreenSharingFragment.java and ScreenSharingCapturer.java
Using these two files I am now able to share screen outside my application.
Note:
Apps that target Android 9 (API level 28) or higher should use Foreground services or else your app will crash due to security reasons.
According to Tokbox, we can't share the screen outside the application.
Manik here from the Video API team.
To screenshare the contents outside of your application on the Android platform, you need to use the Media Projection API. In combination with the Media Projection API, you need to use a Custom Capturer.
We're working on a sample application that will allow you to accomplish this - please stay tuned!
I am trying to create a music player using flutter and I want to make it show on openWith screens and share screens whenever an audio track is selected to be opened or shared or when the system output a list of defaults to chose from when setting a default for audio tracks.
I am using multiple android/iOS plugins in an interconnected way, and would like not to have to rely more on plugin as it really drops the performance and create certain lag that damages the UX.
is there a way to do this in dart directly ?
You could use this plugin: https://pub.dev/packages/receive_sharing_intent
The only thing you need to do is add extra configuration in your AndroidManifest.xml and Info.plist. See the example page of the plugin:
https://pub.dev/packages/receive_sharing_intent#-example-tab-
Edit: I see you didn't want to rely on extra plugins. Sadly there isn't a way to do this without adding native code to your project.
I´m using Delphi XE5 for an app that creates an .avi video file for the user.
After successfully creating it (or even just because the user wants to watch the video again) it tries to open it using the app of choice for videos.
I understand that android picks the right app for the file type or you can direct it providing the MIME type of the file.
So, using intents from Delphi I´m doing
intent:=TJIntent.Create;
intent.setAction(TJIntent.JavaClass.ACTION_VIEW);
intent.setData(TJnet_Uri.JavaClass.parse(StringToJString('file://'videoFile)));
intent.setType(StringToJString('video/avi'));
When I do this, a very basic video player appears (the Running Apps section of Setting show an app called Android Media running) with a play button but does not show the video.
But if you use any File Manager or even go through the Gallery and click on the file it plays nicely using the View Video app.
I can´t make my app call View Video directly, not even show a list of video apps for the user to choose it.
I tried different MIME types like
intent.setType(StringToJString('video/*'));
and even
intent.setType(StringToJString('*/*'));
Which lets the user choose from any app in its device (nosense but just for testing) so he can choose View Video but even doing that the app shows only a play button and displays no content when pressed.
So, it looks like when called from my app, View Video can´t play the video I created (I do not have it opened or something like that, checked) but when called from other apps it can.
Does anybody know of bugs or limitations when using intents from Delphi XE5 or maybe I´m not doing it right?
intent.settype in Delphi XE5 was ruining the contents of the data property.
Using another setter like SetDataAndType works well. Use that one!
Not sure if they fixed it in XE6 or XE7.
I would like to create an ermergency call application : if triggered, it calls a given number and play an audio file, giving the information the caller couldn't give himself.
For that I need to engage a call but ensure that I can replace any sound from the speaker with an played audio file. Can I do that in android ? What's the way ?
You can't do this with the G1 at this time, because two different processors handle the call and the apps, and there is no path between them. AFAIK this isn't in the SDK yet, but assuming hardware comes out that can do it, it will be added to the SDK.
http://groups.google.com/group/android-developers/browse_thread/thread/d04c307973345fef/a628e578900b3dce?lnk=gst&q=dave+sparks+play+audio#a628e578900b3dce