How to implement on-demand Play Asset Delivery in flutter through methodchannels.
Actually i am trying to make a dashboard for some app which accesses assets using ContentProvider so i thought play asset delivery might work here.
I know that same can be achieved using deferred components and I have already tried deferred components which is provided by flutter.
You can find it here
Currently, there is an issue with flutter which causes assets to not load when deferred. You can find a link to the issue here.
I do not have any idea of native language and this is the only option I have right now which is implementing a methodchannel so any help would be appreciated
I dont think you can achieve the same thing using play asset delivery. You can try a workaround for this
Disable android:enabled by default by adding android:enabled="false" in your content provider and then use below methodchannel to enable it later
package dev.blah.blah;
import androidx.annotation.NonNull;
import io.flutter.embedding.android.FlutterActivity;
import io.flutter.embedding.engine.FlutterEngine;
import io.flutter.plugin.common.MethodChannel;
import android.content.ContextWrapper;
import android.widget.Toast;
public class MainActivity extends FlutterActivity {
private static final String CHANNEL = "dev.dhanraj.kwgt.test.dashboard";
#Override
public void configureFlutterEngine(#NonNull FlutterEngine flutterEngine) {
super.configureFlutterEngine(flutterEngine);
new MethodChannel(flutterEngine.getDartExecutor().getBinaryMessenger(), CHANNEL)
.setMethodCallHandler(
(call, result) -> {
// Note: this method is invoked on the main thread.
if (call.method.equals("enable")) {
ContextWrapper aContext = new ContextWrapper(getApplicationContext());
aContext.getPackageManager().setComponentEnabledSetting(new android.content.ComponentName(aContext, "org.kustom.api.Provider"), android.content.pm.PackageManager.COMPONENT_ENABLED_STATE_DISABLED, 1);
result.success(null);
Toast.makeText(this, "Done", Toast.LENGTH_SHORT).show();
} else{
result.notImplemented();
}
}
);
}
}
Related
I am working with firebase messaging. I followed the steps as given in readme of the plugin. But my application .java is giving an error.
Application.java
package com.app.demoapp;
import com.transistorsoft.flutter.backgroundfetch.BackgroundFetchPlugin;
import io.flutter.plugin.common.PluginRegistry.PluginRegistrantCallback;
import io.flutter.app.FlutterApplication;
import io.flutter.plugin.common.PluginRegistry;
import io.flutter.plugins.GeneratedPluginRegistrant;
import io.flutter.plugins.firebasemessaging.FlutterFirebaseMessagingService;
public class Application extends FlutterApplication implements PluginRegistry.PluginRegistrantCallback {
public void onCreate() {
super.onCreate();
FlutterFirebaseMessagingService.setPluginRegistrant(this);
BackgroundFetchPlugin.setPluginRegistrant(this);
}
#Override
public void registerWith(PluginRegistry registry) {
GeneratedPluginRegistrant.registerWith(registry);
}
}
Error:
error: cannot find symbol
FlutterFirebaseMessagingService.setPluginRegistrant(this);
^
symbol: method setPluginRegistrant(Application)
location: class FlutterFirebaseMessagingService
1 error
I have faced the same problem and so far I have not found any solution
but If you want just show notification with out handle it in background and just lunch app when click on it
remove FlutterFirebaseMessagingService.setPluginRegistrant(this);and the notification will work fine as Notification messages type
if you don't know about Notification type in fcm
refer to Message types
With FCM, you can send two types of messages to clients:
1- Notification messages, sometimes thought of as "display messages."
These are handled by the FCM SDK automatically.
2- Data messages, which are handled by the client app.
so we use Notification messages here until find solution for handle Data messages
Looks like the instructions file is outdated, it was missing a very important step that you can check at the github repository README
Add the com.google.firebase:firebase-messaging dependency in your app-level build.gradle file that is typically located at /android/app/build.gradle.
dependencies {
// ...
implementation 'com.google.firebase:firebase-messaging:20.1.0'
}
just set with
import io.flutter.app.FlutterApplication;
import io.flutter.plugin.common.PluginRegistry;
import io.flutter.plugin.common.PluginRegistry.PluginRegistrantCallback;
import io.flutter.plugins.firebasemessaging.FlutterFirebaseMessagingService;
public class Application extends FlutterApplication implements
PluginRegistrantCallback {
#Override
public void onCreate() {
super.onCreate();
FlutterFirebaseMessagingService.setPluginRegistrant(this);
}
#Override
public void registerWith(PluginRegistry registry) {
FirebaseCloudMessagingPluginRegistrant.registerWith(registry);
}
}
And make FirebaseCloudMessagingPluginRegisttrant java
I'm attempting to update a plugin for a react native project that was originally developed and worked in 0.40.0.
Our current project is now on:
react-native-cli: 2.0.1
react-native: 0.60.0
In the code we have this method:
private void sendEvent(ReactContext reactContext,
String eventName,
#Nullable Object params) {
reactContext
.getJSModule(RCTNativeAppEventEmitter.class)
.emit(eventName, params);
}
Which uses this import to get the Nullable class:
import android.support.annotation.Nullable;
The problem is that when we run react-native run-android
We get an error:
error: package android.support.annotation does not exist
import android.support.annotation.Nullable;
^
Any idea how I can get access to this Nullable class again or re-create the functionality such that I can pass in either a null field or like an int?
Examples of how I'd call it:
public void UpdatePoints(int points) {
sendEvent(this.reactContext, "UpdatePoints", points);
}
#Override
public void onPointsEarned() {
sendEvent(this.reactContext, "onPointsEarned", null);
}
I found another SO article that recommended this:
// build.gradle
implementation "androidx.annotation:annotation:1.1.0"
// where use it
import androidx.annotation.Nullable;
But that didn't work. I've found a few things mentioning a transition from support to androidX and it seems like this may be related.
Looks like all you need to do is handle androidx by adding import androidx.annotation.Nullable; instead of import android.support.annotation.Nullable;
My Task is as follows: using IBM MobileFirst create a Hybrid app and implement a JS calculator. show date retrieved from native java APIs to the web page.
My attempts:
I followed Documentations here and implemented the whole Native code onCreate method
I found this answer"the first one" illustrating that i should use it on onInitWebFrameworkComplete,
Solution provided didn't work
I am working with MobileFirst version 7
full sample code is provided
Suggestion: should i create the whole action bar in native code then merge it in the cross ui, is that available? I only need to send a petite string of date
I am not clear on your attempts, so here is a quick demonstration how to click a button in HTML and trigger the Send Action API to get the current Date in Java and return it to JavaScript, and then display it.
index.html
<button onclick="getDateFromJava();">show current date from Java</button>
main.js
function wlCommonInit(){
WL.App.addActionReceiver ("returneDdateFromJava", returnedDateFromJava);
}
function getDateFromJava() {
WL.App.sendActionToNative("retrieveDate");
}
function returnedDateFromJava(received){
if (received.action === "returnedDateFromJava"){
alert (JSON.stringify(received));
}
}
main Java class file
Find onInitWebFrameworkComplete
Add an ActionReceiver after the else:
import com.worklight.androidgap.api.WLActionReceiver;
...
...
public void onInitWebFrameworkComplete(WLInitWebFrameworkResult result){
if (result.getStatusCode() == WLInitWebFrameworkResult.SUCCESS) {
super.loadUrl(WL.getInstance().getMainHtmlFilePath());
} else {
handleWebFrameworkInitFailure(result);
}
ActionReceiver ActionReceiver = new ActionReceiver();
WL.getInstance().addActionReceiver(ActionReceiver);
}
ActionReceiver class
package com.getDateApp;
import java.util.Date;
import org.json.JSONException;
import org.json.JSONObject;
import com.worklight.androidgap.api.WL;
import com.worklight.androidgap.api.WLActionReceiver;
public class ActionReceiver implements WLActionReceiver{
public void onActionReceived(String action, JSONObject data){
if (action.equals("retrieveDate")){
Date date = new Date();
JSONObject returnedDate = new JSONObject();
try {
returnedDate.put("dateFromJava", date);
} catch (JSONException e) {
e.printStackTrace();
}
WL.getInstance().sendActionToJS("returnedDateFromJava", returnedDate);
}
}
}
I have been coding in Flash for a while, though I haven't worked a whole lot on AIR Mobile yet. That being said, when debugging on an Android device via USB, Video objects sometimes work just fine when Camera or NetStream objects are attached, and sometimes they refuse to do anything. In the case of a NetStream, the audio is also missing. Also in the case of a NetStream object, if I go to the device's desktop, then come back to the app without ever closing it, then the Video object will suddenly start working.
I have two basic methods of reproducing this:
1: For a Camera:
package
{
import flash.display.Sprite;
import flash.display.StageAlign;
import flash.display.StageScaleMode;
import flash.events.Event;
import flash.events.TimerEvent;
import flash.media.Camera;
import flash.media.Video;
import flash.utils.Timer;
public class CameraExample extends Sprite
{
private var m_cam:Camera;
private var m_vid:Video = new Video();
private var m_tmr:Timer = new Timer(1000);
public function CameraExample()
{
stage ? init() : addEventListener(Event.ADDED_TO_STAGE, init);
function init(pEvent:Event = null):void {
// support autoOrients
stage.align = StageAlign.TOP_LEFT;
stage.scaleMode = StageScaleMode.NO_SCALE;
m_tmr.addEventListener(TimerEvent.TIMER, onTimer);
m_tmr.start();
}
}
private function onTimer(pEvent:TimerEvent):void
{
if (m_cam = Camera.getCamera())
{
m_tmr.removeEventListener(TimerEvent.TIMER, onTimer);
m_tmr.stop();
m_vid.attachCamera(m_cam);
addChild(m_vid);
trace("here")
}
}
}
}
2: For a NetStream (this is two programs):
package
{
import flash.display.Sprite;
import flash.events.Event;
import flash.events.NetStatusEvent;
import flash.events.TimerEvent;
import flash.media.Camera;
import flash.media.Microphone;
import flash.net.NetConnection;
import flash.net.NetStream;
import flash.utils.Timer;
public class NetStreamExample1 extends Sprite
{
private var m_nc:NetConnection = new NetConnection();
private var m_ns:NetStream;
private var m_cam:Camera;
private var m_mic:Microphone;
private var m_tmr:Timer = new Timer(1000);
public function NetStreamExample1()
{
m_tmr.addEventListener(TimerEvent.TIMER, onTimer);
m_tmr.start();
}
private function onTimer(pEvent:TimerEvent):void
{
if (m_cam = Camera.getCamera())
{
m_tmr.removeEventListener(TimerEvent.TIMER, onTimer);
m_tmr.stop();
m_mic = Microphone.getMicrophone();
m_nc.addEventListener(NetStatusEvent.NET_STATUS, onNetStatusNC);
m_nc.connect("rtmp://SomeIPAddress/SomeApp/0");
}
}
private function onNetStatusNC(pEvent:NetStatusEvent):void
{
if (pEvent.info.code == "NetConnection.Connect.Success")
{
m_ns = new NetStream(m_nc);
m_ns.attachCamera(m_cam);
m_ns.attachAudio(m_mic);
m_ns.publish("0", "live");
}
}
}
}
And:
package
{
import flash.display.Sprite;
import flash.display.StageAlign;
import flash.display.StageScaleMode;
import flash.events.Event;
import flash.events.NetStatusEvent;
import flash.events.TimerEvent;
import flash.media.Video;
import flash.net.NetConnection;
import flash.net.NetStream;
import flash.utils.Timer;
public class NetStreamExample2 extends Sprite
{
private var m_nc:NetConnection = new NetConnection();
private var m_ns:NetStream;
private var m_vid:Video = new Video();
private var m_tmr:Timer = new Timer(5000, 1);
public function NetStreamExample2()
{
stage ? init() : addEventListener(Event.ADDED_TO_STAGE, init);
function init(pEvent:Event = null):void {
// support autoOrients
stage.align = StageAlign.TOP_LEFT;
stage.scaleMode = StageScaleMode.NO_SCALE;
addChild(m_vid);
m_nc.addEventListener(NetStatusEvent.NET_STATUS, onNetStatusNC);
m_nc.connect("rtmp://SomeIPAddress/SomeApp/0");
}
}
private function onNetStatusNC(pEvent:NetStatusEvent):void
{
trace("m_nc: " + pEvent.info.code);
if (pEvent.info.code == "NetConnection.Connect.Success")
{
m_ns = new NetStream(m_nc);
m_ns.addEventListener(NetStatusEvent.NET_STATUS, onNetStatusNS);
m_ns.play("0", "live");
m_vid.attachNetStream(m_ns);
m_tmr.addEventListener(TimerEvent.TIMER, onTimer);
m_tmr.start();
}
}
private function onNetStatusNS(pEvent:NetStatusEvent):void
{
trace("m_ns: " + pEvent.info.code);
}
private function onTimer(pEvent:TimerEvent):void
{
trace(m_ns.info.videoBytesPerSecond);
}
}
}
For the NetStream example, run the first part directly out of the IDE, as a desktop AIR app, and run the second part on the Android device through the IDE through a USB cord. For the FMS, just add a blank app called "SomeApp" without any code.
In the NetStream example, I can see comparable amounts being traced for m_ns.info.videoBytesPerSecond. The output when it's working is:
m_nc: NetConnection.Connect.Success
m_ns: NetStream.Play.Reset
m_ns: NetStream.Play.Start
m_ns: NetStream.Video.DimensionChange
~15-20k
And when it's not working, it's:
m_nc: NetConnection.Connect.Success
m_ns: NetStream.Play.Reset
m_ns: NetStream.Play.Start
~15-20k
The output is identical in both cases in the Camera example. Either example will either succeed or fail at random.
As stated earlier, in the NetStream example, if I run back to the Android device's desktop, then go back to the app without ever closing it, it'll suddenly start working. In that case, a NetStream.Video.DimensionChange event will be logged when I come back to the app. However this doesn't work for the Camera example.
I have tried doing a few things, like recreating the NetStream or the Video, but stuff like that doens't seem to work. In general, when this problem is encountered, Video objects seem to be useless thoughout the program, as the whole Video class appears to be fail internally.
I have tried this on a Galaxy Tab Pro and on my personal LG Access phone. The phone didn't have any issues with the Camera part, but it was definitely vulnerable to the NetConnection issue.
Does anyone recognize this or know what is happening? Does anyone know a good way to fix it? This does not happen when running it out of the mobile emulators in the IDE. I'm using Flash Builder 4.7. Thanks!
This will apparently only happen when running in debug mode. When I switched over to release mode, the problem went away, so something is wrong with Adobe's debugger. That being said, the AIR SDK I was using at the time I wrote this question was 3.4 (the default for Flash Builder 4.7), so I'm not sure if maybe this has been fixed in later releases.
I am going to post this answer for now and accept it (just because Stack Overflow really encourages having accepted answers), but if someone posts something better and more informative, I'll change which answer I'm accepting.
I am trying to get a list of Files in a Folder from Google Drive from my Android app but have been unsuccessful so far. I'm using google-api-drive-v1-rev4-java-1.6.0-beta and google-api-client-1.9.0. I'm also building my code similar to calendar-android-sample and tasks-android-sample from the samples at http://code.google.com/p/google-api-java-client/wiki/Android.
I cant seem to find how to use files() to get a list of folders or even the id of the folder I want. The tasks-android-sample uses '#default' in the get() method to get a list of tasks. What would I use in the get method to get a list of folders first, search for my folder, get the id, then get a list of files in that folder?
AsyncLoadDocs.java: (Note: I'm using getFields() just to see if the Get object contains any metadata, which at this point doesn't.)
package com.mysite.myapp.docs;
import com.google.api.services.drive.Drive;
import com.google.api.services.drive.Drive.Files;
import com.google.api.services.drive.Drive.Files.Get;
import com.google.api.services.drive.model.File;
import android.app.ProgressDialog;
import android.os.AsyncTask;
import android.util.Log;
import android.widget.ArrayAdapter;
import java.io.IOException;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
/**
* Asynchronously load the docs with a progress dialog.
*
* #author ms
*/
class AsyncLoadDocs extends AsyncTask<Void, Void, List<String>> {
private static final String TAG = "AsyncLoadDocs";
private final GDocsSync gDocsSync;
private final ProgressDialog dialog;
private final Drive entry = null;
private com.google.api.services.drive.Drive service;
AsyncLoadDocs(GDocsSync gDocsSync) {
this.gDocsSync = gDocsSync;
service = gDocsSync.driveClient;
dialog = new ProgressDialog(gDocsSync);
}
#Override
protected void onPreExecute() {
dialog.setMessage("Loading docs...");
dialog.show();
}
#Override
protected List<String> doInBackground(Void... arg0) {
try {
List<String> folderNames = new ArrayList<String>();
Get get = service.files().get("#default").setProjection("FULL");
String fields = get.getFields();
Log.d(TAG, "Fields: " + fields);
return folderNames;
} catch (IOException e) {
gDocsSync.handleGoogleException(e);
return Collections.singletonList(e.getMessage());
} finally {
gDocsSync.onRequestCompleted();
}
}
#Override
protected void onPostExecute(List<String> result) {
dialog.dismiss();
}
}
Any help would be appreciated. Both Calendar and Tasks samples successfully retrieve data from Google using my API key, why doesn't this Drive code?
The Drive API grants access only to two classes of files:
Files that a user has created with a given Drive app
Files that a user opens with a given Drive app
For security reasons, there's no method to list all files in a user Drive account:
https://developers.google.com/drive/apps_overview#granting_file-level_access
For more options in the Android environment, check out these other answers:
Android API for Google Drive?
Google Drive\Docs API for Android