Issue in tflite based estimator model loading on Android device - android

I generated a Boosted Trees Estimator on structured data and tried to use that estimator to predict on a mobile device. But I am facing the issue, that App generates following error in Android Studio when I start the app.
2020-09-30 12:26:34.892 8966-8966/com.example.tflitetest W/System.err: java.lang.IllegalStateException: Internal error: Unexpected failure when preparing tensor allocations: Encountered unresolved custom op: ParseExampleV2.
2020-09-30 12:26:34.892 8966-8966/com.example.tflitetest W/System.err: Node number 0 (ParseExampleV2) failed to prepare.
I followed this guide for estimator training, code highlighted below:
tf.feature_column.make_parse_example_spec([input_column]))
estimator_base_path = os.path.join(tmpdir, 'from_estimator')
estimator_path = estimator.export_saved_model(estimator_base_path, serving_input_fn)
(If I am doing something wrong, kindly let me know about it.)
All data files (training/testing data, sample Android app and .tflite models) are available on the following link of Google Drive:
https://drive.google.com/drive/folders/1-qDhCJGNZyqu_-XoMyW0kvimyiDdMndO?usp=sharing
Edit1: Usage of Parse Example is included in question.

Related

Unknown directive "model" when setting up AWS Amplify GraphQL in Android

I've been trying to set up GraphQL for my android kotlin project with AWS Amplify, but I am getting red warning for #model as well as AuthRule & allow in:
This is the example TODO GraphQL schema, but when I do edit the schema, the issue still occurs. I'm wondering if I am missing something else that I didn't see in the documentation?
When I hover over #model, I see a Unknown directive "model" error message.
I tried to uninstalling and installing again the amplify add api. The error continues, and after searching through online, I don't see any solutions that are recent that can help with this.

Authentification in Android App without user login

I am trying to implement a dialogsystem (Google Dialogflow CX) into an Android App. I am trying to connect via REST. Appparently, CX doesn't support API keys and the library commonly used is not supported in Android either, so I am facing problems.
With the credentials available in raw (implementation as previously used with dialogflow ES), I get the following error message:
07-09 12:10:27.841 2600-2600/com.softbankrobotics.jokeswithdialogflow E/AndroidRuntime: FATAL EXCEPTION: main
Process: com.softbankrobotics.jokeswithdialogflow, PID: 2600
java.lang.NoClassDefFoundError: Failed resolution of: Ljava/time/Duration;
at com.google.auth.oauth2.OAuth2Credentials.<clinit>(OAuth2Credentials.java:70)
at com.google.auth.oauth2.ServiceAccountCredentials.fromStream(ServiceAccountCredentials.java:475)
at com.softbankrobotics.jokeswithdialogflow.data.DialogflowDataSource.<init>(DialogflowDataSource.kt:17)
at com.softbankrobotics.jokeswithdialogflow.MainActivity.onCreate(MainActivity.kt:52)
at android.app.Activity.performCreate(Activity.java:6257)
...
It seems the first error refers to the min SDK having to be at 26. Unfortunately, I need 23 for my app to run on pepper robot - so I can't alter this.
I would like to authenticate once (e.g. with my own Google account), with validity for all the test users. Safety aspects can be disregarded here, as the implementation is intended for testing in a scientific study, only.
Thank you in advance - any help is appreciated!

Failed to find code-generated model provider - AWS Amplify

I'm getting the below error:
Failed to find a code-generated model provider.
AWS amplify code which throwing this error:
Amplify.addPlugin(new AWSApiPlugin());
Amplify.addPlugin(new AWSDataStorePlugin());
Amplify.configure(context);
I am following the below tutorials:
https://docs.amplify.aws/start/getting-started/generate-model/q/integration/android
https://docs.amplify.aws/cli/graphql-transformer/overview
I have tried generating models and models get generated successfully but still while running the app I am getting above exception.
When you generate models, you should expect to find various code-generated files in your application project. One of them will be app/src/main/java/com/amplifyframework/datastore/generated/model/AmplifyModelProvider.java.
When you build your app, Android Studio will compile that java file into a class file, and include it into your binary.
At runtime, on the phone, the AWSDataStorePlugin() constructor will attempt to find that same AmplifyModelProvider by means of reflection.
I would verify that:
You do actually have the code-generated AmplifyModelProvider;
It is being built successfully;
It is not being stripped out by ProGuard/R8 minification.
If you're still not able to get it working, just use the one-argument version of the AWSDataStorePlugin(...) constructor, instead. That version allows you to explicitly specify the model provider, and does not use runtime reflection.
Amplify.addPlugin(AWSDataStorePlugin(AmplifyModelProvider.getInstance()))
Your datasource needs to be updated:
Try running modelGen, then amplifyPush tasks:
Category
Resource name
Operation
Provider plugin
Api
amplifyDatasource
Update
awscloudformation

TensorFlow Lite GPU delegate failure

I am trying to use TensorFlow Lite with GPU delegate on Android. I am using the lib version (.so files) built from sources from the master branch of the repo. The problem is: the ModifyGraphWithDelegate function always returns error. And there is the following error message in logs:
2019-04-22 15:21:16.212 688-688/com.my.app E/tflite: TfLiteGpuDelegate Prepare: Shader compilation failed: ERROR: 0:6: 'unknown' : not a legal layout qualifier id
ERROR: 0:6: 'unknown' : Syntax error: syntax error
INTERNAL ERROR: no main() function!
ERROR: 2 compilation errors. No code generated.
2019-04-22 15:21:16.212 688-688/com.my.app E/tflite: Node number 54 (TfLiteGpuDelegate) failed to prepare.
If I use JAVA/JNI prebuilt lib version ('org.tensorflow:tensorflow-lite:0.0.0-gpu-experimental') like in official example project - there are no such errors. But I really need to use C++ interface for my cross-platform code.
Any thoughts / suggestions appreciated.
If you're building native shared library then you might need to manually load the .so library.
See https://groups.google.com/a/tensorflow.org/forum/#!topic/tflite/5YhFsCFtKi4
I finally made it work. The internal reason of the error is still completely unknown for me but the point is:
The used (master branch) version of the TFLite GPU delegate for Android fails to properly prepare for running on GPU the standard (for regression task) output nodes combination = flatten + dense.
If replace it with reshape + convolution (pointwise) + squeeze, then it works fine.

Xamarin Forms Ui project (Multiplatform)

I am making this UI for a multiplatform app using Xamarin on my Macbook.
I have set FormsUI as startup project.
Xamarin gives errors when i try to run the project on ios simulator.
/Library/Frameworks/Mono.framework/External/xbuild/Xamarin/iOS/Xamarin.iOS.Common.targets:
Error: Failed to load output manifest for ibtool: Unrecognized property list format. (XPlatformFormsUI.iOS)
/Library/Frameworks/Mono.framework/External/xbuild/Xamarin/iOS/Xamarin.iOS.Common.targets:
Error: Output manifest contents: (XPlatformFormsUI.iOS)
This particular error has been mentioned in the Xamarin Forms forum; basically, something in one of the iOS assets (e.g. a storyboard) is the cause. Unfortunately, you don't get much information regarding WHICH asset it was in the error. Try creating a brand new solution and gradually introduce your changes until you hit the error.

Categories

Resources