Is it possible to use .mlmodel in Android, trained with playground (Xcode)? - android

There is so few material about Android application examples.
Could someone answer is it possible to use .mlmodel trained with playground in the Android project?
Official sources refers to ML Kit, TensorFlow Lite and AutoML.
Moreover, there is detailed example of use for Android SDK level 16.
But:
(usually ending in .tflite or .lite)
Could you give me any constructive advice or an knowledge I should have to complete the Android project trained with Machine Learning model?
I believe, this information would be useful for every beginner interested in Android development also.

From Can I convert mlmodel files back to tflite? the answer appears to be no.
From what I can tell, the .mlmodel format is a client-end inference model similar to .tflite where .tflite is a minimized format for deployment on device.
I suspect that in the process of conversion from the original full machine learning model, trade-offs are made which may or may not have equivalents between the two formats.

Related

Android Tensorflow lite Recognizing keywords

how can I use the model from https://www.tensorflow.org/tutorials/audio/simple_audio in my Android app? How to provide inputs correctly and how to interpret outputs?
TensorFlow Lite's Task Library has an Audio Classification example for Android, which is what you might be looking for. The guide explains how the Java AudioClassifier API works.
The Task Library uses YAMNet for audio analysis, which has a pre-trained version on TFHub. If you want to train with your own dataset, please refer to the notebooks mentioned here.

Saving/Transmitting Model - TensorFlow Lite Transfer Learning on Android

I am trying to create a pair of Android apps: one which trains an image classification transfer-learning model and one which simply uses the trained model for inference. These apps would run on separate devices, and the usefulness would lie in training the model on a more-powerful device and being able to perform inference with that model on a less-powerful wearable device. Transfer learning is being implemented as explained in the post here: https://blog.tensorflow.org/2019/12/example-on-device-model-personalization.html.
The problem is I cannot find a good way to save and transmit the trained model from the first device to the second. I have tried implementing serialization for Bluetooth transmission, but the Android TFL library is not easy to make serializable. How difficult would it be to somehow save a .tflite file on Android? Does this feature already exist and I have missed it? Any help or ideas would be greatly appreciated. Thank you!
For transferring the model, you should do this as a binary instead of trying to explicitly serialize/deserialize. There are a number of different libraries available for this on Android, so it shouldn't be too difficult to find something that works for your app.
As for loading the TFLite model itself and running inference, it's possible to do this device-local using the TFLite Interpreter class and simply pointing it at your on-device file. You can find an example of this here: https://www.tensorflow.org/lite/inference_with_metadata/lite_support

Creating a simple neural network on Tensorflow by means of Android

I want to create a simple neural network based on the example https://github.com/googlesamples/android-ndk/tree/master/nn_sample. Is it possible to create this with the help on Tensorflow only with Android tools on Java
Take a look at this folder https://github.com/googlesamples/android-ndk/tree/master/nn_sample/app/src/main/cpp
simple_model.h is the model trained in Tensorflow before creating the Android project. Now the model likes black-box, get input and predict output only, if you want to build your own model, try this tutorial (All steps from training, evaluating, prediction to deploy onto Android):
https://medium.com/#elye.project/applying-tensorflow-in-android-in-4-steps-to-recognize-superhero-f224597eb055
Affirmative. You can use TensorFlow Lite on Android, it's an open source deep learning framework which helps to compress and deploy models to a mobile or embedded application. It basically can take models as input and then deploys and interpret and perform resource-conserving optimizations for mobile applications. The NNAPI of Android NDK can interface with TFLite easily too. This link contains gesture, image, object & speech detection and classification example implementations on Android with Java using TFLite.

using Tensorflow Estimator exported model on Android app

I have an Tensorflow model trained using the Estimator & Dataset APIs and I would like to use it locally on an Android app.
Can someone point me to a good reference and/or tutorial? I looked at the TensorflowInferenceInterface, but my understanding is it need you to specify which operator you want to feed the input to, but the Estimator/Dataset abstraction is at another level. So I am somewhat lost here.
Thanks.
Here is official documentation for this question: https://www.tensorflow.org/mobile/prepare_models

Use Tensorflow model in c++ app

Apologies in advance for this relatively newbie question. Using Tensorflow, I've trained a neural net in python and I'd like to use it to classify images in a c++ application, which I want to later integrate into an android app.
Today I spent all day working through the Tensorflow Serving tutorial: https://tensorflow.github.io/serving/serving_basic.
I have everything installed and working, but it's still not obvious to me what to do next. Is TF Serving the right thing to do? Can it be used to integrate a trained TF model into a an application? Or is it just something to build models that can be run from the Terminal?
If TF Serving is not what I need, then where should I look in stead? Is there a simpler way to use trained models cross-platform?
Thanks!
Please take a look at this section in the tutorial for how to load a SessionBundle in c++.
Once you have a SessionBundle you can use utilities in signature.h to get a signature (e.g., GetClassificationSignature) then run an inference (e.g., RunClassification).

Categories

Resources