how to use Keras LSTM model to android? - android

I used the keras to create the LSTM model and I've already trained models that are stored as models.h5 files.
I want to know how to predict the model in the Android studio.
how can i solve this problem?

You can do this using deeplearning4j. You can set up deeplearning4j in Android Studio following the instructions here.
Trained Keras models can be imported directly into deeplearning4j using the Keras model import functionality described here. You can then perform predictions with the trained model in Java.

Related

How to deploy a tensorflow model on Android without TF Lite and without using a server

This question is about finding a solution on how to run a trained model on an Android device without using the convert TF Lite and without using a external service.
I don't owned the model and cannot modify it. I just have the trained saved model files.
The device is out of network and should embed the trained model. No connection to an external server is possible.
Tensorflow Lite is not an option since TF Lite doesn't support 5D tensors: https://github.com/tensorflow/tensorflow/issues/56946
In order to do my test I will get the basic model I have provided in the above tensorflow issue to do my tests.
I have found this blog article, but didn't manage to make it work yet: https://medium.com/#vladislavsd/undocumented-tensorflow-c-api-b527c0b4ef6
Do you know any updated solution that enables to load the model inside a Java or C++ lib on Android?
No example is proposed by Tensorflow on their GitHub: https://github.com/tensorflow/tensorflow/tree/master/tensorflow/examples/android
If TFLite doesn't work for your model due to the limited support, you can use Select TensorFlow ops feature. https://www.tensorflow.org/lite/guide/ops_select
It allows you to use TF ops in TFLite so you can overcome the limited 5D supports of TFLite but it impacts your binary size.
I have succeeded to deploy my trained model using 5D tensor on Android Emulator.
In order to do that, I have converted my model using the converter from Tensorflow to ONNX: https://github.com/onnx/tensorflow-onnx
python -m tf2onnx.convert --saved-model tensorflow-model-path --output model.onnx
Then I have created a C++ lib that loads the ONNX model from the converted file and calls it.
In order to copy the asset on the phone storage, I have followed this topic: https://stackoverflow.com/a/69941051/12851157
You can find ONNX samples here: https://github.com/microsoft/onnxruntime-inference-examples/tree/main/c_cxx
And finally I have integrated the C++ lib in Android like this: https://github.com/android/ndk-samples/tree/master/hello-libs
If I have enough time, I will try to use the TF API.

How to use .tflite model in android studio for image classification

I am new Machine Learning and this is my first time making an android application for image classification of two species. I trained my keras model and then converted it to .tflite. I know want to use this file in android studio to detect two species i trained it on. I converted the model from keras to .tflite from this code:
tflite_model = tf.keras.models.load_model('my_model.h5')
converter = tf.lite.TFLiteConverter.from_keras_model(tflite_model)
tflite_save = converter.convert()
open("my_model.tflite", "wb").write(tflite_save)
I know want to use the my_model.tflite file in android studio. I searched on internet but didn't get anything. Can someone help me in this problem. Thank you
Short term: you can create a plain text file or have a string array string array to store the two categories.
Long term: describe your model using tflite metadata which means it can be consumed by supporting tools such as Android Studio ML Model Binding and ML Kit.
How to add TensorFlow Lite (tflite) metadata? Article / Sample code image classifier
Note: You get the metadata for free if you train your model with TensorFlow Lite Model Maker
How to consume TensorFlow Lite (TFLite) Model with metadata?
Android Studio ML Model Binding - supported from Android Studio 4.1 Beta 1 and up (Codelab - step 5 onwards) / (Screencast - around 6 mins in onwards)
ML Kit Image Labelling with custom models - documentation, sample
Advanced topic
If the object you are trying to classify does not take up the whole image - think scanning a leaf on the forest floor full of leaves or a single product on a super market shelf - it may be better to use ML Kit Object Detection and Tracking with custom model. It helps you crop a section of the image for processing, increasing the accuracy. Here's a screencast of how this works.

How to convert tiny yolov2 model for tensorflow lite model?

I've trained a custom tiny yolov2 model and want to use this model in Xamarin Android app. I've converted yolov2 weights file to .pb using darkflow. The converted model don't work in Xamarin Android app and the error comes. This blog post says the .pb model should be converted to tensorflow lite version to run on Android and I've also converted the model to tensorflow lite model but it still gives the error as
Failed to load model.lite, Not a valid TensorFlow Graph serialization:
Invalid GraphDef
How can I convert tiny yolov2 model to tensorflow lite model to make it working with Android app?
https://www.tensorflow.org/lite/guide
just follow these steps.
include tensorflow interpreter and convert your model to tensorflow lite as shown in the link.
are you coding for real time object detection on android?

How do I deploy a scikit-learn model to android?

I have a model that implements sklearn's RandomForestClassifier. I have saved the model in a 'pickle' file and I want to deploy it to an android application.
I've seen that this can be accomplished using a Tensorflow Lite file, and a 'hdf5' file can be converted to the same. However I have no idea how to convert my pickle file to hdf5 and then to tflite.
Do I re-implement Random Forest using Tensorflow? What else can I do to deploy it to android?
I do not want to train my model on android, but use my custom trained model on it.
You can use m2cgen (Model 2 Code Generator), which is a lightweight library that can provide an easy way to transpile trained statistical models into a native code (Python, C, Java, Go, JavaScript, Visual Basic, C#, PowerShell, R, PHP, Dart, Haskell, Ruby, F#)

What is the difference between the .lite and the .tflite formats

What is the difference between the .lite and the .tflite formats (TensorFlow formats)? And if there is no difference, why there are two of them?
In addition to the question, it seems that I can't upload my model with the .lite extension file to Firebase ML kit. What might be the reason for that?
ML Developers first train a TensorFlow model, and then use TOCO to convert it to TensorFlow Lite model. When running the TOCO command, you can specify whatever output name for the converted Lite model. All TensorFlow Lite TOCO samples use .tflite extension; but .lite seems another popular extension people would like to choose.
So as long as it's a TensorFlow Lite FlatBuffer formatted model, TensorFlow Lite would be able to load / run the model regardless of the extension.
But unfortunately, ML Kit Console at this moment only takes files with .tflite extension. We can consider remove that enforcement. In the meantime, if you are sure it's a TensorFlow Lite model, simply rename the extension and upload it.

Categories

Resources