I have trained a tensorflow model on desktop. I used tf.image.decode_jpeg to decode the image and feed it to the model during training.
I want to deploy the model on android. SinceDecodeJpeg is not available on pre-built android binaries, I tried using android's bitmap class to read in the input image. For the same image, the pixel values are different for the outputs of DecodeJpeg and android's bitmap. This causes the prediction of the model to be different for the same image.
I'd like to have the same behavior on the app, how can I build a binary with DecodeJpeg, with its dependencies like libjpeg, using bazel?
Related
This question is about finding a solution on how to run a trained model on an Android device without using the convert TF Lite and without using a external service.
I don't owned the model and cannot modify it. I just have the trained saved model files.
The device is out of network and should embed the trained model. No connection to an external server is possible.
Tensorflow Lite is not an option since TF Lite doesn't support 5D tensors: https://github.com/tensorflow/tensorflow/issues/56946
In order to do my test I will get the basic model I have provided in the above tensorflow issue to do my tests.
I have found this blog article, but didn't manage to make it work yet: https://medium.com/#vladislavsd/undocumented-tensorflow-c-api-b527c0b4ef6
Do you know any updated solution that enables to load the model inside a Java or C++ lib on Android?
No example is proposed by Tensorflow on their GitHub: https://github.com/tensorflow/tensorflow/tree/master/tensorflow/examples/android
If TFLite doesn't work for your model due to the limited support, you can use Select TensorFlow ops feature. https://www.tensorflow.org/lite/guide/ops_select
It allows you to use TF ops in TFLite so you can overcome the limited 5D supports of TFLite but it impacts your binary size.
I have succeeded to deploy my trained model using 5D tensor on Android Emulator.
In order to do that, I have converted my model using the converter from Tensorflow to ONNX: https://github.com/onnx/tensorflow-onnx
python -m tf2onnx.convert --saved-model tensorflow-model-path --output model.onnx
Then I have created a C++ lib that loads the ONNX model from the converted file and calls it.
In order to copy the asset on the phone storage, I have followed this topic: https://stackoverflow.com/a/69941051/12851157
You can find ONNX samples here: https://github.com/microsoft/onnxruntime-inference-examples/tree/main/c_cxx
And finally I have integrated the C++ lib in Android like this: https://github.com/android/ndk-samples/tree/master/hello-libs
If I have enough time, I will try to use the TF API.
I am new Machine Learning and this is my first time making an android application for image classification of two species. I trained my keras model and then converted it to .tflite. I know want to use this file in android studio to detect two species i trained it on. I converted the model from keras to .tflite from this code:
tflite_model = tf.keras.models.load_model('my_model.h5')
converter = tf.lite.TFLiteConverter.from_keras_model(tflite_model)
tflite_save = converter.convert()
open("my_model.tflite", "wb").write(tflite_save)
I know want to use the my_model.tflite file in android studio. I searched on internet but didn't get anything. Can someone help me in this problem. Thank you
Short term: you can create a plain text file or have a string array string array to store the two categories.
Long term: describe your model using tflite metadata which means it can be consumed by supporting tools such as Android Studio ML Model Binding and ML Kit.
How to add TensorFlow Lite (tflite) metadata? Article / Sample code image classifier
Note: You get the metadata for free if you train your model with TensorFlow Lite Model Maker
How to consume TensorFlow Lite (TFLite) Model with metadata?
Android Studio ML Model Binding - supported from Android Studio 4.1 Beta 1 and up (Codelab - step 5 onwards) / (Screencast - around 6 mins in onwards)
ML Kit Image Labelling with custom models - documentation, sample
Advanced topic
If the object you are trying to classify does not take up the whole image - think scanning a leaf on the forest floor full of leaves or a single product on a super market shelf - it may be better to use ML Kit Object Detection and Tracking with custom model. It helps you crop a section of the image for processing, increasing the accuracy. Here's a screencast of how this works.
I have a use-case where the APK size of my Android app is a very important parameter. Adding a TFLite model obviously increases this APK size, which is undesirable for me.
I have already quantized the model so that the APK size increase is minimal. However, I would like to make this model downloadable, rather than including it as an asset file.
Downloading the model after the app has been installed, is not as much of a problem for me.
I've trained a custom tiny yolov2 model and want to use this model in Xamarin Android app. I've converted yolov2 weights file to .pb using darkflow. The converted model don't work in Xamarin Android app and the error comes. This blog post says the .pb model should be converted to tensorflow lite version to run on Android and I've also converted the model to tensorflow lite model but it still gives the error as
Failed to load model.lite, Not a valid TensorFlow Graph serialization:
Invalid GraphDef
How can I convert tiny yolov2 model to tensorflow lite model to make it working with Android app?
https://www.tensorflow.org/lite/guide
just follow these steps.
include tensorflow interpreter and convert your model to tensorflow lite as shown in the link.
are you coding for real time object detection on android?
I'm able to run the TensorFlow lite image classification example on my mobile device. However, I want to exchange the image classification model to a pose recognition model. In my case, the output should consist of a list of (x,y) coordinates.
The respective line in the code looks like this:
#Override
protected void runInference() {
tflite.run(imgData, labelProbArray);
}
However the tflite.run function has no source code (only available as binary). So I don't know how it works or how to manipulate its return values.
I worked with TensorFlow before, however, I don't know how to create a TensorFlow model that is compatible with the input and output expected by TensorFlow Lite.
Can anyone help or point me to some more detailed tutorial than the official documentation?
The conversion has to be done to the TF model before converting it to tflite. A pre-existing tflite model can be inspected using the tool "netron"
When using a self trained model (.ckpt files) one has to undergo the procedure of
creating a graph definition file for evaluation
use freeze_graph to freeze the previously created graph definition file using the latest .ckpt file from your training to assign it some weights
using tflite_convert (eg from command line) to convert the frozen graph to a tflite file which you can push to your android application