I have retrained the model with some custom images as explained in the tensorflow for poets tutorial.
When I run the model on my computer with the below command
python -m scripts.label_image --graph=tf_files/retrained_graph.pb --image=tf_files/test_photos/apple.jpg
I am expecting it to classify as an apple and I get the correct classification result which is:
apple 1.0 orange 1.40016e-08 lemon 2.19029e-09
When I copy the retrained model and the label file to android studio's assets folder and build the apk, I get different classification results even though I am providing the same image to the model.
See the image of the classification result from the app that is built using the same model
I thought the problem was caused by the different tensorflow versions of anaconda and android studio. Therefore, I upgraded the tensorflow version to 1.7.0 to build the model and in android studio tensorflow's dependencies are
compile 'org.tensorflow:tensorflow-android:+'
I also tried the dependency
compile 'org.tensorflow:tensorflow-android:1.7.0'
But I received another error and app crashed immediately after launching so I had to turn back to
'org.tensorflow:tensorflow-android:+'
At the end, even though the optimization and quantization was done, I couldn't make it work with tensorflow for mobile. Same image on mobile and on computer outputed completely different results.
Therefore, I changed to Tensorflow Lite. With tensorflow lite my problem is solved. One short note: Tensorflow lite is not supported on windows (especially toco), therefore I had to use ubuntu.
Related
I am currently working on an android project where I want to add a TFLite model. I searched how to do so and found the common way is to add it from File-->New-->Other-->TensorFlow Lite Model. The problem is the TensorFlow Lite Model option is greyed out as shown in the image 1 in my android studio. Tried to upgrade my gradle plugin version but it didn't seem to solve the problem.
I'm trying to import a tensorflow lite model into my android studio. I've followed the documentation all along. I have exported a correct tensorflow lite model with an extension .tflite on it but in android studio it says that it is an invalid file.
This is not a valid TensorFlow Lite Model file.
On the import window. I'm really confused why is this happening. I've searched through forums everywhere, they said it is because of my android studio is not 4.2. I've updated it but it still cannot work. I'm really stuck in this and don't know what to do. Can anyone who have experienced this problem share your solution? Or is there any possible ways to deploy my model into my android project? I've tried firebase ML kit but it says maximum limit is 40MB while my model is 160MB. Then its suggesting me to bundle the app into the local but then again the documentation is deprecated and so basically I can't use it.
Would really appreciate anyone's answer here
UPDATE:
Here's the public link to the tflite file that you can download https://drive.google.com/drive/folders/1aGeg0SJHCsGT7yvkny-8ijPKGYZJqODE?usp=sharing
I see a sample in Google codelabs this
it requirements dependencies Android TensorFlow support
dependencies {
implementation 'org.tensorflow:tensorflow-android:1.2.0-preview'
}
I know the TensorFlow Lite was help developer to use the model in mobile
devices
What's the difference between these two
The code snippet which you provided corresponds to TensorFlow Mobile.
TensorFlow Mobile is a program useful for running protocol buffers ( .pb ) files on Android , iOS and other IoT stuff. It can only be used to run inferences on a TensorFlow model which is converted to a .pb file. It can only function over specific platforms.
TensorFlow Lite is a successor of TensorFlow Mobile. Lite can run inferences on models which are converted to a .tflite file. The Lite version also allows the developer to run Graphs, Sessions and Tensors over Java and Android. It also provides the Neural Networks API. It can functions over Android and iOS devices, Firebase MLKit, TensorFlow.js and also TensorFlow C++ APIs.
Even Google recommends to use TensorFlow Lite instead of TensorFlow Mobile.
I have a questions about TensorFlow Android Camera Demo, first of all, to start work with this demo I should download (clone) in my laptop all Tenserflow depositary? Or it's possible to download just Android camera example (if yes, then how to do it)?
If you're looking for just the camera demo, it looks like somebody did make a standalone version:
https://github.com/miyosuda/TensorFlowAndroidDemo
The main difference is that the official demo uses Bazel (Google's build tool) to build and has shared dependencies with other tensorflow targets in the full repo, whereas this is uses the standard Android toolchain (including the native code).
If you want to use the official repo, it should be possible to use Android Studio from the demo app directory, as gradle calls out to Bazel in it's build targets.
The TensorFlow Android Camera Demo is now build every night in Jenkins:
http://ci.tensorflow.org/view/Nightly/job/nightly-android/
The "Last Successful Artifacts" section has a "+" sign that you can expand a couple of times to find the tensorflow_demo.apk, which you can then install.
Follow the installation instructions in the README file to install the APK.
You have to first download the source from our repo. Then follow the additional instructions for the Android image demo to make it work.
I've installed Debian+python on an android table with GNURoot. Now I'm trying to install tensorflow python API, so that I can "import tensorflow" in my python code. My tablet CPU is arm 32 bits, so I can not install with pip because tensorflow only supports 64 bits.
I thus try to compile tensorflow from source. Tensorflow build system is Bazel, and I've not found any Bazel executable for linux arm 32 bits.
So I try to compile Bazel from source. But "./compile.sh" fails with error
"Protobuf compiler not found in third_party/protobuf/protoc-linux-arm32.exe"
I'm a bit reluctant to try and compile protobuf from source, because I've read somewhere that protobuf master may not be work with bazel.
Is there an alternative way to install tensorflow on arm32 ?
Unfortunately building TensorFlow Python requires Bazel (for just C++ inference you can use the instructions at tensorflow/contrib/makefile), and this is quite an involved and buggy process. The best place to start is this post on setting things up on the Jetson board:
http://cudamusing.blogspot.com/2015/11/building-tensorflow-for-jetson-tk1.html
If you want to play with Keras it is possible on gnuroot. Instead of Tensorflow you can use Theano backend. I made it.
In this moment my Telefunken Outdoor WT4 phone is running it.
It is very slow but it is working.