I have a model file in tensorflow. I have to integrate in Android application and build a sample app. The model takes in tokenised input. (number array instead of sentences). The application takes sentence input from user. What should be done to implement the tokeniser too in Java android application if the tensorflow model is in Python?
I think you can find some reusable code here: https://github.com/tensorflow/examples/tree/master/lite/examples
May be in the bert_qa example? https://github.com/tensorflow/examples/tree/master/lite/examples/bert_qa/android/app/src/main/java/org/tensorflow/lite/examples/bertqa/tokenization
Related
I have written an ML model using Keras in Google Colab which detects whether an animal is a cat or a dog based on an input image.
Now, I want to create an app in Android Studio that takes an image as input, uses the algorithm I have designed to detect whether the image is a cat or a dog and outputs the result.
My question is how can I incorporate this algorithm into Android Studio? My first thought was to rewrite the code in Java and copy it into Android Studio but writing Java in Google Colab would lead to multiple complications. Hence, is there a way to download the algorithm I have created and upload it into Android Studio such that it works? If not, what other approach can I implement?
My desired outcome is something where I can add the algorithm into Android Studio and write a code like:
if (algorithm == true)
//output dog detected
else
//output cat detected
Android Studio is just an IDE. It doesn't run the actual code. And no, it doesn't run Python.
You should be able to export the Keras model into a offline format that Android can use via Tensorflow; Keras deep learning model to android
Alternatively, to deploy an "online model", you'd run a hosted web-server that exposed the model over HTTP, which your Android code would send in requests to and parse the response.
I want to create a simple neural network based on the example https://github.com/googlesamples/android-ndk/tree/master/nn_sample. Is it possible to create this with the help on Tensorflow only with Android tools on Java
Take a look at this folder https://github.com/googlesamples/android-ndk/tree/master/nn_sample/app/src/main/cpp
simple_model.h is the model trained in Tensorflow before creating the Android project. Now the model likes black-box, get input and predict output only, if you want to build your own model, try this tutorial (All steps from training, evaluating, prediction to deploy onto Android):
https://medium.com/#elye.project/applying-tensorflow-in-android-in-4-steps-to-recognize-superhero-f224597eb055
Affirmative. You can use TensorFlow Lite on Android, it's an open source deep learning framework which helps to compress and deploy models to a mobile or embedded application. It basically can take models as input and then deploys and interpret and perform resource-conserving optimizations for mobile applications. The NNAPI of Android NDK can interface with TFLite easily too. This link contains gesture, image, object & speech detection and classification example implementations on Android with Java using TFLite.
I've recently delved into the Computer Vision and Deep Learning world. I developed a 3D CNN model for action recognition in Keras and now I'm interested in running it in Android (Java). The layers I'm using are Conv3D and MaxPool3D. The total size of the model is 40MB
I've been looking for solutions in the tensorflow-lite space but it seems that they don't have the operations implemented yet.
I got the following error when using the converter.convert() function to get the tflite model
ConverterError: TOCO failed. See console for info.
2019-05-05 14:39:07.006669: I tensorflow/lite/toco/import_tensorflow.cc:1336] Converting unsupported operation: Conv3D
So what can I do to be able to run it in Java? Should I:
run .pb file directly? I don't even know if this is possible now (after tflite). If so, how much time would a new-gen smartphone take to run a 40MB file?
implement ops by myself? If so, how to?
try different approach outside tensorflow?
implement a new action recognition architecture that uses only tflite supported ops
other
I didn't find any Conv3D implementation in Android so far in the web...
Thank you so much for your attention!
If you want to execute as standalon JAVA code using tensorflow,
please have a look on
this. But if you
want to implement something for android using JAVA, the only way is using
Tensorflow Lite.
For infernece time, you can compare your model with some of the state of art architecure in performance benchmarks. You can find here benchmark values , it shows the comparison with Pixel 2 and Pixel XL device.
For your implementation of Conv3D, if you want to implement ops, you can have look on custom operators.
I would prefere your suggestion of 'implement a new action recognition architecture that uses only tflite supported ops'. Here you can find list of supported operations using TF Lite.
I'd like to swap out the multibox_model.pb being used by TensorFlowMultiBoxDetector.java in Google's Tensorflow Detect Sample App with the mobilenet frozen_inference_graph.pb included in the object detection API's model zoo.
I've ran the optimize_for_inference script on it but the tensorflowInferenceInterface can't parse the optimized model. It can, however, parse the original frozen_inference_graph.pb. I still think I need to modify this graph somehow to take in a square-sized input image, such as the 224x224 one that multibox_model.pb does.
I'm one of the developers --- just FYI, we'll be releasing an update to the Android Detection demo in the next few weeks to make it compatible with the Tensorflow Object Detection API, so please stay tuned.
The sample app given by google for tensorflow on android is written in C++.
I have a tensorflow application written in python. This application currently runs on desktop. I want to move the application to android platform. Can I use bazel to build the application that is written in python directly for android? Thanks.
Also sample tensorflow app in python on android will be much appreciated.
Currently, there is no simple way to run tensorflow on android. Typically, you would only have to use inference (runtime), not training.
Another way is to use TensorFlow serving to host models in the cloud and issue RPC calls from an Android client.
I tried to use python in my android application with some 3rd party terminals like SL4A and Qpython. Those will support to run the python files directly in our android application so we have to install SL4A apk's and we need to call that intent.But these will support for some level I guess.
I tried to import tensorflow in that terminal it shows module not found. So I thought this tensorflow will not work in these terminals.
So I am trying to create one .pb file from the python files which are working in unix platform.So We need to include that output .pb file in our android application and we need to change the c++ code regarding that .pb file.I am thinking in this way.let see it will work or not.I will update soon if it working.
You can create your tensorflow model on your desktop and save it as a .pb file. Then you can add this model to your android project and make use of it to make predictions on the android device.
Its like training(which involves heavy computations) on a desktop machine(which is more powerful) and using the model to make predictions(which involves less computations) on a mobile device(comparatively less powerful).
This is a link to a great video by Siraj Raval
https://www.youtube.com/watch?v=kFWKdLOxykE