I have an Tensorflow model trained using the Estimator & Dataset APIs and I would like to use it locally on an Android app.
Can someone point me to a good reference and/or tutorial? I looked at the TensorflowInferenceInterface, but my understanding is it need you to specify which operator you want to feed the input to, but the Estimator/Dataset abstraction is at another level. So I am somewhat lost here.
Thanks.
Here is official documentation for this question: https://www.tensorflow.org/mobile/prepare_models
Related
I'm a self-taught android developer, I want to work as an android developer but I have some imposter syndrome. I think there should be coding conventions we should follow when we code, like:
Follow any architecture (MVP or MVVP)
Coding best practices (https://kotlinlang.org/docs/coding-conventions.html)
Using architecture components like LiveCycle, LiveData, Room, etc ...
Design with XML? (For now, I just export images and use them as design, an example: Here)
etc ...
I search on internet about coding convention and I only found this https://kotlinlang.org/docs/coding-conventions.html (Are they the only standars I need to follow ?)
I also try to look for professional android project I could read, but The only one I found on github is this one : https://github.com/Yazan98/Wintrop (And I don't know if I can take it as reference).
Can I still be hired without this knowledge**? If I need to learn them, what else should I learn to match industry standards? What information should I know about industry standards as a self-taught android developer?**
Thank you for your help!
No, standards there are several for example Modern Android App Architecture that I highly recommend you but also you can find something else like Guide to Android app modularization which is also important in the modern android applications creation.
For projects you can look at nowinandroid which is just very well done and this too it's some models that google's engineers have made for new learners it's really very well done all the standards are respected and they use the new technology in this project like jetpack compose instead of old XML, Room for local data storage, DataStore for preferences, Firebase...
I highly recommend these links
I am trying to create a pair of Android apps: one which trains an image classification transfer-learning model and one which simply uses the trained model for inference. These apps would run on separate devices, and the usefulness would lie in training the model on a more-powerful device and being able to perform inference with that model on a less-powerful wearable device. Transfer learning is being implemented as explained in the post here: https://blog.tensorflow.org/2019/12/example-on-device-model-personalization.html.
The problem is I cannot find a good way to save and transmit the trained model from the first device to the second. I have tried implementing serialization for Bluetooth transmission, but the Android TFL library is not easy to make serializable. How difficult would it be to somehow save a .tflite file on Android? Does this feature already exist and I have missed it? Any help or ideas would be greatly appreciated. Thank you!
For transferring the model, you should do this as a binary instead of trying to explicitly serialize/deserialize. There are a number of different libraries available for this on Android, so it shouldn't be too difficult to find something that works for your app.
As for loading the TFLite model itself and running inference, it's possible to do this device-local using the TFLite Interpreter class and simply pointing it at your on-device file. You can find an example of this here: https://www.tensorflow.org/lite/inference_with_metadata/lite_support
There is so few material about Android application examples.
Could someone answer is it possible to use .mlmodel trained with playground in the Android project?
Official sources refers to ML Kit, TensorFlow Lite and AutoML.
Moreover, there is detailed example of use for Android SDK level 16.
But:
(usually ending in .tflite or .lite)
Could you give me any constructive advice or an knowledge I should have to complete the Android project trained with Machine Learning model?
I believe, this information would be useful for every beginner interested in Android development also.
From Can I convert mlmodel files back to tflite? the answer appears to be no.
From what I can tell, the .mlmodel format is a client-end inference model similar to .tflite where .tflite is a minimized format for deployment on device.
I suspect that in the process of conversion from the original full machine learning model, trade-offs are made which may or may not have equivalents between the two formats.
I want to implement a TFLite Classifier based on YOLOv3 for Android. I'm a little noob with tensorflow lite object detection code...
I want to start from this implementation of Object Detection TFLite. I tried to merge this code with this other implementation with Yolo Classifier but I had a lot of problems in adapting non-lite code with the lite version.
My question is: can i implement a classifier based on Yolov3 starting from TFLite examples?
I think that TFLiteObjectDetectionAPIModel is the class that i have to modify..is this correct? Or this API can be used to call a YoloClassifier Implementation written by myself?
I want to understand in details how I can use API to generate and apply my own classifier based on yolo. I have to implement a new class YoloClassifier.java that interfaces with the API.java file or i can only work on API to adapt new classifier?
Thanks to all in advance and I hope I was clear :)
Unfortunately you can't convert the complete YOLOv3 model to a tensorflow lite model at the moment. This is because YOLOv3 extends on the original darknet backend used by YOLO and YOLOv2 by introducing some extra layers (also referred to as YOLOv3 head portion), which doesn't seem to be handled correctly (atleast in keras) in preparing the model for tflite conversion.
You can convert YOLOv3 to .tflite without the model's 'head' portion (See here: https://github.com/benjamintanweihao/YOLOv3), but then you will have to implement the missing parts in your Java code (as suggested here: https://github.com/wics1224/yolov3-android-tflite). Make sure you have the correct anchor box sizes if you do so. The second link would hopefully answer the second part of your question.
If you plan to keep things simple, your other options would be using SSD-mobilenet or yolov2-tiny for your application. They will give you a more real-time experience.
I am currently working on a similar project involving object detection in flutter/tflite so I'll keep you updated if I find anything new.
Edit:
In https://github.com/benjamintanweihao/YOLOv3, you'll need to change how you import libraries because lite library is moved out from contrib from tensorflow 1.14 onwards.
Try https://github.com/zldrobit/onnx_tflite_yolov3, but the NMS is not in the TensorFlow compute graph. You have to implement your own NMS in your JAVA code.
Another issue with this repo is that it requires ONNX and PyTorch. If you are not familiar with them, it may cost you some time.
I'd like to swap out the multibox_model.pb being used by TensorFlowMultiBoxDetector.java in Google's Tensorflow Detect Sample App with the mobilenet frozen_inference_graph.pb included in the object detection API's model zoo.
I've ran the optimize_for_inference script on it but the tensorflowInferenceInterface can't parse the optimized model. It can, however, parse the original frozen_inference_graph.pb. I still think I need to modify this graph somehow to take in a square-sized input image, such as the 224x224 one that multibox_model.pb does.
I'm one of the developers --- just FYI, we'll be releasing an update to the Android Detection demo in the next few weeks to make it compatible with the Tensorflow Object Detection API, so please stay tuned.