I searched to understand if there is a technique to keep a trained tensorflow model (.pb file) safe in an Android app but didn't find anything useful. I am releasing an app containing a tensorflow model which I built on a training set. When I release the app, anyone can access the model and use it for his own app. I wonder if there is a way to protect a tensorflow model that I put in the asset folder of my Android application?
This is the way that I load my model in Android:
TensorFlowInferenceInterface tf = new TensorFlowInferenceInterface();
tf.initializeTensorFlow(context.getAssets(), "file:///android_asset/model.pb");
I was thinking to embed the model encrypted in the app and decrypt it during runtime, but if someone debugs the app, it can get the password and decrypt it. Moreover, there is just one implementation of initializeTensorFlow method in the TensorFlowInferenceInterface class that just accepts (AssetManager assetManager, String model). It is possible to write one that accepts the encrypted one, but it needs some modification of Tensorflow C++ library. I wonder if there is a more reliable solution. Any suggestion, please?
As mentioned in the comments, there is no real safe way to keep your model safe when you run it locally. That being said, you can hide your model and make things a tad more difficult than having a .pb around.
Apart from name obfuscation provided by freeze_graph, a good solution is to compile to model to a binary using XLA AOT compilation using tfcompile. It generates a binary library containing your model as well as a header file to use it. Somebody who want to peek at your network would then have to go through compiled code, which is a higher bar to clear than reading a .pb file for most people.
Related
I am currently creating an android library that uses json commands for communication with another library. I would like to extract these commands from code and save them in separate files for better readability.
Where do I save those files when there is no asset directory and how do I read them?
Edit: I have found an answer to my question:
how to access resources in a android library project
Have a look at Internal Storage. The files saved here are only available to your app. It uses the Java File API to read an write using the FileInputStream and FileOutputStream
UPDATE:
As per the discussion in the comments, OP was looking for a method to ship a JSON file with the library. In light of that:
I'm not sure if library modules support raw resources. If they do, you might want to use that but it will increase the size significantly. You could also fetch the file from a server the first time you're the library is used, keeping track of that using a SharedPreference entry.
TensorFlow Android Camera Demo uses Inception5h model for live image recognition which delivers exceptional performance. Since I haven't had success retraining Inception5h I've gone with InceptionV3 model but it's not quite as snappy at image recognition. So I'm back at the beginning trying to retrain (or transfer learn) Inception5h model. I've tried modifying retrain.py but it's clearly written just for the v3 model. 5h model doesn't contain "pool_3/_reshape:0", "DecodeJpeg/contents:0" or "ResizeBilinear:0" tensors to begin with. There are other differences as well.
I'm a bit of a newbie at machine learning and TensorFlow so I'd greatly appreciate clear steps as to what I have to do.
Thank you!
It looks like the retrain.py script and tutorial was just updated to work with the mobilenet architecture.
So that solves the first part of your problem, it's not actually inception5h, but it runs well on mobile with much better accuracy than inception5h.
To actually get it to run in the android example you'll still need to update these settings.
I think you should be able to just copy the settings determined for the mobilenet you choose, from the retrain script and you might be okay.
If you wanted to use a different network, that didn't have the settings in retrain.py then the easiest way I can think of to determine them would be to explore the graph with TensorBoard.
So if you really wanted to use inception 5h, you could download and unzip it:
curl -O https://storage.googleapis.com/download.tensorflow.org/models/inception5h.zip
unzip -d inception5h inception5h.zip
Then grab this simple script, from the Tensorflow for Poets: 2 codelab repo, to convert the graph .pb file to something tensorboard can use:
curl -O https://raw.githubusercontent.com/googlecodelabs/tensorflow-for-poets-2/master/scripts/graph_pb2tb.py
And run it on your graph.pb:
mkdir tb_graph
python graph_pb2tb.py tb/inception5h inception5h/tensorflow_inception_graph.pb
And open it in tensorboard:
tensorboard --logdir tb_graph
Then it might be relatively simple to poke around in the graph and find the names of the nodes you need to fill up your own model_info dict.
I think this is the node you'd want to set as your bottleneck_tensor:
At the end of retrain.py script you can notice these lines:
output_graph_def = graph_util.convert_variables_to_constants(
sess, graph.as_graph_def(), [FLAGS.final_tensor_name])
with gfile.FastGFile(FLAGS.output_graph, 'wb') as f:
f.write(output_graph_def.SerializeToString())
Here all the variables are saved as constants in a protocol buffer (pb) file which is binary ('wb'). You should also save in a text file the names of the model's classes. Then as the android documentation mentions, you should save these 2 files in a folder named "assets" in the android path of tensorflow. Then there are some modifications that should be done to load the inception-v3 model which you can see here: https://github.com/tensorflow/tensorflow/issues/1269
I hope this will help!
I'm currently developing an algorithm for texture classification based on Machine Learning, primarily Support Vector Machines (SVM). I was able to gain some very good results on my test data and now want to use the SVM in productive environment.
Productive in my case means, it is going to run on multiple Desktop- and Mobile platforms (i.e. Android, iOS) and always somewhere deep down in native threads. For reasons of software structure and the platform's access policies, I'm not able to access the file system from where I use the SVM. However, my framework supports reading Files in an environment where access the file system is granted and channel the file's content as a std::string to the SVM-part of my application.
The standard procedure how to configure an SVM is by using filenames and OpenCV reads directly from the file:
cv::SVM _svm;
_svm.load("/home/<usrname>/DEV/TrainSoftware/trained.cfg", "<trainSetName>");
I want this (basically reading from the file somewhere else and passing the file's content as a string to the SVM):
cv::SVM _svm;
std::string trainedCfgContentStr="<get the content here>";
_svm.loadFromString(trainedCfgContentStr, "<trainSetName>") // This method is desired
I couldn't find anything in OpenCV's docs or source that this is possible somehow, but it wouldn't be the first OpenCV-Feature that's there and not documented or widely known. Of course, I could hack the OpenCV source and cross-compile to each of my target platforms, but I'd try to avoid that since it is a hell lot of work, besides I'm pretty convinced I'm not the first one with this problem.
All ideas (also unconventional) and/or hints are highly appreciated!
as long as you stick with the c++ api it's quite easy, FileStorage can read from memory:
string data_string; //containing xml/yml data
FileStorage fs( data_string, FileStorage::READ | FileStorage::MEMORY);
svm.read(fs.getFirstTopLevelNode()); // or the node with your trainset
(unfortunately not exposed to java)
I am developing an application for android/iOS/windows using c++ code for the core logic. The application uses the free fuzzy logic library and it works perfectly for windows mobile, iOS and on my local Ubuntu machine, but it doesn't quite work under android.
The application reads a .fcl file from the sd card and then parses it using the free fuzzy logic library parser. The problem is, that the parser gets stuck at random stages of parsing.
Some notes to my project settings:
I enabled the Android read/write permissions for the sdcard in the manifest.xml.
The code I am trying to run is the basic example from the free fuzzy logic library website.
I am using the stlport_static library for stl support and the -frtti compiler flag.
My question is: Am I missing something android specific, like file encoding or some permissions I didn't set?
Some notes I thought about:
File compression should not be an issues, because, to my knowledge, files on the SD card are not compressed and I can parse the file partially.
Using other fuzzy logic libraries is out of the option, because I can't use GPL licenced libraries. The only other library I found didn't hat a manual / how to and couldn't parse the fcl standard.
The free fuzzy logic library uses a lot of wchar_t's whitch could be an issue.
Thank you for your time and hopefully for some help ;)
Ok after plowing through some android manuals and some Google abuse I found the problem. Currently Android doesn't support the wchar_t type. Well you can use it, but the results will not be the same as on any other operating system.
By changing all the wchar_t and wstring types in the free fuzzy logic library to their corresponding char and string types I was able to make the parser work. Well sort of, there are still some sleight inconsistencies, but nothing i can't handle ;).
Conclusion: Don't use wide characters in android c++ Programs.
Thank you for your time & help
I want to restrict my Android Application code to regenerate a code using reverse engineering process form my android .apk file. So then my application code will be secure but i don't know how to do this, please help me to restrict reverse engineering process to my android .apk file.
Thanks,
Android Developer.
The best you can do as far as I am aware is to obfuscate your code before deploying it.
Obfuscating, minifying etc will make the original code unreadable even if the code is decompiled. By unreadable I mean people will not easily be able to tell what variables are used for etc since they will no longer have meaningful names. The same goes for methods, etc.
"You cannot completely restrict Android apk from decompilation.
Because it uses dex formats any one can easily convert these dex files into jar file using publicly available tools like dex2jar.
But you can Obfuscate code to reduce code readability, you can also use native codes to prevent easy decompilation of code.
You can store some part your code in server and download them at runtime call function in library using Reflection concept,
which will help you to prevent your code from decompilation."