TFLite Conversion changing model weights - android

I have a custom built tensorflow graph implementing MobileNetV2-SSDLite which I implemented myself. It is working fine on the PC.
However, when I convert the model to TFLite (all float, no quantization), the model weights are changed drastically.
To give an example, a filter which was initially -
0.13172674179077148,
2.3185202252437188e-32,
-0.003990101162344217
becomes-
4.165565013885498,
-2.3981268405914307,
-1.1919032335281372
The large weight values are completely throwing off my on-device inferences. Need help! :(

What command are you using to convert to tflite? For instance are you using toco, and if so what parameters are you using? While I haven't been looking at the filters, here are my default instructions for finetuning a MobileNetV2-SSD and SSDLite graphs and the model has been performing well.

Related

Tensorflow.lite model produces wrong (different) results in Android app?

I've made an Image classification model and converted it to tflite format.
Then I've verified tflite model in Python using tf.lite.Interpreter — it produces same results for my test image as the original model. Here's a colab link to verify.
Then I embedded it to a sample Android app, using Android Studio ML Model Binding and exact example code from Android studio.
Here's the main activity code, you can also use this link to navigate to the full android project.
val assetManager = this.assets
val istr = assetManager.open("test_image.JPG") //The same image
val b = BitmapFactory.decodeStream(istr)
val model = Model2.newInstance(this) //Model definition generated by Android Studio
// Creates inputs for reference.
val image = TensorImage.fromBitmap(b)
// Runs model inference and gets result.
val outputs = model.process(image)
val probability = outputs.probabilityAsCategoryList
probability.sortByDescending { it.score }
val top9 = probability.take(9)
this.findViewById<TextView>(R.id.results_text).text = top9.toString()
And then I'm getting completely different results on Android for the same model and the same input image.
Here are results matching my initial model in Python:
Here are wrong results I'm getting in Android app:
Links to the model and the test image are there in both examples, but I'll post them into the question once again:
tflite model
test image
I guess it has something to do with input/output formats of the model. Or the image is interpreted differently in python and in android. Or the metadata I added to the model is somehow wrong. Anyways, I've tried everything to localize the issue and now I'm stuck.
How do I fix my model or Android code so it produces the same results as my python code?
I've managed to find and fix the issue:
My model from this tutorial included a built-in image normalization layer. Image normalization is when you transform standard 0-255 image color values to 0.0-1.0 float values, suitable for machine learning.
But the metadata I used for the tflite model included 2 parameters for external normalization: mean and std.
Formula for each value being: normalized_value = (value - mean) / std
Since my model handles its own normalization, I need to turn off external normalization by setting mean = 0 and std = 1.
This way I'll get normalized_value = value.
So, setting the tflite metadata parameters to these:
image_min=0,
image_max=255.0,
mean=[0.0],
std=[1.0]
fixed the double normalization issue and my model now produces correct results in Android app.

Tensorflow Lite Object Detection with Custom AutoML Model

I like to test the Object Detection Example of TFLite.
https://github.com/tensorflow/examples/tree/master/lite/examples/object_detection/android
The example with the default Model works great. But I want to test a custom Model generated from AutoML. When I am replace the "detect.tflite" and "labelmap fille" in the "\src\main\assets" directory and build then the App crashes after launch.
My Model is very simple ... can detect only 2 objects (Tiger and Lion). And my labelmap file contains below:
???
Tiger
Lion
Also I comment the line "//apply from:'download_model.gradle'" in "build.gradle" to stop the download of default Model and use my custom Model from asset.
I new in both Android and this ML space. I'll be glad if anyone can advise for App crash after launch with custom AutoML Model.
Thanks in advance.
Regards.
Two probable errors on log might be:
Cannot convert between a TensorFlowLite buffer with 1080000 bytes and a ByteBuffer with 270000 bytes. Modify TF_OD_API_INPUT_SIZE accordingly.
tflite ml google [1, 20, 4] and a Java object with shape [1, 10, 4]. Modify NUM_DETECTIONS according to custom model.

Android ArCore Sceneform API. How to change textures in runtime?

The server has more than 3000 models and each of them has several colors of material. I need to load separately models and textures and set textures depending on the user's choice. How to change baseColorMap, normalMap, metallicMap, roughnessMap in runtime?
after
modelRenderable.getMaterial().setTexture("normalMap", normalMap.get());
nothing happens
I'm doing something wrong. There is no information in documentation for that.
thank you for posting this question.
setTexture() appears to not work: Unfortunately this part of our API is still a little rough; it works but is very easy to get wrong. We're working on a sample to illustrate how to modify material parameters (including textures) at runtime and will improve our error reporting in the next release.
Thousands of models w/ multiple permutations how?: The plan here has two parts:
The binaries used by the Android Studio plugin will be made available for use in build scripts on server platforms. This will allow you to do a server-side conversion of your assets to .sfb. We'll be releasing a blog post soon on how to do this.
The .sfa will get the ability to contain loose textures and materials not explicitly associated with geometry, and .sfa's will be able to declare data dependencies on other .sfa's. This will mean that you can author (and deliver) .sfb's that contain textures/materials (but no geometry) and .sfb's that contain geometry (but no textures/materials), and if they're both available at instantiation time it will just work.
use this code`
CompletableFuture<Texture> futureTexture = Texture.builder()
.setSource(this, R.drawable.shoes)
.build();
and replace with
/*.thenAccept(renderable -> andyRenderable = renderable)*/
.thenAcceptBoth(futureTexture, (renderable, texture) -> {
andyRenderable = renderable;
andyRenderable.getMaterial().setTexture("baseColor", texture);
})
would work.

Can't load 3ds Model with multiple texture files using Rajawali for android

I am loading a 3ds model using Rajawali for android and its displaying correctly.
parser = new Loader3DSMax(this, R.raw.riale_3ds);
But the object has multiple Texture jpgs provided with it. I cant figure out how to use those Textures together in Rajawali. I have seen the object opens correctly in other 3d model viewers on my PC.
material.addTexture(new Texture("dcmap1", R.drawable.dcmap1));
material.addTexture(new Texture("dcmap2", R.drawable.dcmap2));
material.addTexture(new Texture("dcmap3", R.drawable.dcmap3));
I am applying this material on the object, if I apply only one it seems ok but when I do something as above it messes up.

Is there a way to import a 3D model into Android?

Is it possible to create a simple 3D model (for example in 3DS MAX) and then import it to Android?
That's where I got to:
I've used Google's APIDemos as a starting point - there are rotating cubes in there, each specified by two arrays: vertices and indices.
I've build my model using Blender and exported it as OFF file - it's a text file that lists all the vertices and then faces in terms of these vertices (indexed geometry)
Then I've created a simple C++ app that takes that OFF and writes it as two XMLs containing arrays (one for vertices and one for indices)
These XML files are then copied to res/values and this way I can assign the data they contain to arrays like this:
int vertices[] = context.getResources().getIntArray(R.array.vertices);
I also need to manually change the number of faces to be drawn in here: gl.glDrawElements(GL10.GL_TRIANGLES, 212*6, GL10.GL_UNSIGNED_SHORT, mIndexBuffer); - you can find that number (212 in this case) on top of the OFF file
Here you can find my project page, which uses this solution: Github project > vsiogap3d
you may export it to ASE format.
from ASE, you can convert it to your code manually or programatically.
You will need vertex for vertices array and faces for indices in Android.
don't forget you have to set
gl.glFrontFace(GL10.GL_CCW);
because 3ds max default is counter clockwise.
It should be possible. You can have the file as a data file with your program (and as such it will be pushed onto the emulator and packaged for installation onto an actual device). Then you can write a model loader and viewer in java using the Android and GLES libraries to display the model.
Specific resources on this are probably limited though. 3ds is a proprietry format so 3rd party loaders are in shortish supply and mostly reverse engineered. Other formats (such as blender or milkshape) are more open and you should be able to find details on writing a loader for them in java fairly easily.
Have you tried min3d for android? It supports 3ds max,obj and md2 models.
Not sure about Android specifically, but generally speaking you need a script in 3DS Max that manually writes out the formatting you need from the model.
As to whether one exists for Android or not, I do not know.
You can also convert 3DS MAX model with the 3D Object Converter
http://web.t-online.hu/karpo/
This tool can convert 3ds object to text\xml format or c code.
Please note that the tool is not free. You can try for a 30-day trial period. 'C' code and XML converters are available.
'c' OpenGL output example:
glDisable(GL_TEXTURE_2D);
glEnable(GL_LIGHTING);
glEnable(GL_NORMALIZE);
GLfloat Material_1[] = { 0.498039f, 0.498039f, 0.498039f, 1.000000f };
glBegin(GL_TRIANGLES);
glMaterialfv(GL_FRONT,GL_DIFFUSE,Material_1
glNormal3d(0.452267,0.000000,0.891883);
glVertex3d(5.108326,1.737655,2.650969);
glVertex3d(9.124107,-0.002484,0.614596);
glVertex3d(9.124107,4.039649,0.614596);
glEnd();
Or direct 'c' output:
Point3 Object1_vertex[] = {
{5.108326,1.737655,2.650969},
{9.124107,-0.002484,0.614596},
{9.124107,4.039649,0.614596}};
long Object1_face[] = {
3,0,1,2,
3,3,4,5
3,6,3,5};
You can migrate than those collections of objects to your Java code.

Categories

Resources