I have been trying to graph data in Android using matplotlib through Chaquopy. So far, creating the plots themselves in Chaquopy has been very easy. However, I am unclear on exactly how to save the plot and load it into the ImageView. I looked at this example (How to display python matplotlib graphs (png) with Chaquopy in Android Studio), and that seems like a promising method, but it involves plotting in Python, storing the image as a bytes object, and then loading the bytes object into the ImageView in Java. Ideally, I would like to be able to do all of this, including updating the ImageView, from the python script. Is this possible?
Thanks for any help and advice!
Edit:
I found a solution that avoids the bytes object entirely:
root = Environment.getExternalStorageDirectory()
plt.savefig(root.getAbsolutePath() + "/fig1.png")
bitmap = BitmapFactory.decodeFile(root.getAbsolutePath() + "/fig1.png")
self.findViewById(R.id.imageView).setImageBitmap(bitmap)
If anyone has a solution using the bytes object though I would welcome any suggestions.
Related
A little bit of a backstory, I am currently an IOS developer transitioning into working with Android, more specifically Kotlin, and the way that Android Development handles images confuses the hell out of me!
Can someone please explain to me how/why images are saved as type Int and how can I set an image as a property on of my objects. I understand Drawables have Id's that are type Int and that I should save them that way, but I am currently trying to get an image from the user's gallery and the data from the Intent on the "onActivityResult" function comes back as an Intent for which I then convert to a bitmap.
SUMMARY: How do I properly work with Image properties in Kotlin (Int) and convert a bitmap to type Int to set my property?
I have a custom built tensorflow graph implementing MobileNetV2-SSDLite which I implemented myself. It is working fine on the PC.
However, when I convert the model to TFLite (all float, no quantization), the model weights are changed drastically.
To give an example, a filter which was initially -
0.13172674179077148,
2.3185202252437188e-32,
-0.003990101162344217
becomes-
4.165565013885498,
-2.3981268405914307,
-1.1919032335281372
The large weight values are completely throwing off my on-device inferences. Need help! :(
What command are you using to convert to tflite? For instance are you using toco, and if so what parameters are you using? While I haven't been looking at the filters, here are my default instructions for finetuning a MobileNetV2-SSD and SSDLite graphs and the model has been performing well.
I'm trying out the java_arcore_hello_ar sample app but replacing the andy.obj with my own model created in Blender. I have exported the blender object using settings from this tutorial
The .obj and .mtl files were placed in the assets folder but when I tap on the screen I get nothing. It doesn't show an error so I'm thinking it does place the object on the screen but is not drawing it for whatever reason.
Any Google searching for results generally bring up tutorials where you have to create a parser for converting the object, but as far as I can see, the ObjectRenderer class in arcore package does this heavy lifting for you.
Has anyone tried this with any success? Do I have to do further work with the .mtl file?
I did get this worked by extending the code to reade OBJ and MTL files.
You can take a look at my code # https://github.com/JohnLXiang/arcore-sandbox .
I'm also new to openGL, my code is not perfect but works at least.
If it does't any error information , I think the reasons are:
1.The Obj model has been placed other position , for example far far place . So you should check if model's position is origin of Blender in modeling process .
2.The Obj model is different one of java_arcore_hello_ar sample , so when java_arcore_hello_ar sample's Obj parse library parsed error.
So, you can parse obj model by yourself.
I cannot find any way to retrieve a Path object representing a string. Does it exist? A list of the necessary points would be enough, but I guess, internally, a path is used.
For example in GDI+ there is:
GraphicsPath p = new GraphicsPath();
p.AddString("string");
From there any point of the "drawn string" can be accessed and modified.
PS: I do not mean drawing a text along a path.
I've spent quite a long time solving this problem (to draw vectorized text in OpenGL) and had to dig deep into libSkia sources. And it turned out pretty simple:
Indeed, canvas uses paths internally and it converts text to vector paths using SkPaint's getTextPath() method. Which is, luckily, exposed at Java side in public API as android.graphics.Paint.getTextPath(). After that, you can read Path back with android.graphics.PathMeasure.
Unfortunately, you can't read exact drawing commands, but you can sample the curve pretty closely using something like bisection.
I'm working with Android and I really need a fast way to get a bitmap of format BGRA to be filled in ARGB.
One thing I aslo want to let u know that The Data comes in byte[] format and I have to convert in int[] format also.
Can AnyOne Tell me How to do this ...
Thanks in Advance
If you want to load a Bitmap by bytestream, you can use Bitmap.decodeStream. You could then use getPixel(s?) to get the int array. This is one way I know how to do this, probably not the fastest though. A faster way would be to convert bytes to int, if your byte array is nothing but pixeldata this won't be too hard.
BGRA to ARGB can be done with bitshifting quite fast.
A nice source you would probably like:
https://web.archive.org/web/20141229164101/http://bobpowell.net/lockingbits.aspx
The fastest way I think would be to do it in native code using the NDK. I've been considering to use it for image processing for some time but didn't get the chance yet. So I don't know much about it (i.e. how you would access your byte buffer) but you could start from the Plasma sample for bitmap processing in JNI.