Tensorflow on Android via Kivy - android

I found this answer that brought me to the idea instead of using the compiled tensorflow graph you might be able to use kivy on your Android phone. That way you could directly talk to the tensorflow graph using python-for-android.
A possible advantage would be to train/adapt the model on the fly. As far as I know otherwise you can only use the final trained model (but this is currently unanswered on stackoverflow). Also cross compiling to Windows Phone might be possible what currently isn't (see here).
I don't know the technical limitations. Anyone can confirm that this is possible and maybe what would be neccessary?

Although WinPhones could be a possibility in the future, there's basically a situation where almost no-one cares as there isn't really much interest about porting it. However, there's a thing in progress about using angle for translating openGL to DirectX, so it could be possible later. There's still this funny app packaging though, so it'll eat a lot of time.
Yet I think it might be possible to use those unofficial converters APK -> WinPhone app. Re TensorFlow: to me it looks like only a recipe is missing so try write one. :P

Related

is it possible to use Tensorflow js custom model from local file in react native?

I am trying to use a custom Tensorflow.js model in react native using React Native CLI.
It seems that #tensorflow/tfjs-react-native is not maintained anymore. When I try to install it based on the instructions provided on the official GitHub page, I face many dependency conflicts that are impossible to resolve. So this fact renders the package useless.
Also, tflite is not an option either based on the nature of my model(Details are irrelevant, though I would gladly explain if it would help in solving my problem). But even if I wanted to use that format, I doubt it would work because the package available for that also looks deprecated considering it was last updated around 3 years ago.
And for the same reason(Model's Nature), despite having a fully updated package, ONNX is currently not an option either.
So that leaves me with Tensorflow.js. And I should mention that I know the hardware limitations involved in using this too.
So my question is, considering the things I've said above, is there a way to use Tensorflow.js meaning this package in react native, and load a custom model to make predictions?
or should I just give up? If yes, please do share a simple working piece of code.
Also, my model is a Keras model saved in Tensorflow.js format. Finally, I also tried loading a model from a URL but no luck there too(Note that I am not trying to imply that it is impossible, I am merely saying that I was not able to do it. Would be awesome if you could tell me a way to do that too if possible of course.)
Thanks in advance for your time. If there is anything unclear, tell and I will try to elaborate more.

python to android convertion with Kivy

I have a Python3 desktop application which I want to convert to an android apk. I saw that Kivy module exists and might be able to pull this off, but I am concerned about it's ability to make the apk work just like the python code. I use many different modules like PIL, opencv, pyserial, threading, watchdog, file_read_backwards etc).
Is this possible or I am asking for too much? And if it is, how can I change/handle for which android version it will the apk be?
threading is in the stdlib, so you have it
file_read_backwards is pure python and doesn't seem to require anything (as per setup.py) although the requirements_dev.txt lists a lot more things, so i'm not sure
PIL has a recipe (https://github.com/kivy/python-for-android/blob/master/pythonforandroid/recipes/pil/init.py), and there is also now a Pillow recipe, which is probably better to use (https://github.com/kivy/python-for-android/blob/master/pythonforandroid/recipes/Pillow/init.py) so that seems fine
pyserial is pure python, and i think i remember people having success with serial over usb on android, though i didn't try myself, it might require a device able to be an usb host and not just client, but i don't know much about that.
opencv has a recipe, it might be a more touchy one, as it hasn't been touched in years (aside from some cosmetic fixes), and i think i've seen people having issues with it, but i'd say it's worth a shot.
I'd say, it's certainly worth a shot, before converting any of your code, try just building a hello world application with kivy for android, then redo the build but adding your dependencies to the requirements one by one, and see if you can solve it or find help when it doesn't, if all go well, then look into porting your code, which is the part where the success will certainly depend more on you than on what kivy/python-for-android can do.

Pros and Cons for Haxe and Kivy

I'm looking to develop an application for iOS, Android, Windows Phone and for Desktop no matter if it's web or standalone. Does anyone have experience with Haxe + NME or Kivy that they can share in detail?
I've been looking for something that can deploy to all platforms and these are the 2 best options I seem to have found. I'm not looking to make a game though. It's more like an app with a lot of touch listeners on images. Touch image then hide this, create that, and do a lot of math behind the scenes. I do however need a pathfinding library but pretty much all engines I've worked in had the A-star pathfinding library. I also need a slideView library so users can swap pages like they do on their smartphone desktop. Any information you can share on the following topic is greatly appreciated. Thanks in advance for reading and for any help provided. Sorry for the trouble
no experience with Haxe here, but I can answer for Kivy:
First, windows phone is currently unsupported, to my knowledge, no one attempted any port, it's probably doable, but it doesn't exist yet, and no core contributor have a windows phone device, so until that changes, or someone with that motivation comes in, there is low probability that it will happen.
For your interactive needs, Kivy would fit the bill pretty easily, being really focused on making touch handling per-widget easy to define. We don't have much information about your math needs, if they are heavy, you'll probably want something like numpy to be usable behind the scene, and/or use threads to do the heavy lifting without blocking the application, this can totally be done with Kivy, so i see no particular issue there. For A*, there isn't any implementation directly inside kivy, but you should be able to use a python implementation (there are dozens out there), if your needs on this side require more performance, you can cythonize it to increase performances, or use a C implementation compiled for each target.
Hope this helps.
long time Haxe user here, though personally I mostly use Haxe for web-app projects, not NME as much. Until recently NME's main focus has been (far and away) gaming. There has been a few recent efforts to create nice UI toolkits building on NME's cross platform strengths:
https://github.com/RealyUniqueName/StablexUI - Demo (works on HTML5, flash, native desktop&mobile)
https://github.com/ianharrigan/haxeui
but they are very recent additions, so if you're looking for a tried and tested solution Kivy (never heard of it before, but looks cool!) looks like it has a bit more maturity and a bit more polish going for it.
In terms of performance and overall reliability, Haxe/NME is great, but it's getting those native-feeling UI widgets that will be your pain point. Other than that though, it's an amazing language to work with :) Python's pretty good as well though... each to their own!
At the time of writing, people are experimenting with using Native UI (there is a talk at the upcoming conference about an Objective C target, and the Java and C# targets are becoming more mature, so there are your 3 main mobile platforms covered) so that could be an option if you want native ui components, though it's not ready yet, this is me just hoping that it might become reality over the next year or so :)
Good luck with your project either way! If you do choose to go with Haxe/NME, be sure to ask questions (either here, the NME forums, or the Haxe mailing list) so that people can help you on your way.

Cross Mobile Options

I created an Android app. While creating one specific app was an interesting challenge, I'm now looking into creating a group of similar apps.
I'd like to create a group of similar Android apps and then move on to creating the same on tablets and iOS... (anything mobile).
I've considered doing so with a product called PhoneGap or doing a web based mobile app. Both of these options seem less than ideal. Doing the Android app I've been frustrated by Java's lack of control and low level constructs. Moving to something like a web based app seems like the exact wrong direction.
C++ is my language of choice. It has the ability to work at a low level, is highly portable across platforms, and has significant support for generic coding which would be useful for generating a group of similar apps. However, the Android documentation suggests to not use C++ unless your goal is porting existing code or dealing with computationally heavy tasks.
I'm leaning towards using C++ anyway, but are there other options I've not considered?
Thanks
You could in theory write your logic in C++ and then have UI layers on top that make use of it. If you are really comfortable with C++ that might be the way to go.
Almost any other parts (networking, UI, animation, etc) are better off being done in the native language of the platform. Use of cross platform solutions always limits you in some way, and usually leads to an application that is not as good as it could be for any platform.
Well, Google's recommendation to not use C++ is based on the following, I believe. C++ is low level, so you can get extra performance out of it if you know what you are doing. Google makes the reasonable assumption that many programmers do not. It is easier for an inexperienced programmer to do harm in C++ then to get a performance boost.
But, if you know what you are doing, it can help you. UI elements on both iOS and Android are implemented in their main language (obj-c, and Java respectively) so there is not a great way around that, but you can write core logic and other functions in C++ and it will be portable between them (iOS can use C++ directly and Android can use it via the Native Development Kit).
There are a few other options available. The one I ended up using is Appcelerator Titanium but please stay away from it. If your project gets complicated or large at all you will hate yourself for choosing it, as I did. Another interesting one that uses C++ instead of Javascript is Marmalade. I haven't used it though, so I can't comment on it.
A non-free solution that I hear good things about is Xamarin, who have ported both environments to C# and a .NET using Mono. However, you still have to write two versions of your code for the UI as far as I can tell.

How do I convert an iGoogle gadget into an Android widget?

I'm trying to recreate this progress bar clock gadget I built in iGoogle as a widget for Android devices.
It seems like it should be pretty straightforward, especially considering the code is only 75 lines, but I have very little experience with developing in Android - even more-so when considering that I would like it to be a widget.
Hopefully Google will develop (if they somehow aren't already) a translation tool to accomplish this task, but until this, I'm out of ideas.
So here are some questions:
Are there any conversion tools for this yet? Something that would allow you use a program and/or a web service to point to the XML file used for the iGoogle gadget and have the program/service return back the necessary project files needed fro Android apps?
I'm not sure if this approach mentioned above is at all possible, but I'm sure that an Android app can be developed to perform the same way as it does on iGoogle - it's pretty basic Javascript + CSS syntax.
In any case, where should I start and what tutorials (if any) exist with regards to this specific request of translating iGoogle gadgets into Android apps (preferably avoiding the "iframe" type of app framework that just points to the mobile version of a webpage).
Are my assumptions $| intentions out of scope here? I feel like this is an easily doable project via the traditional means of using Androids SDK with Eclipse, for example. I tried messing around with the online GUI that Google had for developing Android apps, but the programming interface was like Visual Basic for 3rd graders - it just wasn't too intuitive either.
Also, any other suggestions on what steps I could take to execute this task would be greatly appreciated. I'm just guessing on how this could be done potentially, but if anyone has done something like this already or has insight towards this conversion process that's more valuable than pure speculation as I was doing above, please answer back with some suggestions as to how to accomplish this iGoogle Gadget -> Android Application conversion process.
I found another somewhat similar question on SO, but it doesn't have the same end result that I'm looking for: iGoogle Gadget on Android Phone as APP or Widget
Thanks a bunch for any help!
So far there is no conversion tool that allows converting to an Android wiget a preexisting widget written for another system. You have to rewrite it.
I know some code generators exist, but I don't know them. They won't take the gadget you are referring to as a source, but maybe they can help you to redesign it for Android. This requires checking.

Categories

Resources