Android Image Rectangle Detection with NO OpenCV - android

I'm baffled that on Android we have to import a 30 MB OpenCV library to detect rectangles in images / video frames. On iOS that is pretty easy using CIDetector.
Has anyone found a solution that isn't OpenCV based? Maybe using Renderscript? I've found this one (explained here) which implements some kind of edge detection, but I'm not sure whether this is the right basis to extend. Any vision / graphics expert out there who could evaluate this and maybe point me in the right direction?

Related

Detect & measure shape (circle) sizes in an image on Android

The requirement is to create an Android application running on one specific mobile device that records video of a human eye pupil dilating in response to a bright light (which is physically attached to the mobile device). The video is then post-processed frame by frame on the device to detect & measure the diameter of the pupil AND the iris in each frame. Note the image processing does NOT need doing in real-time. The end result will be a dataset describing the changes in pupil (& iris) size over time. It's expected that the iris size can be used to enhance confidence in the pupil diameter data (eg removing pupil size data that's wildly wrong), but also as a relative measure for how dilated the eye is at any point.
I am familiar with developing Android mobile apps, but my experience with image processing is very limited. I've researched solutions and it seems that the answer may lie with the OpenCV/JavaCv libraries, which should provide shape detection (eg http://opencvlover.blogspot.co.uk/2012/07/hough-circle-in-javacv.html) but can anyone provide guidance on these specific questions:
Am I right to think it can detect the two circle shapes within a bitmap, one inside the other? ie shapes inside each other is not a problem.
Is it true that JavaCv can detect a circle, and return a position & radius/diameter? ie it doesn't return a set of vertices that then require further processing to compare with a circle? It seems to have a HoughCircle method, so I think yes.
What processing of each frame is typically used before doing shape detection? For example an algorithm to enhance edges, smooth, or remove colour?
Can I use it to not just detect presence of, but measure the diameter of the detected circles? (in pixels, but then can easily be converted to real-world measurements because known hardware is being used). I think yes, but would be great to hear confirmation from those more familiar.
This project is a non-commercial charitable project, so any help especially appreciated.
I would really suggest using ndk as it is a bit richer in features. Also it allows you to run and test your algorithms on a laptop with images before pushing it to a device, speeding up development.
Pre-processing steps:
Typically one would use thresholding or canny edge detection and morphological operations like erode dilate.
For detection of iris / pupil, houghcircles is not a very good method, feature detection methods like MSER work better for not-so-well-defined circles. Here is another answer I wrote on the same topic which has code that could help.
If you are looking to measure the regions, I would suggest going through this blog. It has a clear explanation on the steps involved for a reasonably accurate measurement.

Android : AndEngine pixel perfect collision GLES2?

I am using AndEngine to implement a 2d game and I am now wondering if there is actually any already implemented pixel perfect collision detection library/extension/... for AndEngine GLES2 . I've been searching for hours now and nothing. Please do not refer me to this, as I've already tried it and it only works for GLES1. If there really isn't any already existing work I accept any idea for me to implement it - although i chose AndEngine for pixel perfect collisions, bad idea-
Thank you.
AndEngine Collision Extension:
https://github.com/MakersF/AndEngineCollisionsExtension
This is an extension that aims to bring different collision methods (perfect or approximations) to AndEngine GLES2
Supported Collision Methods:
Pixel-Perfect Collision (supports: translation, scale, rotation, screw)
It also support pixel-perfect collision between pixel-perfect shapes and rectangular shapes, without the need for the latter to be a pixel-perfect shape.
Alpha values different from 0 (you set the threshold that identify if a pixel is solid or not)
You can use the Utils methods to check the performances in your app or to output the collision mask to check if it is what you need

Live Wallpaper Water Ripple Effect

I'm working on a live wallpaper that incorporates some water ripple effects on touching the screen but I'm a little stuck.
Would it be better to create multiple images and loop through them to create a ripple animation or would it be better to distort the bitmap a bit before I place it on the canvas?
This is a video of a very nice ripple effect done through OpenGL.
I don't have any experience yet with OpenGL and was wondering if it is still possible to create a 2D water effect on the live wallpaper?
I wanted to implemented a realistic ripple effect in Android too, so will share my experience:
As a reference implementation i took Sergey's Chikuyonok JavaScript port of Neil Wallis Java algo. Here's a playground where you can experiment with original JS code: http://jsfiddle.net/esteewhy/5Ht3b/6/
At first, i've ported JS code to Java only to realize that there's no way to squeeze more than 1 fps on my Huawei U8100 hardware. (There're several similar attempts on the net with the only conclusion: they're ridiculously slow).
BTW, this SO answer was quite useful to get basic understanding of how to code an interactive graphics in Android: https://stackoverflow.com/a/4946893/35438. I've borrowed fps counter from there.
Then i decided to try Android NDK to reimplement original algo in pure C (my first encounter with it in 10+ yrs!). Despite NDK's docs being somewhat confusing (especially as to requirements and prerequisites), it all worked like a charm, so i was able to achieve up to 30 fps -- it might not be too impressive, but still, a radical improvement over Java code.
Finally, i've put all my work online: https://github.com/esteewhy/whater , so feel free to play with that. It contains:
Interactive bouncing ball code mentioned above (just for the reference).
Water ripples Java port (slow like hell!)
Water ripples C implementation (needs NDK to compile and JDK to create .h file).
(The project is not "clean", i.e.: all binaries are there, so can try to run it "as is" even without NDK.)
You can find an example of a touch ripple effect here:
https://github.com/MasDennis/RajawaliExamples
It utilizes the rajawali OpenGL ES framework/library. You can download the rajawali examples app from the market to see how it looks. Browse through the "src" folder and you will see the TouchRippleEffect activity and renderer. Hope that helps.
I'm no expert in this, but I believe the typical way to do water effects in OpenGL is with a fragment shader. With a static image as a texture, your shader can vary the texture coordinates used for sampling that image, to distort it in arbitrary ways.
Calculate the pixel's direction and distance from the center of the circle, and adjust the texture coordinate toward or away from the circle's center based on a sinusoidal function of the distance, and you should get a nice ripple effect.
Judging by the description of that YouTube video you linked, it sounds like that's done by using a grid of triangles and adjusting the texture coordinates only at the vertices. That should work too, but it won't look as good unless you use a rather fine grid. Doing it per-pixel with a fragment shader is the ideal, but I don't know whether that would cause performance problems on a phone's GPU.

AndEngine VS Android's Canvas VS OpenGLES - For rendering a 2D indoor vector map

This is a big issue for me I'm trying to figure out for a long time already.
I'm working on an application that should include a 2D vector indoor map in it.
The map will be drawn out from an .svg file that will specify all the data of the lines, curved lines (path) and rectangles that should be drawn.
My main requirement from the map are
Support touch events to detect where exactly the finger is touching.
Great image quality especially when considering the drawings of curved and diagonal lines (anti-aliasing)
Optional but very nice to have - Built in ability to zoom, pan and rotate.
So far I tried AndEngine and Android's canvas.
With AndEngine I had troubles with implementing anti-aliasing for rendering smooth diagonal lines or drawing curved lines, and as far as I understand, this is not an easy thing to implement in AndEngine.
Though I have to mention that AndEngine's ability to zoom in and pan with the camera instead of modifying the objects on the screen was really nice to have.
I also had some little experience with the built in Android's Canvas, mainly with viewing simple bitmaps, but I'm not sure if it supports all of these things, and especially if it would provide smooth results.
Last but no least, there's the option of just plain OpenGLES 1 or 2, that as far as I understand, with enough work should be able to support all the features I require. However it seems like something that would be hard to implement. And I've never programmed in OpenGL or anything like it, but I'm willing very much to learn.
To sum it all up, I need a platform that would provide me with the ability to do the 3 things I mentioned before, but also very important - To allow me to implement this feature as fast as possible.
Any kind of answer or suggestion would be very much welcomed as I'm very eager to solve this problem!
Thanks!

Android/iOS OpenCV Eye Dilation Detection

Looking for opinions on if OpenCV could be or has been used to detect eye dilation on Android or iOS. I haven't found much other than eye tracking and blink detection with the app EyePhone that uses OpenCV. Under perfect conditions, I'm sure it's possible, I'm more curious of seeing a proof of concept, that it can and has been done.
Thank you for your opinion.
rd42
try template matching, it gives me best results for now. You can see my sample app:
Example app or use haar detector as its at the start of video, but haar detector is slowly and drop fps.

Categories

Resources