Facial Expression Recognition on Android [closed] - android

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
My first Android app is going to involve a front facing camera, and facial expression recognition. I did a lot of research, yet I couldn't find any Android libraries that involve any facial expression recognition. I basically want to measure reactions.
I'm thinking it MUST have been done somewhere in some app, can anyone point my research in the right direction? If not in Android, perhaps somebody may know of a library that I can port over?

I'd suggest getting the Android OpenCV port working as a good first step.
Due to real-time requirements for image processing, most of the face detection/recognition code you're going to see is likely to be in C++. Many systems may use OpenCV as a base and/or you can cobble together a reasonable solution from OpenCV's many low-level functions.
The CVCamera sample in there is also good for showing how JNI/NDK interop works on Android which can be helpful for interfacing with other code.

what about face.com? it's web(service) based, but supposed to be pretty good!

Related

Generating Unity AR without target [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 3 years ago.
Improve this question
I've been looking into AR recently because I decided I should use it for my Android app and I found about Unity and Vuforia, which is probably the simplest way to implement AR. I've never before used Unity so I'm a complete beginner to it.
I've followed this video: https://www.youtube.com/watch?v=YvSrZqP0elQ and it's working on my phone.
Vuforia uses marker, which I don't want. I would like for it to render AR image on the ground. More accurately said, I would need it rendered so that it is vertical to your body and horizontal to the floor.
Is there a way to do this in Unity with Vuforia? It would be great if I got any kind of help, for example link to useful resource or documentation, explanation or even example. Thank you for your help.
Vuforia does not provide markerless. They hqve smart terrain but seems to be oriente for games and has not been updated for years.
There are some SDK available like Wikitude, but they provide it as beta.
http://www.wikitude.com/blog-wikitude-3d-tracking-beta/
I think ARToolkit does something similar called Natural Feature tracking (NFT).
http://artoolkit.org/documentation/doku.php?id=3_Marker_Training%3Amarker_nft_fiducial_markers
What you may look for is SLAM but beware that the technology is not yet mature.
There is Kudan but it also has limitations as it seems to rely on hardware. Gyro for rotation so some Android device won't do.

Beginning android game programming [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
Okay, so I want to get into programming games for android. (I am not an absolute beginner, as I already have an app on the amazon app store) I have searched the Internet and found nothing of particular use. Can somebody point me in the right direction?
By the way this is my first question on stack exchange, so if I'm going about this wrong, please help me.
Thanks in advance! :)
EDIT: What I mean when I say 'the right direction' is a resource of sorts to get me started with sprites and canvases and the like.
ANOTHER EDIT: I have decided that I will use unity for my game, which will be 2D. A tutorial for that would be helpful. :)
The most simple way for a beginner to get into game development is start using some framework/engine (Unity3d,libGdx,etc) . You will understand that making game is pretty hard even on high-level tools (and drawing on canvas even simple frame-by-frame animation with correct memory management is not simple task).
If this is exactly what you need try some Java game tutorial (like this one http://zetcode.com/tutorials/javagamestutorial/) and backport it as android version. The methods looks very similar.

What is the use of opencv library in android [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Closed 7 years ago.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Improve this question
I have tried to read about opencv library library and it has something to do with our eye reading but i am not able to get what exactly are its true capabilities.
All the tutorials i got on net started implemented this library but couldn't find any explaining why it is used. So anyone please explain me
OpenCV is a computer vision library that has many real world uses.
such as: Human-Computer interaction, object identification, segmentation and recognition, face recognition, gesture recognition, motion tracking, motion understanding, stereo and multi-camera calibration, depth computation and mobile robotics.
(All of this is written in the opencv tag you used in your question by the way)
I think you probably could have gotten all this information from a quick google search however. Perhaps next time, just do that?

Human body shape detection on image from Android device camera [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
I must implement an applcation that draws a heart on a person seen via an Android device camera(augmented reality stuff).
So, I must detect the shape/contour of the upper part of human body(head, neck, shoulders) in order to know where the heart must be situated. This stuff must be made in real time.
I've searched about this stuff but I haven't find something useful. Does anybody know some tutorials or examples of something related to this? Thanks.
I would recommend using OpenCV if the memory isn't a problem ( your app would be like 50 MB). There are plenty of tutorials to learn how to use OpenCV. Try Cascade Classifier
In case of face detection, as far as i know, android has a library for this. Camera.Face

Eye tracking in Android [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
I'm looking to do basic eye tracking using an Android tablet, hopefully to track the users eyes to allow movement of a cursor around the screen. I've been doing some searching, I've read a little bit about OpenCV and FaceL and have seen an example in another SO question here showing it is possible to track eye movement on android.
I was wondering if anyone knows of any good tutorials or sample code that would be good to refer to or work from? I'm looking for anything that can help me figure how to get this working. Even in its most basic form.
I found your question via comment.
FaceL use OpenCV too, but with python wrappers.
For basic ideas I recommend this:
eye tracking
eye detect
but they are in native C API (OpenCV 1.X), OpenCV 2.3 java wrappers use C++ API (OpenCV 2.X) syntax, so Mat instead Iplimage etc.
You can see new syntax in:
template matching
On android you can choose two way to access OpenCV - JAVA API (70% functions) or C++ (with android NDK)
Im using java side and think, all functions you would need are accessible via java side.
Hope that helped you a little ;)
Try Opengazer. Actually it is used in PC (linux/Mac). But you can get an idea where to start.

Categories

Resources