i have a landmarks for lips(i got from MediaPipe),
Currently, i created the triangle coordinated and then i tried triangle_strip, triangle_fan to fill the connected lines which i got from mediapipe.
is that approach is correct or is any other proper approach there to fill all the connected lines ?
I'm Using mediapipe SOlutions API example - Source Here -> https://github.com/google/mediapipe/blob/e0a254789a1ec05f3c09411b45a6c59d0ed3075e/mediapipe/examples/android/solutions/facemesh/src/main/java/com/google/mediapipe/examples/facemesh/FaceMeshResultGlRenderer.java#L154
Related
How can i touch select a 3D object in the scene which is drawn using OpenGL ES 2.0 on Android 6.0. I am using Min3D for Parsing the .3DS file and loaded to Android View using OpenGL ES 2.0 and GLSurfaceView class. I am able to load the 3D model; but i have no idea how to select each part of the 3D object using mouse click/touch.
For ex : I have a 3D model of car, and i need to touch select each part of the car, say doors; and change its colors/texture.
Suggest if there is much more easier way to do the same functionality.
There are 2 ways to do this.
Method 1: Draw all objects using different colours.In our case, we used the object id in the RGB values and rendered the image to a texture. Then you can read the pixel colour where the user is touching. The colour of pixel gives you the object id.
Method 2: Using Ray Hit Test on all bounding boxes of objects.
Method 1 is more reliable if you are looking for accuracy. Method 2 is faster but will not work in all cases. Especially when objects with overlapping bounding boxes.
I am using the MUPDF android library for displaying the PDF on android.
MUPDF returns the PDF as bitmap and I have used the Canvas layer over that to render the annotations such as highlight, underline, strikeout. I am trying to sync the annotations across IPad and Android it's not working because of coordinates difference in both the devices. I am using the concept of quadpoints described by adobe for rendering the annotations.
Suppose I am doing Highlight annotations on single line, I am getting top (left and right) and bottom (left and right) x, y coordinates.With same coordinates when I am trying to render annotation on iPad, it's not showing anything on screen and vice versa.But when I am trying to sync one android device coordinates with another android device, it renders correctly. Same with one iPad compared to another iPad.
Below is the points for same annotation on same file.
iPad:
"quadPoints": [
[
"72.06",
"626.51",
"390.07",
"626.51",
"72.06",
"610.91",
"390.07",
"610.91"
]
Android:
"quadPoints": [
[
"57.6",
"330.96786",
"312.15",
"330.96786",
"57.6",
"362.2044",
"312.57",
"362.2044"
]
CoreGraphics in iOS uses the lower-left point as origin (0,0).
For conversion, see answer: How to compensate the flipped coordinate system of core graphics for easy drawing?
You can read more in iOS Drawing Concepts: https://developer.apple.com/library/content/documentation/2DDrawing/Conceptual/DrawingPrintingiOS/GraphicsDrawingOverview/GraphicsDrawingOverview.html
I am working on an android app to display a plane flying over a map and it's trajectory is based on data I read from a file (latitude, longitude coordinates, altitude, roll, pitch, heading). I use the glob3 mobile sdk and used for start their example "3d model'. The problem is when I try to position the plane using setPitch(), setRoll() and setHeading(), the setRoll() and setHeading() are for the same axis only in opposite directions so I can't control one motion of the plane.
If you could look over the code and give me a few ideas on how to change the axis it would be great, I'm not really good with android. Here are the links for the 3d model example provided on github https://github.com/glob3mobile/g3m/blob/purgatory/Android/G3MAndroidDemo/src/org/glob3/mobile/demo/ThreeDModelActivity.java , and here is my code from android studio https://drive.google.com/file/d/0BzO0p5utgoipaU9LR25BeWdYcmc/view?usp=sharing
In my app I have 3 buttons to start, pause/resume and stop the animation. I set a camera looking from
270 degrees and I have to use setPitch(90) otherwise the plane will be on a side, actually controlling the roll with that.
These are VEMap references. If you are actually using google maps then please go through the documentation and use valid GoogleMaps API references.
There are many curious things in android.
The thing that I want to ask is about Drawing with canvas.
when I override ondraw function ,
and the android gives me canvas to draw something in device.
I would like to get answer how canvas draw image or some figures internally.
If they need to draw something, use surface flinger or openglES or core graphics or any
thing to swap image from back buffer to front buffer in device display.
I got some similar answers in this link Android GUI architecture - relation between Surface/view/window/canvas
But it's not really enough to understand..
Because I can't understand how they can initialize the device display and draw canvas in device display
I understand what relationship is in ViewRoot and View also surface.
Please let me know keys to draw something deeply about native layer.
I strongly suggest you load the API Demos app into Eclipse and look through the Graphics demos. There are a large number of them and the code should give you insights into solutions to your problem.
To load the API Demos app, in Eclipse click File > New > Other > Android Sample Project.
Then select your build target such as Android 4.2.
Then from the list of sample apps select the API Demos app.
And finally click Finish.
There are over 300 demos in the app, so it can be difficult to navigate. You want to look in the Graphics section.
It's a little unclear from your description exactly what your issue is, but the API Demos cover a lot of territory and you should be able to find some code to help you out.
Here is a list of some of the Graphics API Demo Java source files:
AlphaBitmap
AnimateDrawables
Arcs
BitmapDecode
BitmapMesh
BitmapPixels
CameraPreview
Clipping
ColorFilters
ColorMatrixSample
Compass
CreateBitmap
DensityActivity
FingerPaint
Layers
MeasureText
PathEffects
PathFillTypes
Patterns
Pictures
DrawPoints
PolyToPoly
Regions
RoundRects
ScaleToFit
SensorTest
SurfaceViewOverlay
WindowSurface
Sweep
TextAlign
TouchPaint
Typefaces
UnicodeChart
Vertices
Xfermodes
i have ever tried to texture the 3d cube in android using opengl es according to the example in c++, but after several times , the result is disappointed!
so i wanna know, who have ever done it before? may you give me some suggestions?
thanks in advance!
Lesson 6 on this page has a well described Android example of showing a textured cube:
http://insanitydesign.com/wp/projects/nehe-android-ports/