ocean wave effect using OpenGL in android - android

I am trying to develop a Game in android for that i need to have a ocean wave effect.I am using OpenGL ES 1.0 for the development.When i searched for the implementation examples and logic i have come across lots of mathematical calculation and all.something like calculating sum of sine function with specific amplitude and frequency and all (http://http.developer.nvidia.com/GPUGems/gpugems_ch01.html)
Sine Wave equation:\
W1=A1*sin(D1.(x,y)*w1+t*q1)
A1=amplitude of 1st wave
D1=direction of 1st wave
w1=frequency of 1st wave
t=time
q1=phase constant q=speed*2P/Length.
taking the sum of this equation will give the surface,
Surfce equation:
H(x,y,t)=W1+W2+W3....
and there are equations to calculate normal
So guys , how can i use these equations to calculate values for vertex array and normal array to render the ocean surface in my application...i would be most grateful if you people can give a code example or any help...

Related

how to show audio frequency in waveform?

I wanna render wave which shows frequency data of audio.
I have data point of 150 points/second.
I have rendered it using canvas,showing line for each data value. so I show 150 lines for 1 second of song, its showing in right way but when we scroll the view, its lagging.
Is there any Library which can render the data points using openGL, canvas or using any other method which will be smooth while scrolling.
These are two waves. Each line represent one data point, with minimum value zero and maximum value will be highest value in data set.
How to render this wave in OpenGL or using any other library because Its lagging in Scrolling if rendered using canvas.
maybe you could show an example of how it looks like. How do you create the lines? Are the points scattered? Do you have to connect them or do you have a fixed point?
Usually in OpenGL-ES the process would looks like:
- read in your data of audio
- sort them so that OpenGL knows how to connect them
- upload them to your vertexShader
I would really recommend this tutorial. I don't know your OpenGL background, thus this is a perfect tool to start it.
Actually, your application shouldn't be too complicated and the tutorial should offer you enough information. In the case, you want to visualize each second with 150 points
Just a small overview
Learn how to set up a window with OpenGL
You described a 2d application
-define x values as eg. -75 to 75
-define y values as your data
define Lines as x,y dataSet
Use to draw
glBegin(GL_Lines)
glVertexf((float)x value of each line,(float)y small value of each line);
glVertexf((float)x value of each line,(float)y high value of each line);
glEnd();
If you have to use mobile graphics you need shaders because OpenGLES only support shader in GLSL
define your OpenGL camera!

How to use the numbers from Game Rotation Vector in Android?

I am working on an AR app that needs to move an image depending on device's position and orientation.
It seems that Game Rotation Vector should provide the necessary data to achieve this.
However I cant seem to understand what the values that I get from GRV sensor show. For instance in order to reach the same value on the Z axis I have to rotate the device 720 degrees. This seems odd.
If I could somehow convert these numbers to angles from the reference frame of the device towards the x,y,z coordinates my problem would be solved.
I have googled this issue for days and didn't find any sensible information on the meaning of GRV coordinates, and how to use them.
TL:DR What do the numbers of the GRV sensor show? And how to convert them to angles?
As the docs state, the GRV sensor gives back a 3D rotation vector. This is represented as three component numbers which make this up, given by:
x axis (x * sin(θ/2))
y axis (y * sin(θ/2))
z axis (z * sin(θ/2))
This is confusing however. Each component is a rotation around that axis, so each angle (θ which is pronounced theta) is actually a different angle, which isn't clear at all.
Note also that when working with angles, especially in 3D, we generally use radians, not degrees, so theta is in radians. This looks like a good introductory explanation.
But the reason why it's given to us in the format is that it can easily be used in matrix rotations, especially as a quaternion. In fact, these are the first three components of a quaternion, the components which specify rotation. The 4th component specifies magnitude, i.e. how far away from the origin (0, 0) a point it. So a quaternion turns general rotation information into an actual point in space.
These are directly usable in OpenGL which is the Android (and the rest of the world's) 3D library of choice. Check this tutorial out for some OpenGL rotations info, this one for some general quaternion theory as applied to 3D programming in general, and this example by Google for Android which shows exactly how to use this information directly.
If you read the articles, you can see why you get it in this form and why it's called Game Rotation Vector - it's what's been used by 3D programmers for games for decades at this point.
TLDR; This example is excellent.
Edit - How to use this to show a 2D image which is rotated by this vector in 3D space.
In the example above, SensorManage.getRo‌tationMatrixFromVecto‌r converts the Game Rotation Vector into a rotation matrix which can be applied to rotate anything in 3D. To apply this rotation a 2D image, you have to think of the image in 3D, so it's actually a segment of a plane, like a sheet of paper. So you'd map your image, which in the jargon is called a texture, onto this plane segment.
Here is a tutorial on texturing cubes in OpenGL for Android with example code and an in depth discussion. From cubes it's a short step to a plane segment - it's just one face of a cube! In fact that's a good resource for getting to grips with OpenGL on Android, I'd recommend reading the previous and subsequent tutorial steps too.
As you mentioned translation also. Look at the onDrawFrame method in the Google code example. Note that there is a translation using gl.glTranslatef and then a rotation using gl.glMultMatrixf. This is how you translate and rotate.
It matters the order in which these operations are applied. Here's a fun way to experiment with that, check out Livecodelab, a live 3D sketch coding environment which runs inside your browser. In particular this tutorial encourages reflection on the ordering of operations. Obviously the command move is a translation.

Is this Fourier Analysis of Luminance Signals Correct? (Android)

I'm writing an Android app that measures the luminance of camera frames over a period of time and calculates a heart beat using Fourier Analysis to find the wave's frequency. The problem is that my spectral analysis looks like this:
which is pretty much the inverse of what a spectral analysis should look like (like a normal distribution). Can I accurately assess this to find the index of the maximum magnitude, or does this spectrum reveal that my data is too noisy?
EDIT:
Here's what my camera data looks like (I'm performing FFT on this):
It looks like you have two problems going on here:
1) The FFT output often places the value for negative frequencies to the right of the positive frequencies, which seems to be the case here. Therefore, you need to move the right half of the FFT to the left, and put freq=0 in the middle.
2) In the comments you say that you're plotting the magnitude but that's clearly not the case (the magnitude should be greater than 0 and symmetric). Instead you're probably just plotting the really part. Instead, take the magnitude, or Re*Re + Im*Im, where Re and Im are the real and imaginary parts respectively. (Depending on the form of your numbers, something like Math.sqrt(Math.pow(a.re, 2) + Math.pow(a.im, 2)).)

Android OpenGL ES 2.0 : Translation, collision and control over framerate

I try to design my own game engine on Android, with OpenGL ES 2.0.
So far, I have a sphere moving along the x axis with the equation x(t) = v*(t-t0) + x0. Let's say the radius of the sphere is r. I use the fonction SystemClock.uptimeMillis() to compute the time t.
There are two walls, at x=1 and x=-1. When the sphere touches one of these two walls (i.e. when the distance between the sphere and one wall is less than r), then it bounces back.
My calculations are done in the onDrawFrame() method of my renderer.
Therefore, the calculations are only done when a frame is rendered. So, the frequency of the "collision check" depends on the framerate of the application. Alas, sometimes the next frame takes too long to be rendered and the translation takes the sphere behind the wall, like some kind of quantum particle :).
Is it possible to have some kind of control over the framerate of my application ? Do you know a method to avoid such discontinuities in the trajectory of my sphere ?
Thank you very much for your held !
"Tunneling" through walls is a consistent problem in video games. Much time is spent combating such behavior. When checking your collisions, you really need to know the previous position and the new position. Then you check to see if there is a wall in between those two positions, and if so, you re-orient your object as though it had bounced off of the wall.
A collision check that looks for distance between sphere and wall is not enough.

FFT on EEG signal in Android understanding the code

I've been attempting to find a library that would enable to perform FFT (Fast Fourier Transform) on some EEG signals in Android.
with help of Geobits, I've finally found the code that might help me do FFT on an EEG signal. But I am having a hard time figuring out how does the code actually work. I want to know what float array x and y are for and maybe an example that might help me a little more.
An fft should return a series of complex numbers (could either be rectangular coordinates, or polar: phase and magnitude) for a specific range of frequencies...
I'm still working through the expressions, but I'll bet dollars to donuts that the x and y arrays are the real (x) and imaginary (y) components of the complex numbers that are the result of the transformation.
The absolute value of the sum of the squares of these two components should be the magnitude of the harmonic component at each frequency (conversion to polar).
If the phase is important for your application, keep in mind that the the FFT (as with any phasor) can either be sine referenced or cosine referenced. I beleive sine is the standard, however.
see:
http://www.mathworks.com/help/matlab/ref/fft.html
http://mathworld.wolfram.com/FastFourierTransform.html
Since the FFT gives a truncated approximation to an infinite series created by a harmonic decomposition of a periodic waveform any periodic waveform can be used to test the functionality of your code.
For an example, a square wave should be easy to replicate, and has very well known harmonic coefficients. The resolution of the data set will determine the number of harmonics that you can calculate (most fft algorithms do best with a data set that has a length equal to a power of two, and is a number of integral wavelengths of the longest frequency that you want to use).
The square wave coefficients should be at odd multiples of the fundamental frequency and have magnitudes that vary inversely with the order of the harmonic.
http://en.wikipedia.org/wiki/Square_wave

Categories

Resources