My project is shape recognition of leaf. I use Invariant Moments to feature extraction and City Block Distance to compute the distance between test image and image in database. Bust the result i get is very bad. I can say the recognition is only get less then 50% of match.
for example:
This is test image
But that match with this image:
I convert that image into binary image using otsu threshold, so the image looking good on shape.
My question, is this the normal? or Do I have an error in my coding?
This is my coding using City Block Distance:
CityBlock[j] = Math.abs(bMom1 - DB.GetBentukMoment1(j)) + Math.abs(bMom2 - DB.GetBentukMoment2(j)) +
Math.abs(bMom3 - DB.GetBentukMoment3(j)) + Math.abs(bMom4 - DB.GetBentukMoment4(j)) +
Math.abs(bMom5 - DB.GetBentukMoment5(j)) + Math.abs(bMom6 - DB.GetBentukMoment6(j)) +
Math.abs(bMom7 - DB.GetBentukMoment7(j));
If i uses the same image on that coding, the result didn't give 0 value. Why? is it because double data type?
Finally i know the problem on my application, the double value that i save to database with double data type has change, so i use Text data type to store the double value, then i parse that to double, and then i get the zero value for same image.
But i think invariant moments is not good for matching on binary image, its better used on canny image. That because for some matching on binary image i still get the problem like my question.
Related
I'm trying to get camera shutter speed when a photo is taken with Android camera. Using this instruction in an image the app creates that contains the taken photo.
double vel = exif.getAttributeDouble(ExifInterface.TAG_SHUTTER_SPEED_VALUE, 0);
This gives some values that change according to the level of luminosity, for example right now if I allow natural light to go through my window fully it offers the value 6.906 and if I don't allow it as much as possible it is 3.882.
But on the other hand I'm using this app to check correctness of the values and for these same cases it offers the values 1/120 and 1/12, which seem to be on a standard format to represent shutter speed as seen here.
I can't grasp if ExifInterface.TAG_SHUTTER_SPEED_VALUE is measuring shutter speed correctly, but in other scale which I don't know how to convert or if it's doing that in a wrong way and using it wouldn't help.
Could anyone tell me how to convert to the 1/x format from the value it gives or tell me if it's measuring any other thing?
The TAG_SHUTTER_SPEED_VALUE unit is the APEX value.
Not sure about about this source but this is the only answer I found about the APEX value calculation: https://www.dpreview.com/forums/post/54376235
ShutterSpeed=-log2(ExposureTime).
And it matches your values:
-log2(1/120) = 6.907
-log2(1/12) = 3.585
Anyway, if your are looking for the exposure time value in second, you can directly read the TAG_EXPOSURE_TIME instead.
I´m trying to develop an app that calculates the reverberation time of a room and displays it on the screen.
What I´ve done so far is:
record the audio in a wav file
extract the bytes from the wav file and transform them to "double"
plot the data obtained following the next equation: SPL=20log(samples/ 20 μPa)
then from the figure that I´ve plotted, I can obtain the RT60 easily
The point is that I´m not really sure if what I´m doing has any sense, as wherever I search for info I see that they obtain the RT by octave ( or third of octave ) bands and in my case I´m not doing anything with the frequency, I´m just plotting the graph against time getting something like this:
So my point is, is there anything that I´m missing?
Should the "samples" value in the SPL formula be something else? What Im doing to obtain them is:
double audioSample = (double) (array[i+1] << 8 | array[i] & 0xff)/ 32767.0;
and then I place the [-1,+1] values that I obtain directly in the formula
For what frequency I´m I theorically plotting the RT?
Thanks
You have to use a common frequency for the voice you can use 500hz
, 1000 or 2000. Or an average of the 3 like for the calculus off rt60. Oliver.
I need to set the direction value when capture a image from the camera in a tag inside the image. I try for example:
exif.setAttribute("GPSImgDirectionRef","T");
exif.setAttribute("GPSImgDirection","142.2");
with no success.
Any idea?
Thks.
I came into a solution for this issue recently, if someone needs it:
According to ExifInterface documentation the attribute using the tag TAG_GPS_IMG_DIRECTION expects a "rational" value. But as far as I could understand it's source code, it validates the inserted value firstly by checking if there's a "/" char in the string, and after that get the numbers before and after the "/" to generate a double value, and just when it can be converted to a double value this attribute will be added to the image file.
So basically to make it work, instead of sending a double value as your attribute, you need to send a fraction.
As a suggestion, making a fraction out of a double could be easily done using Apache Commons Math Lib - Fraction. It would be something like this:
Fraction azimuthAsFraction = new Fraction(azimuthAsDouble);
exif.setAttribute(TAG_GPS_IMG_DIRECTION, String.valueOf(azimuthAsFraction));
This way your azimuth value should be added to the image file metadata.
I am working on an application in Android, wherein I have to get android.graphics.path values and compare them.
Consider the following images :
and
The first image shows a straight line and the generated Path value. Similarly, the second image,also shows a similar straight line, with a different Path value.
I'm unable to understand the value that is generated. Can anyone explain as to what exactly the generated values mean? Can I approximately take a wild guess about a path value from the screen coordinates?
Also, in my application, I would like to compare path values. The lines shown in the above figure are similar. And in my application, I would like to compare them and render them as same lines. And I'm not just going to compare lines, there'll be curves and all such drawable shapes. For comparison do I first have to normalize my path values (maybe calling getMatrix for my current canvas?), so as to have the same effect for different screen sizes?
There is one other way of comparison that will be much simpler,finding centroids of the paths of figures. Obviously lines will have a centroid at a different position compared to curves,etc. But this sort of comparison won't be so accurate. I wanted to store some value and then compare the generated path value to the stored value, along with comparing the centroids, so as to have a better accuracy. But for that, I need to understand the generated path values!
Please help or guide! Thanks! :-)
Edit:
The code that I'm using for converting my path values to String. My path values are stored in an ArrayList (called pointsToDraw ). Here's the code :
#Override
public void onClick(View v) {
// TODO Auto-generated method stub
synchronized(pointsToDraw){
for(Path path : pointsToDraw)
{
stringPoints.add(String.valueOf(path));
}
}
TextView b1Text = (TextView) findViewById(R.id.GText);
for(String s : stringPoints)
{
b1Text.setText(s);
}
}
A Path object is an object that encapsulates a series of geometric paths. If you want to programmatically compare one path to another, then the place I think you have to start is to use PathMeasure on that Path object in order to pull out all of the co-ordinates. Using PathMeasure you can obtain a series of co-ordinates that the path follows, by supplying a distance argument.
PathMeasure
Then, in order to determine whether one given path is similar to another in terms of the size and its path along the screen, I would perhaps suggest using PathMeasure on them both and comparing the co-ordinates they produce given incremental distance arguments. Then use some comparison algorithm, which may be as simple as determining whether each set of compared co-ordinates are within a distance from each other (with relative starting co-ordinates taken into account).
So I can't help with the algorithm you would use, but as a starting point, I think it's PathMeasure that you have to use in order to inspect and analyse the data within the Path to begin with. Or, you might want to render them to bitmap and use some kind of image recognition library to compare those bitmaps, perhaps?
You can't really make a string for that object I think, so you get the default value: If you check out the manual you see that it is actually the same as
getClass().getName() + '#' + Integer.toHexString(hashCode())
or, in their words,
The unsigned hexadecimal representation of the hash code of the object.
It's just a hashed value of the object, and has no direct recognisable connection to for instance any features of the Path (location etc). If you really want to know how it is made you can find the hashCode() function of that object, but I suspect you won't see anything interesting for this question.
To be clear, when you say you want to compare "path values" you seem to imply that you want to compare above printed values. I don't see how you would want to need that. You probably want to check if 2 separately drawn/created lines are the same. You cannot use this hash for that purpose, you need to use the actual values like start/stop/angle/or something like that. (I'm not sure what members are present in a path, but you can look that up)
in android eclipse sometimes a calculation result for both double and float when displayed as a string uses a decimal point (desired) but sometimes using an exponent (bad - confusing to user). anyway to avoid the exponent?
See String.format documentation.
Just set the desired format for your numbers. You probably want String.format("%f",number).