How to create digital persona FMD from byte[] in Android? - android

I am able to save FMD to SQLite in Android. However when I am trying to recreate FMD it is not working properly.
Saving FMD in database.
try {
Fmd m_enrollmant_fmd = m_engine.CreateEnrollmentFmd(Fmd.Format.ANSI_378_2004, enrollThread);
if (m_success = (m_enrollmant_fmd != null)) {
byte[] fingerprintData = m_enrollmant_fmd.getData();
int fingerprintImageHeight = m_enrollmant_fmd.getHeight();
int fingerprintImageWidth = m_enrollmant_fmd.getWidth();
int fingerprintResolution = m_enrollmant_fmd.getResolution();
int fingerPosition = m_enrollmant_fmd.getViews()[0].getFingerPosition();
int cbeffId = m_enrollmant_fmd.getCbeffId();
int m_templateSize = m_enrollmant_fmd.getData().length;
m_current_fmds_count = 0; // reset count on success
// Save above values in database and this is working fine.
}
} catch (Exception e) {
m_current_fmds_count = 0;
}
Recreating FMD Code:
Fmd temp_fmd = m_engine.CreateFmd(
user.fingerPrint, // byte[]
user.width, // width
user.height, // height
user.resolution, // resolution
user.fingerPrint, //Finger position,
user.cbeffId, // Cbeff Id
Fmd.Format.ANSI_378_2004); // Format algorithm
if (temp_fmd != null) {
// m_fmd is FMD that we generate after taking fingerprint
int m_score = m_engine.Compare(m_fmd, 0, temp_fmd, 0);
if (m_score < (0x7FFFFFFF / 100000)) {
userFound = user;
break;
}
}
This gives me UareUException saying FID is invalid. However when I try to give fix values in m_engine.CreateFmd(...) except byte[] parameter, it works fine but its dissimilarity is 100%, meaning it doesn't match at all. Any help will be appreciated. I am using UareU SDK 3.1 from Crossmatch (previously known as Digital Persona).
Note that values from Database are correct there is no issue in that. Only I am unable to create correct FMD.

Try using the following to recreate the Fmds from a byte array:
Fmd print = UareUGlobal.GetImporter().ImportFmd(prints, Fmd.Format.ANSI_378_2004, Fmd.Format.ANSI_378_2004);
Where prints here is your byte array.

Related

Passing FFMPEG AvFrame data from c++ to JAVA

I need to pass the FFMPEG 'raw' data back to my JAVA code in order to display it on the screen.
I have a native method that deals with FFMPEG and after that calls a method in java that takes Byte[] (so far) as an argument.
Byte Array that is passed is read by JAVA but when doing BitmapFactory.decodeByteArray(bitmap, 0, bitmap.length); it returns null. I have printed out the array and I get 200k of elements (which are expected), but cannot be decoded. So far what I'm doing is taking data from AvFrame->data casting it to unsigned char * and then casting that to jbyterArray. After all the casting, I pass the jbyteArray as argument to my JAVA method. Is there something I'm missing here? Why won't BitmapFactory decode the array into an image for displaying?
EDIT 1.0
Currently I am trying to obtain my image via
public void setImage(ByteBuffer bmp) {
bmp.rewind();
Bitmap bitmap = Bitmap.createBitmap(1920, 1080, Bitmap.Config.ARGB_8888);
bitmap.copyPixelsFromBuffer(bmp);
runOnUiThread(() -> {
ImageView imgViewer = findViewById(R.id.mSurfaceView);
imgViewer.setImageBitmap(bitmap);
});
}
But I keep getting an exception
JNI DETECTED ERROR IN APPLICATION: JNI NewDirectByteBuffer called with pending exception java.lang.RuntimeException: Buffer not large enough for pixels
at void android.graphics.Bitmap.copyPixelsFromBuffer(java.nio.Buffer) (Bitmap.java:657)
at void com.example.asmcpp.MainActivity.setSurfaceImage(java.nio.ByteBuffer)
Edit 1.1
So, here is the full code that is executing every time there is a frame incoming. Note that the ByteBuffer is created and passed from within this method
void VideoClientInterface::onEncodedFrame(video::encoded_frame_t &encodedFrame) {
AVFrame *filt_frame = av_frame_alloc();
auto frame = std::shared_ptr<video::encoded_frame_t>(new video::encoded_frame_t,
[](video::encoded_frame_t *p) { if (p) delete p; });
if (frame) {
frame->size = encodedFrame.size;
frame->ssrc = encodedFrame.ssrc;
frame->width = encodedFrame.width;
frame->height = encodedFrame.height;
frame->dataType = encodedFrame.dataType;
frame->timestamp = encodedFrame.timestamp;
frame->frameIndex = encodedFrame.frameIndex;
frame->isKeyFrame = encodedFrame.isKeyFrame;
frame->isDroppable = encodedFrame.isDroppable;
frame->data = new char[frame->size];
if (frame->data) {
memcpy(frame->data, encodedFrame.data, frame->size);
AVPacket packet;
av_init_packet(&packet);
packet.dts = AV_NOPTS_VALUE;
packet.pts = encodedFrame.timestamp;
packet.data = (uint8_t *) encodedFrame.data;
packet.size = encodedFrame.size;
int ret = avcodec_send_packet(m_avCodecContext, &packet);
if (ret == 0) {
ret = avcodec_receive_frame(m_avCodecContext, m_avFrame);
if (ret == 0) {
m_transform = sws_getCachedContext(
m_transform, // previous context ptr
m_avFrame->width, m_avFrame->height, AV_PIX_FMT_YUV420P, // src
m_avFrame->width, m_avFrame->height, AV_PIX_FMT_RGB24, // dst
SWS_BILINEAR, nullptr, nullptr, nullptr // options
);
auto decodedFrame = std::make_shared<video::decoded_frame_t>();
decodedFrame->width = m_avFrame->width;
decodedFrame->height = m_avFrame->height;
decodedFrame->size = m_avFrame->width * m_avFrame->height * 3;
decodedFrame->timeStamp = m_avFrame->pts;
decodedFrame->data = new unsigned char[decodedFrame->size];
if (decodedFrame->data) {
uint8_t *dstSlice[] = {decodedFrame->data,
0,
0};// outFrame.bits(), outFrame.bits(), outFrame.bits()
const int dstStride[] = {decodedFrame->width * 3, 0, 0};
sws_scale(m_transform, m_avFrame->data, m_avFrame->linesize,
0, m_avFrame->height, dstSlice, dstStride);
auto m_rawData = decodedFrame->data;
auto len = strlen(reinterpret_cast<char *>(m_rawData));
if (frameCounter == 10) {
jobject newArray = GetJniEnv()->NewDirectByteBuffer(m_rawData, len);
GetJniEnv()->CallVoidMethod(m_obj, setSurfaceImage, newArray);
frameCounter = 0;
}
frameCounter++;
}
} else {
av_packet_unref(&packet);
}
} else {
av_packet_unref(&packet);
}
}
}
}
I am not entirely sure I am even doing that part correctly. If you see any errors in this, feel free to point them out.
You cannot cast native byte arrays to jbyteArray and expect it to work. A byte[] is an actual object with length field, a reference count, and so on.
Use NewDirectByteBuffer instead to wrap your byte buffer into a Java ByteBuffer, from where you can grab the actual byte[] using .array().
Note that this JNI operation is relatively expensive, so if you expect to do this on a per-frame basis, you might want to pre-allocate some bytebuffers and tell FFmpeg to write directly into those buffers.

GLES30.glBufferData crash

When I program a project on Android with GLES 3.0 VBO, sometimes the application crashes when it invoking the method GLES30.glBufferData. There is no crashes happen if I use simple data while it crash when I get data from file.
int[] vboIDs = new int[1];
GLES30.glGenBuffers(1, vboIDs, 0);
GLES30.glBindBuffer(GLES30.GL_ELEMENT_ARRAY_BUFFER, vboIDs[0]);
GLES30.glBufferData(GLES30.GL_ELEMENT_ARRAY_BUFFER, size, buffer, GLES30.GL_STATIC_DRAW);//size=1296 buffer.capacity()=1296
GLES30.glBindBuffer(GLES30.GL_ELEMENT_ARRAY_BUFFER, 0);
It just crashed with no exception log. Is the format of buffer wrong? Below is how I get the buffer instance,the parameter byteBuffer is got from a binary file,
public static ByteBuffer createSlice(
ByteBuffer byteBuffer, int position, int length)
{
if (byteBuffer == null)
{
return null;
}
int oldPosition = byteBuffer.position();
int oldLimit = byteBuffer.limit();
try
{
int newLimit = position + length;
if (newLimit > byteBuffer.capacity())
{
throw new IllegalArgumentException(
"The new limit is " + newLimit + ", but the capacity is "
+ byteBuffer.capacity());
}
byteBuffer.limit(newLimit);
byteBuffer.position(position);
ByteBuffer slice = byteBuffer.slice();
slice.order(byteBuffer.order());
return slice;
}
finally
{
byteBuffer.limit(oldLimit);
byteBuffer.position(oldPosition);
}
}
A crash in buffer data normally means you've run off the end of the buffer and accessed an unmapped page, so check that size is correct for the buffer you are uploading.

Encryption on open Source VoIP Android

This is with reference to sipdroid data encrypt failed
I tried using XOR operation instead of reverse byte code for send packets and receive packets in SipdroidSocket.class.
I experienced same issue(too much noise)
Please guide me in encrypting and decrypting packets in SipdroidSocket.class
Sorry for late reply.I am posting the snippets of the code I tried. Please refer the original RtpSocket.java and SipdroidSocket.java classes for complete view. I am just putting the snippets here.
In RtpSocket.java , I took a static value and collected the packet's header length. Then used this header length in SipdroidSocket.java so as to remove the header part prior tweaking with the payload:
In SipdroidSocket.java, following editing were done in Send and Receive functions:
public void receive(DatagramPacket pack) throws IOException {
if (loaded) {
impl.receive(pack);
byte[] b = pack.getData(); // fetch data from receiver
int len = RtpSocket.header;
pack.setData(do_something(b, len)); // do the XORing to retrieve
// original data
} else {
super.receive(pack);
byte[] b = pack.getData();
int len = RtpSocket.header;
pack.setData(do_something(b, len));
}
}
public void send(DatagramPacket pack) throws IOException {
byte[] b = pack.getData(); // fetch original data
int len = RtpSocket.header;
pack.setData(do_something(b, len)); // replace with tweaked data
if (loaded)
impl.send(pack);
else
super.send(pack);
}
private byte[] do_something(byte[] b, int len) {
// TODO Auto-generated method stub
int new_buff_len = b.length - len;
byte[] new_buff = new byte[new_buff_len];
int i = 0;
for (i = len; i < b.length; i++) // separating header values
{
new_buff[i] = (byte) (b[i] ^ 0x43); // XORing original packet
// payload before sending and
// after receiving to get
// original data on both sides
}
return new_buff;
}
Kindly , try it and suggest me please.
Finally it worked ! Had to meddle with the other parts of the code . XOR operation now works fine and have attained the objective.

Difference between Android and Matlab (reading a wav file into array )

I am trying to read a wav file into an array using Android. In order to validate the results I read the same wav file using Matlab. The problem is that the values are different. Your help is highly appreciated in solving this problem.
Kindly, find below the Matlab and Android code with the associated results:
Matlab Code:
fName = 'C:\Users\me\Desktop\audioText.txt';
fid = fopen(fName,'w');
dlmwrite(fName,y_sub,'-append','delimiter','\t','newline','pc');
Matlab Results:
0.00097656
0.00045776
0.0010681
0.00073242
0.00054932
-0.00064087
0.0010376
-0.00027466
-0.00036621
-9.1553e-05
0.00015259
0.0021362
-0.00024414
-3.0518e-05
-0.00021362
Android Code:
String filePath;
private static DataOutputStream fout;
ByteArrayOutputStream out;
BufferedInputStream in;
filePath = "mnt/sdcard/audio.wav";
out = new ByteArrayOutputStream();
try {
in = new BufferedInputStream(new FileInputStream(filePath));
} catch (FileNotFoundException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
}
int read;
byte[] buff = new byte[2000000];
try {
while ((read = in.read(buff)) > 0)
{
out.write(buff, 0, read);
}
} catch (IOException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
}
try {
out.flush();
} catch (IOException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
}
byte[] audioBytes = out.toByteArray();
}
Android Results:
82, 73, 70, 70, 92, 108, 40, 0, 87, 65, 86, 69, 102, 109
Thanks,
In Android you're reading the file header, not the actual values of the sound samples. Your values in Android are ASCII for
RIFF\l( WAVEfm
In Matlab I'm not sure what you're doing... looks like you're writing, not reading a file.
The dir command is quite helpful here. It either displays the whole content of a directory but you can also specify a glob to just return a sub-set of files, e.g. dir('*.wav'). This returns an struct-array containing file information such as name, date, bytes, isdir and so on.
To get started, try the following:
filelist = dir('*.wav');
for file = filelist
fprintf('Processing %s\n', file.name);
fid = fopen(file.name);
% Do something here with your file.
fclose(fid);
end
If a processing result has to be stored per file,
I often use the following pattern. I usually pre-allocate an array, a struct array or
a cell array of the same size as the filelist. Then I use an integer index to iterate
over the file list, which I can also use to write the output. If the information to be
stored is homogeneous (e.g. one scalar per file), use an array or a struct array.
However, if the information differs from file to file (e.g. vectors or matrices of different size) use a cell array instead.
An example using an ordinary array:
filelist = dir('*.wav');
% Pre-allocate an array to store some per-file information.
result = zeros(size(filelist));
for index = 1 : length(filelist)
fprintf('Processing %s\n', filelist(index).name);
% Read the sample rate Fs and store it.
[y, Fs] = wavread(filelist(index).name);
result(index) = Fs;
end
% result(1) .. result(N) contain the sample rates of each file.
An example using a cell array:
filelist = dir('*.wav');
% Pre-allocate a cell array to store some per-file information.
result = cell(size(filelist));
for index = 1 : length(filelist)
fprintf('Processing %s\n', filelist(index).name);
% Read the data of the WAV file and store it.
y = wavread(filelist(index).name);
result{index} = y;
end
% result{1} .. result{N} contain the data of the WAV files.
I am not sure what is the problem exactly, but I got the correct readings when I used the following code:
File filein = new File(filePath, "audio.wav");
try
{
// Open the wav file specified as the first argument
WavFile wavFile = WavFile.openWavFile(filein);
// Display information about the wav file
wavFile.display();
// Get the number of audio channels in the wav file
int numChannels = wavFile.getNumChannels();
// Create a buffer of 100 frames
double[] buffer = new double[20000 * numChannels];
int framesRead;
double min = Double.MAX_VALUE;
double max = Double.MIN_VALUE;
do
{
// Read frames into buffer
framesRead = wavFile.readFrames(buffer, 20000);
// Loop through frames and look for minimum and maximum value
for (int s=0 ; s<framesRead * numChannels ; s++)
{
if (buffer[s] > max) max = buffer[s];
if (buffer[s] < min) min = buffer[s];
}
}
while (framesRead != 0);
// Close the wavFile
wavFile.close();
// Output the minimum and maximum value
System.out.printf("Min: %f, Max: %f\n", min, max);
}
catch (Exception e)
{
System.err.println(e);
}

Android byte array to string to byte array

All I need is convert byte[] to String. Then do something with that string and convert back to byte[] array. But in this testing I'm just convert byte[] to string and convert back to byte[] and the result is different.
to convert byte[] to string by using this:
byte[] byteEntity = EntityUtils.toByteArray(entity);
String s = new String(byteEntity,"UTF-8");
Then i tried:
byte[] byteTest = s.getBytes("UTF-8");
Then i complared it:
if (byteEntity.equals(byteTest) Log.i("test","equal");
else Log.i("test","diff");
So the result is different.
I searched in stackoverflow about this but it doesn't match my case. The point is my data is .png picture so the string converted is unreadable. Thanks in advance.
Solved
Using something like this.
byte[] mByteEntity = EntityUtils.toByteArray(entity);
byte[] mByteDecrypted = clip_xor(mByteEntity,"your_key".getBytes());
baos.write(mByteDecrypted);
InputStream in = new ByteArrayInputStream(baos.toByteArray());
and this is function clip_xor
protected byte[] clip_xor(byte[] data, byte[] key) {
int num_key = key.length;
int num_data = data.length;
try {
if (num_key > 0) {
for (int i = 0, j = 0; i < num_data; i++, j = (j + 1)
% num_key) {
data[i] ^= key[j];
}
}
} catch (Exception ex) {
Log.i("error", ex.toString());
}
return data;
}
Hope this will useful for someone face same problem. Thanks you your all for helping me solve this.
Special thanks for P'krit_s
primitive arrays are actually Objects (that's why they have .equals method) but they do not implement the contract of equality (hashCode and equals) needed for comparison. You cannot also use == since according to docs, .getBytes will return a new instance byte[]. You should use Arrays.equals(byteEntity, byteTest) to test equality.
Have a look to the answer here.
In that case my target was transform a png image in a bytestream to display it in embedded browser (it was a particular case where browser did not show directly the png).
You may use the logic of that solution to convert png to byte and then to String.
Then reverse the order of operations to get back to the original file.

Categories

Resources