I have a method in JNI C/C++ which takes jstring and returns back jstring some thing like as below,
NATIVE_CALL(jstring, method)(JNIEnv * env, jobject obj, jstring filename)
{
// Get jstring into C string format.
const char* cs = env->GetStringUTFChars (filename, NULL);
char *file_path = new char [strlen (cs) + 1]; // +1 for null terminator
sprintf (file_path, "%s", cs);
env->ReleaseStringUTFChars (filename, cs);
reason_code = INTERNAL_FAILURE;
char* info = start_module(file_path);
jstring jinfo ;
if(info==NULL)
{
jinfo = env->NewStringUTF(NULL);
}
else
{
jinfo = env->NewStringUTF(info);
}
delete info;
info = NULL;
return jinfo;
}
The code works perfectly with prior android 4.0 versions like 2.2,2.3 and so on. With ICS 4.0 check JNI is on by default and because of it the app crashes throwing the following error
08-25 22:16:35.480: W/dalvikvm(24027): **JNI WARNING: input is not valid Modified UTF-8: illegal continuation byte 0x40**
08-25 22:16:35.480: W/dalvikvm(24027):
08-25 22:16:35.480: W/dalvikvm(24027): ==========
08-25 22:16:35.480: W/dalvikvm(24027): /tmp/create
08-25 22:16:35.480: W/dalvikvm(24027): ==========
08-25 22:16:35.480: W/dalvikvm(24027): databytes,indoorgames,drop
08-25 22:16:35.480: W/dalvikvm(24027): ==========���c_ag����ϋ#�ډ#�����#'
08-25 22:16:35.480: W/dalvikvm(24027): in Lincom/inter /ndk/comNDK;.rootNDK:(Ljava/lang/String;)Ljava/lang/String; **(NewStringUTF)**
08-25 22:16:35.480: I/dalvikvm(24027): "main" prio=5 tid=1 NATIVE
08-25 22:16:35.480: I/dalvikvm(24027): | group="main" sCount=0 dsCount=0 obj=0x40a4b460 self=0x1be1850
08-25 22:16:35.480: I/dalvikvm(24027): | sysTid=24027 nice=0 sched=0/0 cgrp=default handle=1074255080
08-25 22:16:35.490: I/dalvikvm(24027): | schedstat=( 49658000 26700000 48 ) utm=1 stm=3 core=1
08-25 22:16:35.490: I/dalvikvm(24027): at comrootNDK(Native Method)
I am clueless as to where i am wrong. If you see above NewStringUTF is adding some garbage value to the c Char* bytes .
Any idea about why this is happening
Any alternative solution to achieve the above is welcome
I really appreciate if one of you can help me in . Thanks in advance
regds
me
The cause of this problem is directly related to a known UTF-8 bug in the NDK/JNI GetStringUTFChars() function (and probably related functions like NewStringUTF). These NDK functions do not convert supplementary Unicode characters (i.e., Unicode characters with a value of U+10000 and above) correctly. This leads to incorrect UTF-8 and subsequent crashes.
I encountered the crash when handling user input text that contained emoticon characters (see the corresponding Unicode chart). Emoticon characters lie in the Supplementary Unicode character range.
Analysis of the Problem
The Java client passes a string containing a supplementary Unicode
character to JNI/NDK.
JNI uses the NDK function GetStringUTFChars() to extract the contents of the Java string.
GetStringUTFChars() returns the string data as incorrect and invalid UTF-8.
There is a known NDK bug whereby GetStringUTFChars() incorrectly converts supplementary Unicode characters, producing an incorrect and invalid UTF-8 sequence.
In my case, the resulting string was a JSON buffer. When the buffer was passed to the JSON parser, the parser promptly failed because one of the UTF-8 characters of the extracted UTF-8 had an invalid UTF-8 prefix byte.
Possible Workaround
The solution I've used can be summarized as follows:
The goal is to prevent GetStringUTFChars() from performing the
incorrect UTF-8 encoding of the supplementary Unicode character.
This is done by the Java client encoding the request string as
Base64.
The Base64-encoded request is passed to JNI.
JNI calls GetStringUTFChars(), which extracts the Base64-encoded
string without performing any UTF-8 encoding.
The JNI code then decodes the Base-64 data, producing the original
UTF-16 (wide char) request string, including the supplementary
Unicode character.
In this way we circumvent the problem of extracting supplementary Unicode characters from the Java string. Instead, we convert the data to Base-64 ASCII before calling GetStringUTFChars(), extract the Base-64 ASCII characters using GetStringUTFChars(), and convert the Base-64 data back to wide characters.
This is how I done this.
1- Char Array to JByteArray.
2- JByteArray to JString.
3- Return jstring to java side.
JNI Code; (.c) format
jstring Java_com_x_y_z_methodName(JNIEnv *env, jobject thiz) {
int size = 16;
char r[] = {'P', 'K', 'd', 'h', 't', 'X', 'M', 'm', 'r', '1', '8', 'n', '2', 'L', '9', 'K'};
jbyteArray array = (*env)->NewByteArray(env, size);
(*env)->SetByteArrayRegion(env, array, 0, size, r);
jstring strEncode = (*env)->NewStringUTF(env, "UTF-8");
jclass cls = (*env)->FindClass(env, "java/lang/String");
jmethodID ctor = (*env)->GetMethodID(env, cls, "<init>", "([BLjava/lang/String;)V");
jstring object = (jstring) (*env)->NewObject(env, cls, ctor, array, strEncode);
return object;
}
Java Code;
native String methodName();
Other Approach Not Works For Me;
I also tried return (*env)->NewStringUTF(env, r) but returns some characters that are not in the char array, at the end of the string where with the warning of JNI WARNING: input is not valid Modified UTF-8: illegal continuation byte 0x40.
Example; PKdhtXMmr18n2L9K�ؾ�����-DL
Edit:
C++ version
jstring clientStringFromStdString(JNIEnv *env,const std::string &str){
// return env->NewStringUTF(str.c_str());
jbyteArray array = env->NewByteArray(str.size());
env->SetByteArrayRegion(array, 0, str.size(), (const jbyte*)str.c_str());
jstring strEncode = env->NewStringUTF("UTF-8");
jclass cls = env->FindClass("java/lang/String");
jmethodID ctor = env->GetMethodID(cls, "<init>", "([BLjava/lang/String;)V");
jstring object = (jstring) env->NewObject(cls, ctor, array, strEncode);
return object;
}
I resolved this issue by returning byte array instead of String. On the Java side i am now converting the Byte array to Strings .Works fine! Stay away from using NewStringUTF() for Android 4.0 and above as there is already a bug reported on Google Android NDK.
I had this problem when I change the file Application.mk
From this line:
APP_STL := stlport_static
To:
APP_STL := gnustl_static
Once I changed it back again it fixed the issue.
Strings that you pass to NewStringUTF() need to be valid Modified UTF-8. It looks like the string returned by your start_Inauthroot() function is in some other encoding, or is just returning an invalid string. You need to convert the string to UTF-8 before passing it to JNI functions. Or you could use one of the charset-aware String constructors to build the String object instead.
In my opinion, its not a bug.
NewStringUTF Constructs a new java.lang.String object from an array of characters in modified UTF-8 encoding.
Modified UTF-8 is not standard UTF-8. See Modified UTF-8
In most cases, UTF-8 encoded string is valid Modified UTF-8. Because Modified UTF-8 and UTF-8 are quite similar. However when it comes to Unicode string beyond Basic Multilingual Plane, they are not compatible.
Solution:
pass UTF-8 bytes to Java layer and new String(bytes, "UTF-8") then pass jstring to JNI.
For me, the solution was to place the content on a const char*:
const char* string = name_sin.c_str();
jstring utf8 = env_r->NewStringUTF(string);
and the function:
jclass cls_Env = env_r->FindClass(CLASS_ACTIVITY_NAME);
jmethodID mid = env_r->GetMethodID(cls_Env, "Delegate",
"(Ljava/lang/String;)V");
//todo importante hacerlo asi, si pasas directamente c_str a veces da error de carater no UTF 8
const char* string = name_sin.c_str();
jstring utf8 = env_r->NewStringUTF(string);
env_r->CallVoidMethod(*object_r, mid, utf8);
env_r->DeleteLocalRef(utf8);
I also struggled with the same problem from the last day. Finally figured out a solution after a day .. I hope this reply may save someone's day..
The problem was I was calling another function within the native function, used the returned string directly and which caused crash in android older versions
So firstly I saved the string returned from another function to a variable then used it, and the problem gone :D
The below example may clear your concept
//older code with error
//here key_ is the string from java code
const char *key = env->GetStringUTFChars(key_, 0);
const char *keyx = getkey(key).c_str();
return env->NewStringUTF(keyx);
And here is how I solved this error
//newer code which is working
//here key_ is the string from java code
const char *key = env->GetStringUTFChars(key_, 0);
string k = getkey(key);
const char *keyx = k.c_str();
return env->NewStringUTF(keyx);
Happy coding :D
This works for me in c++
extern "C" JNIEXPORT
jstring Java_com_example_ndktest_MainActivity_TalkToJNI(JNIEnv* env, jobject javaThis, jstring strFromJava)
{
jboolean isCopy;
const char* szHTML = env->GetStringUTFChars(strFromJava, &isCopy);
std::string strMine;
strMine = szHTML;
strMine += " --- Hello from the JNI!!";
env->ReleaseStringUTFChars(strFromJava, szHTML);
return env->NewStringUTF(strMine.c_str());
}
c android ndk is working as follows
JNIEXPORT jstring JNICALL
Java_com_example_hellojni_HelloJni_stringFromJNI( JNIEnv* env,
jobject thiz,jstring str )
{
jboolean isCopy;
const char* szHTML = (*env)->GetStringUTFChars(env, str, &isCopy);
return (*env)->NewStringUTF(env, szHTML);
}
Related
Description of the intended goal
I'm trying to implement OpenSSL-generated public/private key pairs in Android/Kotlin using JNI, in order to implement user encryption on the information stored to the cloud server. I've compiled all OpenSSL source code and generated all .so files correcly.
The code (C++)
The C++ code to use OpenSSL is shown below. CmakeLists.txt and NDK configuration is working fine.
extern "C"
JNIEXPORT jobjectArray JNICALL
Java_eu_joober_ui_entry_SplashFragment_generateRSAKeyPair(JNIEnv *env, jobject thiz) {
int ret = 0;
RSA *r = nullptr;
BIGNUM *bne = nullptr;
BIO *bp_public = nullptr, *bp_private = nullptr;
int bits = 2048;
unsigned long e = RSA_F4;
jstring public_key_text;
jstring private_key_text;
jobjectArray returnPair = env->NewObjectArray(2, env->FindClass("java/lang/String"),nullptr);
// 1. generate rsa key
bne = BN_new();
ret = BN_set_word(bne,e);
if(ret != 1){
goto free_all;
}
r = RSA_new();
ret = RSA_generate_key_ex(r, bits, bne, nullptr);
if(ret != 1){
goto free_all;
}
// 2. get public key
bp_public = BIO_new(BIO_s_mem());
ret = PEM_write_bio_RSAPublicKey(bp_public, r);
BIO_get_mem_data(bp_public, &public_key_text);
if(ret != 1){
goto free_all;
}
// 3. get private key
bp_private = BIO_new(BIO_s_mem());
ret = PEM_write_bio_RSAPrivateKey(bp_private, r, nullptr, nullptr, 0, nullptr, nullptr);
BIO_get_mem_data(bp_private, &private_key_text);
// Check public and private keys were generated correctly
__android_log_print(ANDROID_LOG_DEBUG, LOG_TAG, "Public key is: \n%s",public_key_text);
__android_log_print(ANDROID_LOG_DEBUG, LOG_TAG, "Private key is: \n%s",private_key_text);
// 4. free
free_all:
BIO_free_all(bp_public);
BIO_free_all(bp_private);
RSA_free(r);
BN_free(bne);
// 5. Return strings using jobjectArray
if (ret == 1) {
env->SetObjectArrayElement(returnPair, 0, public_key_text);
env->SetObjectArrayElement(returnPair, 1, private_key_text);
return returnPair;
}
else {
return nullptr;
}
}
The error
If I check the Android Logcat, both public and private key seem to be generated correctly (as per __android_log_print output) but the app crashes with the following error when env->SetObjectArrayElement(returnPair, 0, public_key_text); is called:
JNI DETECTED ERROR IN APPLICATION: use of invalid jobject
The IDE (Android Studio) does not complain on any error, and the log suggests that key pair is being generated correctly, but I don't know why the keys are not being stored in the jobjectArray correctly. In fact, if I just simply put:
env->SetObjectArrayElement(returnPair, 0, env->NewStringUTF("Hello"));
env->SetObjectArrayElement(returnPair, 1, env->NewStringUTF("World"));
the code works completely fine, my Kotlin code gets the Strings correctly ("Hello" and "World"), and the app does not crash, which makes me think that problem is only on the C++ side.
The question
What I am doing wrong? I have checked other SO questions like JNI converting jstring to char * or jstring return in JNI program with slight modifications and combinations with no luck.
SIDE NOTE: I'm using OpenSSL implementation with C++ because Android/Kotlin code does not provide the private key generated using KeyPairGenerator.getInstance() and generatePair() (only public key can be retrieved from Keystore), which I need to be stored in a different place so that user information can be retrieved even if the app is uninstalled, as every subsequent call to generatePair() will lead to a different key pair. If you know a different approach to this problem I am more than welcome to any suggestions you may provide.
You never created a java string out of public_key_text
Try
char * public_key_text;
...
BIO_get_mem_data(bp_public, &public_key_text);
...
env->SetObjectArrayElement(returnPair, 0, env->NewStringUTF(public_key_text));
Does anyone know how to solve the error?
JNIEXPORT jstring JNICALL JAVA_com_pfc_camera_ndkmain_MainActivity_compresion(JNIEnv* env, jobject obj, jobjectArray jargv){
//jargv is a Java array of Java strings
int argc = env->GetArrayLength(jargv);
typedef char *pchar;
pchar *argv = new pchar[argc];
int i;
for(i=0; i<argc; i++)
{
jstring js = env->GetObjectArrayElement(jargv, i); //A Java string
const char *pjc = env->GetStringUTFChars(js); //A pointer to a Java-managed char buffer
size_t jslen = strlen(pjc);
argv[i] = new char[jslen+1]; //Extra char for the terminating null
strcpy(argv[i], pjc); //Copy to *our* buffer. We could omit that, but IMHO this is cleaner. Also, const correctness.
env->ReleaseStringUTFChars(js, pjc);
}
//Call main
Principal *pa=Principal::CreateInstance(argc,argv);
pa->Run();
pa->FreeInstance();
//Now free the array
for(i=0;i<argc;i++)
delete [] argv[i];
delete [] argv;
}
I understand that the error can come from not doing a casting but I do not have it very clear
[ https://i.stack.imgur.com/bOWKZ.png][1]
It seems that the problem has been solved, now I get another error but I do not understand if I'm passing two arguments js and pjc
[ https://i.stack.imgur.com/UHCAR.png][1]
In C++ you have to use explicit conversion to your desired type.
jstring js = (jstring)env->GetObjectArrayElement(jargv, i);
You can learn about jni programming here
Regarding your other question do the following:
const jbyte *pjc = env->GetStringUTFChars(js, NULL);
I have a problem with converting in JNI.
In C++ I'm creating some cipher using AES (Library CryptoPP). I'm converting result to string and returning it. This is how the code getting the string looks like:
JNIEXPORT jbyteArray JNICALL Java_com_example_androidake_MutualAuthenticateChip_prepareEncryptionCPP
(JNIEnv *env, jobject thisObj, jboolean hmm, jboolean jinit) {
string encryption= mac->EncryptCertKey();
jbyteArray returns = env->NewByteArray(encryption.size());
env->SetByteArrayRegion(returns, 0, encryption.length(), (jbyte*) encryption.c_str());
return returns;
};
Above string is converting to jbyteArray which is returned. First I wanted just return string using
env->NewStringUTF(encryption.c_str());
but the application has been crashing. I think it is caused by content of variable 'encryption'. I'm using env->NewStringUTF(encryption.c_str()); in other functions, where returned string is for example just a number or something like that.
Then in Java I'm doing conversion from byte to string:
byte[] cipher = mac_A.prepareEncryptionCPP(true, true);
string cipher_str = new String(cipher);
And I'm putting that string again to the C++ object and compare old cipher with the cipher which is sent from Java:
//Java
boolean result = mac_A.compareEncryption(true, cipher);
//JNI
JNIEXPORT jboolean JNICALL Java_com_example_androidake_MutualAuthenticateChip_compareEncryption
(JNIEnv * env, jobject thisObj, jboolean jinit, jstring cipher){
bool init = jinit;
bool result;
jsize length = env->GetStringUTFLength(cipher);
const char *inCStr_ek = env->GetStringUTFChars(cipher, 0);
string s(inCStr_ek, length);
result = mac->CompareCipher(s);
env->ReleaseStringUTFChars(cipher, inCStr_ek);
return result;
};
Comparing in C++ :
bool MyClass::CompareCipher(std::string cipher_2){
if(cipher == cipher_2){
return true;
}else{
return false;
}
}
And it always returns false. I do not know what I'm doing wrong. I've even sent this cipher from Java to C++ and take it back to Java and the strings are equals, but in C++ side are not.
on your java side code you have
byte[] cipher = mac_A.prepareEncryptionCPP(true, true);
boolean result = mac_A.compareEncryption(true, cipher);
compareEncryption jni function is defined with jstring , not jbytearray.
So from JNI side you are sending a byte array to java, and from java side you send back the same byte array to native side (but uses jstring in call), but then you are using env->GetStringUTFChars(cipher, 0) which converts that byte array into a modified UTF-8 string, so its technically not that same byte array anymore.
if you need strings do the conversions in java side, and just use with same plain byte arrays between jni and java. See this for string encoding issues in Android JNI.
I have implemented Jansson in Android with C and made a function which calculates values from json and that works in C, I tried to use that code in NDK with JNI it builds with no errors, but as i tried to arrange the code to work with JNI it gives me pointer error warning: return from incompatible pointer type. I have read that i need to use jlong for pointers but i cant figure out how that works, it is my first time working in it.
This is my code from C (gives no errors and compiles)
char *doCalc (char *invoice_str) {
json_error_t error;
json_t *invoice = json_loads (invoice_str, JSON_DISABLE_EOF_CHECK, &error);
...
char *result = json_dumps (json_data, JSON_PRESERVE_ORDER);
return result;
}
C code Arranged to work with JNI (gives me error warning: return from incompatible pointer type, which if im correct is because of jchar)
JNIEXPORT jchar JNICALL *Java_com_example_test_doCalc (JNIEnv* env, jobject obj,char const *invoice_str) {
json_error_t error;
json_t *invoice = json_loads (invoice_str, JSON_DISABLE_EOF_CHECK, &error);
...
char *result = json_dumps (json_data, JSON_PRESERVE_ORDER);
return result;
}
Then in my Activity I like to would run doCalc(charJ);, charJ has Json in it. Which would then give me dump of calculated values.
Also I might be looking at this completely wrong, any help is appreciated.
Try to use jstring instead of char*
JNIEXPORT jchar JNICALL * Java_com_example_test_doCalc(JNIEnv * env, jobject obj, jstring invoice_jstring) {
//convert invoice_jstring to char* link bellow
json_error_t error;
json_t * invoice = json_loads(invoice_str, JSON_DISABLE_EOF_CHECK, & error);
...
char * result = json_dumps(json_data, JSON_PRESERVE_ORDER);
return result;
}
for conversion jstring to char* you can use this answer:
JNI converting jstring to char *
In my Android JNI code, I need to convert jstring to wchar_t. The closest reference I found was How do I convert jstring to wchar_t *.
One can obtain jchar* and the length using the following code:
const jchar *raw = env->GetStringChars(string, 0);
jsize len = env->GetStringLength(string);
wchar_t* wStr = new wchar_t[len+1];
It seems I cannot use wcncpy to copy "raw" into "wStr." Although jchar is 2-bytes long, wchar_t is 4 bytes long on all modern versions of Android OS.
One option is to copy one character at a time in a for loop:
for(int i=0;i<len;i++) {
wStr[i] = raw[i];
}
wStr[len] = 0;
The other option would be to call env->GetStringUTFChars() and use iconv_* routines to convert to wchar_t type.
Can someone please confirm if option 1 is valid? Hope I don't have to resort to option 2. Is there a better option? Regards.
wchar_t specifies an element size but not a character set or encoding. Since you are asking about a 32-bit element, can we assume you want to use Unicode/UTF-32? Regardless, once you decide which encoding you want, standard Java libraries are up to the task.
Use a String.getBytes() overload to get an array of bytes. (It is easier to do this in Java rather than JNI, if you have a choice.) Once you have a jbyteArray, you can copy it to a C buffer and cast to wchar_t *.
On Android, you might want Unicode/UTF-8. But that has an 8-bit code-unit so you probably wouldn't be asking about wchar_t. (BTW-a character in UTF-8 can need 1 or more bytes.)
One way would be to use String.getBytes("UTF-32LE"). Note this is making the ASSUMPTION that wchar_t is 4 bytes and little-endian, but this should be a fairly safe assumption to make.
Here's an example that passes a String from Java to C++, where it is converted to std::wstring, reversed, and passed back to Java:
class MyClass {
private native byte[] reverseString(byte[] arr);
String reverseString(String s) {
try {
return new String(reverseString(s.getBytes("UTF-32")), "UTF-32");
} catch (UnsupportedEncodingException e) {
// TODO Auto-generated catch block
e.printStackTrace();
return "";
}
}
}
On the C++ side you have:
std::wstring toWStr(JNIEnv *env, jbyteArray s)
{
const wchar_t *buf = (wchar_t*) env->GetByteArrayElements(s, NULL);
int n = env->GetArrayLength(s) / sizeof(wchar_t);
// First byte is BOM (0xfeff), so we skip it, hence the "buf + 1".
// There IS NO null-terminator.
std::wstring ret(buf + 1, buf + n);
env->ReleaseByteArrayElements(s, (jbyte*) buf, 0);
return ret;
}
jbyteArray fromWStr(JNIEnv *env, const std::wstring &s)
{
jbyteArray ret = env->NewByteArray((s.size()+1)*sizeof(wchar_t));
// Add the BOM in front.
wchar_t bom = 0xfeff;
env->SetByteArrayRegion(ret, 0, sizeof(wchar_t), (const jbyte*) &bom);
env->SetByteArrayRegion(ret, sizeof(wchar_t), s.size()*sizeof(wchar_t), (const jbyte*) s.c_str());
return ret;
}
extern "C" JNIEXPORT jbyteArray JNICALL Java_MyClass_reverseString(JNIEnv *env, jobject thiz, jbyteArray arr)
{
std::wstring s= toWStr(env, arr);
std::reverse(s.begin(), s.end());
return fromWStr(env, s);
}
I tested it both on my phone, which has Android 4.1.2 and ARM CPU, and on the Android Emulator - Android 4.4.2 and x86 CPU, and this code:
MyClass obj = new MyClass();
Log.d("test", obj.reverseString("hello, здравствуйте, 您好, こんにちは"));
Gave this output:
06-04 17:18:20.605: D/test(8285): はちにんこ ,好您 ,етйувтсвардз ,olleh
As long as all your data is UCS2, you can use option 1. Please see wchar_t for UTF-16 on Linux? for a similar discussion. Note that C++11 provides std::codecvt_utf16 to deal with the situation.
No need to convert. Cast const jchar to (wchar_t *). jni.h define jchar as typedef uint16_t jchar; /* unsigned 16 bits */ which is eventually wchar_t.
You can try this, it worked for me in old project.