Situation:
String is getting encrypted in Java environment (javax.crypto.Cipher/PBEWithMD5AndDES), base64 encoded
String is transfered to Android
Same crypto decoder is used, only different base64 library
=> we could not decrypt, getting
java.lang.SecurityException: Could not decrypt: pad block corrupted
During analysis we compared byte arrays, passed to the decode() method, in order to skip all possible base64 issues, and the arrays are identical.
Again, two identical byte arrays, passed to the same java module, produce different results (ok on java, exception on android).
Parameters, passed to Cipher module are hard-coded and identical on both platforms.
Where is the difference
We have finally found the difference between Java and Android code. It turned out Cipher component is just a container and does not implement anything by itself. The particular algorithm implementation is done by a provider and each platform has different list of providers configured. In our case if was some Sun implementation for Java and Bouncy Castle for Android. So it turned out that wir accidentaly used the encryption algorithm which implementation was different for different providers
Related
I am implementing the AES encryption algorithm in GCM operating mode in an android application.
My IDE (Intellij Idea) tells me that to use javax.crypto.spec.GCMParameterSpec the condition android.os.Build.VERSION.SDK_INT> = android.os.Build.VERSION_CODES.KITKAT is required.
I tried when the condition is not verified to use a javax.crypto.spec.GCMParameterSpec of which I downloaded the source file and included it in my project, but with it the encryption operations do not work correctly (the decrypted data does not match to original data or java.security.InvalidAlgorithmParameterException: IV must be specified in GCM mode).
Do you have any ideas to suggest on how I can also support previous versions of Android KITKAT?
Thanks in advance.
Initial versions of Android based on Java 6 did not give you GCMParameterSpec, but they would use IvParameterSpec instead. Besides the (usually 12 byte) IV, the GCMParameterSpec will give you two additional operations: the support for additional data and the tag size.
Now the tag size is not too much of a problem: first of all, usually the full 128 bits / 16 bytes are used. Furthermore you can just remove those bytes from the end of the ciphertext until you reach the required tag size, e.g. remove 4 bytes / 32 bits to get a tag size of 96 bits.
The additional data is a problem, as far as I know there is no way to specify those, at least not if you require a Cipher instance. You could of course use GCMBlockCipher instead, but then you'd not use Cipher and any possible acceleration provided by the platform (as Bouncy is software only).
And yes, as indicated, it is perfectly possible to implement GCM mode yourself for Android, as you don't need to sign any providers. Of course, you'd have to use a different GCMParameterSpec implementation, and it would be a good idea only to use the provider for the older platform, so some runtime switching seems to be required.
We are having an android app which a decrypting and encrypting large (up to 100MB) files over HTTP-Streams.
Therefore, we are using CipherInputStreams and CipherOutputStreams which works fine for AES/CBC/PKCS7Padding. We recently switched to AES/GCM/NoPadding. Now the encryption and decryption is inacceptable slow for files over roughly 50MB.
Debugging into the android source code, reveals the issues: https://android.googlesource.com/platform/libcore/+/master/ojluni/src/main/java/javax/crypto/CipherInputStream.java#112
This method has byte buffer "oBuffer" which is reallocated and increased by 512bits until it can hold the whole message (see line: https://android.googlesource.com/platform/libcore/+/master/ojluni/src/main/java/javax/crypto/CipherInputStream.java#121)
I am aware of the note over this method which stated that in AEAD ciphers the whole message has to be buffered. This is one issue, because we cannot hold the whole message into a memory buffer. Another issue is that the oBuffer is constantly reallocated.
Is there any solution for using GCM with a streaming API?
Splitting the file into the parts and chaining is a solution for you.
Assume that you divide the file into n parts. Encrypt each of them with AES-GCM with the following additions. Prefix each part before encryption as follows;
tag_0 = ''
for i from 1 to n
ciphertextBlock_i, tag_i = AES-GCM( i:n || tag_i-1 || plaintextBlock_i)
prefix each part with the part number as i:n
prefix each part except the first one with the authentication tag of the previous part.
With these, you have now a chain that can be controlled after decryption. You can detect, additions, deletions. The order is under your control, you can send even without the order. However, you need to check the prefix.
You can also
add the part size, and
add the time of encryption, too if you fear from the replay attack.
I found a link describing how to rebuild proto files from a C++ executable: http://www.sysdream.com/reverse-engineering-protobuf-apps
Is there a similar method for APKs or decrypted objective c apps?
The article you linked describes extracting the FileDescriptorProtos embedded as string literals in a C++ application that uses Protobufs. The Java code generator embeds similar string literals into Java code. If you ran the Java classes through a decompiler, you should be able to recover the descriptor strings and decode them.
However, note that this only works if the application uses the standard, Google-authored Java protobuf implementation and does not use "lite mode". In lite mode, descriptors are not included in the generated code. Implementations other than the Google-authored ones may or may not include the descriptor. I would guess that most Android developers prefer to use lite mode or some alternative lightweight implementation that doesn't include descriptors, so you might have trouble extracting from APKs. (I don't know about objective C.)
That said, note that you can actually decode a lot of the information in a protobuf message without having the schema at all. If you use protoc with the --decode-raw command-line option and feed it a protobuf message on stdin, it will decode it to tag/value pairs. You'll only get numbered fields (not names) and some type information is lost, but you'll find it much easier to reverse-engineer the format from there than you would with just the raw bytes.
I am using a 128-bit AES cipher algorithm. But the program takes a long time, since the files to encrypt are big.
I was wondering if there is a more light cipher algorithm to use in Android. I can't find a list of supported ciphers in Android.
Have you tried to use shorter keys with AES instead? You can try OpenSSL build as native code, but I guess dalvik already uses optimized libraries, I don't think it will help. There are good reasons AES takes some time, by choosing something faster, you will have to lower real security.
I suggest you should not encrypt whole file if you need speed. Instead, encrypt only header or parts of file, without which rest of file is not useful. However it depends on what data you are encrypting and will not work for generic data files.
When following the Application Licensing document on the developer page for Android to use ServerManagedPolicy for licensing, then the section Implementing an Obfuscator says to declare a private static final array of 20 with random bytes called SALT. This is passed to the constructor of AESObfuscator and the description says it's "an array of random bytes to use for each (un)obfuscation". I am new to this, but I guess that is for obfuscating preference values.
When later I obfuscate the code itself using the ProGuard option delivered with the Android SDK for Eclipse by exporting the apk I get the final apk. But using a reverse engineer application like apktool on my apk reveals the SALT array in plain bytes. Now, like I said I am new to this and my question might seem a bit naive... but isn't that a bad thing? Shouldn't the byte array be a bit more hidden?
A salt value is not a secret, so it's not really a problem if it is disclosed. That said, the obfuscator mangles code (mostly variable and method names), not values. So anything you have stored as is (strings, byte arrays, your obfuscation key) will be recoverable by decompilation.
Obfuscating makes it a bit harder to find, but if you are purposefully looking for a random-looking 16-byte array or a 128-bit key, it not too hard to find.
BTW, that example doesn't really promote best practices -- you should use a randomly generated
new salt value for every encryption operation, and store it along with the encrypted data. Not hard-code it your encryption code and use it every time. Then again, that example assumes you will be encrypting (for obfuscation purposes) a single preference only.