ByteString what does the parameters exactly do? - android

I wanna upload some files which are 30 MB Max to my server with okhttp websocket.
The websocket transfer allows String or ByteString only.
So I want to convert my file to ByteString and then upload this to my server via websocket(Nodejs).
I use ByteString.of() to convert this byteArray like this.
val file = "../tmp/file.jpg"
try {
val encoded:ByteArray = Files.readAllBytes(Paths.get(file))
val byteString = ByteString.of(encoded,0,1024)
..send data
Log.d("log1","DATA DONE")
} catch (e: IOException) {
Log.d("log1","ERROR:"+e)
}
But what confuses me is that ByteString function takes 3 parameters..
First: ByteArray
Second: Offset
Third: Bytecount
My question is what does the last 2 parameters do and the reason behind it? I don't find any clear documentation about this. Just the roadmap that its added.
If you have any links or suggestions please let me know.

-Offset is actually where you want to start reading your bytes from.
Assume a Text file with the following data
Computer-science World
Quantum Computing
now the offset for the first line is 0 <0,Computer Science World> for the second line the offset will be <23,Quantum Computing>
-ByteCount is the number of bytes you want to count(include)
Let's help you with a piece of simple code
byte[] bytes1 = "Hello, World!".getBytes(Charsets.UTF_8);
ByteString byteString = ByteString.of(bytes1, 2, 9);
// Verify that the bytes were copied out.
Sytem.out.print(byteString.utf8());
Answer is : llo, Worl
So basically, method can be used as a substring. But since you want to send in all the bytes, you can simply use
fun of(vararg data: Byte): ByteString

Related

Transfer Images using IPC app to app communication on the same device

I want to know if there is any efficient way to make app to app communication using IPC. I went to the guide of services that uses AIDL from the docs. But what I really want is to have an image to transfer between them.(Images are high quality) The only issue is android only let us use 1024 kB or even less to pass data through bundle. if you do more than that you'll end up with TransactionTooLargeException. Right now I'm compressing the image and passing base64 string between the two apps and it works fine. But sometimes some images can not be compressed at all. How can I do something like that. I'm compressing image using
bitmap.compress(imageformat=webp,quality=90,compress.nowrap)
the quality will reduce by 10 if I get TransactionTooLargeException. Any ideas on how to do something like that working on android?
But the thing is I don't want to open any other app. The image that I'm receiving will be processed and send a status to the image that was sent from the application. Like 'very cool image' 'very bad image'.in a string status.
Aidl link
Thanks.
var quality = 100
private fun sendImageToDevice(quality: Int, icon: Bitmap) {
Log.e(TAG, "quality of image is $quality ")
runOnUiThread {
Toast.makeText(
this#MainActivity,
"Image Quality $quality",
Toast.LENGTH_SHORT
)
.show()
}
if (quality > 0)
try {
mProcessImageService?.sendImageFormat(icon.toBase64(quality))
} catch (e: TransactionTooLargeException) {
e.message
this.quality = quality - 20
sendImageToDevice(this.quality, icon)
}
else {
}
}
fun Bitmap.toBase64(quality: Int): String {
val outputStream = ByteArrayOutputStream()
this.compress(Bitmap.CompressFormat.JPEG, quality, outputStream)
val base64String: String = Base64.encodeToString(outputStream.toByteArray(), Base64.NO_WRAP)
Log.d("MainActivity", "outputstream size is ${outputStream.size()}")
return base64String
}

SHA Encryption in iOS

I have this code on Android:
val digest = MessageDigest.getInstance("SHA-512")
digest.update("secretotpkey".toByteArray())
val sb = StringBuilder()
val bytes = digest.digest(value.toByteArray())
bytes.forEach {
sb.append(((it and 0xF) + 0x100).toString(16).substring(1))
}
val encryptedValue = sb.toString()
makeLog("Encrypted value is $encryptedValue")
return encryptedValue
I am trying to convert this to iOS by using CryptoSwift. However I am getting different results. Any ideas how to fix?
var digest = Digest.sha512("secretotpkey".bytes)
print(digest)
let bytes = "54181474".bytes
print(bytes)
digest.append(contentsOf: bytes)
var blah = String()
for item in digest {
let a = Int(item & 0xF) + Int(0x100)
let b = (String(format:"%02X", a)).substring(range: NSRange(location: 1, length: 2))
print(b)
blah.append(b)
}
Two encoding issues:
you're not indicating the characters set when converting the key to bytes (a key should consist of bytes in the first place, strings are not keys).
your hex encoding is clearly not correct for either Kotlin or Swift; please use a pre-made library call instead or look up correct code here on StackOverflow.
That should fix it, because there is nothing there but a call to a standardized algorithm, SHA-512 otherwise.

To how convert protobuf object to ByteArray and the encode with Base64 URL_SAFE in Swift?

In Android, I could convert an object to ByteArray and then encode it to Base64 (URL_SAFE) as per code below
val myByteArrayObject = protobufObject.toByteArray()
val meEncodedObject = android.util.Base64.encodeToString.encodeToString(
myByteArrayObject, android.util.Base64.DEFAULT).trim()
How could I achieve that in Swift?
Found the answer.
do {
let protobufSerialized = try protobufObject.serializedData()
let protobufEncoded = protobufSerialized.base64EncodedString()
// Do whatever need to be done with the protobufEncoded
} catch { }
The main hidden function that is hard to find is serializedData() that exist on SwiftProtobuf.Message

Flutters resource load faster than native Android

I'm trying to convert image taken from resources to ByteArray which
will later be send through Socket. I've been measuring time of each of this conversion.
I've done it on both Flutter and native Android (Kotlin). All of the test were done on the same image which was about 1-2MB.
Flutter code :
sendMessage() async {
if (socket != null) {
Stopwatch start = Stopwatch()..start();
final imageBytes = await rootBundle.load('assets/images/stars.jpg');
final image = base64Encode(imageBytes.buffer.asUint8List(imageBytes.offsetInBytes, imageBytes.lengthInBytes));
print('Converting took ${start.elapsedMilliseconds}');
socket.emit("message", [image]);
}
}
Kotlin code:
private fun sendMessage() {
var message = ""
val thread = Thread(Runnable {
val start = SystemClock.elapsedRealtime()
val bitmap = BitmapFactory.decodeResource(resources, R.drawable.stars)
message = Base64.encodeToString(getBytesFromBitmap(bitmap), Base64.DEFAULT)
Log.d("Tag", "Converting time was : ${SystemClock.elapsedRealtime() - start}")
})
thread.start()
thread.join()
socket.emit("message", message)
}
private fun getBytesFromBitmap(bitmap: Bitmap): ByteArray? {
val stream = ByteArrayOutputStream()
bitmap.compress(Bitmap.CompressFormat.JPEG, 100, stream)
return stream.toByteArray()
}
I've been actually expecting native code to be much much faster than Flutter's but thats not the case.. Conversion for Flutter takes about 50ms and its around 2000-3000ms for native.
I thought that Threading may be the case, so I've tried to run this conversion on background thread for native code but it didn't help.
Can you please tell me why is there such a different in time, and how I can implement it better in native code? Is there a way to omit casting to Bitmap etc.? Maybe this makes it so long.
EDIT. Added getBytesFromBitmap function
the difference you see is that in flutter code you just read your data without any image decoding, while in kotlin you are first decoding to Bitmap and then you are compress()ing it back - if you want to speed it up simply get an InputStream by calling Resources#openRawResource and read your image resource without any decoding
It have something to do with the way you convert it to bytes... Can you please post your
getBytesFromBitmap func? Plus, the conversion in native code really should be done in background thread, please upload the your results in this case.

Android String Parsing

I am writing an android app that recieves data over bluetooth. The bytes comming in can be of any size example: 00023>024935928598235>9284>
As you can see each set is seperated by ">". The data comes in extremely fast. I would like some ideas for an implementation. See my problem is that I need to read the data into a byte array that can and then convert it to a string and split them according to the delimeter of ">".
so in the above example:
00023
024935928598235
9284
If i set byte[] data = new byte[8] then when reading the incomming data it might get 00023>02 which is not what i want. I'm not sure how to implement something like this. Any ideas?
Here's one approach. You'll have to implement the readDataFromBluetooth() and somehow set dataAvailable, but this should get you on the right track.
byte[] data = new byte[1024];
List<String> chunks = new LinkedList<String>();
StringBuilder chunk = new StringBuilder();
while (dataAvailable) {
data = readDataFromBluetooth();
for (byte b : data) {
if (b == '<') {
chunks.add(chunk.toString());
chunk.setLength(0);
} else {
chunk.append(b);
}
}
}
if (chunk.length() > 0)
chunks.add(chunk.toString());
I would recommend using a buffered stream, but maybe a bit bigger that 8 bytes, as you suggest, and the read one and one character from the beginning of the stream, accumulating the string. When you encounter a ">", send the value you have accumulated off to a queue for a background thread processing. Use standard producer/consumer implementation techniques (e.g. the Monitor pattern) to communicate via the queue.

Categories

Resources