I am trying to pass an image as Uint8List to native code Android/iOS from Flutter, but getting an error in iOS side. I am modifying a plugin and i never developed in iOS before.
Here is the Flutter code to capture an image from widget and send the Uint8List to native code.
Future<void> _capturePng() async {
RenderRepaintBoundary boundary =
globalKey.currentContext.findRenderObject();
ui.Image image = await boundary.toImage(pixelRatio: 1.50);
ByteData byteData = await image.toByteData(format: ui.ImageByteFormat.png);
Uint8List pngBytes = byteData.buffer.asUint8List();
printer.printImage(pngBytes);
}
In Android, i am using Kotlin:
private fun printImage(call: MethodCall, result: Result) {
val image = call.arguments as ByteArray
val res = mPrinterPlugin.printImage(image)
result.success(res)
}
In iOS, the plugin is written in Objective-C. i added this check for my image
else if([#"imagePrint" isEqualToString:call.method]){
NSData *imgBytes = call.arguments;
UIImage *label = [UIImage imageWithData:imgBytes];
}
I saw that Uint8List is FlutterStandardTypedData typedDataWithBytes in iOS, but when i set the imgBytes type to it i get an error.
You should take first argument as a typed data, probably:
NSArray *args = call.arguments;
FlutterStandardTypedData *list = args[0];
kotlin:
val args: List<Any> = call.arguments as List<Any>
val list = args[0] as ByteArray
and you invoke it like this:
Uint8List uint8List;
_channel.invokeMethod("your_function_name", [uint8List]);
I just dealt with this. Letting Xcode do its magical conversion was burning me. For instance, in your iOS plugin implementation, the line:
NSData *imgBytes = call.arguments;
When i was doing this, in Xcode the debugger would see the data type would come through as a FlutterStandardTypedData (as i passed in a list of ints) but every time i would try to access an element/object it would give me memory errors. It was like Xcode was honoring my parameter list (NSArray) in reality, but showing me the data type as a FlutterStandardTypedData while debugging. Seemed like a bug in the IDE to be honest.
Just as the upvoted answer shows, i resolved by using the flutter data type:
FlutterStandardTypedData *bytesList = call.arguments[#"listOfBytes"];
Related
I want to take a screenshot of any widget and then share it to WhatsApp or any other app. But from the screenshot widget, I get uint8List type of image but I want to convert it into image type and then Share it with Flutter_Share plugin, so I have to convert it into Image. How to do that?
controller
.capture(delay: Duration(milliseconds: 10))
.then((capturedImage) async {
final imagePath = await File('/image.png').create();
await imagePath.writeAsBytes(capturedImage!);
await Share.shareFiles([imagePath.path]);
}).catchError((onError) {
print(onError);
});
Here I am trying to convert it to imagePath but I got "FormatException: Unexpected extension byte (at offset 0)" this type of error, Also I have tried,
File img = File.fromRawPath(imageFile!);
but still, I got an error.
You can use Image.memory
Image.memory(listHere);
https://api.flutter.dev/flutter/widgets/Image/Image.memory.html
It possible to run ADB inside android flutter application without root.
There is an application that did it and it works as expected:
https://github.com/nightmare-space/adb_tool
It is using ADB binary compiled for Android like we can see here:
https://github.com/nightmare-space/adb_tool/tree/main/assets/android
However I can't find how it works, even with the full source.
ADB is installed to a strange location as you can see:
I tried to replicate this but it doesn't work at all.
final deposit = (await getApplicationDocumentsDirectory()).path;
const program = '/data/data/com.example.my_app/files/usr/bin/adb';
ByteData data = await rootBundle.load('assets/android/adb');
List<int> bytes = data.buffer.asUint8List(data.offsetInBytes, data.lengthInBytes);
var file = File(program);
file = await file.create(recursive: true);
file.writeAsBytes(bytes);
await Process.run('chmod', ['+x', program]);
var results = await Process.run(program, ['--version']); // Permission error.
setState(() {
_counter += 1;
textarea.text = results.stdout;
});
I'm trying to convert image taken from resources to ByteArray which
will later be send through Socket. I've been measuring time of each of this conversion.
I've done it on both Flutter and native Android (Kotlin). All of the test were done on the same image which was about 1-2MB.
Flutter code :
sendMessage() async {
if (socket != null) {
Stopwatch start = Stopwatch()..start();
final imageBytes = await rootBundle.load('assets/images/stars.jpg');
final image = base64Encode(imageBytes.buffer.asUint8List(imageBytes.offsetInBytes, imageBytes.lengthInBytes));
print('Converting took ${start.elapsedMilliseconds}');
socket.emit("message", [image]);
}
}
Kotlin code:
private fun sendMessage() {
var message = ""
val thread = Thread(Runnable {
val start = SystemClock.elapsedRealtime()
val bitmap = BitmapFactory.decodeResource(resources, R.drawable.stars)
message = Base64.encodeToString(getBytesFromBitmap(bitmap), Base64.DEFAULT)
Log.d("Tag", "Converting time was : ${SystemClock.elapsedRealtime() - start}")
})
thread.start()
thread.join()
socket.emit("message", message)
}
private fun getBytesFromBitmap(bitmap: Bitmap): ByteArray? {
val stream = ByteArrayOutputStream()
bitmap.compress(Bitmap.CompressFormat.JPEG, 100, stream)
return stream.toByteArray()
}
I've been actually expecting native code to be much much faster than Flutter's but thats not the case.. Conversion for Flutter takes about 50ms and its around 2000-3000ms for native.
I thought that Threading may be the case, so I've tried to run this conversion on background thread for native code but it didn't help.
Can you please tell me why is there such a different in time, and how I can implement it better in native code? Is there a way to omit casting to Bitmap etc.? Maybe this makes it so long.
EDIT. Added getBytesFromBitmap function
the difference you see is that in flutter code you just read your data without any image decoding, while in kotlin you are first decoding to Bitmap and then you are compress()ing it back - if you want to speed it up simply get an InputStream by calling Resources#openRawResource and read your image resource without any decoding
It have something to do with the way you convert it to bytes... Can you please post your
getBytesFromBitmap func? Plus, the conversion in native code really should be done in background thread, please upload the your results in this case.
I am working on an xamarin app that has images stored in an S3 bucket. The querying works correctly in xamarin when using the correctly constructed Url:
https:// + BucketName + path + ".jpg?AWSAccessKeyId=keycode&Expires=expireNumber&Signature=signatureCode"
When using
Image.Source = urlAddress (as the above format)
The image is loaded fine
Part of the apps pages have custom renderers with Images that need to be rendered via url address. We are updating the images via url at each os level. The iOS is working correctly using the following code:
using (var url = new NSUrl(uri))
using (var data = NSData.FromUrl(url))
if (data != null)
return UIImage.LoadFromData(data);
Which successfully gets the image from Url and updates it. However I am having major issues having it work on Android. I have tried the following area:
making a basic android url and setting the imageView with the following code. Which has been explained to not work here https://forums.xamarin.com/discussion/4323/image-from-url-in-imageview
Android.Net.Uri url = Android.Net.Uri.Parse(url);
imageView.SetImageURI(url);
On that same link using WebClient was suggested by user 'rmacias' to download the data via the url and parse the bytes to an android Bitmap.
private Bitmap GetImageBitmapFromUrl(string url){
Bitmap imageBitmap = null;
using (var webClient = new WebClient())
{
var imageBytes = webClient.DownloadData(url);
if (imageBytes != null && imageBytes.Length > 0)
{
imageBitmap = BitmapFactory.DecodeByteArray(imageBytes, 0, imageBytes.Length);
}
}
return imageBitmap;}
This returns a 403 forbidden error. at the line var imageBytes = webClient.DownloadData(url)
However the same process is working in iOS, the string is already authenticated and I have set the authentication timeout for several minutes incase of slow load. I have also tiued the same url requesting method with the .Net.Http library.
It crashes at res = (HttpWebResponse)request.GetResponse(); with the same 403 Forbidden error.
I have tried multiple things with header authentications for the WebClient and Http client. It feels that its something specific about android requesting url data because the authentication in the url string works for the Xamarin images and in the ioS code.
I'm thinking there is something specific to android that I am missing? Help is much appreciated!
How about using HttpClient, which can leverage the platform specific HttpClientHandler's which Xamarin provides?
So something like:
// make sure to reuse your HttpClient instance, it is a shared resource
// using it in a using() and disposing it all the time, will leave
// sockets open and bog down the connection!
private static HttpClient _httpClient;
public async Task<byte[]> GetImageDataAsync(string url)
{
if (_httpClient == null)
{
// you could inject AndroidHttpClientHandler or NSUrlSessionHandler here...
_httpClient = new HttpClient();
// set headers etc...
}
var response = await _httpClient.GetAsync(url).ConfigureAwait(false);
if (!response.IsSuccessStatusCode)
return null;
var result = await response.Content.ReadAsByteArrayAsync().ConfigureAwait(false);
return result;
}
Then you can use this platform agnostically like:
var data = await GetImageDataAsync(url);
imageBitmap = BitmapFactory.DecodeByteArray(data, 0, data.Length);
on iOS
var data = await GetImageDataAsync(url);
var imageData = NSData.FromArray(data);
imageBitmap = UIImage.LoadFromData(imageData);
There are also nice libraries, such as FFImageLoading, which support this out of the box, with effects, loading of images in TableViews etc., which you can consider as an alternative.
I am using react-native for Android app. And use axios as http library. When I try to send a Blob object through http post I will get below error:
HTTP Failure in Axios TypeError: One of the sources for assign has an enumerable key on the prototype chain. Are you trying to assign a prototype property? We don't allow it, as this is an edge case that we do not support. This error is a performance optimization and not spec compliant.
Below is the code I used to add blob object on form data:
let data = new FormData()
data.append('image', decodeBase64Image(image));
below is the code to decode base64 image. And below code works fine in one of my website application.
export const decodeBase64Image = (dataURI) => {
let byteString;
if (dataURI === undefined) {
return undefined
}
if (dataURI.split(',')[0].indexOf('base64') >= 0)
byteString = atob(dataURI.split(',')[1]);
else
byteString = unescape(dataURI.split(',')[1]);
// separate out the mime component
let mimeString = ''
if (dataURI.split(',')[0] != undefined && dataURI.split(',')[0].split(':')[1] != undefined) {
mimeString = dataURI.split(',')[0].split(':')[1].split(';')[0]
}
// write the bytes of the string to a typed array
let ia = new Uint8Array(byteString.length);
for (let i = 0; i < byteString.length; i++) {
ia[i] = byteString.charCodeAt(i);
}
return new Blob([ia], {type: mimeString});
}
The root of the problem is that the React Native devs made a performance optimization that is not spec-compliant (which is why the code works on your website, but not your React Native app). For more details, see the issue I opened here: https://github.com/facebook/react-native/issues/16814
As a workaround, you can use react-native-fetch-blob. I ran into the same error you did, and react-native-fetch-blob solved it for me.