React Native Android] Resize image and upload to backend server - android

The photos captured with the device are big. I want to upload them to my backend server after resizing them (scaling them) to more reasonable sizes (less than 800x800). I hoped to use ImageEditor module's coprImage() function, but running it with a large image results in a OutOfMemory exception. I assume that since the module tries to decode a large image and store it in memory, the app crashes.
What I need is the following:
Input
{
width: 3100,
height: 2500,
uri: content://android/1 (some location in Android device)
}
Output
{
width: 800,
height: 650,
uri: content://android/1/resized (some location in Android device)
}
Then I can grab this uri to send the picture to my backend server, and delete the resized photo from the device.
I assume that I will have to write a NativeModule so I can resize an image without loading a large decoded image into memory. React Native's Image component uses Fresco to handle resizing before rendering them, but I don't think it provides a way to resize an image and temporarily save it in fs.
Any help would be appreciated.
References:
https://developer.android.com/training/displaying-bitmaps/load-bitmap.html
http://frescolib.org/docs/resizing-rotating.html
https://facebook.github.io/react-native/docs/images.html
Memory efficient image resize in Android

Expo library has an image manipulator that can resize and more:
import { ImageManipulator } from 'expo';
...
const manipResult = await ImageManipulator.manipulate(
imageUri,
[{ resize: { width: 640, height: 480 } }],
{ format: 'jpg' }
);
manipResult will return an object with the new uri, width and height.
Find it here:
https://docs.expo.io/versions/latest/sdk/imagemanipulator.html

In my app I use react-native-image-picker which works really well.
If you don't want to use it, have a look into this function source to see how resizing is done.
Cheers.

Did you try with react-native-image-resizer? For me it works pretty well, I'm using it with react-native-camera, it looks like this:
fromCamera() {
let newWidth = 800;
let newHeight = 650;
this.refs.camera.capture()
.then((data) => {
ImageResizer.createResizedImage(data.path, newWidth, newHeight, 'JPEG', 100, rotation)
.then(uri => {
// send to backend
}
})
}
Original image is saving on device with uri: data.path, so there are no problems with memory.

Lets check out this. It will provide you detailed description about your issue for resizing and uploading to backend server.

Incase If you want to send Base64 String you can check the code below
how to install and link with android please check this link
https://github.com/TBouder/react-native-asset-resize-to-base64
// this code is for resizing and pushing to array only
let images = [];
NativeModules.RNAssetResizeToBase64.assetToResizedBase64(
response.uri,
newWidth,
newHeight,
(err, base64) => {
if (base64 != null) {
const imgSources = { uri: base64 };
this.setState({
image: this.state.image.concat(imgSources)
});
this.state.image.map((item, index) => {
images.push(item);
});
if (err) {
console.log(err, "errors");
}
}
}
);

Related

React Native image stream from a server - fps and flickering

In my app I want to stream images from a live server and show it in the UI as an ImageBackground component. So that it feels like it is a video, but it is not - just a fast changing image, got as base64 from WebSocket.
The problem is that I'm not sure about the solution I've made, because:
I'm pretty sure that with higher frame rate it's going to lag the app (now it's 4-5fps, what if 30?)
When updating an image there's a flickering effect, it doesn't feel good (example on video)
Check on video how it currently works: video here
Code (View):
const liveFrameRef = useRef<Image | null>(null);
return (
<ImageBackground fadeDuration={0} style={styles.stream} source={require("../../../../assets/images/camera_preview.jpg")} imageRef={(image) => {
liveFrameRef.current = image
}}>
</ImageBackground>
)
Code (Stream in useEffect):
// data - my current image got from a socket as binary
if (liveFrameRef.current) {
let bytes = new Uint8Array(data);
let binary = '';
let len = bytes.byteLength;
for (let i = 0; i < len; i++) {
binary += String.fromCharCode(bytes[i])
}
const image = "data:image/jpg;base64," + btoa(binary); // b64 encoded JPG
liveFrameRef.current.setNativeProps({
src: [{uri: image}],
});
}
So basically I get the ref of my Image and with each frame change the source using setNativeProps. That's how I've been doing it in plain React in HTML, and worked fine. Here it doesn't. Any better ideas I can workaround this?
Thanks.

Android GPUImage setImage and getBitmapWithFilterApplied cause screen to flicker

I have been using this github as a start of my code: https://github.com/xizhang/camerax-gpuimage
The code is a way to show the camera view with GPUImage filters on it.
I want to be able to also analyze the bitmap with the filter applied on it to get some analytics (percent red/green/blue in the image).
I have been successful in showing the default camera view to the user as well as the filter I created.
By commenting out the setImage line of code, I have been able to get the analytics of the filtered image, but when I try to do both at the same time the screen flickers. I changed the StartCameraIfReady function to get the filtered image as follows:
#SuppressLint("UnsafeExperimentalUsageError")
private fun startCameraIfReady() {
if (!isPermissionsGranted() || cameraProvider == null) {
return;
}
val imageAnalysis = ImageAnalysis.Builder().setBackpressureStrategy(ImageAnalysis.STRATEGY_KEEP_ONLY_LATEST).build()
imageAnalysis.setAnalyzer(executor, ImageAnalysis.Analyzer {
var bitmap = allocateBitmapIfNecessary(it.width, it.height)
converter.yuvToRgb(it.image!!, bitmap)
it.close()
gpuImageView.post {
// These two lines conflict causing the screen to flicker
// I can comment out one or the other and it works great
// But running both at the same time causes issues
gpuImageView.gpuImage.setImage(bitmap)
val filtered = gpuImageView.gpuImage.getBitmapWithFilterApplied(bitmap)
/*
Analyze the filtered image...
Print details about image here
*/
}
})
cameraProvider!!.unbindAll()
cameraProvider!!.bindToLifecycle(this, CameraSelector.DEFAULT_BACK_CAMERA, imageAnalysis)
}
When I try to get the filtered bitmap, it seems to conflict with the setImage line of code and cause the screen to flicker as shown in the video below. I can either show the preview to the user, or I can analyze the image, but not both at the same time. I have tried running them synchronized as well as each on their own background thread. I have also tried adding another Image Analyzer and binding it to the camera lifecycle (one for the preview and the other to get the filtered bitmap), the screen still flickers but less often.
https://imgur.com/a/mXeuEhe
<blockquote class="imgur-embed-pub" lang="en" data-id="a/mXeuEhe" >screenRocord</blockquote><script async src="//s.imgur.com/min/embed.js" charset="utf-8"></script>
If you are going to get the Bitmap with filter applied, you don't even need the GPUImageView. You can just get the Bitmap, and then set it on a regular ImageView. This is how your analyzer should look like:
ImageAnalysis.Analyzer {
var bitmap = allocateBitmapIfNecessary(it.width, it.height)
converter.yuvToRgb(it.image!!, bitmap)
it.close()
val filteredBitmap = gpuImage.getBitmapWithFilterApplied(bitmap)
regularImageView.post {
regularImageView.setImageBitmap(filteredBitmap)
}
}
Please be aware that the original GitHub sample is inefficient, and the sample above is even worse, because they convert the output to a Bitmap before feeding it back to GPU. For the best performance to argument the preview stream, please see CameraX's core test app on how to access preview Surface via OpenGL.

Google Maps custom marker icon size not preserved on Android

I'm using the nativescript-google-maps-sdk plugin to create a Google map.
Everything works fine but I've got a problem with my custom marker icons, if you look at these pictures you can see that the icon size is not preserved on Android, making them very, very small to the point where you can barely even see them. This happens both in the emulators and on a real phone.
On IOS however the size is fine, as you can see in the 2nd image. The icon images have a size of 16x16 pixels and are in .png format.
I haven't been able to find any solution to this so this is my last resort, does anyone know why this might be happening?
This is the code I use to create the markers:
getImage(this.getWarningIcon(warning.status)).then((result) => {
const icon = new Image();
icon.imageSource = result;
const marker = new Marker();
marker.position = warning.centerOfPolygon;
marker.icon = icon;
marker.flat = true;
marker.anchor = [0.5, 0.5];
marker.visible = warning.isVisible;
marker.zIndex = zIndexOffset;
marker.infoWindowTemplate = 'markerTemplate';
marker.userData = {
description: warning.description,
startTime: warning.startTime,
completionTime: warning.completionTime,
freeText: warning.freeText
};
this.layers.push(marker);
this.map.addMarker(marker);
});
In that case 16px sounds too low for a high density device. Increase the size of the image sent from server or locally resize the image before passing it to marker.
You may also consider generating a scaled bitmap natively if you are familiar with Android apis. Image processing is something always complicated in Android. Using drawables are recommend when your images are static at least.

Fastest Way To Display 100 or more images in Flutter

The purpose of this question is to find the best way to display a lot of images (20 or more) from the gallery.
I need to display all the photos on the on the device. As you can see there could be a lot of images to be displayed, and when I try to display them all using different Image.file() widget, My phone (LG G2) gets slow and then the app completely crashes.
I think the Problem Is that I am Loading a lot of 4K Images (More Than 100) on a 5 years old Device. The Images actually displayed on the Screen at the same time are actually around 15, but I need to have something like a gallery, so all the images in a GridView.
So, I don't think it's the best option, maybe there is a function to downscale the image to like 100x100 px. But looking online I couldn't find anything. So, Is there a better way?
I get all the Image Paths and their folder using a Platform Channel to Android.
I get them all in a list like this:
[["Camera", "storage/emulated/0/downloads/1.png"], etc]
This is The UI I got: (Sorry For the Awful design)
There Are three Main components on the Screen, The IconButton With The Two Arrows:
The DropDownMenu and The GridView.
The IconButton requeries all the paths from android. in the List as stated above.
The DropDown Menu has a function on onChanged which creates all the Cards in the List used by gridView.
This is The Funciton:
void _onChanged(String value) {
setState(() {
_currFolder = value;
_photoCards = calculatePhotoCards();
});
}
And the calculatePhotoCards is this:
List<Widget> calculatePhotoCards() {
List<Widget> list = new List();
for (int v = 0; v < _foldersAndPaths.length; v++) {
if (_foldersAndPaths[v][0] == _currFolder) {
list.add(Card(
color: Colors.white,
shape: RoundedRectangleBorder(
borderRadius: BorderRadius.all(Radius.circular(20.0))),
child: Padding(
padding: const EdgeInsets.all(8.0),
child: new ClipRRect(
borderRadius: new BorderRadius.circular(8.0),
child: Image.file(
File(_foldersAndPaths[v][1]),
fit: BoxFit.cover,
),
),
),
));
}
}
print(list);
return list;
}
Where _foldersAndPaths is the List containing all the paths which their respective folders (as stated above), while _currFolder is the Selected Folder in DropDownMenu.
You need to add cacheWidth and cacheHeight
example:
Image.network(
shop.shopPic,
width: 46,
height: 46,
cacheWidth: 46,
cacheHeight: 46,
)
I decided to implement this to see if it was something to do with image sizing etc, as there have been a few questions about that in the past for remote images.
When I implemented it, I ran into the exact same problem as you. I thought that flutter did caching of images scaled to a reasonable size but that doesn't seem to be the case. It's actually loading up each image into memory, which is why you're getting a crash. It doesn't take that many images to run out of memory as by the time they're decoded into pixels they're pretty huge.
Here and here are bugs about this (I raised one myself) against flutter.
So unfortunately, I don't think there's a pure flutter way of doing this at the moment. There is a image package on pub that can scale down images but it does it directly in dart code so isn't all that performant.
Until those bugs have been fixed, you may have to look for an android-specific (and iOS-specific if you're interested in that) solution to this. That would entail using the android thumbnail creator to create a bitmap, then either saving that bitmap and passing it's URL back to flutter, or passing the bitmap's bytes directly.
Sorry I don't have a better answer than that!
I'm using CachedNetworkImage and I had an issue with loading multiple Images and the app got to over 3GB ram because of it the solution was
CachedNetworkImage(
height: height,
width: width,
memCacheHeight: height, //add this line
memCacheWidth: width, //add this line
imageUrl: imageUrl,
),
It reduced the memory usage from3GB to 60MB

Crop Image in Titanium in the Preview Screen

I am creating an application which uses camera/gallery. I am taking a photo using camera and once I took the picture, the device will automatically display a preview screen in iOS, allows me to move and scale my image as required. In android, I manually created a preview window.
But I want to crop the image with a resolution 610x320 pixels.
Here is the code for taking image
Ti.Media.showCamera({
success:function(event) {
if(event.mediaType == Ti.Media.MEDIA_TYPE_PHOTO) {
var image = event.media;
var ImageFactory = require('ti.imagefactory');
var newBlob = ImageFactory.imageAsCropped(image, {width:610, height:320 });
imgvwCapturedImage.image = newBlob; //imgvwCapturedImage is an image view
}
},
cancel:function() {},
error:function(error) {
alert("Sorry, Unable to to process now.Please retry later.");
},
saveToPhotoGallery:true,
allowEditing:true,
mediaTypes:[Ti.Media.MEDIA_TYPE_PHOTO]
});
I was able to crop the image using the imageFactory module only after selecting the photo from the preview screen. Is there any chance to do the same at the preview screen itself so that the user can identify which area is getting cropped?
Any help will be appreciated.
Did you try an overlay? Just create a resizable view that user can manipulate with (Select portion of an image) and add it to CameraOptionsType.
http://docs.appcelerator.com/titanium/latest/#!/api/CameraOptionsType-property-overlay
I have created my own preview screen for iOS and cropped the image by the help of a scrollView and image factory module. Now it's working perfect. You may find a sample code here. However this will not be working for Android devices.

Categories

Resources