Pick multiple images in react native Expo with Imagepicker - android

It seems like Expo only supports imagepicker for selecting one image rather than multiple images. Is there any way to pick multiple images without ejecting expo or starting new react-native-init?

Did you get the link from Oleg working?
I found that library as well, updated the Expo/SDK version because it was to old but now I get the following message when trying to open the gallery;
Invalid filter option: '(null)'. Expected one of photos, video's or all. To me unfortunately it is unclear where to place this filter exactly.
enter image description here

Related

How to display a TIFF image in a Flutter app on an Android device?

Our application allows users to set up a library of various types of resources such as PDFs, spreadsheets, etc. I.e. just about any MIME type of document, which we store on S3.
When a user clicks to view any of these resources we basically determine if we are on an iOS device or Android. On iOS we use url_launcher to basically show just about anything.
Android is a bit more complex, but if the mime type is an image we just also use url_launcher. If not we download the file to a local file and invoke the OpenFile package to show the result (if we can).
This works generally pretty good, except for TIFF image types, which don't display natively in the browser...
Is there an easy way to show TIFF images in a full flutter screen (similar to showing it in a browser) on the Android platform? Actually for image/ mime types is there an easier way to show them rather than url_launcher in general?
The other Stack Overflow question about TIFF on Android does not address this issue as that is for a native Android app and this is for Flutter and that solution doesn't lend itself (easily that I can see) to a Flutter application.
The question is quite old but i still want to answer it :)
What you can do with the Image Library is to decode the image (whatever it is) and then encode it as a png and then pass it to the Image Widget.
Why PNG??? => in widgets/images.dart it says
/// This only accepts compressed image formats (e.g. PNG). Uncompressed
/// formats like rawRgba (the default format of [dart:ui.Image.toByteData])
/// will lead to exceptions.
Todo:
I had to import the Image Library as imgLib otherwise it was colliding with the Image Widget from Flutter...
import 'package:image/image.dart' as imgLib;
....
imgLib.Decoder dec = imgLib.findDecoderForData(response.data);
Image.memory(imgLib.encodePng(dec.decodeImage(response.data)))
Then you have an Image Widget for displaying it on the UI

Using Android native image from a NativeScript plugin

I'm writing a NativeScript plugin for the Stripe library. On Android, I can't figure out how to turn a resource ID from the native library into a drawable image.
One of the calls I make returns an ID obtained from R.drawable.xxx. The image is in the native library's resources. I want to turn that into an image I can draw using the tag.
I have tried this code:
import { android as androidApp } from "application";
let res = ...; // obtained from native library
let image = android.graphics.BitmapFactory.decodeResource(
androidApp.foregroundActivity.getResources(),
res);
but decodeResource() returns null. The native library is able to draw the same image, so somehow the resource is getting into the app. I just don't know how to access it from my code.
Note: On iOS this same call returns a UIImage, which is working correctly. I wish Stripe had been more consistent between their iOS and Android APIs!
I figured out the problem. Turns out my diagnosis was incorrect. The resource was being found, but it was a VectorDrawable and decodeResource() is unable to properly decode it.
The technique for converting a VectorDrawable to a Bitmap posted in this question works.

Put image over video and save video to sd card Android

I'm working on a feature in which I want to add picture over the video and save it to sd card.
in general, the user selects an image with semi-transparent background and puts that image above the video, after the user presses the save button he gets a new video but already with the image above the video.
I have heard about ffmpeg, and saw some commands that are provided by ffmpeg. but I don't know where I should initialize. can anyone provide me an example for the same?
Thank you.
One common approach is to use an ffmpeg wrapper to access ffmpeg functionality from your Android app.
There are several fairly well used wrappers available on GitHub - the ones below are particularly well featured and documented (note, I have not used these as they were not so mature when I was looking at this previously, but if I was doing something like this again now I would definitely build on one of these):
http://writingminds.github.io/ffmpeg-android-java/
https://github.com/guardianproject/android-ffmpeg
Using one of the well supported and used libraries will take care of some common issues that you might otherwise encounter - having to load different binaries for different processor types, and some tricky issues with native library reloading to avoid crashes on subsequent invocations of the wrapper.
Because this approach uses the standard ffmpeg cmd line syntax for commands it also means you should be able to search and find help easily on multiple different operations (as anyone using ffmpeg in 'normal' model will use the same syntax for the ffmpeg command itself).
For example, for your adding an image case here are some results from a quick search (ffmpeg syntax can change over time so it is worth doing a current check):
https://stackoverflow.com/a/32250369/334402
https://superuser.com/a/678171

Flash Builder embedding files into app like flash IDE?

I'll try to make this simple :
If I create an AIR app from the Flash IDE, I can choose to embed folder in my package. Then I can load the files using 'app:/'+filename. Everything is ok.
I have to move to Flash Builder because I can't test workers in the IDE (thanks Adobe). My issue is that, if I test/debug from Flash Builder, it does a stream error when calling 'app:/'+filename. If I launch the test in the IDE from FB, it works but the Workers don't. I should mention, the reason I'm using this method is that I have so many graphical assets, it's just easier to maintain/update this way instead of using [Embed.. ] for all my items, and it just works in the IDE...
I've added my folder to my sources locations in Flash Builder, still it seems I cannot use the 'app:/' thing.
How can I make this work so I don't change my code and still use 'app:/'? FB is such a confusing program...
edit : I tested again the workers in the IDE build launched by FB (the test in flash IDE icon), I can trace its state with :
worker.start();
worker.addEventListener(Event.WORKER_STATE, this._handleWorkerState);
private function _handleWorkerState(__e:Event):void{
trace(__e.currentTarget.state);
}
traces 'new' then 'running'. But for some reason, it doesn't send or receive any data from any message channel, which, again, works in FB4.7 when i run a debug but doesn't find my files....
Error #2044: Unhandled ioError:. text=Error #2032: Stream Error. URL: app:/foldername..
So basically, I'm looking for a solution to at least one of my problems :)
EDIT :
So ok. Here it is, one issue was due to the wrong debugger version installed (for the workers part). So I can now work and compile in the IDE again. I haven't found an answer to why 'app:/' doesn't work from FB4.7. So that would be the remaining question.
One option since you have Flash IDE is to create a library with all of your images. Drop all your images into the library in Flash and export them for actionscript. Then publish and create a a SWC. Then you can use the swc, which is kind of like a zip file for display objects, in flashbuilder and access them like:
var mc:MovieClip = new imageExportedForAS3_1()
Create a top level folder in your flex project called for example images, copy all of your images into that folder, then every time you need to load an image, just use the source attribute and use the absoulte rute, for example.
<mx:Image source="#Embed(source='../images/pic.png')"
I have never used the app:/ sentence before! Good luck!

Using the GPUImageFilter library in android

i m new to android and after having long search and also tried some java code for implement filter effect like the instagram so going with java code filter effect are working but too slow,so week before stop using that effect and after search found one library GPUImageFilter(on github) and start working on that library .The library for filter effect work very fine and produce some good effect also and i m able to implement in my project also but the next thing arise as after saving the image with filter in the sdcard ,if again i try to fetch that file from the sdcard it showing nothing but blank screen and if we look inside the device sdcard it showing as file is saved and opening in the gallery also.can any tell me how to overcome this problem.Here is the link where u can find the lib:
link:-GPUImageFilter library
any help will be very much helpfull to me.Thanks and waiting for reply.

Categories

Resources