I'm writing an Android and IOS app using Appcelerator Titanium, and I can't find a way to pixelate an image. The app that I'm writing, needs to do that: pixelate a given image with a parameter given by user, (the greater the number, the greater the pixels). I have found a way to do it with Xcode for IOS, and in Android SDK for Android, but if possible, I would like to do it in Titanium to avoid writing the whole app twice, one for Android and other for IOS.
Is there a way to do it?
Thank you.
If you have a native way in iOS and both Android you should wrap these as native modules and then include them in the project.
Follow this guide here on the Community wiki ->
https://wiki.appcelerator.org/display/guides2/Creating+a+New+Titanium+Module
Then you can write a function that wraps the modules and returned the processed object. eg.
var processImage = function() {
if (Titanium.Platform.name == 'android') {
// Android stuff
var imageProcess = require('ti.imageProcess');
return imageProcess.doImage('/voo/bar', /*more options */)
} else {
// etc
}
};
Instead of writing and maintaining two modules you could use a webView and use a JS library or the canvas object to pixelate the image.
A JS canvas solution to this can be found here:
https://stackoverflow.com/a/19129822/2132015
Related
Seems like basic stuff:
1) there's an image defined within an RCC file
<RCC>
<qresource prefix="/">
<file>main.qml</file>
<file alias="image.png">images/image.png</file>
</qresource>
</RCC>
2) the image is referenced within a QML file
Image {
source: "images/image.png"
}
The image is recognized and displayed fine within the QT Creator's GUI einvoronment. BUT, when deployed to android there's a runtime error from .SO
armeabi-v7a.so: qrc:/main.qml:76:9: QML Image: Cannot open: qrc:/images/image.png
Now considering I wanted to do this the 'proper' way i.e. rely on QT's multi-platform capabilities and resource management and do not do any hacky stuff by tinkering with android folder, what is the proper way?
Also, if QT attempts to introduce a multi-platform resource-abstraction layer over each platform; how does it handle various resolutions? For instance, in Android there's typically a separate folder for each DPI range. Does it do any kind of automatic scaling / conversions? (obviously not but then how to provide these various bitmaps to QT). It's related.
These issues is related URL on device's environment.
So you should link URL on the code.
Try like the below:
Image { source: "qrc:/images/image.png"}
I am currently changing my apps, made with Swift and Java, to Flutter, however, right now I have 5 Apps for android and iOS, all using the same code (5 in Swift, 5 in Java), and, for each one I have different assets, like images, strings, API url's etc. In case of an iOS app I was currently creating different Targets on XCode, with different user-defined variables that i use in the code, and then I chose which target I want to build and send to the corespondent iTunesConnect app. In android I do more or less the same, but using Android flavors.
My doubt is how can I do this in flutter without being forced to create a different Flutter project for each app I want to build.
Any ideas on what approach should I use?
I use a custom build script that creates a symlink depending on the flavor name
From my Grinder build script
Future<void> _setTenant(Tenant tenant) async {
const symlinkPath = 'assets/tenant';
final link = Link(symlinkPath);
if (link.existsSync() &&
link.targetSync() == '../assets/${tenant.identifier}') {
return;
}
if (link.existsSync()) {
link.updateSync('../assets/${tenant.identifier}');
} else {
Link(symlinkPath).createSync('../assets/${tenant.identifier}');
}
}
Tenant is a custom class and Tenant.identifier returns a string that is valid as directory/symlink name.
I created https://github.com/flutter/flutter/issues/21682 to get direct support for that in Flutter.
I'm writing a NativeScript plugin for the Stripe library. On Android, I can't figure out how to turn a resource ID from the native library into a drawable image.
One of the calls I make returns an ID obtained from R.drawable.xxx. The image is in the native library's resources. I want to turn that into an image I can draw using the tag.
I have tried this code:
import { android as androidApp } from "application";
let res = ...; // obtained from native library
let image = android.graphics.BitmapFactory.decodeResource(
androidApp.foregroundActivity.getResources(),
res);
but decodeResource() returns null. The native library is able to draw the same image, so somehow the resource is getting into the app. I just don't know how to access it from my code.
Note: On iOS this same call returns a UIImage, which is working correctly. I wish Stripe had been more consistent between their iOS and Android APIs!
I figured out the problem. Turns out my diagnosis was incorrect. The resource was being found, but it was a VectorDrawable and decodeResource() is unable to properly decode it.
The technique for converting a VectorDrawable to a Bitmap posted in this question works.
I am currently using react-native-safari-view module in my React Native project for showing web views in iOS.
As the module is not yet implemented for Android, when I try to build the project for Android, it gives me an error at this line:
import SafariView from 'react-native-safari-view'
I am going to use the Linking library for Android, but I don't know how to use the same code for two platforms.
I tried:
if (Platform.OS == 'ios') {
import SafariView from 'react-native-safari-view'
}
And it gives me this error:
import' and 'export' may only appear at the top level
How do I get around this?
To get around this I have been using require instead (but mainly for modules rather than components):
var SafariView;
if (Platform.OS == 'ios') {
SafariView = require('react-native-safari-view');
}
For this particular situation I would definitely go for Konstantin Kuznetsov's approach - Just sticking this here as it might help someone else where making a wrapper component with separate files may be overkill :)
platform-specific code is more complex, you should consider splitting the code out into separate files. React Native will detect when a file has a .ios. or .android. extension and load the relevant platform file when required from other components.
For example, say you have the following files in your project:
BigButton.ios.js
BigButton.android.js
You can then require the component as follows:
import BigButton from './BigButton'
reference
https://facebook.github.io/react-native/docs/platform-specific-code.html#platform-specific-extensions
You can separate platform code by creating two different files your_file_name.android.js and your_file_name.ios.js. So you can create two versions for the file where you want to use SafariView or you can create a wrapper around SafariView which will export this module on iOS and dummy object on Android, and then use this wrapper with Platform.OS checks.
A few reasons why I might do this:
To create some webviews and inject javascript loaded from files
To separate large chunks of text into separate files rather than forcing them into views
To include raw data of an arbitrary format (eg, CSV) to be used in the app
In React Native you can use require to import an image file, but as far as I've seen, this only works for image files. And it (strangely) also works for JSON files (see Importing Text from local json file in React native). However, I haven't seen anywhere talking about importing plain old text files.
After looking and asking around, the best I can come up with is to use a
fork of the react-native-fs library to access android "assets". This fork is a pull request out and as soon as it gets merged you can use it.
Note that in android dev, "assets" specifically refer to accessing the raw contents of a file. In order to do this sort of thing on the react native side, you need to write a native module to interface with react, hence the library above. See here (and search for assets).
In your react native project, make a file called something like android/app/src/main/assets/text.txt. Using the version of react-native-fs mentioned above, you do:
RNFS.readFileAssets('test.txt').then((res) => {
console.log('read file res: ', res);
})
update: If you want the pull request that would enable this ability to go through, you should go let the author know you want it by giving it a thumbs up on github.
There is a library that solves this exact problem: React-Native-Local-Resource. The library allows you to asynchronously load any type of text file in Android and iOS at runtime.
Here is a simple react-native project example of implementing FS and read files for both IOS and Android .
https://github.com/baselka/kindleapp
(Implement a simple application to render 2 files (book file) in an android and iOS application on a tablet. The app must emulate the functionality where you are reading a book file in your app like in a kindle app)
Here's how I did it synchronously in Swift and JS for iOS/tvOS/macOS, based on the React Native docs: Exporting Constants.
Disadvantage: Note that the file will be loaded into memory once upon startup, and won't be dynamically re-loadable.
Advantage: It's synchronous, simple, and works at run-time whether in native or JS.
MyJSFile.js
import { NativeModules } from "react-native";
console.log(NativeModules.MyNativeModule.MyFileContents);
We import our native module and access the MyFileContents constant that we expose upon it. It works synchronously with no bridge-crossing (as far as I understand, it's injected into the React Native JSContext via JavaScriptCore).
In Build Phases, ensure that this file is added into Copy Bundle Resources. Otherwise your app will quickly crash upon trying to read it.
MyNativeModule.swift
import Foundation
#objc(MyNativeModule)
class MyNativeModule: RCTEventEmitter {
#objc override func constantsToExport() -> [AnyHashable : Any]! {
let contents: String = try! String(contentsOfFile: Bundle.main.path(forResource: "MyFile.min", ofType: "js")!)
return [
"MyFileContents": contents
]
}
#objc override func supportedEvents() -> [String]! {
return []
}
#objc override static func requiresMainQueueSetup() -> Bool {
return false
}
}
One can likely make a simpler/slimmer native module than this one (by subclassing something with less functionality than RCTEventEmitter), but this is the file I had lying around to work with, so here it is.
MyProject-Bridging-Header.h
#import <React/RCTBridge.h>
#import <React/RCTBridgeModule.h>
#import <React/RCTUIManager.h>
#import <React/RCTEventEmitter.h>
#import <React/RCTBundleURLProvider.h>
#import <React/RCTJavaScriptLoader.h>
#import <React/RCTLinkingManager.h>
#import <React/RCTRootView.h>
#import <React/RCTEventDispatcher.h>
Here's the bridging header I'm using. It exposes a lot more headers than are strictly necessary, but you may need them for other native modules later anyway.
As this approach uses Swift, make sure to enter your Build Settings and set Always Embed Swift Standard Libraries to Yes if you haven't already. And if this is the first time building your app with Swift embedded, you may want to clear DerivedData for luck before building.
... Or simply rewrite that same Swift code in Obj-C, as the documentation shows how.