PhoneGap 2.9.1 - Camera; crashing due to background process limit - android

so after a lot of investigation, I finally made my app able to resume state (I simply just modified the OnCreate method in the MainAcitivity.java to load a "restore" page on activity kill if there is a saved bundle instance)
I believe once I fix this final problem, all will be good and I can finally sleep.
HOW do I get the Activity/Intent result from a Camera in PhoneGap once the app has been killed off due to the Background Processes limit or the possibility of "Do not keep activities being checked" (I have a surprising amount of users who have these restrictions enabled)
I thought it might be possible to make the camera save the file in a temp directory and then I just pass the URI through javascript as a hash url (so it'd be something like file:///android_www/index-restore.html#URI_TO_IMAGE)
But my only issue is - How do I even begin this in PhoneGap? I know what to do for everthing bar the temp storage of the image and retrieving the location through onCreate

Ok finally managed it, wrote a hacky solution but it works.
I modified the CordovaInterface and CordovaActivity by adding a function called "getSharedPref" which returns a shared preference that can be accessed throughout the app.
I modified Camera Launcher to force the stored name to be temp.jpg or temp.png depending on what ever input, then store it within the preference.
On the MainActivity.java, I use this.getSharedPref() if a bundle instance is not null, and then check for the key. Unfortunately the only way I could assign the variable in Cordova was by doing
super.loadUrl("javascript: var global_image = '" + file + "'");
then I did the usual routines to add the file in my program and it all work so far! Happy days
Down the track I will probably write a plugin to use, using super.loadUrl("file:///blablabla#" + file); didn't seem to work.
The only issues so far are that Images aren't resized and rotated correctly, and I still need to implement this for the PhotoGallery but so far so good.
edit:
I've managed to get the photos resized and re-orientated :) Next issue is implementing it onto the Photo Gallery now - intent.putExtra(android.provider.MediaStore.EXTRA_OUTPUT, file); doesn't seem to apply for some reason.
edit:
Latest update - so it turns out AFTER the App has seemingly "crashed", you still get the Intent back regardless, so modifying the source code even more it now auto-calls a global javascript function called "customRestore", which passes over information gathered from the intent and then goes through the normal routine of adding a photo.

Related

Ionic 3 File Permissions

Alright, here goes. I'm dealing with an ionic project. In this specific scenario we're dealing with testing the Android version of the app. I can get images from the file system just fine, they come back in the form of a string url that looks something like this,
content://com.android.providers.media.documents/document/image%3A5744
The processor that is then supposed to blob the file and pass it up the line looks like this:
return this.file.readAsArrayBuffer(urlData.url, urlData.fileName)
.then((item) => {
return new Blob([new Uint8Array(item)]);
})
.catch((err) => {
console.log(err.message)
}).then((res)=>{
return new Blob([res])
})
But then I get the error SECURITY_ERR, which the documentation doesn't really talk about.
This works just fine for the pictures I take with the camera, which all have urls that look like this
file:///storage/emulated/0/Android/data/<appname>/cache/1502211622334.jpg
The issue is, as far as i can find, there is no documentation on what causes this error. I have no idea what to change to make my code work. I have verified the URI is valid, using the checkFile method.
So it turns out you can't use content urls with file.readAsArrayBuffer instead you have to first resolve the url into something readAsArrayBuffer can understand. To do this, i used the ionic native filePath plugin.
Once it was installed and included in the page where I needed it, I used the
filePath.resolveNativePath(url)
method on my url, since this returns a promise, I chained my readAsArrayBuffer onto a then statement prepended to it. I did have to use an if statement to have branching paths for content urls (Which required resolveNativePath) and non-content urls which were already working.
This solution works as far as I can tell.

Scanner android application

We are developing an android application for document scanner.
This application is having feature to edit the image like adding magic color, grey mode, Black white,etc. this application has the option to scan "N" number of pages and convert it into PDF at the end.
The flow of the application is First activity is taking photo of the image and the second activity for cropping the image and third activity for editing the image like applying magic color, grey mode and black/white conversion. And in the third activity we have add button,clicking on it will go back to first activty and the same process continues.Once all the images scanned, third activity is haivng done button, clicking on it will create pdf and close the application.
Now the problem is after scanning some 35 pages, it throws out of memory error because we are always keeping original and modified bitmaps as List in the code because its possible for the user to go back to previous images and edit it. At that time i need original version of the image also.
Could you please help me out on the below items.
1) where to keep the bitmaps in this scenario?
2) Is there any way to store the image in the external card and reading it everytime on need basis?
Thanks in advance.
Store the bitmap as cache.
out = new FileOutputStream(filename);
bmp.compress(Bitmap.CompressFormat.PNG, 100, out);
It is prefetable to use cache directory to store such a file rather than storing it persistently. getCacheDir() will return the path to the directory.

Summary: Take a picture utilizing Camera Intent and display the photo with correct orientation (works on hopefully all devices)

It seems to be the simplest thing in the world: taking a picture within your Android app using the default camera activity. However, there are many pitfalls which are covered in several posts across StackOverflow and the web as, for instance, Null Intents being passed back, the orientation of the picture not being correct or OutOfMemoryErrors.
I'm looking for a solution that allows me to
start the camera activity via the camera intent,
retrieve the Uri of the photo, and
retrieve the correct orientation of the photo.
Moreover, I would like to avoid a device configuration (manufacturer, model, os version) specific implementation as far as possible. So I'm wondering: what is the best way to achieve this?
UPDATE: January 2nd, 2014:
I tried really hard to avoid implementing different strategies based on the device manufacturer. Unfortunately, I did not get around it. Going through hundreds of posts and talking to several developers, nobody found a solution that works on all devices without implementing device manufacturer specific code.
After I posted my solution here on StackOverflow, some developers asked me to publish my code on github. So here it is now: AndroidCameraUtil on github
The code was successfully tested on a wide variety of devices with Android API-Level >= 8. For a complete list, please see the Readme file on github.
The CameraIntentHelperActivity provides the main functionality, which is also described in more detail in the following.
Calling the default camera activity:
for Samsung and Sony devices: I call the camera activity with the method call to startActivityForResult. I only set the constant CAPTURE_IMAGE_ACTIVITY_REQUEST_CODE. I do NOT set any other intent extras.
for all other devices: I call the camera activity with the method call to startActivityForResult as previously. This time, however, I additionally set the intent extra MediaStore.EXTRA_OUTPUT and provide an URI, where I want the image to be stored.
In both cases I remember the time the camera activity was started.
On camera activity result:
Mediastore: First, I try to read the photo being captured from the MediaStore. Using a mangedQuery on the MediaStore content, I retrieve the latest image being taken, as well as its orientation property and its timestamp. If I find an image and it was not taken before the camera intent was called, it is the image I was looking for. Otherwise, I dismiss the result and try one of the following approaches.
Intent extra: Second, I try to get an image Uri from intent.getData() of the returning intent. If this is not successful either, I continue with step 3.
Default photo Uri: If all of the above mentioned steps did not work, I use the image Uri I passed to the camera activity.
At this point, I retrieved the photo Uri and its orientation which I pass to my UploadPhotoActivity.
Image processing
Please take a close look at my BitmapHelper class. It is based on the code described in detail in that tutorial.
Moreover, the shrinkBitmap method also rotates the image if required based on the orientation information extracted earlier.
I hope this is helpful to some of you.
I have tested this code with a Sony Xperia Go, Samsung Galaxy SII, Samsung Galaxy SIII mini and a Samsung Galaxy Y it worked on all devices!
But on the LG E400 (2.3.6) it didn’t work and you get double pictures in the gallery. So i have added the manufacturer.contains("lge") in the void startCameraIntent() and it fixed the problem.
if(!(manufacturer.contains("samsung")) && !(manufacturer.contains("sony")) && !(manufacturer.contains("lge"))) {
String filename = System.currentTimeMillis() + ".jpg";
ContentValues values = new ContentValues();
values.put(MediaStore.Images.Media.TITLE, filename);
cameraPicUri = getContentResolver().insert(MediaStore.Images.Media.EXTERNAL_CONTENT_URI, values);
intent.putExtra(MediaStore.EXTRA_OUTPUT, cameraPicUri);
}
On a Galaxy S3 with CM 10.1 I get a nullpointer exception in BitmapHelper:
bm = BitmapFactory.decodeFileDescriptor(fileDescriptor.getFileDescriptor(), null, options);
subsequently my UploadPhotoActivity fails at:
try {
photo = BitmapHelper.readBitmap(this, cameraPicUri);
if (photo != null) {
photo = BitmapHelper.shrinkBitmap(photo, 600, rotateXDegrees);
thumbnail = BitmapHelper.shrinkBitmap(photo, 100);
ImageView imageView = (ImageView) findViewById(R.id.sustainable_action_photo);
imageView.setImageBitmap(photo);
} else {
Log.e(TAG,"IMAGE ERROR 1");
}
} catch (Exception e) {
Log.e(TAG,"IMAGE ERROR 2");
e.printStackTrace();
}
at the second log (IMAGE ERROR 2).
After a couple of tries my camera broke and I got a "Could not connect to camera"-error.
Tested it on a nexus 7 and it works perfectly.
Edit: Narrowed it down to this:
fileDescriptor = context.getContentResolver().openAssetFileDescriptor(selectedImage, "r");
Although selectedImage contains this:
file:///storage/emulated/0/DCIM/Camera/IMG_20131023_183343.jpg
The fileDescriptor returns a FileNotFoundException. I checked the file system and the image is not saved at this location. The cameraPicUri in TakePhotoActivity points to a non existant image. I am currently checking where it all goes wrong.
Edit2: I figured out the error: Since the device is a Samsung, and tells the App that it is a Samsung device, your Samsung specific fixes are applied. Cyanogenmod does not need those fixes though, and in the end the code breaks. Once you remove
(manufacturer.contains("samsung")) &&
It works. Since this is a custom ROM you could not plan for that of course. I am trying to figure out a way to detect if the device is running cyanogenmod and then include this in your code.
Thanks for a nice camera fix!
Edit3: I fixed it to run on Cyanogenmod on the Galaxy S3 by changing your code to this:
Well, now it sometimes works, sometimes it does not. Strange.
if (getPackageManager().hasSystemFeature("com.cyanogenmod.android") || (!(manufacturer.contains("samsung")) && !(manufacturer.contains("sony")) && !(manufacturer.contains("lge"))))
I experience some problems when using this with Sony Xperia Z5.
I added this and it got a lot better.
if (buildType.contains("sony")&& buildDevice.contains("e5823")) {
setPreDefinedCameraUri = true;}
But 4 times out of 22 it restarted the camera and once it restarted two times. I restarted the App due to every test.
Is there some way to get around this or do I accept this result?
The thing is that if the camera restarts I can press the back button twice and boom, the image is there in my Imageview and saved

Android Image doesn't save using native camera app on Nexus S

So this is kind of weird because out of the three devices I'm testing on, I only get this issue on the Google Nexus S with 4.0.3. I'm starting the native camera app to take a picture, and I don't care where the image is saved to, so I don't specify, hoping that it will get saved to the default location, but no location is saved at all!
Does work fine on the Galaxy S 2 and Samsung Skyrocket (both with 2.3.something).
Code I'm using to start the app
Intent camIntent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
startActivityForResult(camIntent, TAKE_PHOTO);
return true;
I'm keeping it really simple, that's why I'm confused! Does anyone know of any issues specific to the Nexus S's camera?
EDIT: It would seem that maybe the Nexus S's camera app saves the file in some onActivityResult, and since I start up the native camera app and don't save the image upon return, it doesn't save it. Does anyone know this this to be true? Seen this behavior?
EDIT: No one has seen this? I find it hard to believe I'm the first person to run into this...
EDIT: Alright, well after working on it some more, I tried adding a URI into the EXTRA_OUTPUT of the intent like so:
camIntent.putExtra(MediaStore.EXTRA_OUTPUT,
Uri.parse(folderPath + String.format("%d.jpg", System.currentTimeMillis())));
And now I see the behavior described here: Android ACTION_IMAGE_CAPTURE Intent where the camera app doesn't do anything when I hit ok, and creating the file beforehand doesn't work either, as I tried like this:
File f = new File(folderPath, filename);
f.createNewFile();
camIntent.putExtra(MediaStore.EXTRA_OUTPUT, Uri.parse(folderPath + filename));
Alright, figured it out. Oddly, and I have NO idea why, but changing this line:
camIntent.putExtra(MediaStore.EXTRA_OUTPUT, Uri.parse(folderPath + filename));
to
camIntent.putExtra(MediaStore.EXTRA_OUTPUT, Uri.fromFile(file));
fixed it. Don't know if I should understand why... but whatever, it works. I did try a billion other things...
not specifying an EXTRA_OUTPUT at all -> camera would act like everything was good... except it wouldn't save the image ANYWHERE
getting the bitmap taken by intent.getExtras().get("data") -> did return a bitmap, but it was not the full size image as documented extensively in many bug reports
using the above Uri.parse method -> causes the native camera app checkmark button to not do anything when clicked
Hope this helps someone..

Android: MediaStore.Images.Media.EXTERNAL_CONTENT_URI ... show pictures in full size?

I guess this question has been asked before, but I can't seem to find a proper answer/solution.
Have a note-taking app, which allows to take pictures. For that I start an intent, that starts up the built-in camera-app. So far so good.
But when I show that image in my app, it's in a much smaller format :(
The funny/weird thing is, that the camera-app did take a full-resolution picture! But for some reason I can't get the full version to show in my app???
So, when I use the standard Android Gallery app, and go to that picture, it is very obvious that it's full size (I can zoom in and see details I really can't see when I zoom in, in my own app). Also, the dimensions are really those of the original picture, taken with the 5MP camera.
In my app, they are very small. My phone has Android 2.2, but the same happens on my emulator (Android 2.1).
How should I retrieve the pictures in my app??? Tried a couple of ways, but none works :( Don't need a complete example (allthough that's very welcome), just a few clues are enough to search for myself.
Tx in advance!!
Greetingz,
Koen<
Very weird, I found the solution/answer by looking at the _ID-values that were being inserted in my own database. First I noticed that when I selected an existing image (via the build-in Android Gallery), I did get the full size image.
When I first took a picture, I got a scaled image. So where was the difference. Apparantly at the location where the _ID of the picture got stored in my database. In my case, and probably most cases, this happens in the onActivityResult procedure.
First take a look at what I initially had:
if(requestCode == REQUEST_CAMERA && resultCode == RESULT_OK){
String timestamp = Long.toString(System.currentTimeMillis());
// get the picture
mPicture = (Bitmap)result.getExtras().get("data");
//save image to gallery
String pictureUrl = MediaStore.Images.Media.insertImage(getContentResolver(), mPicture, getResources().getString(R.string.app_name_short), timestamp);
insertPictureAttachment(mRowId.intValue(), Integer.parseInt(Uri.parse(pictureUrl).getLastPathSegment()));
The "insertPictureAttachment"-method does the actual inserting into the database.
Looking backwards, this was a bit weird anyway ... make a picture, so I could make an URI of it, and then get the last path segment (which is the _ID), so I could insert that into my database.
Eventually, it turns out that I can replace the above code with just one line:
insertPictureAttachment(mRowId.intValue(), Integer.parseInt(result.getData().getLastPathSegment()));
Much shorter, and actually makes more sense ... rather than getting the info from result.getExtras().get("data"), I get my info from result.getData(), which gives the _ID of the original, full-size image.
I will do some further research on this though, cause it's not clear to me yet why I actually don't have to call MediaStore.Images.Media.insertImage(...) ... maybe I will have to if I want specific features (like a custom file location or something like that).
Greetingz,
Koen<

Categories

Resources