I'm trying to use TensorImage.load() to load a bitmap of picture the user took with the camera app. When I pass in the bitmap I get this error:
java.lang.IllegalArgumentException: Only supports loading ARGB_8888 bitmaps
This is my code for when I call the load function. First, it starts with the onActivityResult:
override fun onActivityResult(requestCode: Int, resultCode: Int, data: Intent?) {
super.onActivityResult(requestCode, resultCode, data)
if (requestCode == CAMERA_REQUEST_CODE) {
if(Build.VERSION.SDK_INT < 28) {
val foodBitmap = MediaStore.Images.Media.getBitmap(this.contentResolver, Uri.fromFile(photoFile))
// pass this bitmap to the classifier
val predictions = foodClassifier.recognizeImage(foodBitmap, 0)
} else {
val source = ImageDecoder.createSource(this.contentResolver, Uri.fromFile(photoFile))
val foodBitmap = ImageDecoder.decodeBitmap(source)
// pass this bitmap to the classifier
val predictions = foodClassifier.recognizeImage(foodBitmap, 0)
}
}
}
In the recognizeImage function, I call a variable named inputImageBuffer which is of type TensorImage. I call the load function and pass the bitmap. This is where the application crashes. Can someone tell me how do I fix this?
I solved the issue by changing the bitmap configuration in simplest way.
Bitmap bmp = imageBitmap.copy(Bitmap.Config.ARGB_8888,true) ;
Here
ii - Bitmap is immutable therefore I have make a copy with Bitmap.Config.ARGB_8888 configuration and a new Bitmap with refrence,
for further reference
https://developer.android.com/reference/android/graphics/Bitmap.Config
For anyone else this is how I solved the issue by changing the bitmap configuration.
// Convert the image to a Bitmap
var bitmap: Bitmap? = null
try {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.P) {
val source = ImageDecoder.createSource(requireContext().contentResolver, uri!!)
bitmap = ImageDecoder.decodeBitmap(source)
bitmap = bitmap.copy(Bitmap.Config.ARGB_8888, true)
} else {
bitmap = MediaStore.Images.Media.getBitmap(requireContext().contentResolver, uri!!)
}
} catch (e: Exception) {
println("Could not convert image to BitMap")
e.printStackTrace()
}
Related
I'm trying to return the bitmap value from a lambda function but I get the error: lateinit property bitmap has not been initialized ... Is there a way to check if the ImageRequest is complete before returning the bitmap?
fun getBitmap(context:Context,imageUrl: String) : Bitmap{
lateinit var bitmap: Bitmap
val imageRequest = ImageRequest.Builder(context)
.data(imageUrl)
.target { drawable ->
bitmap = drawable.toBitmap() // This is the bitmap 🚨
}
.build()
ImageLoader(context).enqueue(imageRequest)
return bitmap
}
Hm... I don't have much time to explain. Just see the code and understand.
You have to use execute() instead of enqueue(). See
private suspend fun getBitmap(context: Context, url: String): Bitmap? {
var bitmap: Bitmap? = null
val request = ImageRequest.Builder(context)
.data(url)
.target(
onStart = {
Log.d(TAG, "Coil loader started.")
},
onSuccess = { result ->
Log.e(TAG, "Coil loader success.")
bitmap = result.toBitmapOrNull() // Or (result as BitmapDrawable).bitmap
},
onError = {
Log.e(TAG, "Coil loading error")
}
)
.build()
context.imageLoader.execute(request)
return bitmap
}
I am opening my phones camera, taking a photo, showing a 50px snippet then saving an image into FirebaseFireStore Storage.
The image is very pixelated when I download it from FireStore. Can someone have a look at my code to see were I am going wrong please?
Perhaps I am saving the 50px image, whereas I would like to save the image that was taken from the camera.
Variable
val REQUEST_CODECAM = 200
var bitmapPhoto: Bitmap? = null
Open the Camera
binding!!.galleryContainer.setOnClickListener { v: View? ->
val cameraIntent = Intent(MediaStore.ACTION_IMAGE_CAPTURE)
startActivityForResult(cameraIntent, REQUEST_CODECAM)
}
override fun onActivityResult(requestCode: Int, resultCode: Int, data: Intent?) {
super.onActivityResult(requestCode, resultCode, data)
if (resultCode == Activity.RESULT_OK && requestCode == REQUEST_CODECAM && data != null){
bitmapPhoto = data.extras?.get("data") as Bitmap
// Image is shown in 50px ListView cell
binding!!.issueImage.setImageBitmap(data.extras?.get("data") as Bitmap)
//imageUri = data.data
//Picasso.get().load(imageUri).into(binding!!.issueImage)
}
}
Save Button Pressed...
fun saveImage(action: GenericAction?) {
val imageName = UUID.randomUUID().toString()
val imageReference = FirebaseStorage.getInstance().reference.child("partInfoImagesFolder").child(imageName)
val baos = ByteArrayOutputStream()
bitmapPhoto.compress(Bitmap.CompressFormat.JPEG, 100, baos)
val data = baos.toByteArray()
var uploadTask = imageReference.putBytes(data)
uploadTask.addOnFailureListener {
// Handle unsuccessful uploads
}.addOnSuccessListener { taskSnapshot ->
// taskSnapshot.metadata contains file metadata such as size, content-type, etc.
// ...
imageReference.downloadUrl.addOnCompleteListener { task1: Task<Uri?>? ->
if (task1 != null) {
if (task1.isSuccessful()) {
saveCollOfURLString = task1.getResult().toString()
action?.onCallback()
}
}
}
}
}
My old implementation to upload image to Firebase Storage in JPEG format without any compression
private fun sendToFirebase() {
if (imgUri != null) {
val fileRef = storageRef!!.child(username+ ".jpg")
....
// code to upload and read image url
}
}
Decided to write a image compression technique to compress image and then upload to Firebase Storage
Result : Achieved image compression technique, see below
Newly added code to compress image
URI to Bitmap
val bitmap = MediaStore.Images.Media.getBitmap(activity?.contentResolver, imgUri)
Method to compress Bitmap
private fun compressBitmap(bitmap: Bitmap, quality:Int):Bitmap{
val stream = ByteArrayOutputStream()
bitmap.compress(Bitmap.CompressFormat.WEBP, quality, stream)
val byteArray = stream.toByteArray()
return BitmapFactory.decodeByteArray(byteArray, 0, byteArray.size)
}
Bitmap compression Implementation
compressBitmap(bitmap, 80)
Query: How to upload same compressed image to Firebase storage
private fun sendToFirebase() {
if (imgUri != null) {
// code to convert uri to bitmap <start>
val bitmap = MediaStore.Images.Media.getBitmap(activity?.contentResolver, imgUri)
compressBitmap(bitmap, 80)
// code to convert uri to bitmap <end>
// old implementation
.....
}
}
You don't seem to be passing anything into your function for sendtoFirebase. i am posting code i have done to successfully upload.
you looking at compressing first so you would need this;
private fun compressBitmap(bitmap: Bitmap, quality: Int): Bitmap {
val stream = ByteArrayOutputStream()
bitmap.compress(Bitmap.CompressFormat.WEBP,quality,stream)
val byteArray = stream.toByteArray()
arrayByte = byteArray
uploadFile(arrayByte)
return BitmapFactory.decodeByteArray(byteArray,0,byteArray.size)
}
in the above, uploadFile is the call for the firebase upload. i am passing the compressed bitmap into the function. the functional for upload looks as follows:
in below mImageURI is a companion object which is part of the URI passed for compression. you can remove the if statement below if you dont want to do the check
private fun uploadFile(data:ByteArray) {
if (mImageUri != null){
val storageref = imageref.child("put your image id here")
storageref.putBytes(data).addOnSuccessListener {
Handler().postDelayed({
progressbar.setProgress(0)
Toast.makeText(activity, "Upload Successful", Toast.LENGTH_LONG).show()
}
, 1000)
}.addOnFailureListener{e->
Toast.makeText(activity,e.message,Toast.LENGTH_LONG).show()
}.addOnProgressListener {taskSnapshot ->
val progress = (100.0 * taskSnapshot.bytesTransferred/taskSnapshot.totalByteCount)
progressbar.setProgress(progress.toInt())
}
}
else if(mImageUri == null) {
Toast.makeText(activity,"No File Selected",Toast.LENGTH_LONG).show()
}
}
You do not need to have the progress bar above. its just a nice visual for the user to have to see the progress of the upload if the file is large.
your really only need to ensure that you passing data into .putbytes
Edit: For your onActivity result if your code is similar to mine then use;
override fun onActivityResult(requestCode: Int, resultCode: Int, data: Intent?) {
super.onActivityResult(requestCode, resultCode, data)
if (requestCode == PICK_IMAGE_REQUEST && resultCode == RESULT_OK
&& data != null && data.getData() != null) {
mImageUri = data.getData()!!
image1.setImageURI(data.getData())
}
}
in the above image1 is a imageView on the current page to show the image selected.
Hope this helps
I have a problem using Glide load image android.I will present it simply as follows:
First I load image using glide
I want to choose image using Intent.ACTION_PICK but when I using image.setImageBitmap on onActivityResult but it not working
Glide.with(context!!).load(url).centerCrop().error(R.drawable.avata_boy).into(imgAvata)
override fun onActivityResult(requestCode: Int, resultCode: Int, data: Intent?) {
super.onActivityResult(requestCode, resultCode, data)
if (requestCode == PICK_IMAGE_REQUEST && resultCode == Activity.RESULT_OK
&& data != null && data.data != null
) {
filePath = data.data
try {
val bitmap = BitmapFactory.decodeStream(activity!!.contentResolver.openInputStream(filePath!!))
imgAvata.setImageBitmap(bitmap)
Glide.with(context!!)
.load(bitmap)
.placeholder(R.drawable.avata_boy)
.diskCacheStrategy(DiskCacheStrategy.NONE)
.skipMemoryCache(true)
.into(imgAvata)
} catch (e: IOException) {
}
Try this!!
val myBitmap = BitmapFactory.decodeFile(filePath!!.getAbsolutePath())
imgview.setImageBitmap(myBitmap)
I want to get some info from barcode using my camera.
It works when I use png image downloaded from site, but when I try to get it work with a photo I took, it outputs me the empty array. Seems like I need to make some preps with the image in order to make it work.
Here is my code:
fun getTheBarCode(bitmap: Bitmap) {
val options = FirebaseVisionBarcodeDetectorOptions.Builder()
.setBarcodeFormats(
FirebaseVisionBarcode.FORMAT_AZTEC)
.build()
val detector = FirebaseVision.getInstance().getVisionBarcodeDetector(options)
val bm = BitmapFactory.decodeResource(getResources(), R.drawable.barcode) //this is the place where I can load my downloaded barcode to make everything work!
val newBitmap = Bitmap.createScaledBitmap(bitmap, 300, 500, false)
val image = FirebaseVisionImage.fromBitmap(newBitmap)
photoImage.setImageBitmap(newBitmap)
detector.detectInImage(image)
.addOnSuccessListener {
Log.d("Success", "Success")
//empty array here, when I take picture.
}
.addOnFailureListener {
Log.d("Failed", it.message)
}
}
This is how I get the image from the camera
override fun onActivityResult(requestCode: Int, resultCode: Int, data: Intent) {
super.onActivityResult(requestCode, resultCode, data)
if (requestCode == CAMERA_REQUEST_CODE && resultCode == Activity.RESULT_OK) {
val photo = data.extras.get("data") as Bitmap
getTheBarCode(photo)
}
}
Edit:
I've take a picture with my phone, scale it down to 1500x1000px and put it inside my app directory, then loaded it as a bitmap.
Still not working.
The approach you're using will only give you back thumbnail of photo (as per https://developer.android.com/training/camera/photobasics) ...that may not be sufficient for what you're trying to do. That link also contains info on how to get access to full size photo.