I am using Person Object to build chat app notifications like Gmail. So I have created the person object. But i want to set icon from a image URL coming from server an not from drawable resource. I am using Coil library for loading images . The below code is working fine,
By default the android generates the icon with the first letter passed to the title.
So, How can i show the image coming from server as a URL in icon with best practice of memory and resource usages. Below is my Person object.
Here is the Official link of Person.
And this is what I referred to Notification Messaging style tutorial
val senderPerson: Person = Person.Builder().also {person->
person.setKey(message.getSenderKey(prefs))
person.setName(message.getNotificationTitle())
person.setImportant(true)
//****HERE I WANT TO SET IMAGE FROM URL******
// person.setIcon(IconCompat.createWithResource(this, R.drawable.placeholder_transaparent))
}.build()
You'd load the image URL asynchronously using the Coil Request, and return the fetched icon in a closure.
Coil returns a drawable, and you can get Icon from a Drawable through a Bitmap using IconCompat.createWithBitmap((drawable as BitmapDrawable).bitmap):
private fun asyncLoadIcon(imageUrl: String?, setIcon: (IconCompat?) -> Unit) {
if (imageUrl.isNullOrEmpty())
setIcon(null)
else {
// using COIL to load the image
val request = ImageRequest.Builder(this)
.data(imageUrl)
.target { drawable ->
setIcon(IconCompat.createWithBitmap((drawable as BitmapDrawable).bitmap)) // // Return the fetched icon from the URL
}
.listener(object : ImageRequest.Listener { // Return null icon if the URL is wrong
override fun onError(request: ImageRequest, result: ErrorResult) {
setIcon(null)
}
})
.build()
imageLoader.enqueue(request)
}
}
This code returns a null icon if the URL is wrong or if it's empty/null.
Then build the notification message with that function:
asyncLoadIcon("https://my_icon_url.png") { // set the icon url
val person = Person.Builder().apply {
setName("John Doe")
setIcon(it)
}.build()
// Build the notification with the person
.....
}
For some enhancements, you'd enable caching, and disable hardware bitmaps; but I do recommend other libraries like Glide and Picasso.
.memoryCachePolicy(CachePolicy.ENABLED)
.diskCachePolicy(CachePolicy.ENABLED)
.allowHardware(false) // Disable hardware bitmaps
Related
I'm just trying to resize an image after the user launches the Image Picker from my app and chooses an image file on the local device (handling a remote image from Dropbox or something will be another battle) and while this has worked for me previously, now I'm getting this exception:
java.lang.RuntimeException: Failure delivering result ResultInfo{who=null, request=1105296364, result=-1, data=Intent { dat=content://com.android.externalstorage.documents/document/primary:Download/20170307_223207_cropped.jpg flg=0x1 }} to activity {my.app/MainActivity}: java.io.FileNotFoundException: No content provider: /document/primary:Download/20170307_223207_cropped.jpg
This occurs after the image is chosen in the Picker, because I'm running my "processing" code to locate the image, resize it, and copy it to a subfolder in the app's folder.
Like I said, this worked, but I'm not sure what's wrong now. I've tried this on the emulator as well as on my Galaxy S10 via USB debugging and it's the same result. The image is in the local storage "Download" folder on the emulator as well as my own device.
The URI looks weird (I mean the picture is just in the local storage "Download" folder) but I'm no URI expert so I assume it's fine, because that's what the Image Picker returns.
Here's the immediate code that's throwing the exception (specifically, the ImageDecoder.decodeBitmap call):
private fun copyFileToAppDataFolder(
context: Context,
imageTempPath: String
): String {
// ensure we are sent at least a non-empty path
if (imageTempPath.isEmpty()) {
return ""
}
val appDataFolder = "${context.dataDir.absolutePath}/images/firearms"
var filename = imageTempPath.substringAfterLast("/", "")
if (filename.isNullOrBlank()) {
filename = imageTempPath.substringAfterLast("%2F", "")
}
// couldn't parse filename from Uri; exit
if (filename.isNullOrBlank()) {
return ""
}
// get a bitmap of the selected image so it can be saved in an outputstream
var selectedImage: Bitmap? = null
selectedImage = if (Build.VERSION.SDK_INT <= 28) {
MediaStore.Images.Media.getBitmap(context.contentResolver, Uri.parse(imageTempPath))
} else {
ImageDecoder.decodeBitmap(ImageDecoder.createSource(context.contentResolver, Uri.parse(imageTempPath)))
}
if (selectedImage == null) {
return ""
}
val destinationImagePath: String = "$appDataFolder/$filename"
val destinationStream = FileOutputStream(destinationImagePath)
selectedImage.compress(Bitmap.CompressFormat.JPEG, 100, destinationStream)
destinationStream.close()
return destinationImagePath
}
That above function is called from my ViewModel (that processFirearmImage function is just calling the one above), where I send the result URI from the image Picker as well as the Application Context:
// this event is fired when the Image Picker returns
is AddEditFirearmEvent.AssignedPicture -> {
val resizedImagePath = ShotTrackerUtility.processFirearmImage(
event.applicationContext, // this is from LocalContext.current in Composable
event.value // result uri from image picker
)
_firearmImageUrl.value = resizedImagePath
}
I don't know, lol. I can't believe this is such a difficult thing but information for this sure seems sparse (for Compose especially, but even so) but I don't really consider launching an Image Picker and resizing the resulting image to be that weird. Any help would be great from you smart people.
Taking a step away from programming problems and coming back seems about the best bet sometimes, lol.
I came back tonight and within a couple minutes noticed that I was sending an improper Uri to the ImageDecoder.createSource method that was causing the exception. Basically this was happening:
val imageTempPath = theUriReturnedFromImagePicker.path ?: ""
ImageDecoder.decodeBitmap(ImageDecoder.createSource(context.contentResolver, Uri.parse(imageTempPath)))
And it should've been:
val imageUrl = theUriReturnedFromImagePicker
ImageDecoder.decodeBitmap(ImageDecoder.createSource(context.contentResolver, imageUri))
As I mentioned in the OP, this originally worked but I must've changed code around a bit (arguments I'm sending to various methods/classes, mostly). I'm also using that Uri.path part to get the filename of the image chosen so I overlooked and/or got confused to what I was sending to ImageDecoder.createSource.
Doh. Maybe someone else will do something dumb like me and this can help.
I'm trying to use Glide to display thumbnails from the Google Photos library in a RecyclerView. In order to fetch images from this library, I must make two HTTP requests: first I must get the MediaItem from the id (I've already obtained a list of ids in a previous step), and second I must request the actual image from thumbnailUrl. This is the recommended process, as baseUrls expire after one hour so you aren't supposed to store them:
val googlePhotosThumbnailUrl =
App.googlePhotosService.getMediaItem(asset.googlePhotosId) // First HTTP request fetches MediaItem
.run {
val baseUrl = this.baseUrl
val thumbnailUrl = "$baseUrl=w400-h400" // Appends the requested dimensions to the Url.
thumbnailUrl // Second HTTP request fetches this URL
}
The problem is that Glide's load() method doesn't appear to support chaining HTTP requests like what's shown above:
GlideApp.with(itemView.context)
.asBitmap()
.load(googlePhotosThumbnailUrl)
.diskCacheStrategy(DiskCacheStrategy.ALL)
.into(binding.imageViewLargeThumbnail)
The above code executes synchronously, so loading is incredibly slow. I've managed to fix this by using coroutines as shown below. But the problem with this is Glide doesn't cache any of the images, so if I scroll down and back up Glide refetches every image:
override fun bindAsset(asset: GooglePhotosAsset, position: Int) {
this.asset = asset
this.index = position
// We set the loading animation here for Google Photos assets, since for those we need to fetch a mediaItem and then a baseUrl.
// This forces us to perform the process in a coroutine, and Glide can't set the loading animation until the baseUrl is fetched.
binding.imageViewLargeThumbnail.setImageResource(R.drawable.loading_animation)
fragment.lifecycleScope.launch(Dispatchers.Default) {
val googlePhotosThumbnailUrl = App.googlePhotosService.getMediaItem(asset.googlePhotosId) // First HTTP request fetches MediaItem
.run {
val baseUrl = this.baseUrl
val thumbnailUrl = "$baseUrl=w400-h400" // Appends the requested dimensions to the Url.
thumbnailUrl // Second HTTP request fetches this URL
}
withContext(Dispatchers.Main) {
GlideApp.with(itemView.context)
.asBitmap()
.load(googlePhotosThumbnailUrl)
.diskCacheStrategy(DiskCacheStrategy.ALL)
.fitCenter()
.into(binding.imageViewLargeThumbnail)
}
}
}
The only potentially relevant answer I've found is this one, but it seems super complicated and outdated. Are there any better solutions?
I am using MapBox 8.4.0 and I have the following snippet to load the map on a fragment, pinning the user's current location with a marker. I need to customize the marker by dynamically setting foregroundDrawable with an image loaded from a network URL. But foregroundDrawable only accepts a resource ID as parameter.
val customOptions = LocationComponentOptions.builder(context!!)
.elevation(5f)
.foregroundDrawable(R.drawable.icon_profile) // set image dynamically
.backgroundDrawable(R.drawable.icon_current_location)
.build()
val activationOptions = LocationComponentActivationOptions.builder(context!!, style)
.locationComponentOptions(customOptions)
.build()
mapboxMap.locationComponent.apply {
activateLocationComponent(activationOptions)
isLocationComponentEnabled = true
cameraMode = CameraMode.TRACKING
renderMode = RenderMode.NORMAL
}
It should look like this with the profile icon replaced with the loaded image at run time.
https://i.stack.imgur.com/eoXuG.jpg
Any way I could achieve this?
We can use foregroundName() to set dynamic icon for our marker.
mapboxMap.getStyle { loadedStyle ->
loadedStyle.addImage("marker-icon", bitmapIcon) // create a Bitmap icon; you may use Glide to load image from URL
val locationComponentOptions: LocationComponentOptions =
LocationComponentOptions.builder(context!!)
.foregroundName("marker-icon") // set icon for the marker
.build()
val activationOptions =
LocationComponentActivationOptions.builder(context!!, loadedStyle)
.locationComponentOptions(locationComponentOptions)
.build()
mapboxMap.locationComponent.apply {
activateLocationComponent(activationOptions)
...
}
}
Fly by comment here to say that if you're using Picasso instead of Glide, https://stackoverflow.com/a/20181629/6358488 shows how to use Picasso to set a target to get the Bitmap from the network URL call.
I'm new to wowza and is working on a project to live stream video captured from an Android device. I need to attach an image(dynamic one) to the video stream so that the users watching the stream can view it. The code I have tried is given below(as from the example source code from wowza):
// Read in a PNG file from the app resources as a bitmap
Bitmap overlayBitmap = BitmapFactory.decodeResource(getResources(), R.drawable.overlay_logo);
// Initialize a bitmap renderer with the bitmap
mWZBitmap = new WZBitmap(overlayBitmap);
// Place the bitmap at top left of the display
mWZBitmap.setPosition(WZBitmap.LEFT, WZBitmap.TOP);
// Scale the bitmap initially to 75% of the display surface width
mWZBitmap.setScale(0.75f, WZBitmap.SURFACE_WIDTH);
// Register the bitmap renderer with the GoCoder camera preview view as a frame listener
mWZCameraView.registerFrameRenderer(mWZBitmap);
This works fine, but I don't want to show the image at the broadcasting end, the image should be visible only at the receiving end. Is there anyway to get this done?
I managed to get this done by registeringFrameRenderer and setting the bitmap inside onWZVideoFrameRendererDraw.
Code snippet is as given below(Kotlin):
private fun attachImageToBroadcast(scoreValue: ScoreUpdate) {
bitmap = getBitMap(scoreValue)
// Initialize a bitmap renderer with the bitmap
mWZBitmap = WZBitmap(bitmap)
// Position the bitmap in the display
mWZBitmap!!.setPosition(WZBitmap.LEFT, WZBitmap.TOP)
// Scale the bitmap initially
mWZBitmap!!.setScale(0.37f, WZBitmap.FRAME_WIDTH)
mWZBitmap!!.isVisible = false // as i dont want to show it initially
mWZCameraView!!.registerFrameRenderer(mWZBitmap)
mWZCameraView!!.registerFrameRenderer(VideoFrameRenderer())
}
private inner class VideoFrameRenderer : WZRenderAPI.VideoFrameRenderer {
override fun onWZVideoFrameRendererRelease(p0: WZGLES.EglEnv?) {
}
override fun onWZVideoFrameRendererDraw(p0: WZGLES.EglEnv?, framSize: WZSize?, p2: Int) {
mWZBitmap!!.setBitmap(bitmap) // note that the bitmap value gets changed once I get the new values
//I have implemented some flags and conditions to check whether a new value has been obtained and only if these values are satisfied, the setBitmap is called. Otherwise, as it is called continuously, flickering can occur in the screen
}
override fun isWZVideoFrameRendererActive(): Boolean {
return true
}
override fun onWZVideoFrameRendererInit(p0: WZGLES.EglEnv?) {
}
}
In iOS, we can implement WZVideoSink protocol to achieve this.
First, we need to update the scoreView with the latest score and then convert the view to an image.
Then we can embed this image to the captured frame using WZVideoSink protocol method.
A sample code is given below.
// MARK: - WZVideoSink Protocol
func videoFrameWasCaptured(_ imageBuffer: CVImageBuffer, framePresentationTime: CMTime, frameDuration: CMTime) {
if self.goCoder != nil && self.goCoder!.isStreaming {
let frameImage = CIImage(cvImageBuffer: imageBuffer)
var addCIImage: CIImage = CIImage()
if let scoreImage = self.getViewAsImage() {
// scoreImage is the image you want to embed.
addCIImage = CIImage(cgImage: scoreImage.cgImage!)
}
let filter = CIFilter(name: "CISourceOverCompositing")
filter?.setDefaults()
filter?.setValue(addCIImage, forKey: kCIInputImageKey)
filter?.setValue(frameImage, forKey: kCIInputBackgroundImageKey)
if let outputImage: CIImage = filter?.value(forKey: kCIOutputImageKey) as? CIImage {
let context = CIContext(options: nil)
context.render(outputImage, to: imageBuffer)
} else {
let context = CIContext(options: nil)
context.render(frameImage, to: imageBuffer)
}
}
}
func getViewAsImage() -> UIImage {
// convert scoreView to image
UIGraphicsBeginImageContextWithOptions(self.scoreView.bounds.size, false, 0.0)
self.scoreView.layer.render(in: UIGraphicsGetCurrentContext()!)
let scoreImage: UIImage = UIGraphicsGetImageFromCurrentImageContext()!
UIGraphicsEndImageContext()
return scoreImage
}
I can't seem to get the url, from the PlacePhotoMetadata object. Debugger shows that there is an URL there but I can't seem to access it.
How do you access the URL in the object?
val placeId = "ChIJa147K9HX3IAR-lwiGIQv9i4"
val photoMetadataResponse = mGeoDataClient.getPlacePhotos(placeId)
photoMetadataResponse.addOnCompleteListener { task ->
// Get the list of photos
val photos = task.result
// Get the PlacePhotoMetadataBuffer (metadata for all of the photos)
val photoMetadataBuffer = photos.photoMetadata
// Get the first photo in the list
for (photo in photoMetadataBuffer) {
// Get the attribution text
val attribution = photo.attributions
}
}
You can't. Take a look at the documentation for PlacePhotoMetadata. There are methods to download a bitmap of the image, but no methods that return the URL.
To get the photo you should do something like this:
// this is your for-loop:
photoMetadataBuffer.forEach { photo ->
photo.getPhoto(client).setResultCallback({ result ->
// do whatever you want here:
showPhotoWithAttribution(photo.attributions, result.getBitmap())
})
}
Note that replacing a for-loop with a forEach call has no real advantage, it just makes your code look cleaner.