How to read pixel values for Raw Image on Android - android

I have managed to capture a RAW image on Android and save it to disk as a DNG. What I want to do is read the raw sensor data of each "pixel" for the image. Specifically, I just want to read the green sensor values before any processing. What is the best way to do this?

Related

Android camera2API: save raw camera bytes stream into local storage

I am required to develop an app that saves raw byte array streams into local storage to be processed offline (using the camera metadata)
I am using camera2API to get the RAW_SENSOR format and I am able to save it (each frame is around 25MB in size), but with a very low FPS(1-4).
Is there any way to optimize this process?
Can I select part of the active sensor area to be saved instead of the full active area (when saving in a raw format I cannot choose resolution)? Is cropping possible when working with raw formats?
Is there a good way to save a big file without disturbing the camera FPS?
Thanks

Bitmap.compress png 8-bit pallete

I'm playing with sending some images over a Bluetooth link to an embedded device. The device speaks and decodes PNGs for the images, however it seems some of the messages are getting too large for transfer over the wire at a reasonable speed / data size.
I'm currently using Bitmap.compress(Bitmap.Compression.PNG, 0, <ByteArrayOutputStream>) to get the byte array for the compressed data.
Is there any way I can either make Bitmap.compress() be lossy with the colors. And for that matter, would Bitmap.compress ever use a 256-colour mode on it's own, given an image that has an appropriate number of colors.

What format is for Android camera with raw pictureCallback?

I am trying to use data from Android picture. I do not like JPEG format, since eventually I will use gray scale data. YUV format is fine with me, since the first half part is gray-scale.
from the Android development tutorial,
public final void takePicture (Camera.ShutterCallback shutter,
Camera.PictureCallback raw, Camera.PictureCallback postview,
Camera.PictureCallback jpeg)
Added in API level 5
Triggers an asynchronous image capture. The camera service will
initiate a series of callbacks to the application as the image capture
progresses. The shutter callback occurs after the image is captured.
This can be used to trigger a sound to let the user know that image
has been captured. The raw callback occurs when the raw image data is
available (NOTE: the data will be null if there is no raw image
callback buffer available or the raw image callback buffer is not
large enough to hold the raw image). The postview callback occurs when
a scaled, fully processed postview image is available (NOTE: not all
hardware supports this). The jpeg callback occurs when the compressed
image is available. If the application does not need a particular
callback, a null can be passed instead of a callback method.
It talks about "the raw image data". However, I find nowhere information about the format for the raw image data?
Do you have any idea about that?
I want to get the gray-scale data of the picture taken by the photo, and the data are located in the phone memory, so it would not cost time to write/read from image files, or convert between different image formats. Or maybe I have to sacrifice some to get it??
After some search, I think I found the answer:
From the Android tutorial:
"The raw callback occurs when the raw image data is available (NOTE:
the data will be null if there is no raw image callback buffer
available or the raw image callback buffer is not large enough to hold
the raw image)."
See this link (2011/05/10)
Android: Raw image callback supported devices
Not all devices support raw pictureCallback.
https://groups.google.com/forum/?fromgroups=#!topic/android-developers/ZRkeoCD2uyc (2009)
The employee Dave Sparks at Google said:
"The original intent was to return an uncompressed RGB565 frame, but
this proved to be impractical. " "I am inclined to deprecate that API
entirely and replace it with hooks for native signal processing. "
Many people report the similar problem. See:
http://code.google.com/p/android/issues/detail?id=10910
Since many image processing processes are based on gray scale images, I am looking forward gray scale raw data in the memory produced for each picture by the Android.
You may have some luck with getSupportedPictureFormats(). If it lists some YUV format, you can use setPictureFormat() and the desired resolution, and ciunterintuitively you will get the uncompressed high quality image in JpegPreview callback, from which grayscale (a.k.a. luminance) can be easily extracted.
Most devices will only list JPEG as a valid choice. That's because they perform compression in hardware, on the camera side. Note that the data transfer from camera to application RAM is often the bottleneck; if you can use stagefright hw JPEG decoder, you will actually get the result faster.
The biggest problem with using the raw callback is that many developers have trouble with getting anything returned on many phones.
If you are satisfied with just the YUV array, your camera preview SurfaceView can implement PreviewCallback and you can add the onPreviewFrame method to your class. This function will allow you direct access to the YUV array for every frame. You can fetch it when you choose.
EDIT: I should specify that I was assuming you were building a custom camera application in which you extended SurfaceView for a custom camera preview surface. In order to follow my advice you will need to build a custom camera. If you are trying to do things quickly though I suggest building a new bitmap out of the JPEG data where you implement the greyscale yourself.

How to compare an image stored in the device with the scanned frame value using camera?

I'm having an image stored in the android device. I need to compare the image with the video stream captured using its camera. If a match found we need to display a message.
How can i do this?
It might be no so trivial question. Definitely need to research for available image processing libs online, otherwise extract DIB section (binary data of the image, as RGB or ARGB) and then compare with streaming image with possible round percentage.
It's just idea, and not so real answer.

is there a way to access the pixels of a video?

Having a video file, there is any way to access single pixel values?
I have two cases where I would like to access the pixels:
From the video camera
From a video file What I need is geting a pixel information for a certain place with something like getPixel(posX, posY) and returning the RGB information
I have an algorithm that detects blobs (homogeneous parts) of an image and I would like to implement it in real time using the android video camera and offline processing analyzing a video file.
Yes, but you'll need to do some work.
Extract a video frame from the source file with a tool such as FFmpeg. The result will be a JPEG or other such image file
Use an image processing tool, like Imagemagick, to extract the information for a pixel.
Presto!

Categories

Resources