how do i use
onPreviewFrame (byte[] data, Camera camera)
inorder to getPixel(int x, int y). I want to change the frame into a bitMap is that possible? I am using onPreviewFrame because I want to getPixel data every second, so it would be too long to get the picture.
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
byte[] baos = convertYuvToJpeg(data, camera);
if (baos != null) {
Bitmap bitmap = Tool.loadBitmap(baos);
}
}
public byte[] convertYuvToJpeg(byte[] data, Camera camera) {
try {
YuvImage image = new YuvImage(data, ImageFormat.NV21,
camera.getParameters().getPreviewSize().width,
camera.getParameters().getPreviewSize().height,
null);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
Rect rect = new Rect(0, 0,
camera.getParameters().getPreviewSize().width,
camera.getParameters().getPreviewSize().height);
//set quality
int quality = 100;
image.compressToJpeg(rect, quality, baos);
return baos.toByteArray();
} catch (Exception e) {
}
return null;
}
Related
I am trying to achieve video streaming from one android device to another. In order to do this I want to obtain frames from camera, send them through sockets and show on ImageView on another phone. The problem is that I have to use ImageReader with specific format - YUV_420_888. When I try to create bitmap based on bytes from this image I can only get ALPHA_8(other formats cause Exceptions saying that buffer is not big enough for pixels), which is black and white + wrongly oriented. I would like to know if there is a way to obtain correct version directly or somehow convert it so I can make a proper bitmap ready to show on ImageView. Here's some code:
public class MainActivity extends AppCompatActivity
{
private ImageReader mImageReader;
#Override
protected void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
this.imageView = (ImageView) findViewById(R.id.image);
this.handler = new Handler(){
#Override
public void handleMessage(Message msg)
{
if(msg.what == Worker.PROCESSED_IMAGE)
{
imageView.setImageBitmap((Bitmap) msg.obj);
}
}
};
}
private void openCamera()
{
CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
try
{
String cameraId = manager.getCameraIdList()[0];
CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraId);
StreamConfigurationMap map = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
mPreviewSize = map.getOutputSizes(SurfaceTexture.class)[0];
mImageReader = ImageReader.newInstance(mPreviewSize.getWidth(), mPreviewSize.getHeight(), ImageFormat.YUV_420_888, 50);
mImageReader.setOnImageAvailableListener(new Worker(this, this.handler, mPreviewSize.getWidth(), mPreviewSize.getHeight()), new Handler());
manager.openCamera(cameraId, mStateCallback, null);
}
catch(CameraAccessException e)
{
e.printStackTrace();
}
}
}
public class Worker implements ImageReader.OnImageAvailableListener
{
#Override
public void onImageAvailable(ImageReader imageReader)
{
Image image = imageReader.acquireLatestImage();
Image.Plane plane = image.getPlanes()[0];
ByteBuffer buffer = plane.getBuffer();
Bitmap bm = Bitmap.createBitmap(width, heigth, Bitmap.Config.ALPHA_8);
bm.copyPixelsFromBuffer(buffer);
image.close();
handler.obtainMessage(PROCESSED_IMAGE, bm).sendToTarget();
}
Almost correct, try to use this code instead:
#Override
public void onImageAvailable(ImageReader reader) {
Image image = null;
try {
image = reader.acquireLatestImage();
// Log.d(TAG, String.format(Locale.getDefault(), "image w = %d; h = %d", image.getWidth(), image.getHeight()));
Image.Plane[] planes = image.getPlanes();
ByteBuffer buffer = planes[0].getBuffer();
buffer.rewind();
byte[] data = new byte[buffer.capacity()];
buffer.get(data);
Bitmap bitmap = BitmapFactory.decodeByteArray(data, 0, data.length);
//use your bitmap
} catch (Exception e) {
e.printStackTrace();
}
if (image != null) {
image.close();
}
}
I used a helper function to convert image into bitmap:
Preview2.kt#image -> bitmap
Preview.kt#yuvToBitmap
protected fun yuvToBitmapFaster(bytes: ByteArray, w: Int, h: Int): Bitmap {
if (yuvType == null) {
yuvType = Type.Builder(rs, Element.U8(rs)).setX(bytes.size)
`in` = Allocation.createTyped(rs, yuvType!!.create(), Allocation.USAGE_SCRIPT)
rgbaType = Type.Builder(rs, Element.RGBA_8888(rs)).setX(w).setY(h)
out = Allocation.createTyped(rs, rgbaType?.create(), Allocation.USAGE_SCRIPT)
}
`in`?.copyFrom(bytes)
yuvToRgbIntrinsic!!.setInput(`in`)
yuvToRgbIntrinsic!!.forEach(out)
val bmp = Bitmap.createBitmap(w, h, Bitmap.Config.ARGB_8888)
out?.copyTo(bmp)
return bmp
}
btw, Google has an example with MLkit to get camera images and process them as: VisionProcessorBase.kt and tutorial
I am working on android app which takes frames from the camera and display on surface view, I am using camera callbacks to get raw images and then convert it to byte stream and passes to the server for processing and then server return same frame. But the problem is Image view is very slow in drawing images (bitmaps) 15-20 fps. Is there any other solution using I can draw bitmaps quickly. In current code I am processing the bitmaps on a different thread and using UI thread I am setting bitmaps to image view.
Code in Camera callback is
Camera.Size pSize = camera.getParameters().getPreviewSize();
YuvImage yuv = new YuvImage(data, ImageFormat.NV21,
pSize.width, pSize.height, null);
yuv.compressToJpeg(new Rect(0, 0, pSize.width,
pSize.height), 100, baos);
rawImage = baos.toByteArray();
Bitmap bitmap = BitmapFactory.decodeByteArray(rawImage,
0, rawImage.length);
bitmap = Util.getResizedBitmap(bitmap, frameResolution);
final ByteArrayOutputStream rotatedStream = new ByteArrayOutputStream();
bitmap = Util.RotateBitmap(bitmap, 90);
bitmap.compress(Bitmap.CompressFormat.WEBP, 100, rotatedStream);
baos.close();
rawImage = rotatedStream.toByteArray();
if(isStreamingStart== true) {
beforeTime=(new Date()).getTime();
if(client.isConnected()==false){
client.connect();
}
client.send(rawImage);
rotatedStream.flush();
}
and code which returns bitmap is
decodedString = Base64.decode((String) data, Base64.DEFAULT);
byte[] dataString = ((String)data).getBytes();
String stringDecompressed = compressor.decompressToString(dataString);
byte[] imageAsBytes = stringDecompressed.getBytes();
final Bitmap bitmap = BitmapFactory.decodeByteArray(decodedString, 0, decodedString.length);
final Bitmap mutableBitmap = bitmap.copy(Bitmap.Config.ARGB_8888, true);
final Bitmap bitmap = BitmapFactory.decodeByteArray(data, 0, data.length);
HomeActivity.this.runOnUiThread(new Runnable() {
public void run() {
try {
remoteViewImageView.setImageBitmap(bitmap);
} catch (Exception e) {
e.printStackTrace();
}
}
});
I am trying to capture the preview of the camera on a surfaceview ,to save it as a JPEG in the internal memory. I found some code here on this site, that does mostly I want but saves the image to the SD Card. I changed that, and came up with the following code.
Camera.PreviewCallback mPrevCallback = new Camera.PreviewCallback()
{
#Override
public void onPreviewFrame( byte[] data, Camera Cam ) {
//Log.d(TAG, "FRAME");
Camera.Parameters parameters = Cam.getParameters();
int format = parameters.getPreviewFormat();
//Log.d(TAG, "FORMAT:" + format);
//YUV formats require more conversion
if (format == ImageFormat.NV21 || format == ImageFormat.YUY2 || format == ImageFormat.NV16) {
int w = parameters.getPreviewSize().width;
int h = parameters.getPreviewSize().height;
// Get the YuV image
YuvImage yuv_image = new YuvImage(data, format, w, h, null);
// Convert YuV to Jpeg
Rect rect = new Rect(0, 0, w, h);
ByteArrayOutputStream output_stream = new ByteArrayOutputStream();
yuv_image.compressToJpeg(rect, 100, output_stream);
byte[] byt = output_stream.toByteArray();
FileOutputStream outStream = null;
try {
outStream = new FileOutputStream("/data/data/com.example.max.camtest/files/test"+System.currentTimeMillis()+".jpg");
outStream.write(byt);
outStream.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
}
}
}
};
The preview is shown on the surfaceview and the mPrevCallback is triggered.It successfully saves pictures that have diffrent sizes (250~500Kb) but they are all black. When I try to capture a picture with the camera.takePicture function is it also black.
What Am I doing wrong? How can I debug this?
Thanks!
Use this intent to take picture
Intent intent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
File f = new File(android.os.Environment.getExternalStorageDirectory(), AppInfo.getInstance().getCurrentLoginUserInfo().getId()+".jpg");
intent.putExtra(MediaStore.EXTRA_OUTPUT, Uri.fromFile(f));
intent.putExtra("return-data", true);
startActivityForResult(intent, 1);
and on Your Activity Result.... Note Bitmap bitmap = getScaledBitmap(uri.getPath(), 200, true); 200 is your max image size.
if(requestCode == 1)
{
String base = Environment.getExternalStorageDirectory().getAbsolutePath().toString();
final String imgPath = base + "/" +AppInfo.getInstance().getCurrentLoginUserInfo().getId()+".jpg";
File file = new File(imgPath);
if (file.exists())
{
Uri uri = Uri.fromFile(file);
Log.d(TAG, "Image Uri path: " + uri.getPath());
Bitmap bitmap = getScaledBitmap(uri.getPath(), 200, true);
}}
This method ll return image bitmap after resizing it-
private Bitmap getScaledBitmap(String imagePath, float maxImageSize, boolean filter) {
FileInputStream in;
BufferedInputStream buf;
try {
in = new FileInputStream(imagePath);
buf = new BufferedInputStream(in);
Bitmap realImage = BitmapFactory.decodeStream(buf);
float ratio = Math.min(
(float) maxImageSize / realImage.getWidth(),
(float) maxImageSize / realImage.getHeight());
int width = Math.round((float) ratio * realImage.getWidth());
int height = Math.round((float) ratio * realImage.getHeight());
Bitmap newBitmap = Bitmap.createScaledBitmap(realImage, width, height, filter);
return newBitmap;
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
return null;
}
Now you have scaled bitmap image.
Hope this ll help you.
I am creating a program that uses twice the takepicture function, like this:
mCamera.takePicture(null, null, mPicture);
But only once it enter the PictureCallback loop:
private PictureCallback mPicture = new PictureCallback(){
// Bitmap readyToGo;
#Override
public void onPictureTaken(byte[] data, Camera camera){
Log.d(TAG, "entrei no picture callback");
//-----OUT OF MEMORY ERROR
pictureFile = getOutputMediaFile(MEDIA_TYPE_IMAGE);
mPreview.setDrawingCacheEnabled(true);
mPreview.setDrawingCacheQuality(View.DRAWING_CACHE_QUALITY_AUTO);
BitmapFactory.Options options = new BitmapFactory.Options();
//options.inPurgeable = true;
//options.inInputShareable = true;
options.inJustDecodeBounds = true;
options.inSampleSize = 5;
options.inJustDecodeBounds = false;
Bitmap bitmap = mPreview.getDrawingCache();
//---------------------------------------------------
bmp = BitmapFactory.decodeByteArray(data, 0, data.length);
//combine the two bitmaps!!!!-------------
Log.d("main", "antes overlay");
combination = overlay(bmp, bitmap);
Log.d("main", "depois overlay");
//------------------ROTATION---------------------
if(pictureFile == null)
{
Log.d(TAG, "Error creating media file, check storages permissions. ");
return;
}
try
{
ExifInterface exif = new ExifInterface(pictureFile.getAbsolutePath()); //Since API Level 5
String exifOrientation = exif.getAttribute(ExifInterface.TAG_ORIENTATION);
Log.d("main", "exif orientation= "+exifOrientation);
FileOutputStream fos = new FileOutputStream(pictureFile);
Log.d(TAG, "ALO!!!");
combination.compress(CompressFormat.JPEG, 100, fos);//troquei bitmap por combination
fos.flush();
fos.close();
//------------------------------
clearBitmap(combination);
clearBitmap(bmp);
//------------------------------
}
catch(FileNotFoundException e)
{
Log.d(TAG, "File not found: "+e.getMessage());
}
catch(IOException e)
{
Log.d(TAG, "Error accessing file: "+e.getMessage());
}
}
};
Does anyobody know what happen, can I call it twice?
call thread to pause for 2 seconds and then recall it
or You can use your above code with this picture callback
public void takeSnapPhoto() {
camera.setOneShotPreviewCallback(new Camera.PreviewCallback() {
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
Camera.Parameters parameters = camera.getParameters();
int format = parameters.getPreviewFormat();
//YUV formats require more conversion
if (format == ImageFormat.NV21 || format == ImageFormat.YUY2 || format == ImageFormat.NV16) {
int w = parameters.getPreviewSize().width;
int h = parameters.getPreviewSize().height;
// Get the YuV image
YuvImage yuv_image = new YuvImage(data, format, w, h, null);
// Convert YuV to Jpeg
Rect rect = new Rect(0, 0, w, h);
ByteArrayOutputStream output_stream = new ByteArrayOutputStream();
yuv_image.compressToJpeg(rect, 100, output_stream);
byte[] byt = output_stream.toByteArray();
FileOutputStream outStream = null;
try {
// Write to SD Card
File file = createFileInSDCard(FOLDER_PATH, "Image_"+System.currentTimeMillis()+".jpg");
//Uri uriSavedImage = Uri.fromFile(file);
outStream = new FileOutputStream(file);
outStream.write(byt);
outStream.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
}
}
}
}); }
I want to use barcode scanner into my application and i am using Zbar library
however, i can scan barcode but i want to save scanned image into sd card.
So far, i can able to capture image and save into SD but when i try to open it i have broken image error and cannot display it.
What am i using is :
private final Camera.PreviewCallback saveImage = new Camera.PreviewCallback()
{
#Override
public void onPreviewFrame(byte[] data, Camera camera)
{
mCamera.setPreviewCallback(null);
String path = Environment.getExternalStorageDirectory() + "/DCIM/mcs_" + timeStamp + ".jpg";
FileOutputStream fos = null;
try
{
fos = new FileOutputStream(path);
fos.write(data);
fos.close();
}
catch(Exception e)
{
}
}
};
PreviewCallback previewCb = new PreviewCallback()
{
public void onPreviewFrame(byte[] data, Camera camera)
{
Camera.Parameters parameters = camera.getParameters();
Size size = parameters.getPreviewSize();
Image barcode = new Image(size.width, size.height);
barcode.setData(data);
barcode = barcode.convert("Y800");
int result = scanner.scanImage(barcode);
if (result != 0)
{
mCamera.setPreviewCallback(saveImage);
mCamera.stopPreview();
SymbolSet syms = scanner.getResults();
for (Symbol sym : syms)
{
Intent intent = new Intent(getApplicationContext(), ScanCodeResult.class);
intent.putExtra("timeStamp", timeStamp);
intent.putExtra("result", sym.getData().toString());
//startActivity(intent);
break;
}
}
}
};
#Override
public void onPreviewFrame(byte[] data, Camera camera)
{
Size size = camera.getParameters().getPreviewSize(); //获取预览大小
final int w = size.width; //宽度
final int h = size.height;
final YuvImage image = new YuvImage(data, ImageFormat.NV21, w, h, null);
ByteArrayOutputStream os = new ByteArrayOutputStream(data.length);
if(!image.compressToJpeg(new Rect(0, 0, w, h), 100, os)){
return;
}
byte[] tmp = os.toByteArray();
Bitmap bmp = BitmapFactory.decodeByteArray(tmp, 0,tmp.length);
FileHelper fileHelper = new FileHelper();
fileHelper.storeInSD(bmp);
}