I am working on android app which takes frames from the camera and display on surface view, I am using camera callbacks to get raw images and then convert it to byte stream and passes to the server for processing and then server return same frame. But the problem is Image view is very slow in drawing images (bitmaps) 15-20 fps. Is there any other solution using I can draw bitmaps quickly. In current code I am processing the bitmaps on a different thread and using UI thread I am setting bitmaps to image view.
Code in Camera callback is
Camera.Size pSize = camera.getParameters().getPreviewSize();
YuvImage yuv = new YuvImage(data, ImageFormat.NV21,
pSize.width, pSize.height, null);
yuv.compressToJpeg(new Rect(0, 0, pSize.width,
pSize.height), 100, baos);
rawImage = baos.toByteArray();
Bitmap bitmap = BitmapFactory.decodeByteArray(rawImage,
0, rawImage.length);
bitmap = Util.getResizedBitmap(bitmap, frameResolution);
final ByteArrayOutputStream rotatedStream = new ByteArrayOutputStream();
bitmap = Util.RotateBitmap(bitmap, 90);
bitmap.compress(Bitmap.CompressFormat.WEBP, 100, rotatedStream);
baos.close();
rawImage = rotatedStream.toByteArray();
if(isStreamingStart== true) {
beforeTime=(new Date()).getTime();
if(client.isConnected()==false){
client.connect();
}
client.send(rawImage);
rotatedStream.flush();
}
and code which returns bitmap is
decodedString = Base64.decode((String) data, Base64.DEFAULT);
byte[] dataString = ((String)data).getBytes();
String stringDecompressed = compressor.decompressToString(dataString);
byte[] imageAsBytes = stringDecompressed.getBytes();
final Bitmap bitmap = BitmapFactory.decodeByteArray(decodedString, 0, decodedString.length);
final Bitmap mutableBitmap = bitmap.copy(Bitmap.Config.ARGB_8888, true);
final Bitmap bitmap = BitmapFactory.decodeByteArray(data, 0, data.length);
HomeActivity.this.runOnUiThread(new Runnable() {
public void run() {
try {
remoteViewImageView.setImageBitmap(bitmap);
} catch (Exception e) {
e.printStackTrace();
}
}
});
Related
I want convert my captured image into byte[]. When I capture image using camera it gets captured and preview is also shown and images saves on my external storage as well successfully.But when I try to convert my preview image it doesn't stores anything in the byte array.
Following is my method which is called when I press preview image button on my phone.
public static void previewCapturedImage() {
try {
static ByteArrayOutputStream stream = null;
imgPreview.setVisibility(View.VISIBLE);
BitmapFactory.Options options = new BitmapFactory.Options();
options.inSampleSize = 8;
final Bitmap bitmap = BitmapFactory.decodeFile(fileUri.getPath(),options);
imgPreview.setImageBitmap(bitmap);
stream = new ByteArrayOutputStream();
bitmap.compress(CompressFormat.JPEG, 100, stream);
byte[] byteArray = stream.toByteArray();
} catch (NullPointerException e) {
e.printStackTrace();
}
}
convert image to string & byte array, use following short of code.
ByteArrayOutputStream baos = new ByteArrayOutputStream();
yourbitmap.compress(Bitmap.CompressFormat.JPEG, 100, baos);
//this will convert image to byte[]
byte[] byteArrayImage = baos.toByteArray();
// this will convert byte[] to string
String encodedImage = Base64.encodeToString(byteArrayImage, Base64.DEFAULT);
Check the below working code. This includes quality adjustment also
/**
* #param bitmap
* #param quality 1 ~ 100
* #return
*/
public static byte[] compressBitmap(Bitmap bitmap, int quality)
{
try
{
ByteArrayOutputStream baos = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.PNG, quality, baos);
return baos.toByteArray();
} catch (Exception e)
{
PrintLog.print(TAG, e.toString(), e);
}
return null;
}
I'm developing an app that combines two bitmaps, where one bitmap is from drawable, and the other is taken from a camera snapshot. However the pictures always end up incomplete. Half the picture is fine, but the other half is gray. Is there a way to make sure that the file is completed before the app moves on with the code? Below is the code that works with writing and saving the file. Thanks
Combine.java
protected void createPostcard(byte[] data, File pictureFile, CameraActivity app, Button shareButton,
Button newButton) {
try {
Bitmap photo = BitmapFactory.decodeByteArray(data, 0, data.length);
Bitmap splash = Bitmap.createScaledBitmap(BitmapFactory.decodeResource(app.getResources(),
R.drawable.wishsplash), photo.getWidth(), photo.getHeight(), false);
Bitmap postcard = Bitmap.createBitmap(photo.getWidth(), photo.getHeight(), photo.getConfig());
Canvas canvas = new Canvas(postcard);
canvas.drawBitmap(photo, new Matrix(), null);
canvas.drawBitmap(splash, 0, 0, null);
savePostcard(postcard, pictureFile, app, shareButton, newButton);
} catch (Exception e) {
}//end catch
}//end createPostcard
/**
* Saves the postcard
*/
private void savePostcard(Bitmap postcard, File pictureFile, CameraActivity app, Button shareButton,
Button newButton) {
BitmapDrawable mBitmapDrawable = new BitmapDrawable(postcard);
Bitmap mNewSaving = mBitmapDrawable.getBitmap();
ByteArrayOutputStream stream = new ByteArrayOutputStream();
mNewSaving.compress(CompressFormat.JPEG, 100, stream);
byte[] byteArray = stream.toByteArray();
save(byteArray, pictureFile, app);
shareButton.setBackgroundResource(R.drawable.sharebutton);
newButton.setBackgroundResource(R.drawable.newbutton);
shareButton.setEnabled(true);
newButton.setEnabled(true);
}//end savePostcard
/**
* Check if external is available. If not, postcard will be saved in internal.
* #retun
*/
private void save(byte[] data, File pictureFile, CameraActivity app) {
try {
if (Environment.getExternalStorageState().equals(Environment.MEDIA_MOUNTED)) {
FileOutputStream fos = new FileOutputStream(pictureFile);
imageUri = Uri.fromFile(pictureFile);
fos.write(data);
imageFile = pictureFile;
fos.close();
app.sendBroadcast(new Intent(Intent.ACTION_MEDIA_MOUNTED,
Uri.parse("file://"+ Environment.getExternalStorageDirectory())));
} else {
File cache = app.getCacheDir();
File internalPic = new File(cache, pictureFile.getName());
FileOutputStream fos = new FileOutputStream(internalPic);
imageUri = Uri.fromFile(internalPic);
imageFile = internalPic;
fos.write(data);
fos.close();
}//end else
} catch (FileNotFoundException e) {
System.out.println("FILENOTFOUND");
} catch (IOException e) {
System.out.println("IOEXCEPTION");
}//end catch
}//end getStorage
try this code
public Bitmap PutoverBmp(Bitmap all, Bitmap scaledBorder) {
Paint paint = new Paint();
final int width = bmp.getWidth(); // bmp is your main Bitmap
final int height = bmp.getHeight();
patt = Bitmap.createScaledBitmap(bmp, width, height, true);
Bitmap mutableBitmap = patt.copy(Bitmap.Config.ARGB_8888, true);
Canvas canvas = new Canvas(mutableBitmap);
scaledBorder = Bitmap.createScaledBitmap(border, width, height, true);
paint.setAlpha(100);
canvas.drawBitmap(scaledBorder, 0, 0, paint);
return mutableBitmap;
}
simply call this Bitmap combine = (bmp , yourOtherBitmap);
Hi guys I wanted to ask you one thing, I have a chat that transfers strings and I can even attach of JPEG images before sending them to convert it into a string and then decode in BITMAP just that when I decode it crashes the app. I wanted to know if it is the right code to decode it.
NOME = (TextView) row.findViewById(R.id.comment);
NOME.setText(coment.comment);
String a = NOME.getText().toString();
if(a.length() > 1024 )
{
byte[] image = Base64.decode(a, 0);
int lung = a.length();
Bitmap bitmap = BitmapFactory.decodeByteArray(image, 0, lung);
Image = (ImageView) row.findViewById(R.id.image);
Image.setImageBitmap(bitmap);
}
The code looks fine, if I had to guess I would say you're getting the Out of Memory error, which is very common when loading images. Check out
http://developer.android.com/training/displaying-bitmaps/load-bitmap.html
for some best practices when loading images.
The method for Encoding an Image to String Base64 :
public static String encodeToString() {
String imageString = null;
try {
Bitmap bm = BitmapFactory.decodeResource(getResources(), R.drawable.ic_launcher);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
bm.compress(Bitmap.CompressFormat.PNG, 100, baos); //bm is the bitmap object
byte[] b = baos.toByteArray();
imageString = Base64.encodeToString(b, Base64.DEFAULT);
} catch (Exception e) {
e.printStackTrace();
}
return imageString;
}
The method for Decoding String Base64 to Image :
public static void decodeToImage(String imageString) {
try {
byte[] imageByte = Base64.decode(imageString, Base64.DEFAULT);
Bitmap bm = BitmapFactory.decodeByteArray(imageByte, 0, imageByte.length);
image_view.setImageBitmap(bm);
} catch (Exception e) {
e.printStackTrace();
}
}
I'm trying to capture the picture I'm getting from webview.capturePicture() to save it to an sqliteDatabase, to do I need to convert the image to a byte[] to be able to save it as a BLOB in my table, and then by able to retrieve that byte[] and convert it back to a bitmap.
Here is what I'm doing:
Picture p = webView.capturePicture();
ByteArrayOutputStream bos = new ByteArrayOutputStream();
p.writeToStream(bos);
byte[] ba = bos.toByteArray());
I then retrieve the image by:
byte[] image = cursor.getBlob(imageColumnIndex);
Bitmap bm = BitmapFactory.decodeByteArray(image, 0, image.length);
I'm able to retrieve the byte[] just fine but I get a null bitmap all the time from bitmapfactory.
I also notice that if I log.d(TAG, ""+bos) I get a long sequence of bytes as expected but if I do the same to ba just after I do bos.toByteArray() I just get a short array, some thing like this: [B#2b0a7c60
I'm guessing I'm having trouble perhaps to convert by OutputStream to byteArray. Could this by because capturePiture() method returns an OutputStream instead of a ByteArrayOutputStream?
Any help would be appreciated.
Use the below two function convert::::
public String convertBitmapToString(Bitmap src) {
String str =null;
if(src!= null){
ByteArrayOutputStream os=new ByteArrayOutputStream();
src.compress(android.graphics.Bitmap.CompressFormat.PNG, 100,(OutputStream) os);
byte[] byteArray = os.toByteArray();
str = Base64.encodeToString(byteArray,Base64.DEFAULT);
}
return str;
}
public static Bitmap getBitMapFromString(String src) {
Bitmap bitmap = null;
if(src!= null){
byte[] decodedString = Base64.decode(src.getBytes(), Base64.DEFAULT);
bitmap = BitmapFactory.decodeByteArray(decodedString,0,decodedString.length);
}
return bitmap;
}
Updated::
//Convert Picture to Bitmap
private static Bitmap pictureDrawable2Bitmap(Picture picture){
PictureDrawable pictureDrawable = new PictureDrawable(picture);
Bitmap bitmap = Bitmap.createBitmap(pictureDrawable.getIntrinsicWidth(),pictureDrawable.getIntrinsicHeight(), Config.ARGB_8888);
Canvas canvas = new Canvas(bitmap);
canvas.drawPicture(pictureDrawable.getPicture());
return bitmap;
}
how do i use
onPreviewFrame (byte[] data, Camera camera)
inorder to getPixel(int x, int y). I want to change the frame into a bitMap is that possible? I am using onPreviewFrame because I want to getPixel data every second, so it would be too long to get the picture.
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
byte[] baos = convertYuvToJpeg(data, camera);
if (baos != null) {
Bitmap bitmap = Tool.loadBitmap(baos);
}
}
public byte[] convertYuvToJpeg(byte[] data, Camera camera) {
try {
YuvImage image = new YuvImage(data, ImageFormat.NV21,
camera.getParameters().getPreviewSize().width,
camera.getParameters().getPreviewSize().height,
null);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
Rect rect = new Rect(0, 0,
camera.getParameters().getPreviewSize().width,
camera.getParameters().getPreviewSize().height);
//set quality
int quality = 100;
image.compressToJpeg(rect, quality, baos);
return baos.toByteArray();
} catch (Exception e) {
}
return null;
}