I'm trying to save the frame of the image from the live stream video. So, I'm able to show live stream video from my android application as well as saving it in my local storage. Now, I want to use some sort of delay in my saveImage function so that images get saved after some specific time. I have used both Handler and TimerTask. The image is getting saved in my local directory after some delay but the length of the image that I get is sometimes very small, the other time normal. I want my saved image to exactly the length of the video stream that I am getting in my application.
I hope the question I asked is easy to understand. I am a beginner in both android and stack overflow.
P.S - I used mjpeg library for showing the video. The video I am getting is from an IP Camera.
Code for saving the images
public void saveImage(){
try {
photo =
new File(Environment.getExternalStorageDirectory(),
"Download/photos/photo" + Instant.now().getEpochSecond() + ".jpg");
if (photo.exists()) {
photo.delete();
}
System.out.println("Photo " + photo);
FileOutputStream fos = new FileOutputStream(photo.getPath());
System.out.println("Image_length" +image.length);
fos.write(image);
fos.close();
} catch (IOException e) {
Log.e("PictureDemo", "Exception in photoCallback", e);
}
}
Code where saveImage is called
final Bitmap outputImg = BitmapFactory.decodeByteArray(image, 0, image.length);
if (outputImg != null) {
if (run) {
newFrame(outputImg);
// Saving frames in internal Storage
new Timer().schedule(
new TimerTask() {
#Override
public void run() {
saveImage();
}
},5000
);
}
} else {
Log.e(tag, "Read image error");
}
Related
My app can download an image from a raspberry. it works fine. This is the code
public void downloadFile() {
FTPClient ftpClient = new FTPClient();
try {
ftpClient.connect("******");
ftpClient.login("****","*****");
ftpClient.enterLocalPassiveMode();
ftpClient.setFileType(FTP.BINARY_FILE_TYPE);
String remoteFile1;
File downloadFile1 = new File(filePath);
OutputStream outputStream1 = new BufferedOutputStream(new FileOutputStream(downloadFile1));
boolean success = ftpClient.retrieveFile(remoteFile1, outputStream1);
outputStream1.close();
if (success) {
System.out.println("File #1 has been downloaded successfully.");
} else {
System.out.println("Error in downloading file !");
}
boolean logout = ftpClient.logout();
if (logout) {
System.out.println("Connection close...");
}
} catch (IOException ex) {
System.out.println("Error: " + ex.getMessage());
ex.printStackTrace();
} finally {
try {
ftpClient.disconnect();
} catch (IOException ex) {
ex.printStackTrace();
}
}
}
And then I can display it so the user of my app can see it. For the image loading, Im using this code and it works too.
private void loadImage(String imagePath) {
Uri imageUri;
String fullImagePath;
Drawable image;
ImageView imageDisplay;
imageUri = Uri.parse(imagePath);
fullImagePath = imageUri.getPath();
image = Drawable.createFromPath(fullImagePath);
imageDisplay=(ImageView) findViewById(R.id.imageDisplay);
imageDisplay.setImageDrawable(image);
}
Now I want to display the image without downloading it in my gallery. But I can't figure out how to do this.
Can someone help me please.
You cannot show an image without download it. Actually when you see something "remotely", you are downloading it.
If you mean that the image is too large and you don't want to download, but want a mechanism for the user can view it. One possible solution is make a thumbnail (reduced image) in server side and show that "preview" to the user. Then if the user want to download it to the gallery you could get the original image.
If you want to display an image without downloading it, it has to be uploaded in a image hosting site or alike so you will just use the link instead of the whole FTP Client.
Basically, you are using a code that is intended for saving an image. And the one you are using for loading the images fetches data from the Drawable. So you are in the wrong path.
I'm currently developing an Android Application. My current progress is that I successful develop custom android camera. I followed this step (http://courses.oreillyschool.com/android2/CameraAdvanced.html) The tutorial given saved the picture taken into the gallery, but I want to insert the name, description, and other information of the Image because I'm going to save the image along with the details that the user enter into my database.
Here for example on my interface:
a) Success taken Image:
b) The image that need to be pass to another Imageview (red circle):
I want the save button able to pass the image that had been taken to the ImageView(red circle) and not to store the image into the gallery. And here's are the code on the save button:
private View.OnClickListener mSaveImageButtonClickListener = new View.OnClickListener() {
#Override
public void onClick(View view) {
File saveFile = openFileForImage();
if (saveFile != null) {
saveImageToFile(saveFile);
} else {
Toast.makeText(Capturingimage.this, "Unable to open file to save the image.", Toast.LENGTH_LONG).show();
}
}
};
This is save image to file method:
private void saveImageToFile(File file) {
if (mCameraBitmap != null) {
FileOutputStream outStream = null;
try {
outStream = new FileOutputStream(file);
if (!mCameraBitmap.compress(Bitmap.CompressFormat.PNG, 100, outStream)) {
Toast.makeText(Capturingimage.this, "Unable to save image to file.",
Toast.LENGTH_LONG).show();
} else {
Toast.makeText(Capturingimage.this, "Saved image to: " + file.getPath(),
Toast.LENGTH_LONG).show();
}
outStream.close();
} catch (Exception e) {
Toast.makeText(Capturingimage.this, "Unable to save image to file.",
Toast.LENGTH_LONG).show();
}
}
}
My ImageView(redCircle) id is :
#+id/image_view_after_capture
If you guys aren't very clear with the codes, here's the link on the full source code (http://courses.oreillyschool.com/android2/CameraAdvanced.html) on MainActivity.java. I'm sorry if my question is a lil bit messy and less explanation on the codes. I'm new to android programming, I hope you guys can teach me. I really appreciate your time and help to consider to help me.
Thank you in Advance!
when you save image than hold your captured image path and pass to another activity where you want to show
String mCaptureImagePath="";
private void saveImageToFile(File file) {
if (mCameraBitmap != null) {
FileOutputStream outStream = null;
try {
outStream = new FileOutputStream(file);
if (!mCameraBitmap.compress(Bitmap.CompressFormat.PNG, 100, outStream)) {
Toast.makeText(Capturingimage.this, "Unable to save image to file.",
Toast.LENGTH_LONG).show();
} else {
mCaptureImagePath=file.getPath(); //**** this is your save image path
Toast.makeText(Capturingimage.this, "Saved image to: " + file.getPath(),
Toast.LENGTH_LONG).show();
}
outStream.close();
} catch (Exception e) {
Toast.makeText(Capturingimage.this, "Unable to save image to file.",
Toast.LENGTH_LONG).show();
}
}
}
After show image where you want like this
BitmapFactory.Options bmOptions = new BitmapFactory.Options();
Bitmap bitmap = BitmapFactory.decodeFile(mCaptureImagePath,bmOptions);
yourImageView.setImageBitmap(bitmap);
My application captures video footage and saves it as .mp4 file. I would like to extract one frame from this video and also save it to file. Since I haven't found nothing better, I've decided to use MediaMetadataRetriever.getFrameAtTime() for that. It happens inside the class that inherits from AsyncTask. Here is how my code looks like my doInBackground():
Bitmap bitmap1 = null;
MediaMetadataRetriever retriever = new MediaMetadataRetriever();
try {
retriever.setDataSource(src);
bitmap1 = retriever.getFrameAtTime(timeUs, MediaMetadataRetriever.OPTION_CLOSEST_SYNC);
if (Utils.saveBitmap(bitmap1, dst)) {
Log.d(TAG, "doInBackground Image export OK");
} else {
Log.d(TAG, "doInBackground Image export FAILED");
}
} catch (RuntimeException ex) {
Log.d(TAG, "doInBackground Image export FAILED");
} finally {
retriever.release();
if (bitmap1 != null) {
bitmap1.recycle();
}
}
And the saveBitmap() method:
File file = new File(filepath);
boolean result;
try {
result = file.createNewFile();
if (!result) {
return false;
}
FileOutputStream ostream = new FileOutputStream(file);
bitmap.compress(Bitmap.CompressFormat.PNG, 100, ostream);
ostream.close();
return true;
} catch (Exception e) {
e.printStackTrace();
return false;
}
Now, the problem is that the quality of the exported image is noticeably worse then the video quality. I don't think that this should happen with PNG output format. You can see the difference below:
The first image was extracted from the video using ffmpeg on my desktop. The second one was extracted using the code above on Samsung Galaxy S6. The result looks pretty much the same on every Android device I was using.
Can someone tell how can I improve the quality of the exported picture?
I found other solution for the issue. You can use bigflake's example to build mechanism for extracting video frame. The one thing you will have to add is seeking mechanism. This works well, keeps the exact quality and does not require any third-party libraries. Only downside I've noticed so far is that it will result in longer execution time than the original idea.
I'm writing an application with one of the key features being taking a photo and writing it to a file, then reading that photo into a base64 array (all in the one button click). The problem being that when i initiate the onclick to take a photo it will return from this function before the onPhotoTaken() function has received the image and written it to the storage directory specified.
I have added log outputs at several stages in the code and it is clear that the onclick takePhoto function is exiting before the onPhotoTaken() function that it calls is finished.
The android documentation states that you need to wait for JpegCallback to finish returning before you can restart the preview but I am having trouble getting it to wait for the write to finish.
code:
public static void takePhoto(){
fCamera.takePicture(null, null, jpegCallback);
Log.d("capture", "photo was captured");
// Set the image callback
Log.d("this one is", "being called");
}
static PictureCallback jpegCallback = new Camera.PictureCallback() {
// Function for handling the picture
public void onPictureTaken(byte[] data, Camera fCamera){
//fCamera.stopPreview();
Log.d("is this", "not being called ??? probably");
File imagePath;
FileOutputStream out = null;
// create the filename with extension
String fileName = "IMAGE_1.bmp";
// Create / Find the storage Directory for our pictures
//storageDir = context.getDir("imageDir", Context.MODE_PRIVATE);
// Create it if it doesn't exist
// Create the image file
imagePath = new File(storageDir, fileName);
String finalPath = imagePath.getAbsolutePath();
Log.d("location", finalPath);
if (!imagePath.exists()) {
if (!imagePath.mkdirs())
Log.d("#string/app_name", "Failed to create File");
return;
}
try {
out = new FileOutputStream(imagePath);
} catch (FileNotFoundException e) {
e.printStackTrace();
}
//finalImage.compress(CompressFormat.PNG, 100, out);
try {
out.write(data);
Log.d("write", "photo was written");
out.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
}
catch (IOException e) {
e.printStackTrace();
}
}
};
Log cat:
10-13 12:55:39.185: D/capture(7126): photo was captured
10-13 12:55:39.185: D/this one is(7126): being called
These are the only log outputs that occur.
I have the same problem, by taking a photo and then try to create a thumbnail and store it to the media gallery.
As it's defined in here(developer.android.com/reference/android/hardware/Camera.html#takePicture) takePicture is an asynchronous proccess, so you have to derive your program logic and mandate the sequence.
This can be implemented with AsyncTask (http://developer.android.com/reference/android/os/AsyncTask.html)
In this example, the code inside onPostExecute method will be executed after camera.takePicture (so you are sure that onPictureTaken from PictureCallback is done).
i need to capture listview data and convert it into jpg or png image format , then saved into sd card. I captured only the data which is visible in the screen , but i am unable to capture the data which is avaialble in the scrollview.
So, please guide me guide me how to implement this.
i am using the following code to capture the visible data.
View v1=btnCapture.getRootView();
public void gettingRootView(View v1)
{
if( v1 != null)
{
v1.setDrawingCacheEnabled(true);
v1.buildDrawingCache();
Bitmap bm = v1.getDrawingCache();
try
{
if ( bm != null )
{
Log.e("file","filepath");
savePhoto(bm);
}
}
catch(Exception e){e.printStackTrace();}
}
}
public void savePhoto(Bitmap bmp)
{
Log.e("save photo","save photo");
File fileFolder=new File(Environment.getExternalStorageDirectory(),"SMSREADING");
fileFolder.mkdir();
Calendar c=Calendar.getInstance();
try
{
File fileName=new File(fileFolder,c.getTimeInMillis()+".jpg");
FileOutputStream output=new FileOutputStream(fileName);
bmp.compress(Bitmap.CompressFormat.PNG,100,output);
}
catch(Exception ex){
ex.printStackTrace();
}
}
You can create HTML from your data using one of the many templating libraries out there like, if you have a String list Apache's Velocity might work well. After you create your HTML you can use java-html2image to convert your html to an image.