Programmatically take a screenshot from service - android

it is easy to this code
Bitmap bitmap;
View v1 = MyView.getRootView();
v1.setDrawingCacheEnabled(true);
bitmap = Bitmap.createBitmap(v1.getDrawingCache());
v1.setDrawingCacheEnabled(false);
and it works great , but this is the case if there is activity.
How can I take a screenshot from service ?
my goal to take a screenshot ones in a hour ,e.i to to take screen shut every hour for example in 12 then in 1 then in 2 .... and so on

To capture ScreenShot for your activity you have to need a View of your activity, and which one is not present in your service so you have to make a TimerTask which will call your activity at every hour and your activity responding it to with current appear view and you can capture the ScreenShot from that. (I think this one is only solution for your problem.)
Or If you want to take a ScreenShot of your current device screen (any application) then you have to root permission, and read framebuffer for that which will give raw data of current screen then convert it to bitmap or any picture file you can do it in your service.

Android Screenshot Library (ASL) provides means for taking snapshots of phone's screen without the need for signing your application or having privileged (root) access to the Android system
Click here for ASL

Related

Print a bitmap image without resizing through Wi-Fi on Android App?

I need to make an app for label printing like this
I am checking this tutorial
And PrintHelper has very simple and limited features.
I can only use two scales - SCALE_MODE_FILL, SCALE_MODE_FIT.
My bitmap image has a size of 512px X 512px. And I might need to adjust the size because of the label sticker size.
OR I need to choose the size of paper(ex. 100mm X 100mm) then, both way above will have the same result.
When I try this code, It opens print setting activity.
private void doPhotoPrint(Bitmap bitmap) {
PrintHelper printHelper = new PrintHelper(this);
printHelper.setColorMode(COLOR_MODE_MONOCHROME);
// Bitmap bitmap = BitmapFactory.decodeResource(getResources(), R.drawable.droids);
printHelper.printBitmap("droids.jpg - test print", bitmap);
}
However, I just want to implement the print function without opening the setting screen, But just when I click 'print' on my application, then right away print one or more bitmap images continuously with the default settings that I set(image size, black&white/color, printer that is connected, paper size).
Is there any way to make a function like the video above?
With the PrintHelper you get the system print dialog there is no way to print silently as the user has to pick the printer and the print attributes from the dialog. To function like the video, you'd need to implement the discovery and printing functionalities with the printer directly

Possible to make android watch face a picture from my gallery?

Is it possible to make an app that will take a picture from a user's phone's gallery and convert it to a android wearable watch face?
I've been reading up on these Android articles
https://developer.android.com/training/wearables/watch-faces/index.html
https://developer.android.com/training/wearables/watch-faces/drawing.html
and it seems if I can get a user to select a picture from the gallery and convert it to a bitmap it would then be plausible to set that as the watch face. I'm definitely a beginner programmer when it comes to Android and apks.
Confirmation from a more advanced Android developer would be great.
Now where I'm getting confused is if the picking of the picture would happen on the user's phone and send it to the android wearable app or if the wearable app has the ability to access the gallery of the user's phone and select it directly. Does anyone know if wearable apps can access the gallery of a users phone?
Assuming I already have a reference of the image selected it would be something like this? Correct me if I'm wrong. (Taken from second article under "Initialize watch face elements"
#Override
public void onCreate(SurfaceHolder holder) {
super.onCreate(holder);
// configure the system UI (see next section)
...
// load the background image
Resources resources = AnalogWatchFaceService.this.getResources();
//at this point the user should have already picked the picture they want
//so set "backgroundDrawable" to the image the user picture
int idOfUserSelectPicture = GetIdOfUserSelectedPictureSomehow();
Drawable backgroundDrawable = resources.getDrawable(idOfUserSelectPicture, null);
//original implementation from article
//Drawable backgroundDrawable = resources.getDrawable(R.drawable.bg, null);
mBackgroundBitmap = ((BitmapDrawable) backgroundDrawable).getBitmap();
// create graphic styles
mHourPaint = new Paint();
mHourPaint.setARGB(255, 200, 200, 200);
mHourPaint.setStrokeWidth(5.0f);
mHourPaint.setAntiAlias(true);
mHourPaint.setStrokeCap(Paint.Cap.ROUND);
...
// allocate a Calendar to calculate local time using the UTC time and time zone
mCalendar = Calendar.getInstance();
}
Thank you for any and all help.
The way to implement this would be to create a configuration Activity that runs on the phone, that picks from an image on your device. You can then send this image as an Asset via the Data Layer http://developer.android.com/training/wearables/data-layer/index.html and it will be received on the watch side, and you can then make it the background of the watch face.
It is not possible for an Android Wear device to see the photo collection on your phone, they are totally separate devices and nothing is shared by default unless you write an application that does this.
The Data Layer sample shows how to take a photo on the phone, and then send it to the wearable: https://github.com/googlesamples/android-DataLayer

Android cwac-camera to take multiple photos?

The title may be unclear, but I'm using this awesome library by CommonsWare(nice meeting you at DroidCon btw) to deal with the notorious issues with Android's fragmented camera api.
I want to take 5 photos, or frames..but not simultaneously. Each frame should capture another shot a few milliseconds apart, or presumably after the previous photo has been successfully captured. Can this be done?
I'm following the standalone implementation in the demos, and simply taking a photo using
mCapture.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
try {
takePicture(true, false);
}catch(Exception e){
e.printStackTrace();
}
}
});
Passing in true to takePicture() because I will need the resulting Bitmap. I also disabled single shot mode since I will want to take another photo right after the previous has be snapped, and the preview is resumed
By default, the result of taking a picture is to return the
CameraFragment to preview mode, ready to take the next picture.
If, instead, you only need the one picture, or you want to send the
user to some other bit of UI first and do not want preview to start up
again right away, override useSingleShotMode() in your CameraHost to
return true. Or, call useSingleShotMode() on your
SimpleCameraHost.Builder, passing in a boolean to use by default. Or,
call useSingleShotMode() on your PictureTransaction, to control this
for an individual picture.
I was looking for a callback like onPictureTaken() or something similar inside CameraHost, that would allow me to go ahead and snap another photo right away before releasing the camera, but I don't see anything like this. Anyone ever done something like this using this library? Can the illustious CommonsWare please shed some light on this as well(if you see this?)
Thank you!
Read past the quoted paragraph to the next one, which begins with:
You will then probably want to use your own saveImage() implementation in your CameraHost to do whatever you want instead of restarting the preview. For example, you could start another activity to do something with the image.
If what you want is possible, you would call takePicture() again in saveImage() of your CameraHost, in addition to doing something with the image you received.
However:
Even with large heap enabled, you may not have enough heap space for what you are trying to do. You may need to explicitly choose a lower resolution image for the pictures.
This isn't exactly within the scope of the library. It may work, and I don't have a problem with it working, but being able to take N pictures in M seconds isn't part of the library's itch that I am (very very slowly) scratching. In particular, I don't think I have tested taking a picture with the preview already off, and there may be some issues in my code in that area.
Long-term, you may be better served with preview frame processing, rather than actually taking pictures.

How to get pixel data from screencap.cpp directly

I'm a newbie in Android.
I using Nexus7 reference device and I've downloaded the full source code from source.android.com.
I have an engineering system image and I can make a system application.
/system/bin/screencap utility is good for me to capture screen.
I want to get a pixel data using screencap.cpp directly in my application.
When I used to screencap utility, the process is like below.
capture screen and save an image.
Open image file
decodefile to bitmap
get pixel data(int array) from bitmap
I want to remove the step 1, 2 and 3.
Just call api to get pixel data of a screen directly,
How can I do that?
If you're running with system privileges, you can just ask SurfaceComposerClient for the pixel data rather than launching a separate process to do it for you.
Looking at the screencap source code, all you really need is the Binder initialization:
ProcessState::self()->startThreadPool();
and the SurfaceComposerClient IPC call:
ScreenshotClient screenshot;
sp<IBinder> display = SurfaceComposerClient::getBuiltInDisplay(displayId);
if (display != NULL && screenshot.update(display, Rect(), false) == NO_ERROR) {
base = screenshot.getPixels();
w = screenshot.getWidth();
h = screenshot.getHeight();
s = screenshot.getStride();
f = screenshot.getFormat();
size = screenshot.getSize();
}
You can safely ignore all the /dev/graphics/fb0 stuff below it -- it's an older approach that no longer works. The rest of the code is just needed for the PNG compression.
If you're not running with system privileges, you can't capture the entire screen. You can capture your app though.
If you are writing a Java app just call /system/bin/screencap from your application (using java.lang.Process) and read the result into memory as a binary stream. You can see the binary structure in screencap.cpp, but it's just width, height, and format as four byte integers followed by the data.
Note to other readers: this is only be possible if you are a system app.
1) You can transfer data from your screencap utility to your App over the network by using sockets.
2) Android NDK can be used for direct function calls of your utility from your App.

Android - Picture callback data returns an image that it black or jumbled

So I am using the Android camera to take pictures within an Android app. About 90% of my users have no issues, but the other 10% get a picture that returns pure black or a weird jumbling of pixels.
Has anyone else seen this behavior? or have any ideas why it happens?
Examples:
Black:
Jumbled:
I've had similar problems like this.
The problem in short is: Missing data.
It occurs to a Bitmap/Stream if the datastream was interrupted for too long or it is accidentally no more available.
Another example where it may occur: Downloading and uploading images.
If the user disables all of a sudden Wifi/mobile network no more data can be transmitted.
You end up in a splattered image.
The image will appear/view okay(where okay means black/splattered, it's still viewable!) but is invalid internally (missing or corrupted information).
If it's not too critical you can try to move all the data into a Bitmap object (BitmapFactory.decode*) and test if the returned Bitmap is null. If yes the data possibly is corrupted.
This is just solving the consequences of the problem, as you can guess.
The better way would be to take the problem on the foot:
Ensure a good connection to your data source (Large enough, stout buffer).
Try to avoid unneccesary casts (e.g. from char to int)
Use the correct type of buffers (Either Reader/Writer for character streams or InputStream/OutputStream for byte streams).
From android 4.0 they put hardwareAcceleration set to true as default in the manifest. Hardwareaccelerated canvas does not support Pictures and you will get a black screen...
Please also check that whether you use BitmapFactory.Options object for generating the bitmap or not. Because few methods of this object also makes the bitmap corrupted.

Categories

Resources