I want to have a single video playing in the background of my application, so that when any new activity is pushed on the stack, it will have a transparent background so that the application background video is all the time visible.
How would I accomplish this, without any risk of the activity containing the video gets destroyed at any stage?
What you want is not possible.
You will have to implement this as a single activity, possibly using fragments to handle your changing content. Even then, the user can always exit this activity.
Bear in mind that this will not work well except on devices offering 2D graphic hardware acceleration. That is possible on Android 3.0+ devices when you request hardware acceleration in the manifest. On Android 1.x and 2.x devices, this will likely perform very poorly.
Related
In Android , Is it possible to display one application(rendering Video) as a floating screen in one half of the screen. and at the same time interacting with another application(e.g chat application or any other application. ).The floating screen appliction will be my application so that it will allow the user to do multitasking.
The idea is to keep the surfaceview of the application, which is rendering the video, on top, and at the same time interact with other applications.e.g gallery or any other application..
If by "two active applications" then you mean real applications (i.e. with activities, back stack, &c) active at the same time, then no (except in some specialized devices, with custom APIs).
However, there is a trick you can use to achieve a similar effect. Applications with the android.permission.SYSTEM_ALERT_WINDOW (displayed as "draw over other apps" in Play Store) can create windows from a service and show them. So you could probably get the effect you want with this method.
There is an open source library called StandOut which provides this behavior in an easy to use manner. You might want to take a look at it.
In short, the answer is no. There is no way currently for multiple apps to be visible on the screen at the same time.
You could theoretically reuse code over multiple different applications, so you could create a video window that could play video, while simultaneously showing a text editor fragment that allows notes to be taken, and you can send data between different applications using an Intent, but unlike modern desktop computers, only one application can currently have the focus of the screen at a time in Android.
In Android , Is it possible to display one application(rendering Video) as a floating screen in one half of the screen. and at the same time interacting with another application(e.g chat application or any other application. ).The floating screen appliction will be my application so that it will allow the user to do multitasking.
The idea is to keep the surfaceview of the application, which is rendering the video, on top, and at the same time interact with other applications.e.g gallery or any other application..
If by "two active applications" then you mean real applications (i.e. with activities, back stack, &c) active at the same time, then no (except in some specialized devices, with custom APIs).
However, there is a trick you can use to achieve a similar effect. Applications with the android.permission.SYSTEM_ALERT_WINDOW (displayed as "draw over other apps" in Play Store) can create windows from a service and show them. So you could probably get the effect you want with this method.
There is an open source library called StandOut which provides this behavior in an easy to use manner. You might want to take a look at it.
In short, the answer is no. There is no way currently for multiple apps to be visible on the screen at the same time.
You could theoretically reuse code over multiple different applications, so you could create a video window that could play video, while simultaneously showing a text editor fragment that allows notes to be taken, and you can send data between different applications using an Intent, but unlike modern desktop computers, only one application can currently have the focus of the screen at a time in Android.
I have doing a basic object detection on the camera preview screen in Android (greater than 3.2). For the devices which do not support processing on preview screen, I am buffering the preview screen, processing it and clearing the buffer. This part is working as desired.
What I now want is this app to run in the background while any other app is running in the foreground. I am using android service and am able to run a small test app in the background. However my concern is with the camera preview app.
I don't want to display the preview screen but use the preview screen information for processing. This might be too much to ask, but I wanted to know if this is even possible. I came across this link which shows some hope. Basically I want to process the video (preview) stream without displaying it on the screen. If this is doable, then I can think of putting this app in the background and some other app in the foreground.
Unfortunately I won't be able to share the code, however it is the standard logic of creating a surface view and starting the preview.
I would really appreciate any insight into this.
Check comments here.Basically he opens camera hardware, set preview callback and do startpreview without setting the previewDisplay (this might not work on every device). You can try this from your background service. All this will work if your foreground doesn't access the camera app. Please update this if it works. I am interested to know.
I am developing an application to take images from background application say using android service. i don't want any user interaction to takes images, it should completely work from background.
I have already tried -
Calling camera from service - It not always works sometime works if we have dummy surfaceview, again if Android close the related activity service stopped takings pics, I don't want to run activity for user interaction.
with some changes it working fine but all images are black.
Widget: Not getting how to call camera in widget as widget doesn't support surfaceview.
Live wallpaper: not able to make it work till now, does it supports for camera?
As per my experience till now Android camera service not designed to takes picture if no real preview available.
Is anyone developed something like this? any help will be really appreciate.
If you can ignore older versions, it may be easier to use SurfaceTexture (API 11 and higher) As vikky.rk notes in an answer to a recent question, an introduction to this technique can be found in the PanoramaActivity code of the default Android Camera App.
I want to write an activity that:
Shows the camera preview (viewfinder), and has a "capture" button.
When the "capture" button is pressed, takes a picture and returns it to the calling activity (setResult() & finish()).
Are there any complete examples out there that works on every device? A link to a simple open source application that takes pictures would be the ideal answer.
My research so far:
This is a common scenario, and there are many questions and tutorials on this.
There are two main approaches:
Use the android.provider.MediaStore.ACTION_IMAGE_CAPTURE event. See this question
Use the Camera API directly. See this example or this question (with lots of references).
Approach 1 would have been perfect, but the issue is that the intent is implemented differently on each device. On some devices it works well. However, on some devices you can take a picture but it is never returned to your app. On some devices nothing happens when you launch the intent. Typically it also saves the picture to the SD card, and requires the SD card to be present. The user interaction is also different on every device.
With approach 2 the issues is stability. I tried some examples, but I've managed to stop the camera from working (until a restart) on some devices and completely freeze another device. On another device the capture worked, but the preview stayed black.
I would have used ZXing as an example application (I work with it a lot), but it only uses the preview (viewfinder), and doesn't take any pictures. I also found that on some devices, ZXing did not automatically adjust the white balance when the lighting conditions changed, while the native camera app did it properly (not sure if this can be fixed).
Update:
For a while I used the camera API directly. This gives more control (custom UI, etc), but I would not recommend it to anyone. I would work on 90% of devices, but every now and again a new device would be released, with a different problem.
Some of the problems I've encountered:
Handling autofocus
Handling flash
Supporting devices with a front camera, back camera or both
Each device has a different combination of screen resolution, preview resolutions (doesn't always match the screen resolution) and picture resolutions.
So in general, I'd not recommend going this route at all, unless there is no other way. After two years I dumped by custom code and switched back to the Intent-based approach. Since then I've had much less trouble. The issues I've had with the Intent-based approach in the past was probably just my own incompetence.
If you really need to go this route, I've heard it's much easier if you only support devices with Android 4.0+.
With approach 2 the issues is stability. I tried some examples, but I've managed to stop the camera from working (until a restart) on some devices and completely freeze another device. On another device the capture worked, but the preview stayed black.
Either there is a bug in the examples or there is a compatibility issue with the devices.
The example that CommonsWare gave works well. The example works when using it as-is, but here are the issues I ran into when modifying it for my use case:
Never take a second picture before the first picture has completed, in other words PictureCallback.onPictureTaken() has been called. The CommonsWare example uses the inPreview flag for this purpose.
Make sure that your SurfaceView is full-screen. If you want a smaller preview you might need to change the preview size selection logic, otherwise the preview might not fit into the SurfaceView on some devices. Some devices only support a full-screen preview size, so keeping it full-screen is the simplest solution.
To add more components to the preview screen, FrameLayout works well in my experience. I started by using a LinearLayout to add text above the preview, but that broke rule #2. When using a FrameLayout to add components on top of the preview, you don't have any issues with the preview resolution.
I also posted a minor issue relating to Camera.open() on GitHub.
"the recommended way to access the camera is to open Camera on a separate thread". Otherwise, Camera.open() can take a while and might bog down the UI thread.
"Callbacks will be invoked on the event thread open(int) was called from". That's why to achieve best performance with camera preview callbacks (e.g. to encode them in a low-latency video for live communication), I recommend to open camera in a new HandlerThread, as shown here.