API's for Android 4.4 screen recording? - android

One of the features of Android 4.4 (Kit Kat) is that it provides a way for developers to capture an MP4 video of the screen using adb shell screenrecord. Does Android 4.4 provide any new API's for applications to capture and encode video, or does it just provide the screenrecord utility/binary?
I ask because I would like to do some screen capture work in an application I'm writing. Before anyone asks, yes, the application would have framebuffer access. However, the only Android-provided capturing/encoding API that I've seen (MediaRecorder) seems to be limited to recording video from the device's camera.
The only screen capture solutions I've seen mentioned on StackOverfow seem to revolve around taking screenshots at a regular interval or using JNI to encode the framebuffer with a ported version of ffmpeg. Are there more elegant, native solutions?

The screenrecord utility uses private APIs, so you can't do exactly what it does.
The way it works is to create a virtual display, route the virtual display to a video encoder, and then save the output to a file. You can do essentially the same thing, but because you're not running as the "shell" user you'd only be able to see the layers you created. The relevant APIs are designed around creating a Presentation, which may not be exactly what you want.
See the source code for a CTS test with a trivial example (just uses an ImageView).
Of course, if you happen to be a GLES application, you can just record the output directly (see e.g. EncodeAndMuxTest and the "Record GL app" activity in Grafika).

Well, AFAIK, i don't see an API support equivalent to capturing what's going on the screen.

Related

Using Android's MediaRouter to cast the device screen into firetv stick or a client app?

I am trying to mirror cast using my own app into a Fire TV Stick that is connected to the televsion. It has an option to Mirror the display. My phone can connect to the Fire TV Stick this way, but I would like to mirror something with a smaller resolution and even if I change my phone's resolution using adb, I think it sends the native resolution anyway.
I looked into MediaRouter and MediaRouteProvider. Also downloaded the Media Router sample that it's snippets are used in the documentation. The sample ran but didn't work. And this API is super complex and have so many things in it. I am not sure how to build a simple app that cast video(and later phone's screen) into another device, either the Amazon Fire TV stick mirror display or at least into a client app I will also write.
I couldn't find compact enough samples to do what I want. Do you have any idea where there is a sample that works and is not a massive amount of code?
I couldn't make it work following the documentation.
Instead of finding something in the API to do the mircast for me, I was able to just read pixel data from the MediaProjection and VirtualDisplay and send that using sockets.
It wasn't easy, I had to use a GLES11Ext.GL_TEXTURE_EXTERNAL_OES from the SurfaceTexture, render that into my own offscreen GL_TEXTURE2D and then read that using glReadPixels and the attached framebuffer.

Access both back and front cameras simultaneously

What I'm trying to achieve: access both front and back cameras at the same time.
What I've researched: I know android camera API doesn't give support for using multiple instances of the Camera and you have to release a camera before using the other one. I've read tens of questions about this, I know on some devices it's possible (like Samsung S4, or other new devices from them).
I've also found out that it's possible to have access to both of them in Android KitKat on SOME devices.
I also know that on api >= 21, using the camera2 API, it's possible to access both of them at the same time because it's thread safe.
What I've got so far: implementation for accessing the cameras one at the time in order to provide a picture-in-picture.
I know it's not possible to implement dual simultaneously camera on every device, I just want a way to make it available to some devices.
How can I test to see if the device is capable of accessing both of them?
I've also searched for a library that can allow me such thing, but I didn't find anything. Is there such a library?
I would like to make this feature available for as many devices as possible, and for the others, I'll leave the current state (one by one) of the feature.
Can anyone please help me, at least with some pieces of advice?
Thanks
!
The Android camera APIs generally allow multiple cameras to be used at the same time, but most devices do not have enough hardware resources to support that in practice - for example, there's often only one camera image processor shared by both cameras.
There's no query that's included in the Android APIs that'll tell you up front if you can use multiple cameras at the same time.
The only way to tell is to try to open a second camera when you already have one open. If you can open the second camera, then you can do picture-in-picture, etc. If you get an exception trying to open the second camera, then that particular device doesn't support having both cameras open.
It is possible using the Android Camera2 API, but as indicated above most devices don't have hardware support. If you have a Nexus 5X, Nexus 6, or Nexus 6P it will work and you can test with this BothCameras app. I've implemented blitting to allow video recording as well (in addition to still pictures) using the hardware h264 encoder.
You can not access both the cameras in all android mobile phones due to hardware limitations. The best alternative can be using both the camera one by one. For that you can use single camera object and can change camera face to take another photo.
I have done this in one of my application.
https://play.google.com/store/apps/details?id=com.ushaapps.bothie
I've decided to mention that in some cases just opening two cameras with Camera2 API is not enough to know about support.
There are some devices which are not throwing error during opening. The second camera is opened correctly but the first one will call onCaptureFailed callback.
So the most accurate way is starting both cameras and wait frames from each of them and check there are no capture failure errors.

get screenshots from SurfaceComposerClient

Is it possible to use SurfaceComposerClient to get screenshots, the way MediaCodec does with createInputSurface().
I cant use MediaCodec for that because I need raw video and not encoded data.
since 4.3 it seems that ScreenshotClient cant do multiple screenshots.
Yes, assuming you're running as shell or root, and you don't mind using non-public native APIs (i.e. you don't care if your app breaks every time a new version of the OS rolls out).
The canonical example is screenrecord, introduced in Android 4.4. It creates a virtual display and directs the output to a Surface. For normal operation a MediaCodec input surface receives the output. For the "bugreport" mode introduced in screenrecord v1.1, the output goes to a GLConsumer (roughly equivalent to a SurfaceTexture), which is rendered to a Surface with overlaid text.
There's a bug in Android 4.3 (see issues 59649 or 60638 on the Android Open Source Project Issue Tracker) which means ScreenshotClient can't be used to take more than one screenshot.

android support better method than glreadpixels?

I'm making android game.(using andengine)
I need to record game play screen .
This is not for making promotion video, It is for game players to review their game play.
My app should record video by itself.
So I can't solve this problem using available recording app in market.
I already checked below code.
http://code.google.com/p/andengineexamples/source/browse/src/org/anddev/andengine/examples/ScreenCaptureExample.java?spec=svn66d3057f672175a8a21f9d06e7a045942331f65c&r=66d3057f672175a8a21f9d06e7a045942331f65c
It works very well..
But I want to record game play video, not a one screenshot.
At least I need 24fps for smooth replay, But If I use glreadpixels , I can get 5 fps at my xoom device.
I searched various websites to solve this optimization problem.
most people saying glreadplxels is too slow to record video.
http://www.gamedev.net/topic/473794-glreadpixel-takes-tooooo-much-time/
they recommend glcopyteximage2d instead of glreadpixels.
because glcopyteximage2d is much more faster than glreadpixels.
but I can't find how to use glcopyteximage2d in andengine.
even someone say that android opengl ES do not support glcopyteximage2d.
Maybe Another method exists to record smooth video.
It is read framebuffer of android device.
most of recording app in market using this method. but these app needs root permission to grab framebuffer.
I've read some news that android will be support capture screen from suface_flinger after gingerbread.
But I can't find out how to use framebuffer without root permission. T_T
These are my guessing solution.
use another opengl API which has better speed than glreadpixels.
find some android API can get framebuffer without root permission.
(Maybe I can access to android SURFACE_FLINGER ??)
draw another offscreen texture to record video.
But I don't know how to implement these methods.
Which approach is correct?
Do you have a example code to record video for android?
please help me to solve this problem.
If you know any other method, That will be helpful.
any help will be appreciated
Does the GPU vendor of your device support es3.0, if it does you can try to use PBO.
Here is a topic I you can refer to :Low readback performance with PBO , help !!!!!

Record video in background, preferably with no file or surface, preferably also on 2.2

So I read here that it's not possible to capture preview frames without a valid Surface. However, I saw that the IP Webcam app can do this, and I wanna know how.
Can that app do it on versions below v2.3? If so, how?
Furthermore, the bug isn't marked as fixed, so I'm wondering if the restriction was ever lifted.
Also, if I don't wanna save the video stream from the Preview, but rather stream it over the network, is that possible with the MediaRecorder? All the examples I see use a file for saving, but I reckon the IP Webcam app uses the Preview. Or maybe it writes to a pipe?
When using Android, you must have a valid Surface object to take pictures or video. The Preview also requires the Surface object. I would guess that the IP Webcam uses native calls (C or C++) to the Dalvik lower layers, bypassing the Java layer(s). That way, they can access the hardware more directly. If you have the skills you should be able to do this using the Android NDK.

Categories

Resources