I'm developing an app that is like a whiteboard.
I'm able to mirror the screen through Chromecast, but this way the whole screen is streamed.
But I want that only a part of the screen be streamed, as I have the whiteboard itself and a toolbar that shouldn't be on the streaming.
Is it possible to choose what Views should be streamed on Android?
No. Streaming works via the video recording apps, it grabs the entire screen's data. At that level it doesn't even know views exist.
Yes, it is possible. You should use the Remote Display APIs from Cast SDK v2 where you specify what view/surface in your app should be mirrored on a cast device. It is a great match for what you want to do.
Related
I am trying to mirror cast using my own app into a Fire TV Stick that is connected to the televsion. It has an option to Mirror the display. My phone can connect to the Fire TV Stick this way, but I would like to mirror something with a smaller resolution and even if I change my phone's resolution using adb, I think it sends the native resolution anyway.
I looked into MediaRouter and MediaRouteProvider. Also downloaded the Media Router sample that it's snippets are used in the documentation. The sample ran but didn't work. And this API is super complex and have so many things in it. I am not sure how to build a simple app that cast video(and later phone's screen) into another device, either the Amazon Fire TV stick mirror display or at least into a client app I will also write.
I couldn't find compact enough samples to do what I want. Do you have any idea where there is a sample that works and is not a massive amount of code?
I couldn't make it work following the documentation.
Instead of finding something in the API to do the mircast for me, I was able to just read pixel data from the MediaProjection and VirtualDisplay and send that using sockets.
It wasn't easy, I had to use a GLES11Ext.GL_TEXTURE_EXTERNAL_OES from the SurfaceTexture, render that into my own offscreen GL_TEXTURE2D and then read that using glReadPixels and the attached framebuffer.
In one of my application i need to record video of my own screen.
I need to make a video in that user can take video of their app how it possible?
first question is it possible so? if yes then how? any useful links or some help
Thanks
Pragna Bhatt
Yes, it is possible, but with certain limitations. I have done it in one of my projects.
Android 4.4 adds support for screen recording. See here
If you are targeting lower versions, you can achieve it, but it will be slow. There is no direct or easy way to do it. What you will do is, create drawable from your view/layout. Convert that drawable to YUV format and send it to camera (see some library where you can give custom yuv image to camera), camera will play it like a movies, you can save that movies to storage. Use threads to increase frame-rate (new multi-core device will have high frame-rate).
Only create drawable (from view) when there is any change in view or its children, for that you can use Global Layout Listener. Otherwise send same YUV image to camera.
Limitation:
You can not create video of more than one activities (at a time), or their transactions, because you are creating image from view. (Work on it yourself, may be you'll find a way)
You can not increase frame rate from a certain point, because it depends on hardware of your device.
I wrote a small game. It has a front facing cam preview implemented. I want to be able to record the entire screen to a mp4. How would I do that? Anyone know a nice tutorial for recording the entire screen to mp4 (in code, so not just screenshots.. I want to enable the user to make a recording, while playing the game).
Indeed, as #Michael stated, directly accessing the screenshot and framebuffer mechanisms is not possible. That doesn't mean you're completely out of luck, though, if you want to record gaming sessions. It's just more work.
If you generate a perfect record of the various actions in your game, save that record. You can then build a service (web service, local, whatever) that accepts the record as an input. In effect, it would replay the game. You'd then reconstruct the game in such a manner that you could generate a video.
You could also replay in the Android emulator and create a video from that. Not as easily deployable, but it depends on what you want to do with it.
Apps aren't allowed to grab the screen contents (see this discussion).
[L]etting applications grab the screen's content liberally is a serious security risk, which is why the platform prevents it (taking a snapshot requires either direct physical manipulation from the user, or using a debugging tool).
And as #CommonsWare mentioned in his comment you can't use the getDrawingCache method either to grab a Bitmap of the View if you're using a SurfaceView or one of its children (e.g. VideoView). More details about that here.
Since ICS it became really easy to take a screenshot from within Android. In the presentations Google goes further than that by streaming/saving the whole content that is shown on the screen including sound directly from the device. How to achieve that? Do I need some extra HDMI certified screen grabbing cards consoles require?
Does Android have the software capabilities to, if a phone has video-out, to open or push content solely to the video out.
So for example if the user is in and clicks on a YouTube link, the app, instead of opening the content on the main screen over the app it would push it to the video out so the YouTube video would display on their connect display and they could continue to browse.
I know Motorola's have the WebTop software and this idea is similar to what I am trying to accomplish but on a much more basic level. It's more similar to Apples AirPlay but much less complex again (without a network/external player - just video out).
Or if even that is to complex an even simpler solution of having the video-out still output even when the phone is locked. Currently the video-out mirroring on both my HTC Incredible and Galaxy Nexus will stop when the phone is locked.
EDIT:
I've noticed while using my phone that playing a video through the Google Videos app that on the phone controls will overlay on the screen i.e. play, pause, seek bar, and, the soft buttons, but the video-out display (Television) plays the video continuously/seamlessly without any of the controls over-layed. Now this is very primitive example of what i'm ultimately alluding too but it does show a real world example of an android device (no 3rd party manufacture software) doing video out that isn't exactly mirroring.
Well... I hate to furnish this as an answer, but it really looks like there's simply nothing in the API for it at all...
http://groups.google.com/group/android-developers/browse_thread/thread/9e3bcd1eea2c379
which just redirects to:
https://groups.google.com/forum/#!topic/android-developers/Jxp_9ZtzL60
I'm definitely going to favorite this question, and hope someone chimes in with something more useful than a "doesn't look like it's possible, though that does appear to be the correct answer, at this time.
Much like Dr.Dredel has mentioned, there is nothing current for multiple displays in terms of display 1 showing 'A' and display 2 showing 'B'
There is support for multiple screen sizes per the following:
http://developer.android.com/guide/practices/screens_support.html#support
This will be the case for a little while longer until someone creates the support for it.