I need to develop 2 applications "Sender" and "Receiver". These two will perform screen mirroring from Sender to Receiver.
How can I do this? Are there any in-built APIs / libraries available for same?
Can I use Miracast to achiave this? If, yes please guide me.
Assumption: Both device will be remain on same wifi.
To collect the UI from Sender, you can try creating something that looks like my MirroringFrameLayout from the CWAC-Layouts library. That is designed to update a separate Mirror View for on the same device that has the MirroringFrameLayout, such as having the MirroringFrameLayout on the touchscreen and the Mirror shown on an external display via a Presentation.
The problem you will encounter is performance, as my current approach draws the entire MirrorFrameLayout contents to a Bitmap, which is then shown by the Mirror. That would require you to ship new bitmaps across the network connection on every UI change, which is likely to be slow. So, while my approach is easy, you may need to be much more aware of what your UI is doing so you can ship over smaller updates.
The best approach may be to stop thinking of "screen mirroring" entirely, and instead focus on "operation mirroring". For example, suppose Sender is a drawing app, and Receiver is supposed to see the drawings. Rather than sending over the screens, send over the drawing operations that the user performs, and apply those same operations on the Receiver.
Related
Well i have read a lot of answers of similar questions (even if they are old from like 2013-2014) and i understood that it is not possible to know it exactly since android doesnt count the hardware usage as usage of the app, and some other possible problems like services etc.
At the moment I'm trying to test the perfomance of an App using a protocol to reach a goal and the perfomance of the same App using another protocol (not well known by everyone) to reach the same goal, the default android battery analyzer is good for me since both cases are like 90% the same and i know how the protocols work
My problem is that i'm not sure which one is the best to measure the mAph consumed by my App, i know that there are some external apps that shows it but i would prefer using the one of default, I believe this is something important not only for me but for other people who might have to compare different protocols.
I know that i can measure it programmatically and I've done it too, i save the percentage when the app is opened and how much has been consumed until it gets closed, but it isnt an exact measure since while the app is opened some other apps can do heavy work and add some kind of noise of what i'm measuring so i would prefer to use the android's battery analyzer.
Get a spare device. Load it completely, then run the protocol until shutdown without other interaction (no youtube or anything), note the time it lasted. Repeat with the other protocol. Imho that is a fair way to compare. Note that every device behaves differently and it may or may not be possible to transfer this result to other devices e.g. with different network chips, processors or even firmware versions.
For a more fair comparison I think you should compare how the protocols work. I.e. number of interactions, payload size etc. because the power consumption can only ever be an estimate.
I am trying to create a TV back light controller that runs on Android TV. To do this I want to use what is being displayed and adjust the back light colors accordingly. I am having trouble determining a good way to basically get a screenshot regularly through a service. Does anyone have any ideas?
On Android 5.0+, you can use the media projection APIs for this. However:
The user has to authorize this, by launching one of your activities which then displays a system-supplied authorization dialog
That authorization does not last forever, but only for however long your process is around (and possibly less than that, though so far in my testing it seems to last as long as the process does)
AFAIK, DRM-protected materials will be blocked from the screenshots, and so using those screenshots to try to detect foreground UI colors may prove unreliable
Is it possible to determine if a device (non-rooted) is in use at the moment, even if my app is not in the foreground? Precisely "in use" means the user made touch events in the last 5 seconds or display is on.
If so, what specific rights are required?
Thanks
AFAIK, android security model would not allow you to record touches if your app in not in the foreground.
There are some crude workarounds like overlaying a transparent screen to record touches. Not sure if these work now though.
"in use" means the user made touch events in the last 5 seconds
In Android, that's not practical, short of writing your own custom ROM.
or display is on
In Android, you can find out if the device is in an "interactive" mode or not. This does not strictly align with screen-on/screen-off, as the whole notion of screen-on/screen-off has pretty much fallen by the wayside.
Is it possible to do some postprocessing on the video data that gets sent to the display driver in Android?
For context, what I would like to do would be to able to apply effects such as blurring, sharpening, increasing or decreasing constrast, on the entire screen output, regardless of what is running.
I would like to know if there is some way to grab the actual video data before it gets shown on screen, process it, and then send it to the screen, (a fairly low-level operation, which I don't believe is provided by the Android API - However, I am only a beginner and do not know how hard it would actually be) or if there is any way by which I would be able to simulate this kind of behavior.
It may work by software in theory. However, performance may be the big issue. It can not be done in Jave app layer. Normally, it's done by HW(in Qualcomm platform, there is a specific device called MDP, which is mostly for video postprocessing).
I want to take an Android based tablet - not a phone, I need a large screen and I don't need 3G.
The guy with the tablet will attach a web cam to it and a s/w application in the Adnroid tablet will stream the cameras feed to a web page (there may later be a need to stream video back to the Android tablet - tbd).
Additionally, I need 2 way Voice over IP.
I may (tbd) need to use a TCP interace to a device which might, or might not, be achieved through the Andoid.
With so much open: is there any open source that can handle that, either as a grooup or individually, or should I code my own? Since I don't normally do this kinds of stuff what's the best approach, in terms of protocols, etc
I'd like to demo something in a month or so. Sorry that this is vague - but so is the person asking for it (which might make me lean towards roll your won simply because of shifting requirements. But I might roll my own around off the shelf building block, for instance if I can find off the shelf open source VoiP, etc)
is there any open source that can
handle that, either as a grooup or
individually, or should I code my own?
AFAIK, there is virtually no "open source that can handle that" for Android. In fact, you will need hardware modifications and drivers to support webcams, let alone anything else on your to-do list.
There are a lot of mobile streaming services. Maybe they can help you with one half of your problem:
http://www.ustream.tv/
http://www.qik.com/
http://bambuser.com/
Instead of the Webcam, you can use the integrated camera on the phone itself to capture and stream. And, yes, you 'll have to develop something on your own esp. with changing requirements.