I'm building an Android Application on a SoC Board that runs Android Nougat. I have a primary HDMI Display that I connect to via the HDMI Port, and a secondary HDMI Display that I connect to via a USB-C to HDMI Converter. By default, the secondary display mirrors the primary one.
I've loaded in the Android Presentation API's Display Manager example, and I'm able to select my secondary HDMI Display, and a photo opens on it.
However, what I noticed is that my mouse is bound inside my primary display, and it can't travel to the secondary display. This isn't ideal because I want my secondary display to be interactive as well, through a touch screen display. Now, I don't want to invest in it yet as it can cost quite a lot.
My question is: Can the secondary display in the Presentation API be interacted with as well and act as if it's an "entirely separate app" from the one displayed in the primary display?
Related
I am developing an android application and I'm trying to implement a button which initiates a screen mirror to a selected device.
I want it to be as straigth forward as possible, but the best i've been able to do is have it open the cast settings on the Settings app, which isn't a proper solution. It would be ideal to start screen mirroring with a single click.
Problem is i've tried using Googles Cast SDK but that only allows me to build custom HTML cast receivers, which isn't what I want, I just need a screen mirror from the android device to a screen/TV.
Is there any Android module/API that would allow me to do this with a single click, even if I have to use a Chromecast/Miracast device connected to the screen?
No, sorry, this is not an option AFAIK.
At least part of this is privacy/security. You seem to want to be able to start screen mirroring purely from app code ("I'm trying to implement a button", "allow me to do this with a single click"). Your button is the "single click", and so you really want to be able to do this without user involvement at all (since there is nothing forcing that button to exist). Showing the contents of the screen on another screen that might be visible to lots of people is the sort of thing that users need to approve, which is why additional clicks will be required.
In addition, there may be more than one target for screen mirroring within range, and the user needs to be able to choose which one to use, if any.
Combine all that with limited support for wireless displays across the various Android device manufacturers, and there really isn't anything here for what you want.
I am working with Presentation Class to show different screen to the customers, but interact together.
But I have no idea how to test this out because my devices and emulators only show one screen.
Is there any way to accomplish this?
For hardware, you may be able to connect an external display. For example, many modern Samsung devices support USB-C to HDMI adapters.
For the emulator and hardware that does not support external displays, you can simulate an external display. In Developer Options, scroll down to "Simulate secondary displays", and from there choose a resolution for the secondary display. This will appear as a floating layer over the main device UI, and it will work with Presentation.
I have a client that's asking for an app that can play videos from a tablet to connected HDMI screens. I'd like to know if it's possible to show different output on the HDMI screen than the tablet itself? This is because I want to add a menu (layer on top of the tablet screen) that is accessible by the client and is not visible to the people seeing the TV Screen.
Thanks for the help.
That is possible since Android 4.2, see the release notes which also contain code samples:
Android now allows your app to display unique content on additional
screens that are connected to the user’s device over either a wired
connection or Wi-Fi. To create unique content for a secondary display,
extend the Presentation class and implement the onCreate() callback.
Within onCreate(), specify your UI for the secondary display by
calling setContentView(). As an extension of the Dialog class, the
Presentation class provides the region in which your app can display a
unique UI on the secondary display.
The Presentation Class in the SDK contains a code sample how to play a video and show some information at the same time on a second screen.
I would like to create an Android Accessibility Application/Service.
This Accessibility app would be able to magnify any screen image produced by any application resident on the android device.
for example, I would like to be able to magnify...
the home screen
Settings menu and sub menus
I would like to magnify Text and images/icons etc..
I've googled and searched the android dev docs for hints/tips/ideas.
Sadly I've hit a dead end.
Is this type of Accessibility application impossible to develop on Android?
Jellybean - Android 4.2 - apparently has this functionality built-in - see this release article detailing new features in Jelly Bean: "Accessibility: Enable screen magnification to easily zoom or pan the entire screen to get a closer look. Visually impaired users can now enter full-screen magnification with a triple-tap on the screen"
Typically on mobile operating systems these features are built into the OS, and not something that a 3rd party can write; partly for security reasons (a magnifier would have access to the graphic output of other apps, so could in theory send screenshots containing sensitive information back to base on the sly) and partly because magnification is complex, in that it involves interfering with normal video output and also with touch input (touch input has to be scaled in the inverse way that the original graphics area, so that touching a magnified button goes to the right place).
There may be a way of doing this if you are prepared to root your device and poke around at the OS/driver level, but that's not going to help much if you want an app you can put in the store.
I'm reworking a project where I control LEDs on a miniature model, and display a presentation at the same time on an LCD Monitor above.
I previously had to work with a clunky touch screen laptop/tablet hybrid which was hooked up to an external monitor which showed a presentation slide while it lit up the LED.
I've now developed an android application for the Galaxy Tab 10.1 which controls LEDs on a miniature model via bluetooth, and is hooked up to the monitor via the Samsung HDMI adapter.
My question is, is there anyway to output two separate activities one for the tablet, one for the screen? The current setup just mirrors the screen, but when I open a slide it automatically displays the video in full resolution on the screen so it can be done, but I'm not sure if its an open API (Can't find one!).
Cheers!
Response from Samsung
Unfortunately Android itself does not support multiple displays, as
stated here:
https://groups.google.com/forum/#!topic/android-developers/Jxp_9ZtzL60
and Samsung also does not extend this functionality.
In that thread it does point out a useful Motorola API to control the HDMI output, shame I've got a load of Samsung tablets!
http://developer.motorola.com/docs/motorola-hdmi-status-api/