I want to place a hardware keyboard on part of the screen of a specific android smartphone (just like the Galaxy S7 edge Keyboard Cover). So i need the android system to always only use the part of the screen which is not occupied by the keyboard, even on full-screen video playback etc. But a service needs to be able to handle the touch events of the area occupied by the keyboard. This does not need to work while booting.
Solutions may use stock android (with root access) or a modified LineageOS.
I tend to believe that there is no solution for stock android.
But the Android open source Project is too complex for me to find a place to start modifying. The window manager service, surfaceflinger, or any other?
My intention is that modifying surfaceflinger would be the most general solution. Is modifying surfaceflinger even possible or is it linked statically with the HAL and part of the binary blob? I expect surfaceflinger not to handle touch events in any way, right?
A vague idea not touching surfaceflinger is to modify the window manager to ask surfaceflinger to create a virtual display with a smaller size as the native one, use this for anything instead of the native one, and blit it to the native one.
Another idea is to modify rectangles of the windows in the window manager. But i don't know whether this is possible (especially for full-screen video playback).
The touch events would need to be modified as well. By the way, does the window manager route the touch events?
Is there any component of android which uses surfaceflinger directly bypassing the window manager and my possible modifications? For example may apps ask surfaceflinger for the screen size/resolution or is this information dispatched by the window manager?
Is there a more simple way?
Any hints?
I found a even more simple solution on my own. The window manager of android manages overscan-settings for each display which may be changed by the wm command in a root shell for example, in order to restrict the area of the display used by the system. This works like a charm.
And yes, the window manager routes the input to the top window. But i do not know how and where, cause the WindowManagerService class is a huge mess, i do not want to fiddle with anymore.
The documentation even states, that a touchscreen driver exposing a specific file in /proc (or /sys, i can't remember) containing information about where fixed soft-buttons are located and how they should be mapped to linux key codes would force the system to automatically use them. So using a custom kernel module which creates such an entry in the filesystem will eventually do the trick. But it is untested.
The button presses of the hardware keyboard are handled by a dedicated service only, so i will simply read /dev/input/event directly in this service.
Related
I want to write a simple convenience app that allows me to have two locking modes on my phone. By default the phone would just go to the slide lock after a minute or two, but after a longer time or if I activate my app, it should engage a more stringent lock, such as the face unlocking. Basically when I have the phone on me, anything but the slide lock is overkill.
To do this I would have to read/write the preferences for the screen lock or find a list of available locking/unlocking mechanisms so I can select and invoke one of them. Does anybody know where this information can be found/is stored?
I expected the preference keys to be found in the (System.Secure class), but could not immediately find anything related except the LOCK_PATTERN_ENABLED setting, which would not be enough.
I tried searching for references to the FaceDetector class, but none are returned in my Eclipse.
The Device Administration API Sample looks like it might give some leads if I could look at the source code. Unfortunately the page omits the detail of which of the several folders of each of the sample directories for the approx. 10 API levels supporting I need to download and look in, as far as I can see.
You can change lock modes in your app only if it is a device admin. These API are located in class DevicePolicyManager and methods setMaximumTimeToLock() and lockNow() etc. It is only accessible only if your app is a device admin. If you are interested in crating a custom lockscreen app, you can try this.
I've been using FLAG_WATCH_OUTSIDE_TOUCH to get touches from a system overlay in honeycomb, but this no longer works in ICS. I need the application underneath to receive touches too so TYPE_SYSTEM_ALERT didnt work. The application isnt going to be put on the play store so its ok if it needs root or to be put in the system directory to work. Any ideas?
Thanks,
ng93
It seems this is just the case, as this post highlights. This is generally a bad idea, see the documentation:
These windows must not take input focus, or they will interfere with the keyguard.
I haven't actually tested those overlays on Android 4.0 ICS but other apps like SwipePad seem to do this just fine. The only concern I am aware of is related to performance, such overlays often take a hefty toll on the device. If you want to accurately detect touch input and the overlay does not have to be above the lock screen, try TYPE_SYSTEM_ALERT. Another post on SO seems to have chosen that type as well.
I'm developing a very simple cross platform window class in C++ just so I have a surface to render to. I've gotten it working on Linux and Windows so far. After I get it working on OS-X I want to try to get it working on my android phone.
However, I need to know if all "windows" created with SurfaceFlinger are full screen or if they can take up only a section of the desktop like a window on Linux or Windows? I ask this because I know I can place widgets on my phone's desktop but I've never seen an app do anything like a popup or a frame that hovers over the desktop.
How does creating a "window" that is smaller than the resolution of the phone work? Does it just center the drawable surface and leave black borders? Also can an application have multiple "windows?"
The Surfaceflinger, as its name suggests, deals with surfaces, not windows.
Each window actually holds one surface it can draw onto, but these are different types of classes.
Whenever the ViewRootImpl (top view of a window) of a certain application's window is created or changed in some way, a call is made to the WindowManagerService's relayout function.
Now, skipping some boring details, the WindowManagerService creates a surface.
A surface can be of any size, and if you are using multiple displays, it can even be attached to a certain display (though this is not yet implemented).
This brings us back to your question(s):
- A surface (window if you like) can be of any size. The black borders that you mentioned actually come from the window that is placed below the current window (and is painted black).
- Yes, an application can have multiple windows (a window, for example, can be a dialog).
As for widgets, I know how the Launcher (the desktop app) supports them and supports their drag and drop behaviours, but I never asked myself whether they were in fact new windows - so I can't really answer that.
Also can an application have multiple "windows?"
Yes, an application can have multiple windows.
1. Status bar window
2. Activity screen window
3. Navigation window
4. Dialog and so on.
I'm quite new to Android programming but familiar with C/C++ and Linux enough to program a few stuff. In my next project, I've to run some native application (Linux applications) under Android.
The executable I'm calling is using the framebuffer device to produce screen output. When I try to just call the program the output of the program is lost because Android is redrawing the screen, thus overwriting the framebuffer.
In order to prevent this, I made a "stubbing" Android program, which actually has no window design (just a black screen) but calls my program. This is working good to some extend; however, whenever I rotate the screen, or some Tattoo notification comes, Android redraws the screen and we're back to frame #1...
Thus, I wondered if there is a "low level" API or something, which can prevent Android from using the screen at all, as long as I release it. Shortly, like a WakeLock preventing phone from sleeping, I want a 'Lock' which will lock the screen (framebuffer device) completely. Does anyone know how can I achieve such functionality?
PS: This might have something to do with SurfaceFlinger, but I've read somewhere that it doesn't publish any API to user-level.
On Android the screen belongs to SurfaceFlinger.
The executable I'm calling is using the framebuffer device to produce screen output.
That's not supported.
Thus, I wondered if there is a "low level" API or something, which can prevent Android from using the screen at all, as long as I release it.
Nothing that is supported at the SDK or NDK levels. You are welcome to create your own custom firmware that does whatever you want, and load that custom firmware on your own device.
Otherwise, please use a supported means for interacting with the screen (e.g., OpenGL natively in C/C++).
I was hoping to make an app which dimmed the soft buttons to dots on the Galaxy Nexus, using the code from this answer. For completeness, this is the code:
getWindow().getDecorView().setSystemUiVisibility(View.SYSTEM_UI_FLAG_LOW_PROFILE);
Here's the question, is it possible for the app to make this setting system-wide rather than just while the app has focus? So ideally the app would run in the background and would keep View.SYSTEM_UI_FLAG_LOW_PROFILE as the default as long as it's running, for any and every app that you open (unless that app specifically overrides it, I suppose). Is this possible or does this fall outside the realm of what an Android app has permission to do?
A sample use case is this: I use the "Screen Filter" app to reduce brightness a lot for nighttime ebook reading or misc app usage, but the soft buttons are still very bright and distracting, so I wanted to make an app that would dim the soft buttons system-wide while running (like how "Screen Brightness" reduces screen brightness system-wide while running) so this wouldn't be a problem.
As CommonsWare states, it's not possible for an application to change this setting. It's an Activity based setting, that must be set for every single Activity, if you want to make it fullscreen, hide the soft keys, etc.
It's also worth pointing out that you should probably refrain from using it in all your application activities. It's really only meant for games, video players and other applications that need to enter this "low profile" state.
Feel free to accept CommonsWare's answer - I just thought I'd give you a bit of background info on the subject.
is it possible for the app to make this setting system-wide rather than just while the app has focus?
No.