I'm quite new to Android programming but familiar with C/C++ and Linux enough to program a few stuff. In my next project, I've to run some native application (Linux applications) under Android.
The executable I'm calling is using the framebuffer device to produce screen output. When I try to just call the program the output of the program is lost because Android is redrawing the screen, thus overwriting the framebuffer.
In order to prevent this, I made a "stubbing" Android program, which actually has no window design (just a black screen) but calls my program. This is working good to some extend; however, whenever I rotate the screen, or some Tattoo notification comes, Android redraws the screen and we're back to frame #1...
Thus, I wondered if there is a "low level" API or something, which can prevent Android from using the screen at all, as long as I release it. Shortly, like a WakeLock preventing phone from sleeping, I want a 'Lock' which will lock the screen (framebuffer device) completely. Does anyone know how can I achieve such functionality?
PS: This might have something to do with SurfaceFlinger, but I've read somewhere that it doesn't publish any API to user-level.
On Android the screen belongs to SurfaceFlinger.
The executable I'm calling is using the framebuffer device to produce screen output.
That's not supported.
Thus, I wondered if there is a "low level" API or something, which can prevent Android from using the screen at all, as long as I release it.
Nothing that is supported at the SDK or NDK levels. You are welcome to create your own custom firmware that does whatever you want, and load that custom firmware on your own device.
Otherwise, please use a supported means for interacting with the screen (e.g., OpenGL natively in C/C++).
Related
I want to place a hardware keyboard on part of the screen of a specific android smartphone (just like the Galaxy S7 edge Keyboard Cover). So i need the android system to always only use the part of the screen which is not occupied by the keyboard, even on full-screen video playback etc. But a service needs to be able to handle the touch events of the area occupied by the keyboard. This does not need to work while booting.
Solutions may use stock android (with root access) or a modified LineageOS.
I tend to believe that there is no solution for stock android.
But the Android open source Project is too complex for me to find a place to start modifying. The window manager service, surfaceflinger, or any other?
My intention is that modifying surfaceflinger would be the most general solution. Is modifying surfaceflinger even possible or is it linked statically with the HAL and part of the binary blob? I expect surfaceflinger not to handle touch events in any way, right?
A vague idea not touching surfaceflinger is to modify the window manager to ask surfaceflinger to create a virtual display with a smaller size as the native one, use this for anything instead of the native one, and blit it to the native one.
Another idea is to modify rectangles of the windows in the window manager. But i don't know whether this is possible (especially for full-screen video playback).
The touch events would need to be modified as well. By the way, does the window manager route the touch events?
Is there any component of android which uses surfaceflinger directly bypassing the window manager and my possible modifications? For example may apps ask surfaceflinger for the screen size/resolution or is this information dispatched by the window manager?
Is there a more simple way?
Any hints?
I found a even more simple solution on my own. The window manager of android manages overscan-settings for each display which may be changed by the wm command in a root shell for example, in order to restrict the area of the display used by the system. This works like a charm.
And yes, the window manager routes the input to the top window. But i do not know how and where, cause the WindowManagerService class is a huge mess, i do not want to fiddle with anymore.
The documentation even states, that a touchscreen driver exposing a specific file in /proc (or /sys, i can't remember) containing information about where fixed soft-buttons are located and how they should be mapped to linux key codes would force the system to automatically use them. So using a custom kernel module which creates such an entry in the filesystem will eventually do the trick. But it is untested.
The button presses of the hardware keyboard are handled by a dedicated service only, so i will simply read /dev/input/event directly in this service.
I'm looking for a way to take a screen shot/screen capture of an openGL application that is behind another window. Namely, I want to screen capture the Android Emulator running an openGl app.
Since the Android Emulator is behind another window, it's not visible on the PC screen, so that taking a normal screenshot or using one of the several screenshot tools I found does not work. Making the Android Emulator the active window to capture it is not an option, because I don't want to distract the computer user.
Of the numerous screenshot tools I tried, one was able to capture background windows. But it couldn't handle opelGl and returned only a black picture. I didn't find a tool that could handle both background windows and openGl.
Note: Although the user should not be disctracted, using a graphical screenshot tool will probably work. I'd do this by leaving the tool in the background and sending it mouse/keyboard input, e.g. via AutoHotkey.
I'd also be comfortable with writing a program for this task, but I haven't found a way to screen capture an openGl window. It's possible to capture Direct3D Windows, and it is also possible to draw the content of an openGl window somewhere else, but neither of this seems to be helpful here.
Using the android debug bridge to take a screenshot returns only black pictures. People often suggest turning the gpu acceleration of the emulator off, but when I do this, the android app I'm trying to capture doesn't start anymore.
I'm using Windows 7 (32bit) with the latest Android SDK and Android 4.4.2. When the Android Emulator is visible, a normal screenshot shows it's actual content, not only a black rectangle.
So is there any way to do what I want?
I'm working on UI automation tests for an Android app. I need to save off a screenshot of the app as it appears during various steps of the test for later analysis by test engineers.
The usual way to do this in Android is to get the Window, then get the DecorView, then call onDraw with a Canvas backed by a Bitmap and save the Bitmap. This doesn't work when a Dialog is showing on top of an Activity, however. The Dialog and Activity each have their own separate DecorView.
Is there any way to programatically take a screenshot of the entire app with all windows composited? Unfortunately, the device is not rooted and the app does not have signature permissions, so this answer in another thread does not apply:
https://stackoverflow.com/a/13837910/244521
For phone devices: This article shares some tips that might help. However, on ICS and above, you may need to use Home + Power, not Volume Down + Power, as the article says.
For automated tests, perhaps AutomatedScreenshots will help
I do have a problem and have very little to go on. I'm about to release an App (created with Air for Android As3) on the Samsung App Store and just got a list of issues that have to be resolved after the app has been tested by samsung staff before the app could be released.
I did manage to solve almost all of the issues, but 1 very important one is beyond me. They say the screen turns/stays black, when returning after the device alarm interrupted the app. This issue practivally happend on all their devices, including a group including the phones I own (e.g. Galaxy S3).
I do have "OnDeActivate" and "OnActivate" listeners in place that are there to pause the app, disable sound etc. if it loses focus, gets minimized etc., yet I checked on my devices and I can't reproduce this error. Meaning if the app gets interrupted on my device by the alarm, I can resume it without any problems. no black screens.
So the question is: Is there any way for me to fix that at all? I do have to work within AirForAndroid AS3 so I guess possibilities are limited. Any clues where I can look? Any listeners to set, or is there a way to maybe "force" the app to reinitialize or refresh the display? Or to listen for the system alarm? Help would be much appreciated. Thanks in advance!
I am trying to overcome the same issue, I read somewhere that setting the stage quality to something else on both the activate and deactivate events might solve the issue.
So just set your stage quality to medium or whatever different in the deactivate and set it back to what it needs to be in the activate.
This should make AIR snap out of that black screen for the alarm (I hope)
An app of mine is with this fix is currently undergoing testing on the Samsung App Store.
I hope it fixes it.
Good news, the dirty fix of toggling the stage quality seems to have worked for Samsung, it has not shown up in their latest certification report of my app.
by the way, this is not for a stage3D app, that's different
It's for a GPU app
When the app loses focus on Android (goes into background) it will lose the context, which among other things mean that you lose all the created graphics, cached objects and like.
You didn't specify what kind of app it is. If you're using Stage3D, that means you'll have to recreate all your textures, and if you're on plain old displaylist, you'll have to recreate any bitmaps that were created at runtime, and redraw your screen at least once (so the vector graphics get redrawn too).
Now, if you're using Starling, for example, it can take care of recreating context for you (there's a flag for enabling that), although you'll still have to recreate dynamically created bitmaps.
I am trying to write an application with Mono for Android. In an attempt to do this, I'm using the default template in monodevelop. I can successfully compile and run the application. When I run the application, it looks similar to the one shown here: http://docs.xamarin.com/android/getting_started/hello_world
There are two oddities in my version though:
The button is red
I can't seem to actually click the button. When I use my computers mouse, it acts like it won't click the button. This is not limited to the application either. If I try to click the home or search button in the emulator itself, I noticed that nothing happens either. Its like the emulator is not responding to my mouse.
As someone new to working with Android, can someone please tell me what I'm doing wrong? I'm using MAC OS X with Lion installed. I'm assuming that I have the SDK and Java SDK installed properly considering the app compiles and when I press "play" I can load the app in the emulator. I just can't figure out why I can't actually click the button. So bizarre.
Any ideas?
The title of this question is pretty misleading, since you're saying that the emulator is not very responsive even outside the Mono for Android application. The problem here is with the emulator itself. The one thing I would recommend trying with respect to Mono for Android is to try starting the application without debugging, as debugging will add extra overhead to running the app.
The Android emulator is notoriously slow, since it is fully emulating the ARM instruction set in software. That said, there are certain things you can do in order to squeeze some more speed out of it. One thing that I've seen make a big difference is to decrease the screen size of the emulator image. Setting this to a small screen size (such as QVGA) can make a big difference. You can manage these settings through Android's AVD Manager.