dumpsys SurfaceFlinger output interpretation - android

Recently, I have started using dumpsys SurfaceFlinger to gather some information about android graphics. I currently work on a development board called Odroid-XU3. The display is a Dell monitor which is connected to the board through HDMI cable.
In the very last few lines of the output of the above command, I have two displays, while I only expect to have one. One of them is Display[0] and the other one is Display[1]. The type column of each of the displays could be HWC or GLES. Some times they are both HWC or GLES and some other times one is HWC and the other one is GLES.
What is the difference between Display[0] and Display[1]?
I have tried to find a documentation to understand how to interpret the output of the aforementioned command, but I have not found anything useful.

It would help to have the dumpsys output in your question, but I can make a couple of general observations.
display[0] is the device's built-in display. display[1] is an "external" display, in your case HDMI. These two indices are hard-wired. (Well, they were as of KitKat; I don't know if they've since un-hard-wired things.) Virtual displays start at index 2.
The chunk of text below the display is the hardware composer dump. It displays the layers on the screen, with a bunch of information about the position, rotation, and format of each layer.
The closest thing to documentation can be found in the HWC section of the graphics architecture doc. Given the level of the system you're working with, I would recommend you read the entire document. Beyond that, you can try to discern the meaning from the source code.
The arch doc does explain the difference between "HWC" and "GLES" in the output: "HWC" means the surface lives on a display overlay, while "GLES" means the surface is being composed with the GPU instead.

Related

Android: draw using fb0 on eink device

I've been looking a lot around, and can't find much of anything clear. The context is I have a stylus-enabled e-ink tablet that I program on for fun, and I'd love to use a native library to read the pen events, and draw directly on the framebuffer. The vendor provides one, secret (a .so JNI lib that you can call with just a size parameter).
I suppose this is intended to activate a direct draw on the frame buffer, eventually with a refresh. But I can't grasp how it's supposed to compose with SurfaceFlinger and Android...
Anyone has experience with generic eink tricks to display from JNI via IOCTL that could explain me why I don't see the pixels changing unless I draw myself in java (I can change the update mode and draw quite fast, but... I want the fastest) ?
How could I verify writes on the FB ? Can an android app be "overlayed" by pixels written directly on the framebuffer ?
I figured it out in the end. Using strace to catch calls to fb0 via ioctl + lsof + remote debugging of the official drawing applications showed me that I was wrong.
This particular tablet's vendor software, unlike the remarkable, does NOT draw on the framebuffer directly via magic ioctl commands. All it does is register an area of the screen to be non refreshable by Android normal display primitives. That allows them not to configure each and every view in the hierarchy to be in Direct Update eink mode, while using the android framework with as little custom code as possible.
I could see the command passed that way:
[pid 3997] openat(AT_FDCWD, "/dev/graphics/fb0", O_RDWR|O_LARGEFILE) = 30
[pid 3997] ioctl(30, _IOC(0, 0x6d, 0x25, 0x00), 0x1) = 0
[pid 3997] close(30)
This simply puts the screen on lockdown and stop all android refreshes.
This means they could be much faster.
I suggest you write a basic pixel plotting program in C for framebuffer. Execute it normally and then create a JNI call for the same. The speed difference is negligible to notice.

New to Micro-controllers and Lower level programming, Is this possible to do?

Hey All I'm a recently graduated BS in Mechanical Engineering and am working on a project that is getting into to the field of CS and I am looking to remake a treadmill after its 1990's motherboard finally quit.
I have the following assets:
Treadmill with broken motherboard (all other components tested and functional)
Touch screen monitor similar to this
A polar heart rate monitor.
Multiple hard drives, joysticks and other USB accessories.
NI LabVIEW full subscription suite
2 functioning (2000's era) laptops with no OS.
Solidworks
Local maker's space
I have a few main goals and stretch goals and I'd like some advice as to which should be easy enough to implement and which will take me a research team and 5 years
This should be easy... right?
Get a PID controller setup with a micro controller to spin treadmill belt at [n]mph and adjust incline to [n2] degrees based on a hardware dial, knob, or push button physical input
* get microcontroller to read motor encoders for speed/incline
* get microcontroller to recognize input from a physical button
* get microcontroller to compare current speed/incline values with target values
and increase/decrease current to motors appropriately
* have microcontroller display info on LCD screen
Change from physical input to touchscreen input.
*Figure out what they're doing[in link 1 in comments below]and adjust for what I currently have (or buy fresh if absolutely necessary)
* change input from hardware buttons to software <up> <down> arrows
* Add hardware E-stop
It looks like there are plenty of libraries and devices online that are doing elements of these two steps, combining them may be difficult due to my inexperience, but not hard for the hardware and software.
Medium Difficulty (I saw a guy do this once)
Upload some kind of Linux distribution or other OS onto my microcontroller and turn my program into an application.
*Learn how to install Linux/Other OS
*Compile program as application
*Section off the bottom of the LCD Screen as a treadmill specific taskbar
* (bonus round) Make treadmill specific taskbar able to be moved and snapped
(similar to the windows taskbar)
Add feedback from a heart rate monitor to the treadmill for heart rate PID control
*SparkFun has a Single Lead Heart Rate Monitor - AD8232 [Link 2] write an application
to read the monitor and control the treadmill program accordingly.
I feel like this is theoretically possible but I don't really know how I would go about it. I also see how either of these tasks could be infinitely more complex than I'm thinking it will be.
Hard mode (Is this even possible?)
Put on smartphone style functionality.
* Install Android OS onto microcontroller
* Install Google Play store
* dedicate a set of pixels to the "treadmill OS" and the rest to the "smartphone."
* Add some sort of hook for the "treadmill OS" into the Android OS and maybe write
a few apps to control the treadmill based on [arbitrary value in app]
If I can do this, why are all the super expensive and advanced treadmills on the market so crappy in terms of their software?
For my skill set I'm pretty good on how to physically put everything together (but will need to make few post to the Electronics stack exchange as to how to get a something the size of a smartphone to regulate 120V 60hz power correctly)
My main question is how much of this is actually conceivable to do and if I am to do it in a way that satisfies all my desires, should I:
A) look to by a particular type of microcontroller to do all of this(reccomendations would be appreciated)
B) Start with one of my two Laptops and write an interface for a microcontroller that just does the easy stuff
C) Install the Android OS on one of my laptops and begin write a [treadmill app]
D) Do something I haven't thought of because this is not my field.
ps: Although this is a DIY project, when it comes to the coding, I really don't want to be reinventing the wheel so please let me know about any libraries or resources that may exist which could be helpful
Wow, what a project!
Getting the treadmill working
If your goal is to "get the treadmill working," then don't bother with any of this; instead focus on debugging the motherboard. There's probably just 1 component that went bad, and it will be easier and faster to fix that than to build everything you mentioned up through easy/medium/hard modes. But I know your goal is learning and fun, not simply to get it working :)
Control loops and data collection
As you've already identified, you need something for low-level access to the hardware (controlling the treadmill and reading heart rate back). This type of work is perfect for a micro, so you're on the right path there. Android or Linux are needlessly complex for these tasks, and implementing them will be a lot more work for you, with not much advantage.
User interaction
At a bare minimum, the existing physical buttons and knobs will directly control the micro. Once you hit that checkpoint, congratulations, your treadmill works again.
But you don't want "working", you want "cool". You mentioned a few different ways for users to interact with your system: displays, touch screens, phones, etc. Already this is going to be a huge project, so don't waste time reinventing the wheel by trying to manually implement those things. Find a working system (your laptop, daily cellphone, or even a cheap tablet online), and use that to talk with your low-level micro over something like Bluetooth or WiFi.
Choosing the right tools
If you pick something obscure, expect to spend tons of time simply trying to get basic functionality out of it. So in general, you want to pick hardware & software that:
is robust (many people use it with minimal issue)
has a large community (for support from other experts/hobbyists)
has a large ecosystem (with lots of libraries that you can leverage)
The Arduino might be a good micro for you. Look into that.
For the "cool" display, your personal phone is probably the best option. The app development for your phone is robust and will have tons of support when you need it.
Other thoughts
You mentioned LabVIEW: stop doing that. It's the wrong tool for almost every goal you have.
You asked how to regulate mains power down to a small board: buy/find any old wall-wart adapter from old electronics around your home. Cut off the tip. Connect the wires to your board. Done. (all the magic is inside the brick block).
You asked which approach is best: B. Get the treadmill working with a basic micro. Then add wireless to the micro. Then write an app to give you a sweet display and control of the treadmill (via the micro).
E-stop. Smart.

adb screencap output is different than on the device

I have a graphical glitch related to blending in my OpenGL application using Android NDK.
The strange thing is that when I take a screenshot through adb screencap command, the problem completely disappears and the result looks okay.
My question is:
Is there a way to know what is happening behind the scenes of making screenshots? Is there eglChooseConfig called with some specific values for the entire frame for example? Or maybe is there some specific initial GL state forced?
Some background:
My device is using Qualcomm Adreno 320.
The glich occurs when I call glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA) for some of the geometry.
I have also found that setting glColorMask(1, 1, 1, 0) results in a black screen on my device (and only on this device), whereas taking a screenshot results in a complete, correct game frame.
The application outputs no glitches on several other Android devices, and other applications work well, even ones that use blending extensively.
Generally speaking, devices don't have a framebuffer full of pixels that you can copy out of when capturing the screen. The "screen capture" functions are actually "redraw the screen into a buffer" functions. It's possible for the screen capture to be slightly different, deliberately so if a "secure" layer or DRM content is on screen.
Is this a single, fully opaque Surface? Or is it being blended with another layer above or below?
The most common reason for differences is bugs in the Hardware Composer, but it sounds like you're seeing issues rendering on a single surface, so that is less likely. If you have a rooted device, you can turn HWC composition on and off using the commands shown here: adb shell service call SurfaceFlinger 1008 i32 1 will disable overlays and force GLES composition. (If none of that made any sense, read through the graphics architecture doc.)
Are you able to post images of the correct and glitched images? (One via screenshot, one by taking a picture of the device with a second device.)
Do you see similar issues if you record the screen with adb shell screenrecord?
The problem disappeared once I commented out EGL_ALPHA_SIZE setting:
const EGLint attribs[] = {
EGL_BLUE_SIZE, 8,
EGL_GREEN_SIZE, 8,
EGL_RED_SIZE, 8,
//EGL_ALPHA_SIZE, 8,
EGL_NONE
};
It looks like with alpha set to 8 bits, eglChooseConfig returned a problematic configuration object.
Funnily enough, the "correct" EGLConfig specifies 0 bits for EGL_ALPHA_SIZE, so at first I would expect it to not work at all. Other devices don't really care about the value and they are doing well provided only RGB channels' depths.
I have learned a lesson: if there are graphical glitches on your device, always check all possible EGL configurations!
So my conclusion is: yes, probably there is a custom EGLConfig set inside adb screencap.

Measuring Frame Render Time of Android App

I'd like to know the performance of my app, particularly the rendering time of video frames. I'm aware of DDMS in Eclipse->DDMS Perspective->System Information->Frame Render Time. However, as has been asked in this other question, the rendering information doesn't seem to show up even when you 'Update from Device' while the app is running.
After some research, I came across this command from this blog and this other one:
adb shell dumpsys gfxinfo <package-name>
However, when I run that command, it completes running right away and I get an output that looks like this, in part:
That is, I do not get more than one line of profile data (often the result is empty). What can I do to obtain more than one point of data in order to graph it? Or, what other alternatives should I consider to assess the rendering performance of the application?
The correct tool is systrace (basic docs, verbose docs). You will need a device that supports it, and will need root for some of the more useful bits.
You can see what every thread is doing, and much of the graphics stuff is instrumented. I don't think the video codecs themselves are typically instrumented, so you won't be able to see hardware codec decode times, but you can see the buffers move through the system.
Most of the interesting stuff will happen in mediaserver or SurfaceFlinger, so you won't see decoding / rendering activity appear in method profiling for your app.

Flashing black screen in Android under OpenGL ES

in some android test devices, when rendering in opengl 2.0 ES, the screen flashes.
I was able to track the problem to the GLSurfaceView class at the point the "eglSwapBuffers" is called, so the flashing is produced on each iteration, on one the screen becomes black and the next has the image I've drawn. So it seams that eglSwapBuffers is not preserving the back buffer on each call producing this flashing behaviour.
Is there anyway to preserve the back buffer? I've found around that maybe I could use the EGL_SWAP_BEHAVIOR_PRESERVED_BIT flag but I can't figure it out how to put it in android, and neither how to use it in old API's such as gingerbread.
Thanks
You should not need to modify GLSurfaceView. It's much more likely that your problem is caused by the drivers or configuration of your system. I would try a different test device with different graphics drivers. What happens when you run it on an AVD?
It could be that your test device is not making enough memory available to the underlying linux framebuffer device to get the normal triple buffering. Most systems will fall back to single buffering in that case. I recommend that you check these fb device parameters. The virtual_size should be large enough for 2 or 3 buffers for the display mode you are using:
cat /sys/class/graphics/fb0/mode
U:1024x768p-60
cat /sys/class/graphics/fb0/virtual_size
800,1440

Categories

Resources