Programmatically get GPU information on Android without using GLSurfaceView - android

I need to get GPU information (including vendor, renderer, version, available extensions etc) programmatically in my Android SDK. So far, I know these ways to do this.
Run dumpsys | grep GLES command (Does not work because dumpsys is a system-level command available only to some system apps or the ADB shell)
Create a GLSurfaceView, attach a GLSurfaceView.Renderer, and get GPU information in its callback (Works fine but I want to know if there is a way to do that without creating a GLSurfaceView since I do not want to tinker with the host app's view hierarchy from my SDK)

Related

Can't get Samsung Galaxy Note 10.1 (N8010) framebuffer, fb0 only shows top bar [duplicate]

My device is Nexus 4 running Jelly Bean 4.2. I'm trying to record the screen and send it out. Most codes on internet do the cap by read /dev/graphics/fb0. It works fine in some devices and the older systems. But when I try it on my device, it fail. It only gives me blackscreen and all "0" in the raw data. I have run "adb root" to get the root permission, tried "chmod 777 fb0", "cat fb0 > /sdcard/fb0". Also I have tried codes like "mmap" and "memcpy" to get the data. But all fail. I have searched on internet and there seems no solution. And some threads said that the kernel may forbid you reading the fb0. Anyone have idea on that?
As the hardware advances, you're less and less likely to find an actual framebuffer with the screen contents in it.
As noted in the comments, adb uses the "screencap" command, which contacts surfaceflinger through a Binder call, falling back on /dev/graphcs/fb0 if that fails. The latter path is rarely used now.
If you really want to dig into how this works, you need to examine how surfaceflinger does composition, especially the hwcomposer HAL. The hardware composer takes multiple gralloc surfaces and composites them during scan-out. How this works differs from device to device; the hwcomposer implementation is usually done by the graphics chip manufacturer.
Screen shot generation, used by the app framework to generate the thumbnails for the "recent apps" feature, applies the same composition steps HWC does but exclusively with GLES. As of Android 4.2, the hardware composer isn't able to composite into a buffer.
Sometimes the hardware composer "punts", e.g. because there are more layers to composite than the hardware can handle. In that case, surfaceflinger reverts to GLES composition, and there will be a buffer somewhere that has the complete image; whether or not you'll find it by opening /dev/graphics/fb0 is a different matter.
Some starting points:
surfaceflinger
hwcomposer HAL interface
adb shell dumpsys SurfaceFlinger

Connect to RS232 from Android - No permission dialog or shell command

Objective
I'm developing a custom app for internal use on a rooted android mini-pc.
The goal (between others... so...many...others...) is to be able to turn on and off a tv using the serial port embeeded on the tv.
I'm using an FTDI UART RS232 serial usb cable for it.
Status
The application is working right now, using an android library (serial-driver) i can communicate with the tv, but the problem is that the device asks for permissions every install (and sometimes, weirdly, again on the same device), so it needs to be improved.
Issue
Since the device doesn't have mouse or keyboard by default, when this happens someone has to click the buttons, and since the device is normally hidden behind the screen, it can be really annoying.
My two bits
This problem, i feel, can be solved by two methods, but i still haven't been able to make them work.
Since the device is rooted, i might be able to modify an unknown (to me) parameter that allows me to bypass the permission request. For this i have tried to make an intent filter for the usb device, and to rewrite the interface that controlls this behaviour, both without success. Is there a way to make this android version more lenient about permissions?
I use for other reasons SuperSU inside the app, so i can use the full width of the might shell power. Using this i've been trying to send commands manually to the device (/dev/bus/usb/00X/00Y), but this haven't worked. My theory is that it's beacuse of the permissions of the device path, but even doing an unhealthy chmod 777 i cannot have them working.
So, that's my problem right now. I hope someone here can help me.
Additional data
Running: Custom Android 4.4.2 (Cannot be changed)
Needs to be doable solely from within the apk (but it can use shell commands)
We don't have the manufacturer signature to install it as a system app
We can use only one app, so i cannot have another one to move this one to /sys/apps, and i don't know if an app can do that to itself.
using Busybox stty -F /dev/.../ returns "Operation not permitted"

Android programmatically automated touch events

I was wondering if there is any way to progam automated touch events? For example, I have my android app and I want to make a program where automatically make 100 tests and every test makes touch events depending on what appears on the app. And I would like to do this on emulators, if is possible all 100 test at the same time.
for exercising your app with many (more than 100 events) use monkey (Full name: UI/Application Exerciser Monkey) or/and monkeyrunner.
The Monkey is a command-line tool that you can run on any emulator
instance or on a device. It sends a pseudo-random stream of user
events into the system, which acts as a stress test on the application
software you are developing.
The Monkey includes a number of options, but they break down into four
primary categories:
Basic configuration options, such as setting the number of events to
attempt.
Operational constraints, such as restricting the test to a single package.
Event types and frequencies.
Debugging options.
Site: http://developer.android.com/intl/es/tools/help/monkey.html
Basic using:
$ adb shell monkey [options] <event-count>
Example
adb shell monkey -p your.package.name -v 500
So if you want to take control over Android system events and you're familiar with Python and writing testing scripts, then you can use monkeyrunner.
The monkeyrunner tool provides an API for writing programs that control an Android device or emulator from outside of Android code.
With monkeyrunner, you can write a Python program that installs an Android application or test package, runs it, sends keystrokes to it, takes screenshots of its user interface, and stores screenshots on the workstation.
The monkeyrunner tool is primarily designed to test applications and devices at the functional/framework level and for running unit test suites, but you are free to use it for other purposes.
Documentation: http://developer.android.com/intl/es/tools/help/monkeyrunner_concepts.html
NOTE: The monkeyrunner tool is not related to the I/Application Exerciser Monkey, also known as the monkey tool. The monkey tool runs in an adb shell directly on the device or emulator and generates pseudo-random streams of user and system events. In comparison, the monkeyrunner tool controls devices and emulators from a workstation by sending specific commands and events from an API.

Android, Native OpenGL/OpenMAX, Screen capture

Use-case
Mirror Android Screen to PC using USB
Potential (Native) Implementation Approaches
Using Android Open-Source, modify screenrecord for your needs and re-install on your Android device using ADB
Use well known native API such as OpenGL/OpenMAX to capture screen
Discussion
Approach #1 will certainly work ( under the shell account ), however, each time the Android OS is updated, the custom code will need to get updated to keep up with OS changes, with approach #2 the API stay fixed and there is no need to worry about OS changes, the question is whether it is possible to implement Mirroring solely using OpenGL/OpenMAX ?
Questions
Having the above said, what would be the best approach to mirror the android screen via USB ?
The screenrecord that ships with Android 5.0 "Lollipop" can send raw H.264 over ADB. The command line looks like:
adb shell screenrecord --output-format=h264 - | <player>
A few details are on the bigflake page. I've used it to mirror the screen onto a Linux workstation, but unfortunately I didn't save the VLC/mplayer command lines. Some player suggestions are here.
You can try to do uncompressed frames (--output-format=raw-frames), but at decent frame rates that easily overwhelms the ADB connection, even if the screen is tiny.
Source code is here.
As suggested by fadden, I have ended up patching through sreeenrecord disabling the time limitation and adding some code of my own ( enabling ADB over USB routing ), it works, BUT, req maintanance each time the OS is updated, I wish there would have been a way for using the Android Java framework as an ADB Shell Tool as this would considerably reduce the amount of un-documented buttons I am pressing...

Android read fb0 always give me blackscreen

My device is Nexus 4 running Jelly Bean 4.2. I'm trying to record the screen and send it out. Most codes on internet do the cap by read /dev/graphics/fb0. It works fine in some devices and the older systems. But when I try it on my device, it fail. It only gives me blackscreen and all "0" in the raw data. I have run "adb root" to get the root permission, tried "chmod 777 fb0", "cat fb0 > /sdcard/fb0". Also I have tried codes like "mmap" and "memcpy" to get the data. But all fail. I have searched on internet and there seems no solution. And some threads said that the kernel may forbid you reading the fb0. Anyone have idea on that?
As the hardware advances, you're less and less likely to find an actual framebuffer with the screen contents in it.
As noted in the comments, adb uses the "screencap" command, which contacts surfaceflinger through a Binder call, falling back on /dev/graphcs/fb0 if that fails. The latter path is rarely used now.
If you really want to dig into how this works, you need to examine how surfaceflinger does composition, especially the hwcomposer HAL. The hardware composer takes multiple gralloc surfaces and composites them during scan-out. How this works differs from device to device; the hwcomposer implementation is usually done by the graphics chip manufacturer.
Screen shot generation, used by the app framework to generate the thumbnails for the "recent apps" feature, applies the same composition steps HWC does but exclusively with GLES. As of Android 4.2, the hardware composer isn't able to composite into a buffer.
Sometimes the hardware composer "punts", e.g. because there are more layers to composite than the hardware can handle. In that case, surfaceflinger reverts to GLES composition, and there will be a buffer somewhere that has the complete image; whether or not you'll find it by opening /dev/graphics/fb0 is a different matter.
Some starting points:
surfaceflinger
hwcomposer HAL interface
adb shell dumpsys SurfaceFlinger

Categories

Resources