Android KitKat screen processing - android

As I know Google added possibility to capture screen using adb in Android 4.4 with 30 fps. Is there exist any way to transfer screen from Android 4.4 device with such speed? I used framebuffer, but it's too slow.
Thanks.

There is no such way. Neither USB nor WLAN is fast enough to transmit uncompressed high-resolution video at 30fps. You could try to do real-time compression on the device and transmit compressed video, but I suspect that that will starve the app you're recording of CPU and thereby make the recording unrealistic.
Sorry.

Related

Device benchmarking: How many videos can a device play at the same time?

My device can display 3 videos simultaneously without problem. But I suppose not all the devices my app will be used on (api 21+) have the cpu cores and RAM to do that.
What would be a good way to determine at runtime how many videos the device can handle?
The best I can come up with at the moment is to always allow 3 videos and adjust the video resolution I request from the server by looking at the screen width: High-resolution devices tend to have better hardware
High-resolution devices tend to have better hardware - Correct but not always the case. I would not depend on this alone.
But I suppose not all the devices my app will be used on (api 21+) have the cpu cores and RAM to do that. - You could retrieve the hardware spec of the device and determine it on that; if you're comfortable enough that this is enough to determine if the device can run 3+ video or not, along with your resolution check.
Benchmark to see how one video playing is affecting the hardware e.g. using 0.7gb of ram, cpu usage at 47% etc. and from that, you can have a rough estimate to determine further.

Need help saving HD video stream to file

I'm using libstreaming.
I would like to initiate two MediaCodecs with different settings and bitrates (one low quality - which will be transmitted via HTTP and one high to be saved to the SDCARD).
The problem appears to be that I can't grab two separate Mediacodec objects with differing settings.
The high bitrate version is saved as a video containing nothing but a green background *unless the dimensions are set to < 352x288, however the low bitrate version is successfully (and correctly) being streamed to the web.
I am really hoping that I'm doing something obviously wrong and that there's a simple way to save the HD version of the stream to disk.
In general, this should work on most devices - I do it without a problem on a number of devices.
But there are some devices where the encoder driver has got restrictions for this - in particular, some Intel devices refuse to create a second encoder instance while one is active. (Samsung Galaxy Note 3 10.1 comes to mind - not sure if all other Intel based ones have the same issue, or only some of them.)
Unfortunately, even if the Android CTS tests have tests to ensure that the hardware encoder works, there's no test that guarantees that you can have more than one encoder active at the same time.
Does your case fail only if you have differing settings on the second encoder, or also if they have the same settings?
If one stream is of a low resolution, you could try using a SW encoder for that instead, while using the HW encoder for the high resolution version. On Android 6.0, the SW encoder OMX.google.h264.encoder should be quite decent, while on older versions, it's close to unusable.

Android is scaling CPU too low on screen OFF even with locks

I have an application which records audio via microphone and directly encodes the raw PCM-data to MP3 via LAME (configured for most performance), before sending that stream via HTTP.
On my Galaxy S5, it's working flawlessly while screen is ON, but seconds after turning the screen OFF, the process get's struggling due to lack of CPU.
I'm using all known required options to prevent the device from sleeping and theoretically it's working as the CPU does not sleep, but just scaling the CPU too low:
service is running in foreground-state
I have a WIFI_MODE_FULL_HIGH_PERF-lock
and a PARTIAL_WAKE_LOCK to prevent the CPU from sleeping
priority of all affected threads is set to android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
The application is theoretically working very well and used on hundred of thousand devices meanwhile.
But if the user is using the MP3-encoder + screen off, there is a chance that the CPU doesn't deliver enough power anymore to encode + stream the data smooth.
The CPU-governor of my S5 is "interactive" - if I set it to "performance", the problem is gone.
Anybody an idea how to prevent Android from sleeping, without using root to change the governor all the time the app is in use?

Capturing a stream of vector data and visualizing it on the Nexus tablet

Desperately need help!
The problem is as follows: there is a bunch of medical diagnostic devices packed in a box. They are all fed from the same battery, and their data is supposed to be visualized on a Nexus tablet, also enclosed in the box. Only one device at a time is connected to a tablet. Connection is via USB port, processing off-line only. Data is streaming in real-time (some devices may have recording capability, some don't) and needs to be visualized in real-time also.
The devices are "dumb" and there are no SDKs. Seemingly, the devices were never intended to be connected to any external visualizer or any other device. All we have to work with is the raw stream of data - the output of a device is not even a file but a stream of 256 vectors. This stream needs to be captured, written to a predefined buffer/series of buffers (how to determine size of such buffer to be generic enough to satisfy every device in the box?), and then translated into some format that Android tablet can visualize.
Is my understanding of the required architecture correct? What language shall this software be written in? can it be done in something truly cross-platform like Python? Does there exist any open-source functionality for capturing a stream (if so, please, kindly recommend)? Is it possible to have such a software generic so that changing a device/tablet/OS could be accommodated without an excruciating pain?

How to view an openGL renders generated from PC's c++ code on an Android Device connected via WiFi?

I'm working on an Augmented Reality (AR) demo in which high quality openGL renders (in C++) will be generated from a PC and then streamed to an Android display device (running minimum Android 2.2). What is the easiest way to achieve this in real-time (30 FPS on Android Device) ?
I've looked into existing Anrdroid applications, and have not found anything to be suitable so far. The best available were remote desktop applications (such as TeamViewer) however the frame rates were far too low and unreliable.
Possible solution A:
1) Encode openGL window as H.264 Video (natively supported by Android)
2) Stream the H.264 Video via RTSP using a server
3) View the content from an Android Browser (android and pc connected via WiFi)
Possible solution B:
1) Encode openGL window as IP Camera in c++ (is this possible?)
2) Use an IPCamViewer on Android device to view (again connected via WiFi)
I'm not entirely sure if either or both of these approaches are viable and would like some reassurance before moving forward.
What is the resolution of the image (is it equal to the current screen resolution, larger or smaller)? It is possible and efficient to transport a H.264 stream, but it also depends on the machine used to do the encoding. Hardware encoders or GPU-accelerated encoders are your best bet.
Remember - if you choose to go with encoding, you will have latency due to buffering (on the encode and the decode side). It will be a constant time offset so if that's not a problem you should be ok.
The total system delay as proposed by this paper is composed of
Note that none of these delays can be fully measured directly in frame-time. Some of these depend on the frame data and/or the encoder/processing performed. But as a rough estimate with fast GPU-encoding and hardware decoding I'd say a lag of around 5-20 frames. You'll have to measure the final latency per-scenario. I did this by sending frames containing text (ticks) and once the frames were steady, comparing them side-by-side. In fact, I even allowed the user to enter "test mode" at anytime to compensate for network traffic peak times or have him change the "quality" settings to tweak this delay.

Categories

Resources