Hardware encoding video on Android phone - android

I'm looking for some information about encoding video on an Android phone using hardware acceleration. I know some phones (if anyone has a list?) support encoding for the camera, and was hoping I could access the chip to encode a live feed supplied through say wifi, usb.
Also, I'm interested in the latency any such chip would provide.
EDIT: Apparently Android uses PacketVideo, however not much documentation to be found for encoding.
EDIT: Android documentation shows a video-encoder: http://developer.android.com/reference/android/media/MediaRecorder.VideoEncoder.html. However it does not say anything about hardware acceleration.

MediaCodec should fit your needs. Documentation says nothing about hardware acceleration, but according to logs and benchmarks, it relies on OpenMAX library.

Related

Can I use android phone with IR to control light stripe

I have a LED stripe with a classic IR controller (image) and asked myself if I could control it with my Samsung Galaxy s21 Ultra which can emit infra-red light for its ToF camera. Now there are two issues.
Does Samsung provide an API access to control this IR emitter? (The remote control apps I downloaded all say that the phone doesn't have the necessary hardware (IR) for this)?
Is it physically possible, would the IR beam be even strong enough to control the LED stripe driver?
This is unlikely. The ToF IR pulse has to be very specific in length and shape, and likely controlled at the very low level (possibly by the sensor hardware itself).
So it's not likely that there's a way to customize the pulse shape or duration.
Well, it might be possible to turn the IR emitter on and off via some api (maybe the standard camera2 API or a special Samsung API) but IR transmitters commonly use a transmitting protocol which requires quite fast switching. That might be difficult.
There is some fairly detailed information about some IR Remote control protocols here: Data Formats for IR Remote Control from Vishay Semiconductors
I have seen projects which use the headphone socket of older phones connected to an IR LED to emit RC commands by outputting the signal as Audio.. pretty neat.

How to turn display to HDR mode

We are working on live SDR to HDR conversion on Android phones. Anyone can tell how to trigger the HDR display mode on an android phone when decoding the SDR video signal?
HDR is PQ transfer function, neither 10 bit, not BT.2020 has anything to do with HDR.
https://developer.android.com/training/wide-color-gamut
This talks only about WCG.
This should be used to trigger HDR. https://developer.android.com/ndk/reference/struct/a-hdr-metadata-smpte2086
For Java this should be used to check whether HDR is there.
https://developer.android.com/reference/android/view/Display.HdrCapabilities?hl=en
See this question with further links What is the difference between Display.HdrCapabilities and configuration.isScreenHdr

How to use hardware accelerated video encoding of GStreamer on Android?

I am trying to design hardware accelerated video encoder based on Android. I have done research for some time but I did not find much useful.
Anyway, I saw the Gstreamer (http://gstreamer.freedesktop.org/). It is said this can provide hardware video encoder. However, after I read the manual, I found nothing about encoder.
Does anyone know about this stuff? Thank you!
It's going to be dependent on your hardware. What device are you running on?
If your processor contains an IP core that implements video encoding/decoding, the manufacturer needs to either offer a driver so you can call this hardware, or ideally go a step further and offer a specific plugin for GStreamer that does it.
For example, the Freescale i.MX6 processor (used in the Wandboard and CuBox) has a driver maintained by Freescale: https://github.com/Freescale/gstreamer-imx
TI OMAP processors have support: http://processors.wiki.ti.com/index.php/GStreamer, also see TI Distributed Codec Engine.
Broadcom processors have support: https://packages.debian.org/wheezy/gstreamer0.10-crystalhd
There are also several standard interfaces to video accelerator hardware, including VDPAU, VAAPI, and OpenMax IL. If your processor is not one of the above, someone may have written a driver that maps one of these standard interfaces to your hardware.
The Rasberry Pi is apparently supported by the OpenMax IL plugin: http://gstreamer.freedesktop.org/releases/gst-omx/1.0.0.html
If you don't know whether your processor is supported, I'd search for the name and various combinations of "VDPAU", "VAAPI", etc.
There are a wide variety of encoding options in Gstreamer to take a raw stream and encode it. Pretty much any element ending in "enc" can be used to do the encoding. Here is a good example of a few encoding pipelines:
https://developer.ridgerun.com/wiki/index.php/TVP5146_GStreamer_example_pipelines
With that said, I'd caution that video encoding is extremely hardware intensive. I would also look at getting a special purpose hardware encoder and to not do software encoding via GStreamer if you're stream is a robust size.

Controlling camera hardware in Android phone

I want to control the aperture, shutter speed and ISO on my android phone. Is there a way in which I can access the hardware features?
I won't say it's impossible to do this, but it IS effectively impossible to do it in a way that's generalizable to all -- or even many -- Android phones. If you stray from the official path defined by the Android API, you're pretty much on your own, and this is basically an embedded hardware development project.
Let's start with the basics: you need a schematic of the camera subsystem and datasheets for everything in the image pipeline. For every phone you intend to support. In some cases, you might find a few phones with more or less identical camera subsystems (particularly when you're talking about slightly-different carrier-specific models sold in the US), and occasionally you might get lucky enough to have a lot of similarity between the phone you care about and a Nexus phone.
This is no small feat. As far as I know, not even NEXUS phones have official schematics released. Popular phones (especially Samsung and HTC) usually get teardowns published, so everyone knows the broad details (camera module, video-encoding chipset, etc), but there's still a lot of guesswork involved in figuring out how it's all wired together.
Make no mistake -- this isn't casual hacking territory. If terms like I2C, SPI, MMC, and iDCT mean nothing to you, you aren't likely to get very far. If you don't literally understand how CMOS image sensors are read serially, and how bayer arrays are used to produce RGB images, you're almost certainly in over your head.
That doesn't mean you should throw in the towel and give up... but it DOES mean that trying to hack the camera on a commercial Android phone probably isn't the best place to start. There's a lot of background knowledge you're going to need in order to pull off a project like this, and you really need to acquire that knowledge from a hardware platform that YOU control & have proper documentation for. Make no mistake... on the hierarchy of "hard" Android software projects, this ranks pretty close to the top of the list.
My suggestion (simplified and condensed a bit): buy a Raspberry Pi, and learn how to light up a LED from a GPIO pin. Then learn how to selectively light up 8 LEDs through an 74HC595 shift register. Then buy a SPI-addressed flash chip on a breakout board, and learn how to write to it. At some point, buy a video image sensor with "serial" (fyi, "serial" != "rs232") interface from somebody like Sparkfun.com & learn how to read it one frame at a time, and dump the raw RGB data to flash. Learn how to use i2c to read and write the camera's control registers. At this point, you MIGHT be ready to tackle the camera in an Android phone for single photos.
If you're determined to start with an Android phone, at least stick to "Nexus" devices for now, and don't buy the phone (if you don't already own it) until you have the schematics, datasheets, and sourcecode in your possession. Don't buy the phone thinking you'll be able to trace the schematic yourself. You won't. At least, not unless you're a grad student and have one hell of a graduate-level electronics lab (with X-Ray capabilities) at your disposal. Most of these chips and modules are micro-BGA. You aren't going to trace them with a multimeter, and every Android camera I'm aware of has most of its low-level driver logic hidden in loadable kernel modules whose source isn't available.
That said, I'd dearly love to see somebody pull a project like this off. :-)
Android has published online training which contain all the information you need:
You can find it here - Media APIs
However, there are limitations, not all hardware's support all kind of parameters.
And if I recall correctly, you can't control the shutter speed and ISO.

How does noise cancellation work in android?

I came across this relatively old post which describes how impressively Nexus One's noise cancellation works and I was wondering where can I find more information about its implementation in the OS software.
In particular:
How much of it is done using software and how much of it is done in
hardware?
Which modules in the Android source code are responsible for noise
cancellation?
Can I control its behavior via Android's API? (if so, which ones)
Does it also work with the microphone in the headset that comes with
Nexus One (4-pin 3.5mm jack) or does it work with the built-in
microphone only?
I only know the answer for the Nexus One, but:
It's done in hardware.
Not sure.
Nope.
Maybe?
For the N1, it works using a second microphone in the back, and comparing the two signals. I don't know exactly how this process is done (hardware or software), but I know there isn't an API for it. Also, it probably doesn't work for the external headset, since there's no second sound source to compare the first one to (unless the headset has two mics too, but I don't think it does).
About the Nexus One:
All hardware only configuration in software.
Sound drivers and sound system but only configuration.
No API possibly some prop configuration but I haven't been able to get that to work.
No, longer reply following.
I haven't found any indication that it uses the other microphone to do noise reduction for the headset. It wouldn’t make much sense either as it would most likely just try to cancel out with the noise from your pocket.
For most other android phones and for headset on the Nexus One I'm pretty sure that there is only some sort of filter to reduce input of sound that is not speech.
I have done some research on this that I tried to get some help with on the android porting and dev lists. There is a little further info:
http://groups.google.com/group/android-porting/browse_thread/thread/fe1b92065b75c6da?pli=1
With the reservation that I haven't looked at the latest and greatest versions of android.

Categories

Resources