How Lollipop can play compressed Audio/Video - android

From
Android Developers - Enhanced camera & video,
Android 5.0 also adds support for multimedia tunneling to provide the
best experience for ultra-high definition (4K) content and the ability
to play compressed audio and video data together.
Any details or specifics of how Lollipop can play compressed Audio/Video? Also, what are the changes regarding that compared with earlier versions?

From the sources, there is some amount of understanding as captured in this post on Google groups: https://groups.google.com/forum/#!topic/android-platform/isNabAHLLks
To summarize, the implementation of the video tunneling is vendor/device specific implementation.
EDIT: The aforementioned information is for the new feature Video Tunneling introduced in Lollipop. The normal playback is handled through NuPlayer unless there is a system property employed to specifically use StagefrightPlayer as shown here.

you can get clearly concept from
https://medium.com/google-exoplayer/tunneled-video-playback-in-exoplayer-84f084a8094d
If you want to know more detail, I think you could download
AOSP project to trace the tunnel mode APIs.

Related

Capturing unprocessed / raw microphone data on Android devices

This is a question in regards with capturing audio content on recent Android devices. I'm working at developing an Android app that must record unprocessed microphone audio. In the process of trying to figure out how this could be achieved, I came across the Android Compatibility Definition Documents (CDDs) which, since version 5.0 (API 21), state that “device implementations that declare android.hardware.microphone MUST allow capture of raw audio content [...]” (see any CDD from 5.0 and higher, under the Audio Recording section : https://source.android.com/compatibility/cdd).
On the other hand, I have found that there are several audio sources that can be selected under the MediaRecorder APIs and that in order “To record raw audio select UNPROCESSED. Some devices do not support unprocessed input. Call AudioManager.getProperty(AudioManager.PROPERTY_SUPPORT_AUDIO_SOURCE_UNPROCESSED) first to verify it's available” (source : https://developer.android.com/guide/topics/media/mediarecorder). The PROPERTY_SUPPORT_AUDIO_SOURCE_UNPROCESSED property was added in API 24.
Here’s where I would need help : I am confused about the fact that the CDDs indicate that capturing raw data MUST be allowed while the MediaRecorder does not always allow access to it. Perhaps is it that “raw” is not the same as “unprocessed”? Or perhaps this means that raw/unprocessed audio data is for system-level functions only, and not available to third-party developers.
Please note that, when I tested in order to verify if the unprocessed property was available on 3 different Android devices (including the Galaxy S8 with API 26), I never got a positive result.
This being said, can someone clarify why CDDs indicate that capture of raw audio content MUST be available when the unprocessed AudioMagager API property rarely seems to be available. Should I be accessing the raw/unprocessed audio content differently?
Thanks in advance!
The Android CDDs are for developers intending to port the Android platform to other devices/platforms and maintain cross compatibility for app developers. This means that the standard, lower level Android libraries will be able to access the functionality defined in the CDD, but you won't necessarily be able to.
I managed to find an example here from androidcookbook.info that goes through getting raw, uncompressed audio via AudioRecord.
If that doesn't suit your needs then I would 100% recommend checking out OpenSL ES if you haven't looked at/used it yet. It uses the NDK (in C++) to provide a native interface so latency is much lower and you have access to more functionality for audio processing. Here' google's audio-echo project on GitHub that shows recording and playback in real-time using the native processing route with OpenSL ES.
I am sure you have access to the uncompressed audio by using it. You can pass the information to and from Java / C++ as you wish.

Decoding h264 raw stream on Android 2.3.3

I'm trying to decode a raw h264 stream on "older" Android versions.
I've tried MediaPlayer class and does not seem to support the stream format.
I can see the stream on other Cam Viewer apps from the market, so I figure there must be a way to do it, probably using the NDK.
I've read about OpenMAX and Stagefright, but couldn't find a working example about streaming.
Could someone please point me in the right direction?
Also, I'm reading in several places about "frameworks/av/include/media/stagefright/MediaSource.h" and other sources, but they don't seem to be either in the regular SDK or the NDK.
Where is this source located? is there another sdk?
Thanks in advance.
Update: I'm receiving a rtsp connection.
If you wish to perform only a simple experiment to verify certain functionality, you can consider employing the command line stagefright utility. Please do consider this condition where your streaming input may not be supported.
If you wish to build a more comprehensive player pipeline, you can consider the handling for rtsp as in here or http as in here. Please note that NuCachedSource2 implementation is essential for streaming input as this provides a page cache implementation which acts as a jitter for the streaming data.
Please do note one critical point: Command line stagefright utility doesn't render to the screen. Hence, if you wish to render, you will to implement the complete playback pipeline with rendering.
On a related note, if your input is streaming input, the standard player implementation does have support for streaming inputs as can be observed here. Did you face any issues with the same?
As fadden has already pointed out, your work is made far more simpler with the introduction of MediaCodec in Android 4.x.
You should use third-party libs like android-h264-decoder which uses JNI for increasing the performance! Also look at this lib Intel
Update: Media codec wasn't exposed until API 16 (Android 4.1), so that won't work for a 2.3.3 device.
Stagefright and OpenMAX IL were (and still are) internal components of Android. You can find the code (including MediaSource.h) at https://android.googlesource.com/platform/frameworks/av/+/master Note that the media framework has moved to a separate "tree" frameworks/av only recently. Before it was part of 'frameworks/base', e.g. https://android.googlesource.com/platform/frameworks/base/+/gingerbread/media/

How to know which android devices support video playback?

I've made an app to view vine videos on Android devices. These are basically .mp4 videos being loaded into a VideoView. From the following documentation (http://developer.android.com/guide/appendix/media-formats.html), mp4 video playback is supported on Android version 3+ devices.
I've already added a android:minSdkVersion="11" to the manifest file to filter out older android versions from downloading the app, but I'm still getting feedback from users running newer versions of Android (eg. 4.1, 4.2) that complain about getting a "Video cannot be played" error message.
Since there's no way (that I know of, please correct me if I'm wrong) to test video playback using the emulator, I can't really know what's going on.
Is there any way to check for a device's ability to do video playback or at least get the emulator to play videos, so I can correctly fix this issue?
pd. for those interested, here's a link to my app in Google Play. As you can see, I'm being crushed by negative reviews: https://play.google.com/store/apps/details?id=com.thirtymatches.vineflow
If you look at the Google compatibility matrix closely, you'll see that support for MP4/H264 encoding started with the Android 3.0 release. Playback of MP4/H.264 has been supported by all Android devices back to Android 1.0, so there's no need for you to limit availability to newer releases of Android (unless you have other API compatibility needs).
Via VideoView, the video playback on all these devices is done using the hardware decoder provided by the phone's chipset. So to guarantee compatibility, the video has to be encoded to lowest-common denominator. Google provides "Video Encoding Recommendations" at the bottom of the page you linked; I also wrote an answer describing how we transcoded to a form of MP4/H264 that plays across all Android devices.
Not knowing what Vine is doing with their video clips, it might be that the videos aren't all encoded with uniform encoder parameters. They might be taking the videos straight off handsets and streaming them without any additional server-side processing to ensure wide compatibility; I don't know. If that's the case, you might find it challenging to develop an Android app that can show the content without a) implementing your own software codec (as apps like RockPlayer, MX Player or VLC do) or b) transcoding the videos on a server (which probably will run afoul of Vine's terms of service).

Adaptive (multibitrate) streaming for Android

Is it possible to perform adaptive (multibitrate) streaming onto an Android device? If yes, how to do that?
If you have 4.0 or 3.2 you just use access the adaptive stream as you would any other video. Literally.
It's a HTTP access.
So if you use as a data source //mywebsite/video1.mp4 you woulduse as a data source the equivalent //mywebsite/video1.m3u8. Now, I'm not including any discussion on how you create your streaming file but only how you would access it.
All the magic happens within the client (ex: mediaplayer, videoview) supported on 4.0 and 3.2. For the record, you may be able to access and run streaming segments (.m3u8 files) on earlier versions of Android because the manufactures have sometimes played around with the code. But I haven't found any that actually adapt. They usually stick to the first segment they run or default to the lower bitrate segment in the bunch and stay there regardless of bitrate.

add a new codec to Android?

I'm very new to Android. Now i need to work on Adding my own codec to Android.I mean, I want to know what all the steps i need to take to add my own codec to android.
I'm very fresh to this i need some basic things, so can someone please explain the steps I need to take in order to add a new codec to Android?
This is virtually impossible to do in a portable way as all audio and video codecs are compiled at the platform level (due to the fact that most of the time they require hardware specific acceleration)
If you are only interested on this working on a specific hardware platform and have an unlocked bootloader (So you can boot a custom build of Android) you can compile the full Android platform from scratch using the AOSP as a base.
Depending on which version of Android you're targeting you're looking at adding code to either Opencore or Stagefright (The subsystems that Android uses for A/V decoding and parsing) here you can add audio decoders, audio encoders, video encoders, video decoders and container parsers.
Here is some discussion of adding to Stagefright:
http://freepine.blogspot.com/2010/01/overview-of-stagefrighter-player.html
http://groups.google.com/group/android-porting/msg/5d88e76845a22bbb
However unless the encoding scheme you wish to support is very simple (What are you wanting to add?) it is likely to be too CPU intensive for most Android devices to run without being able to offload some of the work to another system (like the radio chipset or the GPU).
In Android Framework, mediacodec is implemented as Stagefright, a media playback engine at the native level that has built-in software-based codecs for popular media formats. Stagefright also supports integration with custom hardware codecs provided as OpenMAX component. Here is the article which summaries the steps.
CHECK HERE

Categories

Resources