Is it possible to perform adaptive (multibitrate) streaming onto an Android device? If yes, how to do that?
If you have 4.0 or 3.2 you just use access the adaptive stream as you would any other video. Literally.
It's a HTTP access.
So if you use as a data source //mywebsite/video1.mp4 you woulduse as a data source the equivalent //mywebsite/video1.m3u8. Now, I'm not including any discussion on how you create your streaming file but only how you would access it.
All the magic happens within the client (ex: mediaplayer, videoview) supported on 4.0 and 3.2. For the record, you may be able to access and run streaming segments (.m3u8 files) on earlier versions of Android because the manufactures have sometimes played around with the code. But I haven't found any that actually adapt. They usually stick to the first segment they run or default to the lower bitrate segment in the bunch and stay there regardless of bitrate.
Related
=== BACKGROUND SUMMARY===
At this moment, we are using Android VideoView to perform video play black. Everything seems to be working great until we encounter Live Streaming.
VideoView tends to have 10-15 seconds delay from the live stream within a local network (LAN).
While attempting to solve this issue, we came across VLC Embed for Android. After searching on the Internet, it seems there isn't any article compare pros and cons of using Android VLC Embed vs. Android VideoView.
=== QUESTION ===
What's the advantage (pros) and disadvantage (cons) of using Android
VLC Embed vs. Android VideoView?
Is VLC Embed stable?
Anything I should be careful when switching existing VideoView to VLC?
Thank you all in advanced
My view may not be very professional but it's about what I've experienced so far.
First, Android VideoView is good since it comes with the Android SDK so it does not require external library. But this one has some limits. For example, as far as I know, it doesn't support MMS and MMSH protocols and some others I didn't quote. Which is not the case for Android VLC SDK. This library is complete and supports almost all media formats I know so far.
It just increases your apk on size, on my side that's the only disadvantage.
Is the Android VLC SDK stable? Yes it's stable and maintained by a huge community.
Anything I should be careful when switching existing VideoView to VLC?
You should keep your sources same and care about aspect ratio.
What's the advantage (pros) and disadvantage (cons) of using Android VLC Embed vs. Android VideoView?
Advantage:
More features. VLC supports almost all media formats, hardware decoding. audio tracks, subtitles, chapter are also supported.
More integrated, simpler logic. You can easily get media information and cache them. The playback engine will proactively notify state changes and events, just register player event listening.
Disadvantage:
APK file size increas. If both arm64-v8a and armeabi-v7a are supported, it will increase more than 30MB.
Multiple instances are not perfect. For example, playing 2 videos at the same time is a hassle.
Is VLC Embed stable?
Stable. Starting with VLC 2.0.x (now 3.0.x), I use the VLC library in my Android App. It runs steadily from Android 5.1 to Android 8.0. A small number of 4k h265 video playback is not normal, but can be resolved by displaying "Can not play".
Anything I should be careful when switching existing VideoView to VLC?
To use LibVLC on Android The Medialibrary(org.videolan.medialibrary) is also required. You also need to note the licenses.
VLC for Android is licensed under GPLv3
This may be a concern for you if your project uses a different license.
This is a question in regards with capturing audio content on recent Android devices. I'm working at developing an Android app that must record unprocessed microphone audio. In the process of trying to figure out how this could be achieved, I came across the Android Compatibility Definition Documents (CDDs) which, since version 5.0 (API 21), state that “device implementations that declare android.hardware.microphone MUST allow capture of raw audio content [...]” (see any CDD from 5.0 and higher, under the Audio Recording section : https://source.android.com/compatibility/cdd).
On the other hand, I have found that there are several audio sources that can be selected under the MediaRecorder APIs and that in order “To record raw audio select UNPROCESSED. Some devices do not support unprocessed input. Call AudioManager.getProperty(AudioManager.PROPERTY_SUPPORT_AUDIO_SOURCE_UNPROCESSED) first to verify it's available” (source : https://developer.android.com/guide/topics/media/mediarecorder). The PROPERTY_SUPPORT_AUDIO_SOURCE_UNPROCESSED property was added in API 24.
Here’s where I would need help : I am confused about the fact that the CDDs indicate that capturing raw data MUST be allowed while the MediaRecorder does not always allow access to it. Perhaps is it that “raw” is not the same as “unprocessed”? Or perhaps this means that raw/unprocessed audio data is for system-level functions only, and not available to third-party developers.
Please note that, when I tested in order to verify if the unprocessed property was available on 3 different Android devices (including the Galaxy S8 with API 26), I never got a positive result.
This being said, can someone clarify why CDDs indicate that capture of raw audio content MUST be available when the unprocessed AudioMagager API property rarely seems to be available. Should I be accessing the raw/unprocessed audio content differently?
Thanks in advance!
The Android CDDs are for developers intending to port the Android platform to other devices/platforms and maintain cross compatibility for app developers. This means that the standard, lower level Android libraries will be able to access the functionality defined in the CDD, but you won't necessarily be able to.
I managed to find an example here from androidcookbook.info that goes through getting raw, uncompressed audio via AudioRecord.
If that doesn't suit your needs then I would 100% recommend checking out OpenSL ES if you haven't looked at/used it yet. It uses the NDK (in C++) to provide a native interface so latency is much lower and you have access to more functionality for audio processing. Here' google's audio-echo project on GitHub that shows recording and playback in real-time using the native processing route with OpenSL ES.
I am sure you have access to the uncompressed audio by using it. You can pass the information to and from Java / C++ as you wish.
What is the best (performance wise) way to get and stream a video from an android device's camera to a PC?
I have seen this question asked here before and there exist a few open source programs that do just that, but there exist multiple ways from which I don't know which one is the best!
for example:
Should the android part be written in c++ or java (performance/api wise)?
Which api should I use to get the video from camera?
What is the best way to stream the video?
I don't intend to support old android versions (<4.x), so if the best way/api is relatively new it's fine by me.
I'm not familiar with Android development but I'll try to answer.
I suppose that the actual encoding of the raw image data is probably done on hardware chip (otherwise software encoding would probably kill your battery) and it looks like MediaCodec class is exactly what you need. I suppose you want to implement some kind of live streaming service and the latency is important. If so, then you should stick to UDP based transmission methods. Using RTP protocol or MPEG-TS container format would be the best choice for this purpose. Of course you can also use TCP based methods for streaming like HLS or DASH (both of them use HTTP).
You should also take a look at Table 1 Core media format and codec support:
It tells us for example that using H.264 AVC Encoder supports MPEG-TS container and that HLS version 3 is also supported for Android 4.0 and above.
From
Android Developers - Enhanced camera & video,
Android 5.0 also adds support for multimedia tunneling to provide the
best experience for ultra-high definition (4K) content and the ability
to play compressed audio and video data together.
Any details or specifics of how Lollipop can play compressed Audio/Video? Also, what are the changes regarding that compared with earlier versions?
From the sources, there is some amount of understanding as captured in this post on Google groups: https://groups.google.com/forum/#!topic/android-platform/isNabAHLLks
To summarize, the implementation of the video tunneling is vendor/device specific implementation.
EDIT: The aforementioned information is for the new feature Video Tunneling introduced in Lollipop. The normal playback is handled through NuPlayer unless there is a system property employed to specifically use StagefrightPlayer as shown here.
you can get clearly concept from
https://medium.com/google-exoplayer/tunneled-video-playback-in-exoplayer-84f084a8094d
If you want to know more detail, I think you could download
AOSP project to trace the tunnel mode APIs.
I've made an app to view vine videos on Android devices. These are basically .mp4 videos being loaded into a VideoView. From the following documentation (http://developer.android.com/guide/appendix/media-formats.html), mp4 video playback is supported on Android version 3+ devices.
I've already added a android:minSdkVersion="11" to the manifest file to filter out older android versions from downloading the app, but I'm still getting feedback from users running newer versions of Android (eg. 4.1, 4.2) that complain about getting a "Video cannot be played" error message.
Since there's no way (that I know of, please correct me if I'm wrong) to test video playback using the emulator, I can't really know what's going on.
Is there any way to check for a device's ability to do video playback or at least get the emulator to play videos, so I can correctly fix this issue?
pd. for those interested, here's a link to my app in Google Play. As you can see, I'm being crushed by negative reviews: https://play.google.com/store/apps/details?id=com.thirtymatches.vineflow
If you look at the Google compatibility matrix closely, you'll see that support for MP4/H264 encoding started with the Android 3.0 release. Playback of MP4/H.264 has been supported by all Android devices back to Android 1.0, so there's no need for you to limit availability to newer releases of Android (unless you have other API compatibility needs).
Via VideoView, the video playback on all these devices is done using the hardware decoder provided by the phone's chipset. So to guarantee compatibility, the video has to be encoded to lowest-common denominator. Google provides "Video Encoding Recommendations" at the bottom of the page you linked; I also wrote an answer describing how we transcoded to a form of MP4/H264 that plays across all Android devices.
Not knowing what Vine is doing with their video clips, it might be that the videos aren't all encoded with uniform encoder parameters. They might be taking the videos straight off handsets and streaming them without any additional server-side processing to ensure wide compatibility; I don't know. If that's the case, you might find it challenging to develop an Android app that can show the content without a) implementing your own software codec (as apps like RockPlayer, MX Player or VLC do) or b) transcoding the videos on a server (which probably will run afoul of Vine's terms of service).