Android Video Streaming - Device supported? - android

Ok. So there are a bagillion different Android devices. I have a video streaming service, works wonderfully for iOS. My app has a live video feature and a saved video clip playback feature (which streams to the device too). I've run some tests on different Android devices and get a whole bunch of different playback results. I am using a 640x480 h.264 base profile video. Streaming that video works only on some devices. For other devices, that same video stream can be made to stream at low resolution and that works on some devices, but still not others. The high profile streaming goes through http://www.wowzamedia.com/ (rtsp) and doesn't work on any Android device (but works on iPhone). The lowest and worst option is Motion JPEG, which works on all tested devices so far.
So my question is, how can I figure out (without having to test every device out on the market) if the device will play: 640x480 h.264 base profile - if that wont work then play the low resolution video - if that doesn't work, default to Motion JPEG.
Also, any idea why my rtsp transcoded through wowza works on the iPhone but not on any Android device (not even the Motorola Atrix)?

Streaming on android is an absolute mess. Most devices don't support anything higher than Baseline 3.0. If you encode for iPhone 3, it should generally work via RTSP. Newer versions of android support HLS, but it's hit or miss and largely dependent on specific devices.

I resolved this problem. Check RTP-realization in your streaming service and x264 profile. My RTSP-server works fine on 90% devices.
p.s
Some video frameworks in different Android versions can implement RTP and RTSP protocols with some differences.

These are some of the links/issues which I have come across, while trying to make streaming work in varied devices.
MediaPlayer seekTo doesn't work for streams
MediaPlayer resets position to 0 when started after seek to a different position
MediaPlayer seekTo inconsistently plays songs from beginning
Basic streaming audio works in 2.1 but not in 2.2
MediaPlayer.seekTo() does not work for unbuffered position
Streaming video when seek back buffering start again in videoView/Mediaplayer
Even the big shots in stackoverflow are wondering about this
If you want just streaming without seeking (which is lame), this can be achieved. But then if you receive a call while you are watching, you will end up from the start.

Related

StageVideo fallback in Air mobile doesn't work. Video component won't play .mp4

So with StageVideo you can play a h264 .mp4 file, and by any example which I found you need to have a fallback Video component.
Problem is that I was unable to play the .mp4 video files with the Video component on a mobile device, Android or iOS.
.flv works fine, but I can't have backup video files as it takes too much space.
Is it really necessary to have the fallback to the Video component? what are the chances it will fail?
Thanks.
From my experiences (I've created 3 separate AIR VOD apps for both iOS and Android), the following is true:
StageVideo works on Android 4.0+. I was unable to get it to work with 3.x, but I have been told it works. I can, for sure, confirm that it does not work on 2.x.
StageVideo works on iOS 5+. On iOS 5, you will need to play a silent sound at startup to make sure sound works, but you should do that regardless since the iPad 2 rarely plays sound without doing that. It is a known bug in AIR that, as far as I know, has never been attempted to be fixed
iOS can only play h.264 MP4s through StageVideo and StageWebView. They will not work in Flash video players (including VideoDisplay, the base for Video and all OSMF-based players). I do not recall the exact reason for this, but I believe it has something to do with the MP4 requirement for hardware accelerated playback.
iOS can play FLV and, maybe, F4V through the Flash video players described in #3. This will lack hardware acceleration, however. That means your video and your UI will run on the same thread and share the same process. Basically, lower framerates while video is playing. Additionally, CPU decoding is a battery drain.
Android is a little more wild. You cannot use StageWebView for any playback as of Android 4.3 (have not tested on 4.4 yet). You can use Flash video players for h.264 MP4s... on some devices. I've found that they seem to work fine on Android 3.0+ on all devices I have tested. Keep in mind that is only a couple dozen out of over a thousand possibilities, though. On 2.x, it is extremely hit-or-miss. It seems to work fine on HTC and Motorola devices (which I've tested on), but I have had reports from users who cannot playback on Samsung and Sony devices.
As you mentioned, a fallback player is definitely recommended. Without having multiple sources/encode types, the fallback is useless on iOS, however. I currently have an app in the Play Store (All About Trikes) that was originally released without a fallback player and just used a StageVideo implementation. A day after release, we started getting reports that users on 2.x couldn't play videos. We had to scramble. We first released a version that couldn't be installed on 2.x and then another version that uses Flex's VideoDisplay as a fallback, which seems to have fixed the problem for those users, but I know there will be others than cannot playback video.
Long story short, there is no fool-proof way of playing back h.264 MP4s on mobile using AIR. You do want to include a fallback player, regardless of platform. Ideally, if you are streaming the video, you should have both h.264 MP4s and FLVs available with the fallback using FLVs instead of MP4s.
Hopefully that helps.

HTML5 Video Tag currentTime not working in android

I am working with html5 video tag, i am using .m3u8 file for source. I can not able to seek video by using currentTime property in android 4+ whereas if i using .mp4 file i can able to seek.
In ipad its working properly but not working in android. Please guide me to rectify this issue.
Thanks,
Thavaprakash. S.
HLS and Android are not the best of friends.
Some problems I know of are:
No adaptive bitrate switching: the first quality is picked, no switching will occur.
When pausing, the video restarts from the beginning (for example with VOD).
When going fullscreen, the video restarts from the beginning.
When offering a livestream with DVR, the stream starts at the beginning of the DVR instead of at the "live" moment.
You cannot seek. <-- this is the one for you
Aspect ratio's are not detected properly. (tho this should be fixed in 4.1)
On top of these problems, there is no support for HLS in pre 2.3 Android and in 3.0 it actually makes your tablet crash.
Basically: only use HLS on Android for live video without DVR and set the correct aspect ratio. Oh, and try to pick a "suitable" quality, cause it won't switch.

How to know which android devices support video playback?

I've made an app to view vine videos on Android devices. These are basically .mp4 videos being loaded into a VideoView. From the following documentation (http://developer.android.com/guide/appendix/media-formats.html), mp4 video playback is supported on Android version 3+ devices.
I've already added a android:minSdkVersion="11" to the manifest file to filter out older android versions from downloading the app, but I'm still getting feedback from users running newer versions of Android (eg. 4.1, 4.2) that complain about getting a "Video cannot be played" error message.
Since there's no way (that I know of, please correct me if I'm wrong) to test video playback using the emulator, I can't really know what's going on.
Is there any way to check for a device's ability to do video playback or at least get the emulator to play videos, so I can correctly fix this issue?
pd. for those interested, here's a link to my app in Google Play. As you can see, I'm being crushed by negative reviews: https://play.google.com/store/apps/details?id=com.thirtymatches.vineflow
If you look at the Google compatibility matrix closely, you'll see that support for MP4/H264 encoding started with the Android 3.0 release. Playback of MP4/H.264 has been supported by all Android devices back to Android 1.0, so there's no need for you to limit availability to newer releases of Android (unless you have other API compatibility needs).
Via VideoView, the video playback on all these devices is done using the hardware decoder provided by the phone's chipset. So to guarantee compatibility, the video has to be encoded to lowest-common denominator. Google provides "Video Encoding Recommendations" at the bottom of the page you linked; I also wrote an answer describing how we transcoded to a form of MP4/H264 that plays across all Android devices.
Not knowing what Vine is doing with their video clips, it might be that the videos aren't all encoded with uniform encoder parameters. They might be taking the videos straight off handsets and streaming them without any additional server-side processing to ensure wide compatibility; I don't know. If that's the case, you might find it challenging to develop an Android app that can show the content without a) implementing your own software codec (as apps like RockPlayer, MX Player or VLC do) or b) transcoding the videos on a server (which probably will run afoul of Vine's terms of service).

How to modify the framerate on android

I'm developing an android app to send the live Video & Audio stream to other PC, the app can capture the camera and mic and send the live stream and use VLC player to play it, it works very well in my htc s710e(android version 4.0.4), but in other mobile it's very choppy. I tested many mobiles, some mobile's hardware is higher than s710e, and some is lower than it, but all of them are very choppy.
I debugged it for too many time and found I cannot modify the frame rate of video, although I set the frame rate is 10 or 15, but the live video in vlc shows the frame rate is 30(H.263).
So how can I modify the frame rate? hope someone can helps me, thank you.

HLS streaming on Galaxy S3

We are developing music streaming application for Android devices with adaptive bit rate support. We are using Wowza as streaming server which will stream songs using HLS. We have converted each song into four bitrates and have created smil files which are referred in HLS URLs. So basically final URL will look something like this.
http://streaming.server.name:1935/vod/smil:audiofile.smil/playlist.m3u8.
We have tested the app on multiple android devices ( Android version 4.0 or later ) including Galaxy S2, Galaxy Note I, Sony Experia, Google Nexus etc.. On all these devices the songs are getting streamed except on Galaxy S3. On Galaxy S3 the song starts and plays till 6 secs. After that the song goes in loop and plays again and again 6 sec part.
The same behavior is observed when the above URL is accessed through browser.
We have tried to check wowza logs but no errors were reported.
Has anyone tried such thing on S3 devices. Any guidance on how to debug this issue will be greatly appreciated.
The Android documentation states (not clearly enough though) that the TS container format only supports AAC audio. Your are trying to play HLS with TS segments containing MP3 audio data (which is supported by iOS).
I can share the observation that doing so works on most Android devices, but not on all.
The S-III is an example of that.
Please refer to the Android Supported Media Formats section in the documentation.

Categories

Resources