So we've been fighting this for weeks. We are building a Unity (5.6 and 2017.2) app for Fire TV / Fire Stick devices (among others). It's primarily a media app sharing our own MBR content over HLS, served from a Wowza server. Every player we have tried results in the following behavior: Every two seconds or so there is a skip in the A/V playback, just a few ms or more. Some videos display this more than others, it seems. The audio and video remain in sync, just skipping frames regularly. The result is, in some cases, nearly unwatchable.
We've tried several media player plugins for Unity (UMP, NexPlayer, AvPro), and they all do the same. They play HLS content from external sources perfectly fine, but our own served content is nasty, even things we didn't encode ourselves. This is only for Unity/Android clients though; Roku and Apple TV play the content fine, as do players on Windows. It is just the confluence of Unity/Android (Fire OS, but also others) and our served content.
It seems like a Wowza setting problem but again, other clients play just fine from the same hosts. Has anyone run across this issue, and have recommendations for setting up the plugins or tweaking Wowza? Is there a specific plugin that you've successfully used as a video player in Unity, for Fire TV?
We're doing all the Unity-side things correctly we think. (Multithreaded GL rendering, etc.)
Related
I have an app that was developed for iOS and Android (both are native). Both apps use the same api to download content such as images, documents, and videos. The video format is in mp4. Of course the videos play fine in iOS but some Android tablets have issues playing the videos reliably. Sometimes they play, sometimes (and usually) they don't.
Obviously this is a codec issue. I've suggested we have the user upload two videos, one for android and one for ios, but it doesn't seem to be an option at this point.
Is there a bitrate/fps setting that can be used to make video across both platforms more reliable?
I'm working on applications which have audio and video playback as core elements of the user experience. Something I'm noticing is that across the wide array of android devices on the market playback will or won't work in various ways. The system will often behave as though playback completed successfully (calling callbacks, not erroring) even though the user only heard a portion of the audio, or certain parts didn't play correctly.
I've got a good emulation environment set up for running Unit Tests on various devices, however I don't have any good methods to verify the actual output/input to the system during these.
Are there any techniques for how to unit test and verify that Audio or Video is actually played (i.e. produced audible sound?) on these devices?
So with StageVideo you can play a h264 .mp4 file, and by any example which I found you need to have a fallback Video component.
Problem is that I was unable to play the .mp4 video files with the Video component on a mobile device, Android or iOS.
.flv works fine, but I can't have backup video files as it takes too much space.
Is it really necessary to have the fallback to the Video component? what are the chances it will fail?
Thanks.
From my experiences (I've created 3 separate AIR VOD apps for both iOS and Android), the following is true:
StageVideo works on Android 4.0+. I was unable to get it to work with 3.x, but I have been told it works. I can, for sure, confirm that it does not work on 2.x.
StageVideo works on iOS 5+. On iOS 5, you will need to play a silent sound at startup to make sure sound works, but you should do that regardless since the iPad 2 rarely plays sound without doing that. It is a known bug in AIR that, as far as I know, has never been attempted to be fixed
iOS can only play h.264 MP4s through StageVideo and StageWebView. They will not work in Flash video players (including VideoDisplay, the base for Video and all OSMF-based players). I do not recall the exact reason for this, but I believe it has something to do with the MP4 requirement for hardware accelerated playback.
iOS can play FLV and, maybe, F4V through the Flash video players described in #3. This will lack hardware acceleration, however. That means your video and your UI will run on the same thread and share the same process. Basically, lower framerates while video is playing. Additionally, CPU decoding is a battery drain.
Android is a little more wild. You cannot use StageWebView for any playback as of Android 4.3 (have not tested on 4.4 yet). You can use Flash video players for h.264 MP4s... on some devices. I've found that they seem to work fine on Android 3.0+ on all devices I have tested. Keep in mind that is only a couple dozen out of over a thousand possibilities, though. On 2.x, it is extremely hit-or-miss. It seems to work fine on HTC and Motorola devices (which I've tested on), but I have had reports from users who cannot playback on Samsung and Sony devices.
As you mentioned, a fallback player is definitely recommended. Without having multiple sources/encode types, the fallback is useless on iOS, however. I currently have an app in the Play Store (All About Trikes) that was originally released without a fallback player and just used a StageVideo implementation. A day after release, we started getting reports that users on 2.x couldn't play videos. We had to scramble. We first released a version that couldn't be installed on 2.x and then another version that uses Flex's VideoDisplay as a fallback, which seems to have fixed the problem for those users, but I know there will be others than cannot playback video.
Long story short, there is no fool-proof way of playing back h.264 MP4s on mobile using AIR. You do want to include a fallback player, regardless of platform. Ideally, if you are streaming the video, you should have both h.264 MP4s and FLVs available with the fallback using FLVs instead of MP4s.
Hopefully that helps.
I am (at long last) at the very end of a VOD project. It works perfectly, except on Android. Basically, on Android video will not play until the entire video has downloaded. A media server was well out of scope, so we are just serving the videos up from AWS S3. Works fantastically on iOS. Both streaming and downloading the video works exactly as you would expect it to. On Android, it just doesn't seem to want to play before the download finishes. It works well when using a server on the local network (I even see the occasional buffer, so I know it's not just quickly downloading), but nothing remote.
My only guess is that it is to do with the differences in the way iOS and Android stream video. On iOS, video streams via byte-range requests. Every few seconds, it will time itself out and request another range of bytes for the file. On Android, it only sends a single request for the entire file. Not sure how that could be fixed, however.
Does anyone have any tips or pointers here? Any help would be greatly appreciated here.
Happens on Android 4.4 and 4.3.
Using both a remote prod server we own and AWS S3.
AIR 3.9 with Flex 4.11
Utilizing StageVideo and NetStream
Test devices are a Nexus 5 and a Nexus 4
The issue was with the videos themselves. AIR for Android uses the standard approach to streaming where the entire file is requested and it reads it bit-by-bit (as opposed to iOS which requests specific byte-ranges repeatedly).
The problem here is that the player cannot begin playback until the video's metadata has been read. A standard h.264 encode sees the metadata (moov atom) located at the very end of the file, so the video does not begin until the entire video has been downloaded.
Easiest way I have found to fix this is re-encoding the videos through Handbrake with the "Web Optimized" option selected. This will ensure the metadata is located at the very beginning (byte 24, I believe) so the video should begin playing instantly.
Explanation from Adobe
Thread that gave me the idea to use the "Web Optimized" option
I've made an app to view vine videos on Android devices. These are basically .mp4 videos being loaded into a VideoView. From the following documentation (http://developer.android.com/guide/appendix/media-formats.html), mp4 video playback is supported on Android version 3+ devices.
I've already added a android:minSdkVersion="11" to the manifest file to filter out older android versions from downloading the app, but I'm still getting feedback from users running newer versions of Android (eg. 4.1, 4.2) that complain about getting a "Video cannot be played" error message.
Since there's no way (that I know of, please correct me if I'm wrong) to test video playback using the emulator, I can't really know what's going on.
Is there any way to check for a device's ability to do video playback or at least get the emulator to play videos, so I can correctly fix this issue?
pd. for those interested, here's a link to my app in Google Play. As you can see, I'm being crushed by negative reviews: https://play.google.com/store/apps/details?id=com.thirtymatches.vineflow
If you look at the Google compatibility matrix closely, you'll see that support for MP4/H264 encoding started with the Android 3.0 release. Playback of MP4/H.264 has been supported by all Android devices back to Android 1.0, so there's no need for you to limit availability to newer releases of Android (unless you have other API compatibility needs).
Via VideoView, the video playback on all these devices is done using the hardware decoder provided by the phone's chipset. So to guarantee compatibility, the video has to be encoded to lowest-common denominator. Google provides "Video Encoding Recommendations" at the bottom of the page you linked; I also wrote an answer describing how we transcoded to a form of MP4/H264 that plays across all Android devices.
Not knowing what Vine is doing with their video clips, it might be that the videos aren't all encoded with uniform encoder parameters. They might be taking the videos straight off handsets and streaming them without any additional server-side processing to ensure wide compatibility; I don't know. If that's the case, you might find it challenging to develop an Android app that can show the content without a) implementing your own software codec (as apps like RockPlayer, MX Player or VLC do) or b) transcoding the videos on a server (which probably will run afoul of Vine's terms of service).