Android: Unit Testing Media Playback - android

I'm working on applications which have audio and video playback as core elements of the user experience. Something I'm noticing is that across the wide array of android devices on the market playback will or won't work in various ways. The system will often behave as though playback completed successfully (calling callbacks, not erroring) even though the user only heard a portion of the audio, or certain parts didn't play correctly.
I've got a good emulation environment set up for running Unit Tests on various devices, however I don't have any good methods to verify the actual output/input to the system during these.
Are there any techniques for how to unit test and verify that Audio or Video is actually played (i.e. produced audible sound?) on these devices?

Related

How To disable system settings automatically while playing music in Android

This question is for anyone who knows about audio on Android devices.
We need an Android app for auditory training.
In this app, individualised music is downloaded for each user and then listened to for a while for trining purposes.
It is essential that this music is played unchanged, i.e. without the influence of equalisers, room sound, bass amplification, etc.
Is there a way to bypass the system audio settings in the app or to deactivate them automatically?
Different manufacturers seem to handle the system audio settings somewhat differently. We have noticed differences between Samsung, Google, Sony, Motorola, OnePlus, Redmi.
Therefore, it might make the most sense to bypass the system audio within the app.
We have made attempts with CONTENT-TYPE and USAGE_ because we had the hope that USAGE_VOICE-COMMUNICATION would bypass the equalisers. But that doesn't work and some manufacturers seem to activate compressors or noise reduction, which degrades the quality of the music.
So far we are working with the integrated player.
Does a self-programmed player make a difference?
Does anyone know a solution?
Thank you!

Sharing audio(Mic) input to multiple apps in Android

As per official documentation
Android 10 (API level 29) and higher imposes a priority scheme that can switch the input audio stream between apps while they are running. In most cases, if a new app acquires the audio input, the previously capturing app continues to run but receives silence. In some cases, the system can continue to deliver audio to both apps. The various sharing scenarios are explained below.
Other than some special cases, audio is not shared between apps.
But I have seen many apps sharing the audio input without being in the above special cases.
For eg. Zoom, when I'm on a call in zoom and start an audio recorder then both the apps are getting audio though zoom audio decreases in intensity.
Similarly, Omlet arcade is able to record mic audio even when mic access is given to other apps.
How is it possible? And as per the documentation, this shouldn't be allowed.
Update:
Was able to achieve it with the usage of Oboe. But it is not consistent on all devices. This also causes a sync issue in my live streaming app. Audio is audible with a delay
This is not possible in Android 5+ . You need a rooted phone to perform this action. In Omlet Arcade Whenever you play a Game and switch ON in-game mic, Omlet Arcade will stop receiving any audio input. However, Omlet Arcade will still function but you have to restart it in order to get voice input back.
Though, in a recent MIUI bug, People were able to listen to calls on Zoom and in-game mic apps. In your case, it might be not official Android and Edited Android like MIUI and OxygenOS

Unity video playback on Android devices skips badly

So we've been fighting this for weeks. We are building a Unity (5.6 and 2017.2) app for Fire TV / Fire Stick devices (among others). It's primarily a media app sharing our own MBR content over HLS, served from a Wowza server. Every player we have tried results in the following behavior: Every two seconds or so there is a skip in the A/V playback, just a few ms or more. Some videos display this more than others, it seems. The audio and video remain in sync, just skipping frames regularly. The result is, in some cases, nearly unwatchable.
We've tried several media player plugins for Unity (UMP, NexPlayer, AvPro), and they all do the same. They play HLS content from external sources perfectly fine, but our own served content is nasty, even things we didn't encode ourselves. This is only for Unity/Android clients though; Roku and Apple TV play the content fine, as do players on Windows. It is just the confluence of Unity/Android (Fire OS, but also others) and our served content.
It seems like a Wowza setting problem but again, other clients play just fine from the same hosts. Has anyone run across this issue, and have recommendations for setting up the plugins or tweaking Wowza? Is there a specific plugin that you've successfully used as a video player in Unity, for Fire TV?
We're doing all the Unity-side things correctly we think. (Multithreaded GL rendering, etc.)

Android: play multiple mp3s simultaneously, with precise sync and independent volume control

I want to create an Android app that plays multiple mp3s simultaneously, with precise sync (less than 1/10 of a second off) and independent volume control. Size of each mp3 could be over 1MB, run time up to several minutes. My understanding is that MediaPlayer will not do the precise sync, and SoundPool can't handle files over 1MB or 5 seconds run time. I am experimenting with superpowered and may end up using that, but I'm wondering if there's anything simpler, given that I don't need any processing (reverb, flange, etc.), which is superpowered's focus.
Also ran across the YouTube video on Android high-performance audio, from Google I/O 2016. Wondering if anyone has any experience with this.
https://www.youtube.com/watch?v=F2ZDp-eNrh4
Superpowered was originally made for my DJ app (DJ Player in the App Store), where precisely syncing multiple tracks is a requirement.
Therefore, syncing multiple mp3s and independent volume control is definitely possible and core to Superpowered. All you need is the SuperpoweredAdvancedAudioPlayer class for this.
The CrossExample project in the SDK has two players playing in sync.
The built-in audio features in Android are highly device and/or build dependent. You can't get a consistent feature set with those. In general, the audio features of Android are not stable. That's why you need a specialized audio library which does everything "inside" your application (so is not a "wrapper" around Android's audio features).
When you are playing compressed files (AAC, MP3, etc) on Android in most situations they are decoded in hardware to save power, except when the output goes to a USB audio interface. The hardware codec accepts data in big chunks (again, to save power). Since it's not possible to issue a command to start playing multiple streams at once, what will often be happening is that one stream will already send a chunk of compressed audio to hardware codec, and it will start playing, while others haven't yet sent their data.
You really need to decode these files in your app and mix the output to produce a single audio stream. Then you will guarantee the desired synchronization. The built-in mixing facilities are mostly intended to allow multiple apps to use the same sound output, they are not designed for multitrack mixing.

StageVideo fallback in Air mobile doesn't work. Video component won't play .mp4

So with StageVideo you can play a h264 .mp4 file, and by any example which I found you need to have a fallback Video component.
Problem is that I was unable to play the .mp4 video files with the Video component on a mobile device, Android or iOS.
.flv works fine, but I can't have backup video files as it takes too much space.
Is it really necessary to have the fallback to the Video component? what are the chances it will fail?
Thanks.
From my experiences (I've created 3 separate AIR VOD apps for both iOS and Android), the following is true:
StageVideo works on Android 4.0+. I was unable to get it to work with 3.x, but I have been told it works. I can, for sure, confirm that it does not work on 2.x.
StageVideo works on iOS 5+. On iOS 5, you will need to play a silent sound at startup to make sure sound works, but you should do that regardless since the iPad 2 rarely plays sound without doing that. It is a known bug in AIR that, as far as I know, has never been attempted to be fixed
iOS can only play h.264 MP4s through StageVideo and StageWebView. They will not work in Flash video players (including VideoDisplay, the base for Video and all OSMF-based players). I do not recall the exact reason for this, but I believe it has something to do with the MP4 requirement for hardware accelerated playback.
iOS can play FLV and, maybe, F4V through the Flash video players described in #3. This will lack hardware acceleration, however. That means your video and your UI will run on the same thread and share the same process. Basically, lower framerates while video is playing. Additionally, CPU decoding is a battery drain.
Android is a little more wild. You cannot use StageWebView for any playback as of Android 4.3 (have not tested on 4.4 yet). You can use Flash video players for h.264 MP4s... on some devices. I've found that they seem to work fine on Android 3.0+ on all devices I have tested. Keep in mind that is only a couple dozen out of over a thousand possibilities, though. On 2.x, it is extremely hit-or-miss. It seems to work fine on HTC and Motorola devices (which I've tested on), but I have had reports from users who cannot playback on Samsung and Sony devices.
As you mentioned, a fallback player is definitely recommended. Without having multiple sources/encode types, the fallback is useless on iOS, however. I currently have an app in the Play Store (All About Trikes) that was originally released without a fallback player and just used a StageVideo implementation. A day after release, we started getting reports that users on 2.x couldn't play videos. We had to scramble. We first released a version that couldn't be installed on 2.x and then another version that uses Flex's VideoDisplay as a fallback, which seems to have fixed the problem for those users, but I know there will be others than cannot playback video.
Long story short, there is no fool-proof way of playing back h.264 MP4s on mobile using AIR. You do want to include a fallback player, regardless of platform. Ideally, if you are streaming the video, you should have both h.264 MP4s and FLVs available with the fallback using FLVs instead of MP4s.
Hopefully that helps.

Categories

Resources