I developed an Android game a few years ago. I have a newer device now (a Galaxy S20 FE, Android version 12), and I'm noticing problems with MIDI music and sound effects that I didn't experience on my old device (a Galaxy ON5, Android version 6.0.1).
For both music and sound effects, it doesn't seem to be playing higher notes on the new device. I'm not sure what the cutoff is, but I have one particular pair of sound effects that are identical except for one of them being shifted a few notes higher. I can hear the lower one without any problems, but I don't hear the higher one at all on the new device. On the older device, I can hear both sounds, plus other sounds which are much higher.
If anyone wants to test it, this is the game:
https://play.google.com/store/apps/details?id=org.pandcorps.botsnbolts
The nearly identical sound effects are the player shooting (which I can't hear) and enemies shooting (which I can hear). There are some cannons near the beginning of the Rockslide Bot level which are a quick place to test the enemy shooting sound. The Hail Bot level's music has high notes that I can't hear at all on the new device (plus a lower bassline that's easy to hear).
Has anyone else experienced this? Had anyone found a solution?
I use SoundPool for MIDI sound effects and JetPlayer for MIDI music. I'm not sure if something changed in Android or if there's something different about the new phone. I expected small changes in the MIDI audio between them, but not this drastic.
Related
Whenever I go to sleep, I always have a playlist of YouTube videos playing in the background (on my phone, not a computer or TV). There are several videos or series that I would love to add to my "background noise" playlist; unfortunately, the volume itself on the audio wavers far too much, to the point that the volume I set on my phone is either too high that it risks waking me up shortly after falling asleep, or too low that I cannot hear half of the video.
I have tried searching through various apps on the Google Play Store, but it seems most of them try to equalize volume based on frequency (such as treble and bass), and not intensity (in decibels). I am looking for an app (or if it exists, a setting in my phone) where I can set a limit of __ to __ decibels, and all audio coming out of my phone will be within that range (similar to noise gating but with output instead of input).
If anyone has a solution or app out there that would fit this description, please do let me know. Thank you in advance!
I have tried browsing around in the audio settings on my Galaxy S22+ but have not seen anything that would help. I have also downloaded about 5 or 6 different equalizer apps, but again, they all were based on frequency and not intensity/volume itself.
So with StageVideo you can play a h264 .mp4 file, and by any example which I found you need to have a fallback Video component.
Problem is that I was unable to play the .mp4 video files with the Video component on a mobile device, Android or iOS.
.flv works fine, but I can't have backup video files as it takes too much space.
Is it really necessary to have the fallback to the Video component? what are the chances it will fail?
Thanks.
From my experiences (I've created 3 separate AIR VOD apps for both iOS and Android), the following is true:
StageVideo works on Android 4.0+. I was unable to get it to work with 3.x, but I have been told it works. I can, for sure, confirm that it does not work on 2.x.
StageVideo works on iOS 5+. On iOS 5, you will need to play a silent sound at startup to make sure sound works, but you should do that regardless since the iPad 2 rarely plays sound without doing that. It is a known bug in AIR that, as far as I know, has never been attempted to be fixed
iOS can only play h.264 MP4s through StageVideo and StageWebView. They will not work in Flash video players (including VideoDisplay, the base for Video and all OSMF-based players). I do not recall the exact reason for this, but I believe it has something to do with the MP4 requirement for hardware accelerated playback.
iOS can play FLV and, maybe, F4V through the Flash video players described in #3. This will lack hardware acceleration, however. That means your video and your UI will run on the same thread and share the same process. Basically, lower framerates while video is playing. Additionally, CPU decoding is a battery drain.
Android is a little more wild. You cannot use StageWebView for any playback as of Android 4.3 (have not tested on 4.4 yet). You can use Flash video players for h.264 MP4s... on some devices. I've found that they seem to work fine on Android 3.0+ on all devices I have tested. Keep in mind that is only a couple dozen out of over a thousand possibilities, though. On 2.x, it is extremely hit-or-miss. It seems to work fine on HTC and Motorola devices (which I've tested on), but I have had reports from users who cannot playback on Samsung and Sony devices.
As you mentioned, a fallback player is definitely recommended. Without having multiple sources/encode types, the fallback is useless on iOS, however. I currently have an app in the Play Store (All About Trikes) that was originally released without a fallback player and just used a StageVideo implementation. A day after release, we started getting reports that users on 2.x couldn't play videos. We had to scramble. We first released a version that couldn't be installed on 2.x and then another version that uses Flex's VideoDisplay as a fallback, which seems to have fixed the problem for those users, but I know there will be others than cannot playback video.
Long story short, there is no fool-proof way of playing back h.264 MP4s on mobile using AIR. You do want to include a fallback player, regardless of platform. Ideally, if you are streaming the video, you should have both h.264 MP4s and FLVs available with the fallback using FLVs instead of MP4s.
Hopefully that helps.
I'm new to the android platform, and I wanted to develop an app that runs in the background and reads the microphone input, applies a transformation to it, and outputs the resulting audio to the speaker.
I'm wondering if there is any lag perceived by the user in this process, or if it's possible to do it in near-realtime so that the user can hear the transformed audio in sync with the ambient audio. Thanks!
Yes, users will hear a severe latency lag or echo with attempts at real-time audio on current unmodified Android devices using the provided APIs.
The summary is that Android devices are configured for fairly long audio buffers, which has been reported to be in the somewhere around the range of 100 to 400 milliseconds long, depending on the particular device and the Android OS version it is running. (Shorter buffers might be possible on Android devices on which one can build and install a modified custom build of the OS with your own custom audio drivers.)
(Humans hear echoes at somewhere around or above 25 mS. Audio buffers on iOS can be as short as 5.8 mS, so you may have better luck trying to develop your near-real-time audio processing on a different device platform.)
Audio processing on android isn't all the great, in fact to be honest, it sucks. The out-of-the-box latency on android devices for such things is pretty awful. You can however tinker with the NDK and try to put together something based on OpenSL ES which will have significantly low latency.
There is a similar StackOverflow question: Playing back sound coming from microphone in real-time
Some other helpful links:
http://arunraghavan.net/2012/01/pulseaudio-vs-audioflinger-fight/
http://www.musiquetactile.fr/android-is-far-behind-ios/
http://www.geardiary.com/2012/02/21/the-dismal-state-of-android-as-a-music-production-solution/
On the other side of the coin, android mic quality is way better than IOS quality. I have a galaxy s4 and a huawei very low end phone and both have a wonderful mic quality when recording.
I have made a music visualizer app for Android. While developing on my HTC Legend (running Android 2.2), I noticed that setting the "media volume" of the phone had no effect on the output of the Visalizer class, i.e. I always got the full-volume amplitude data of the playing music, regardless of the volume setting, which was great because that's precisely what I want.
I have recently purchased an Asus EEE transformer tablet, running Android 3.2, and now the user-set volume DOES impact the volume of the data I get back from the Visualiser class.
Does anyone know what the official behaviour should be? I'd hope for volume independence, but the evidence I've seen points to inconsistent behaviour across different devices...
Is this a driver issue, or has the behaviour changed in 3.2?
Thanks!
Nils
Refere this link here. I think that you are not enable equalizer and visualizer for same seesion id.Accroding to me Audio effects need equalizer engine enable for audio modifications settings.Otherwise it gives higher values and it will be affected by media volume also.
Ok. So there are a bagillion different Android devices. I have a video streaming service, works wonderfully for iOS. My app has a live video feature and a saved video clip playback feature (which streams to the device too). I've run some tests on different Android devices and get a whole bunch of different playback results. I am using a 640x480 h.264 base profile video. Streaming that video works only on some devices. For other devices, that same video stream can be made to stream at low resolution and that works on some devices, but still not others. The high profile streaming goes through http://www.wowzamedia.com/ (rtsp) and doesn't work on any Android device (but works on iPhone). The lowest and worst option is Motion JPEG, which works on all tested devices so far.
So my question is, how can I figure out (without having to test every device out on the market) if the device will play: 640x480 h.264 base profile - if that wont work then play the low resolution video - if that doesn't work, default to Motion JPEG.
Also, any idea why my rtsp transcoded through wowza works on the iPhone but not on any Android device (not even the Motorola Atrix)?
Streaming on android is an absolute mess. Most devices don't support anything higher than Baseline 3.0. If you encode for iPhone 3, it should generally work via RTSP. Newer versions of android support HLS, but it's hit or miss and largely dependent on specific devices.
I resolved this problem. Check RTP-realization in your streaming service and x264 profile. My RTSP-server works fine on 90% devices.
p.s
Some video frameworks in different Android versions can implement RTP and RTSP protocols with some differences.
These are some of the links/issues which I have come across, while trying to make streaming work in varied devices.
MediaPlayer seekTo doesn't work for streams
MediaPlayer resets position to 0 when started after seek to a different position
MediaPlayer seekTo inconsistently plays songs from beginning
Basic streaming audio works in 2.1 but not in 2.2
MediaPlayer.seekTo() does not work for unbuffered position
Streaming video when seek back buffering start again in videoView/Mediaplayer
Even the big shots in stackoverflow are wondering about this
If you want just streaming without seeking (which is lame), this can be achieved. But then if you receive a call while you are watching, you will end up from the start.