Kivy Play Audio Not Working - android

I'm attempting to run Kivy's audio example found here. The specific example is this:
sound = SoundLoader.load(filename='test.wav')
if not sound:
# unable to load this sound ?
pass
else:
# sound loaded, let's play!
sound.play()
When attempting to play any .wav file in my directory however, I just hear a click or pop noise. I've tried with different .wav files and am experiencing the same problem. I've also attempted to print out the length of the sound by using the length function, but it returns 0 for some reason. Any feedback or input from you guys? Thanks again.

Can't say just looking at the snippet above how you've implemented it in your Application or what's wrong.
There is a more elaborate example should be in the examples directory under your Kivy install directory that should clear up how to use SoundLoader.
Hope that helps.

Related

Record audio and play it simultaneously

I'm not sure if this is the right place to ask this...sorry in advance if it is wrong.
Currently I have an app that can record audio with the mic and afterwards plays the file.
What I am wondering now is if it is possible to play the recorded file directly. So like a phone call, only that the file is always on one device and is recorded and played from the same device.
Is that possible? Maybe with streaming or something? Unfortunately I am a noob.
I would be very happy about an answer and thanks in advance
It is possible, to play the audio directly with a stream method. The data will be saved in background. To get basic knowledge I suggest to take a look at these examples, explained by the android-studio documentation. This is a taff project as a beginner. Start step by step. First try to record audio, then stream with mediaPlayer and saving it. And the final step is all together. Cheers.

Libgdx audio wav file not playing soundpool not ready?

I am trying to add sound to my Libgdx game using a wav file. It is supposed to play in the background of the main menu and loop but for some reason it does not play. Does anyone have any insight into way this may be. I also notice in the logcat that soundpool is not ready. I have added the following code to the constructor of my screen and the file is located in my assets folder. Thanks.
Sound wavSound = Gdx.audio.newSound(Gdx.files.internal("sound.wav"));
wavSound.play();
Make sure the file extension is named correctly and that the file is encoded in the proper format.
Renaming sound.mp3 to sound.wav will trigger a SoundPool: Not Ready error.

Gstreamer streaming over udp

I have ported gstreamer to android and I am using eclipse (juno). Now, I am able to receive audio stream over udp, but when I try to give any video clip as input, I get the error, amcaudiodec-omxgoogleacdecoder - Gstreamer encountered an internal library error. I dont know how to solve this decoder problem. Any idea regarding what might the error could be?
Also, I wanted to know what is the difference between the plugins playbin and playbin2. Can anyone please explain?
You would have to add "androidmedia" plugin to the Android.mk file, as per plugins.mk file.
I suspect, as you have not included this in your Android.mk, playbin is trying to internally invoke this element, hence the lob
I'm not sure about this, but your problem may be that you cannot use a pipeline meant to play audio files to play a video file.
Video files are usually in a container format such as .avi, .mkv which need to be demultiplexed into separate audio and video streams. Once you have this audio stream you can apply your audio pipeline to play audio.
Here's an example pipeline to play audio while ignoring video: (Try it on the command line)
gst-launch filesrc location=test.mp4 ! qtdemux ! faad ! audioconvert ! audioresample ! autoaudiosink

Android combine audio and video file with ffmpeg programatically

Like the title mentioned, I want to combine an audio file with a video file programatically on android platform. So, if your guys have any idea about it, thanks for your reply!
First, build ffmpeg for android: http://trac.ffmpeg.org/wiki/How%20to%20compile%20FFmpeg%20for%20Android
Then you can invoke it with arguments by calling the main function in ffmpeg.c and it will be like running it normally, except it will actually run in your app's process.

playing video using monkeyrunner

I am playing a mp4 file using monkeyrunner's startactivity function,it is working.Now,i want to know whether the video is still playing or has ended.But,as monkeyrunner doesn't wait for the video(or any other activity)to finish or say "return".How to know whether the video finished or not?
You'll need to look at the android debug log in a sub-process. Hopefully the app that you are using to play the mp4 will have some type of log output when it finishes playing the file.
If you play the video file with the Gallery app, when you start playing the file, the logcat will show:
Displayed com.android.gallery3d/.app.MovieActivity
After the app finishes playing the file, the logcat will show:
Displayed com.android.gallery3d/.app.Gallery
You can look for these strings in the logcat.
As a side note, there are some system properties that you can check using monkeyrunner:
device.getProperty('am.current.package')
device.getProperty('am.current.action')
These show the package name of the currently running package and the current activity's action. Unfortunately when i used them, they returned a blank string, not the names I wanted.
I hope this is useful.

Categories

Resources