Is there a best way to go about panning stereo audio in an android app? I have been looking for some decent audio libraries but have yet to come up with anything solid.
MediaPlayer.setVolume allows you to set the volume of the left and right channels independently.
For more details have a look at This SO question.
Related
I want to control zoom or focus of my Android phone's camera using audio jack. It is common to use audio jack to take picture or probably we may call it as shutter. It is done by connecting the mic port and the ground as shown in the picture 2 (I took it from here. It is not Android, but probably the same idea is possible with Android). Is it possible to do zoom or focus using the audio jack? Say, like we manipulate or rearrange it something like explained in picture 1? Picture 1 taken from here.
The Android specification for the audio Jack, this gives you the input/key events that can be generated by the audio jack. Your camera application should listen for these events and do the appropriate action.
How to create hardware to generate the events is here
I have 5 audio files with each having length 5 seconds. I wanted to play each sound file one by one but the condition is if an audio file playing next file should play after 4 seconds ie adjacent audio file sounds should overlap for 1 second. How can I implement it? Which is the best audio player you can suggest?
The amount of specific work you can do with audio playback on the Java side is pretty limited on android.
It sounds like you will need to mix your sounds at some point during their playback to overlap.
The best way to do this in my head is through a C++ library called Oboe (I am currently working with this). This is a library created by Google for audio playback. Now hold on now, let me explain! I know implementing C++ (especially if your only on the Java stack right now) can add a bit of time to your project.
The reason this came to mind is because in this way of playing audio (through Oboe/C++), you physically move individual bits of the audio sample through a buffer stream. The C++ libraries also actually have a Mixer class that you can put 2 different audio samples (up to 100 actually) into to mix, and then eventually render through the buffer stream.
Using this methodology, you can add specific logic to manage when your audio starts playing (after 4 seconds if adjacent). At which point you can mix the first second of the next clip with the current playing clip.
Now the exciting bit, is you may be able to replicate this process in Java! I found this post which may be of help to you:
Android: How to mix 2 audio files and reproduce them with soundPool
Now I do warn you, rendering audio in this way (through buffer streams) is a complicated process, and some extra research may be needed to fully understand the process. I can't say I know all of the functionality of the Java audio libraries, but I'm willing to bet they don't have much support for mixing sound in the way that you need. So most likely you will have to mix it yourself, or your last resort might be to use the NDK (C++).
Hopefully this answer helps. The best wishes in your research! Hopefully you will find a simple way that works. (If you do, don't forget to share your findings on this question!)
I am just developing a sample app in unity (as a beginner) so i am stuck at a point, i need to change the sound which is recorded (like in Talking Tom app).
i am done with recording the audio but when i increase the pitch of the sound the speed of the playback is also changed. i need the playback speed is normal only pitch must be changed.
so can anyone help me on this issue.
Thanks in advance
After a bit of researching, I found that what you're trying to do is called "pitch shifting". It involves a lot of math and mucking about with sound packets apparently because changing the pitch of a sound, automatically changes it's playback speed. Getting it back to the speed you want while still keeping the audio at something considered "normal" is no walk in the park.
In any case, since Unity3D uses C#, you might (and I stress the word might) be able to use this open source library to get the sound effect you need. It's based on NAudio (also open source, and C#) so you should theoretically be able to use it, or parts of it in your project.
Hi I'm using the game engine AndEngine, and I want to be able to stream live video from a webcam on a robot to my Android app. The reason I'm using AE is because I need game controls that control my robot. However, I have no idea how to stream video when using AndEngine (or even when not using it, for that matter). The controls and video feed need to be in the same screen (unless there's absolutely no other way). My question is how would one put a video stream over-top an AndEngine scene, and/or how would one format that feed so that it didn't obscure the controls? (they're oriented in the bottom left and top right of the screen, which is a pain I know, but I don't think I can change it due to some problems with multi-touch on my device).
Thanks.
Look at the Augumentged Reality example at GitHub.
https://github.com/nicolasgramlich/AndEngineExamples
It could be of use to you. However, I know that this example was problematic and didn't work when I tried it, but maybe you'll have more luck.
I've been asking around in developer circles about this, but so far no one has been able to confirm if it is possible. The default behavior for recording video in Android is to not use the external mic. I'd like to create an app that uses an external mic if it is available, but it seems like this might be tricky for some reason. Anyone have insight into this?
It seems like it would just be a matter of selecting it at this point in the recording setup:
recorder.setAudioSource(MediaRecorder.AudioSource.DEFAULT);
but it seems like there may be some oddness in doing that.
Thanks,
Jon
Default behavior in Android 12 (and I understand starting at 8 or 9) is that external microphones become the primary and default source for input.
It probably depends on the app, but at least video in the default camera app takes my external mic as default.
Take a look at the documentation here (look in the section titled "Performing Audio Capture") it covers setting up the audio capture.