I tried to encode an mp3 sound in android device so I used lame encoder, it was successful but there are some settings and parameters for encoding like sample rate, bitrate and etc, I searched to find what they are and what choices is possible but I do not find something good, can anyone help me with them?
Here they are:
public static final int NUM_CHANNELS = 1;
public static final int SAMPLE_RATE = 16000;
public static final int BITRATE = 128;
public static final int MODE = 1;
public static final int QUALITY = 2;
To find out more about these parameters, take a look at http://en.wikipedia.org/wiki/MP3 first. You may need to check more specific details on mp3 codec documentation, too.
Related
I'm working on a recorder app on iOS and a friend of mine he's doing it on Android, but when he want's to play my url sound which I send to server and which also works fine for me when I play it .. he hears a lot of noise and not my actual recorder.
func setupRecorder() {
let myID = userDefaults.value(forKey: "user_id") as! Int
let message_token = generateUniqueToken()
self.curentFileName = message_token
let currentFileName = "\(String(describing: self.curentFileName!))"+"."+"\(myID).m4a"
let documentsDirectory = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0]
self.soundFileURL = documentsDirectory.appendingPathComponent(currentFileName)
print("writing to soundfile url: '\(soundFileURL!)'")
if FileManager.default.fileExists(atPath: soundFileURL.absoluteString) {
print("soundfile \(soundFileURL.absoluteString) exists")
}
let recordSettings: [String : Any] = [AVFormatIDKey: kAudioFormatAppleLossless,
AVEncoderAudioQualityKey: AVAudioQuality.max.rawValue,
AVEncoderBitRateKey: 320000,
AVNumberOfChannelsKey: 2,
AVSampleRateKey: 44100.0]
do {
soundRecorder = try AVAudioRecorder.init(url: soundFileURL, settings: recordSettings)
soundRecorder.delegate = self
soundRecorder.isMeteringEnabled = true
soundRecorder.prepareToRecord()
}
catch {
soundRecorder = nil
print(error.localizedDescription)
}
soundRecorder.prepareToRecord()
}
This is my recorder settings....
What I have from him it s only
private static final int SAMPLE_PER_SEC = 8000;
private static final int CHANNEL_TYPE = AudioFormat.CHANNEL_IN_MONO;
private static final int AUDIO_ENCODING = AudioFormat.ENCODING_PCM_16BIT;
private static int BUFFER_ELEMENT_TO_REC = 1024; // want to play 2048 (2K) since 2 bytes we use only 1024
private static int BYTES_PER_ELEMENT = 2;
ENCODING_PCM_16BIT
He said I m encoding my sound somehow and he s doing a raw record...I have no ideea where i m messing it since it s my first time with AvFoundation
Instead of using kAudioFormatAppleLossless change that to kAudioFormatLinearPCM. kAudioFormatAppleLossless format is a format only for Apple devices. It is better to use a common format for Android and iOS
devices, so that if any processing in the server for the audio, server doesn't have to handle the audio in two different way
I noticed in my crash reporter I had several crashes on some devices with stack trace:
Fatal Exception: java.lang.RuntimeException: takePicture failed, error=-38
at android.hardware.Camera.native_takePicture(Camera.java)
at android.hardware.Camera.takePicture(Camera.java:1728)
at android.hardware.Camera.takePicture(Camera.java:1661)
I know this is a common error and it can have many causes but it's one of the first time I have an error number. Where can I find a list of these error numbers and their meaning?
There can be many reasons for this in my case i was trying to take photo without preview (hidden photo) and i was using SurfaceView, So i replaced it with
SurfaceTexture surfaceTexture = new SurfaceTexture(10);
camera.setPreviewTexture(surfaceTexture);
and the problem was solved...
P.S I was getting this error only on Above 6.0 devices
I just found a list of errors in the file Camera.java:
private static final int NO_ERROR = 0;
private static final int EACCESS = -13;
private static final int ENODEV = -19;
private static final int EBUSY = -16;
private static final int EINVAL = -22;
private static final int ENOSYS = -38;
private static final int EUSERS = -87;
private static final int EOPNOTSUPP = -95;
This post is also related to my question: MediaRecorder start error codes
Not very useful though..
I am trying to develop an application which plays one of the predefined arrays of sounds continuously and also change the the tempo,pitch (individually) etc using soundpool class of android.
I know how to change the pitch of a sound.
But I do not know how to play the whole array and also the tempo of that whole array.
Please help!
Thank you in advance!
EDIT : I think I will have to use Handler in order to play the array of sounds, but i don't know how!
public class Sound {
private static int length = 22050 * 10; //10 seconds long
private static byte[] data = new byte[length];
static void fillRandom() {
new Random().nextBytes(data); //Create some random noise to listen to.
}
static void play() {
fillRandom();
final int TEST_SR = 22050; //This is from an example I found online.
final int TEST_CONF = AudioFormat.CHANNEL_OUT_MONO;
final int TEST_FORMAT = AudioFormat.ENCODING_PCM_16BIT;
final int TEST_MODE = AudioTrack.MODE_STATIC; //I need static mode.
final int TEST_STREAM_TYPE = AudioManager.STREAM_ALARM;
AudioTrack track = new AudioTrack(TEST_STREAM_TYPE, TEST_SR, TEST_CONF, TEST_FORMAT, length, TEST_MODE);
track.write(data, 0, length);
track.play();
}
}
This code play some random noises. All you have to do is fill the byte[]data array with the proper music files you intend to play...
Solved: I forgot the track.play(); at the end...
I want to play a sound on my Android Smartphone (4.0.4 Api level 15).
I tried to hear some random noise, but its not working:
public class Sound {
private static int length = 22050 * 10; //10 seconds long
private static byte[] data = new byte[length];
static void fillRandom() {
new Random().nextBytes(data); //Create some random noise to listen to.
}
static void play() {
fillRandom();
final int TEST_SR = 22050; //This is from an example I found online.
final int TEST_CONF = AudioFormat.CHANNEL_OUT_MONO;
final int TEST_FORMAT = AudioFormat.ENCODING_PCM_16BIT;
final int TEST_MODE = AudioTrack.MODE_STATIC; //I need static mode.
final int TEST_STREAM_TYPE = AudioManager.STREAM_ALARM;
AudioTrack track = new AudioTrack(TEST_STREAM_TYPE, TEST_SR, TEST_CONF, TEST_FORMAT, length, TEST_MODE);
track.write(data, 0, length);
}
}
I have played a little bit with the variabels, but could not get it to work.
All you have left to do is play it. Add this line to the end of your play() function:
track.play();
I am trying to develop an app which detects whistle sound, I have used musicg library which is working fine but the problem is that is do no work with all the devices. How this library is working is that a recorder thread is contineously recording audio in background and a detector thread is matching the recorded buffer via whistleApi.isWhistle(buffer); method.
The problem is that above mentioned method is always returning false for some specific devices but returning true for samsung and google nexus which is right at that point.
Can anybody tell me that what should I do to resolve this issue as this is very important for my project. Help me out!
In RecorderThread class change "AudioFormat.CHANNEL_CONFIGURATION_MONO in CHANNEL_IN_MONO and frameByteSize=4096....
private AudioRecord audioRecord;
private boolean isRecording;
private int channelConfiguration = AudioFormat.CHANNEL_IN_MONO;//CHANNEL_CONFIGURATION_MONO
private int audioEncoding = AudioFormat.ENCODING_PCM_16BIT;
private int sampleRate = 44100;//44100
private int frameByteSize = 4096; // for 1024 fft size (16bit sample size)//4096
byte[] buffer;