How to calculate FrameLength without using AudioSystem? - android

I'm porting Java project to Android.
As you know, the Android does not have javax.sound package.
I need to calculate frameLength.
My sound file size is 283KB. And frame size is 4 and frame rate is 44100 and sample size in bits is 16.
The frame length was 69632 when I used just pure Java.
Do you know any equation to get this?
Thank you.

For the raw PCM data, basically, you have 4 bytes per sample.
((283 KB) * (1024 bytes per KB)) / (4 bytes per frame) ==> 72448
But a wav can include a lot besides the raw PCM, e.g., song title or artist info.
Here's some more info about wav format. You might have to load the file as raw bytes to parse the header, but the header has the frame size in a predictable location.
Maybe someone else with Android experience has already concocted a method.
Maybe Google should treat Java as an intact entity and properly license and implement it.

Related

Android - MP3 JLayer missing data

I have a server that encodes real-time voice into mono or stereo mp3 thanks to libmp3lame and sends it chunk by chunk through a WebSocket.
I'm trying to make an Android App that receives those mp3 chunks and play them with the most appropriate Audio player Android have. I went with AudioTrack since it seems pretty easy to add chunks to the player as well as "stream" oriented. (Since what I'm doing is sending to the track some byte array and not a full song that is locally stocked in the Android phone).
Since AudioTrack does not support compressed audio format (such as MP3), I have to decode those chunks into PCM to play them afterward. I'm using the famous JLayer to do this real-time decoding. Thanks to that, I can play each sample into my AudioTrack and hear what the server is sending.
My problem is that the received/player audio is badly hashed. (I can understand whatever the speaker is saying perfectly, but the quality is bad, like if the speaker had a "robotic voice").
Here is the code I'm using to receive/decode/play those byte[].
public void addSample(byte[] data) throws BitstreamException, DecoderException, IOException {
// JLayer decoder
Decoder decoder = new Decoder();
// Input Stream with the byte[] voice data
InputStream bis = new ByteArrayInputStream(data);
ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
Bitstream bits = new Bitstream(bis);
// Decoding MP3 data into PCM in a PCM BUFFER
SampleBuffer pcmBuffer = (SampleBuffer) decoder.decodeFrame(bits.readFrame(), bits);
// Sending the PCMBuffer data into Audio Track to play it
mTrack.write(pcmBuffer.getBuffer(), 0, pcmBuffer.getBufferLength());
bits.closeFrame();
}
And here is my AudioTrack initialization
mTrack= new AudioTrack.Builder()
.setAudioAttributes(new AudioAttributes.Builder()
.setUsage(AudioAttributes.USAGE_MEDIA)
.setContentType(AudioAttributes.CONTENT_TYPE_SPEECH)
.build())
.setAudioFormat(new AudioFormat.Builder()
.setEncoding(AudioFormat.ENCODING_PCM_16BIT)
.setSampleRate(48000)
.setChannelMask(AudioFormat.CHANNEL_OUT_STEREO)
.build())
.setBufferSizeInBytes(AudioTrack.getMinBufferSize(48000, AudioFormat.CHANNEL_OUT_STEREO, AudioFormat.ENCODING_PCM_16BIT))
.build();
mTrack.play();
So to understand what was happening I tried to lag each data contained in the pcmBuffer. It seems like a huge part of those data where 0 at the very beginning of the buffer (I'd say 1/5 of the buffer is 0, all of them located at the beginning). So then I took an oscilloscope and tried to get the signal my Android phone was receiving. Here is the result:
As you can see, each frame is present, but as some "blank" or 0 data values. Those 0 in the beginning of each frame makes the signal hashed and pretty annoying to listen.
I have no idea whether this comes from the MP3 signal itself, the way I'm playing it, AudioTrack, JLayer, or the way I'm decoding it. So if anyone has an idea it would be really awesome.
EDIT :
Found out something interesting. By decoding each frame header I can have access to a lot of information such as the time in ms for each frame. I logged it :
System.out.println(bits.readFrame().ms_per_frame());
I found out that each of my frames are 24ms. When I look back at the oscilloscope, I can see that each frame actually take 24ms, but the beginning/end of each frame is filled with 0. So first of all, is it a decoding problem ? If it is not, how can I have a clear signal without small breakup in each frame ?
I've been printing all the data that each frame is sending me, each frame starts with a looot of zeros. How am I supposed to have a clear signal if each frame have some kind of audio void ?
If I print the MP3 data that I'm receiving each frame (96 bits), I have the first four bytes (probably the header?) that always have the same value :
"-1, -5, 20, -60"
Then I have a fifth bit that is always equal to 0, and sometimes a sixth bit that is also equal to 0. Should I be removing those ?

OpenSL ES decode 24bit FLAC

I am trying to decode a FLAC file with 24bit sample format using OpenSL ES on Android. Originally, I had my SLDataFormat_PCM for the SLDataSink setup like this.
_pcm.formatType = SL_DATAFORMAT_PCM;
_pcm.numChannels = 2;
_pcm.samplesPerSec = SL_SAMPLINGRATE_44_1;
_pcm.bitsPerSample = SL_PCMSAMPLEFORMAT_FIXED_16;
_pcm.containerSize = SL_PCMSAMPLEFORMAT_FIXED_16;
_pcm.channelMask = SL_SPEAKER_FRONT_LEFT | SL_SPEAKER_FRONT_RIGHT;
_pcm.endianness = SL_BYTEORDER_LITTLEENDIAN;
This is working well for basically any data format. Luckily the samplesPerSec is not respected (I don't want resampling).
Now I want to support the full bit-depth of a FLAC file with 24bit samples. When using this format, it apparently performs a bit-depth conversion, because once I load the file, and then check the ANDROID_KEY_PCMFORMAT_BITSPERSAMPLE info, it is 16.
When I put bitsPerSample = SL_PCMSAMPLEFORMAT_FIXED_24; or SL_PCMSAMPLEFORMAT_FIXED_32, then OpenSL ES rejects it
E/libOpenSLES(22706): pAudioSnk: bitsPerSample=32
W/libOpenSLES(22706): Leaving Engine::CreateAudioPlayer (SL_RESULT_CONTENT_UNSUPPORTED)
Any idea how this is meant to work? Is Android currently restricted to 16 bit int only?
I would also accept 32bit float, but I don't suppose that will work either.
Currently it only supports 8 and 16 bits
Sources:
Android source code (line 60)
Article (PCM data format section)

Extract a sample from RAW audio file on Android

I have an android application that records AUDIO in raw format
how can i extract a sample of the recording?
for example if the raw file has 3 minutes of audio recorded, i would like to extract 20 seconds of the contents from an arbitrary start position
is this possible?
If the file contains interleaved PCM data with no header and you know the properties of the audio data (sample rate, number of channels, etc) the problem can be solved with basic math:
The number of bytes of audio data per second is sampleRate * bytesPerSample * numChannels.
The starting offset in bytes would be then be bytesPerSecond * offsetInSeconds, and the size of the chunk to read (in bytes) would be bytesPerSecond * lengthInSeconds.

Is it possible to get duration of the remote audio file without downloading it?

I have an Url of the remote audio file. I need to build data for adapter list with track details. Here is this part of code
Log.d("audioURL", audio.getUrl());
MediaPlayer tmpMedia;
tmpMedia = MediaPlayer.create(getContext(), Uri.parse(audio.getUrl()));
holder.txtDuration.setDuration(tmpMedia.getDuration()/1000);
tmpMedia.release();
But it works too slowly. LogCat writes something like this:
15:05:51.783: D/audioURL(776): http://cs4859.vk.me/u14195999/audios/0cbd695ddf50.mp3
15:05:51.783: D/MediaPlayer(776): Couldn't open file on client side, trying server side
15:05:53.813: D/audioURL(776): http://cs4859.vk.me/u14195999/audios/0cbd695ddf50.mp3
15:05:53.823: D/MediaPlayer(776): Couldn't open file on client side, trying server side
15:05:55.373: D/audioURL(776): http://cs4859.vk.me/u14195999/audios/0cbd695ddf50.mp3
15:05:55.383: D/MediaPlayer(776): Couldn't open file on client side, trying server side
15:05:58.143: D/audioURL(776): http://cs1626.vk.me/u149968/audios/04298447cd3c.mp3
15:05:58.153: D/MediaPlayer(776): Couldn't open file on client side, trying server side
...and so on. So, my playlist of about 30 tracks initializes with about 7 minutes.
I guess, the MediaPlayer class method getDuration() sequentially downloads these tracks (or some parts of them) to get their durations.
Is there a way to get these durations quickly, without downloading tracks?
Halim Qarroum, it seems to be a correct way, but I have some troubles with MediaMetadataRetriever class.
Here is my code above:
if (android.os.Build.VERSION.SDK_INT < 10){
holder.txtDuration.setDuration(audio.getTrackDuration());
} else {
MediaMetadataRetriever mRetriever = new MediaMetadataRetriever();
Log.d("URI", Uri.parse(audio.getUrl()).toString());
mRetriever.setDataSource(getContext(), Uri.parse(audio.getUrl()));
String s = mRetriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_DURATION);
holder.txtDuration.setDuration(Long.parseLong(s));
mRetriever.release();
}
Application terminates in mRetriever.setDataSource(getContext(), Uri.parse(audio.getUrl())); because of IllegalArgumentException. The audio.getURl() string is http://cs4859.vk.me/u14195999/audios/134dfe90d1ec.mp3.
Why the exception occurs?
Dheeb posted a well detailed answer. However, ID3 tags are not always present in an mp3 file. Instead of looking for these tags, which will force you to limit this method to mp3 files, you could use the MediaMetadataRetriever class which comes with the Android framework.
This class can give you several metadata from certain types of audio/video files, one of this information, is the duration. This method has the advantage to be standard, as it comes with the Android SDK and is not limited to one audio format.
From the Android developers related page :
MediaMetadataRetriever class provides a unified interface for
retrieving frame and meta data from an input media file.
A trivial example of code using this class :
MediaMetadataRetriever retriever = new MediaMetadataRetriever();
retriever.setDataSource(your_data_source);
String time = retriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_DURATION);
long timeInmillisec = Long.parseLong( time );
long duration = timeInmillisec / 1000;
long hours = duration / 3600;
long minutes = (duration - hours * 3600) / 60;
long seconds = duration - (hours * 3600 + minutes * 60);
There was bug related to MediaMetadataRetriever.
You could try,
metaRetreiver.setDataSource("<remoteUrl>", new HashMap<String, String>());
I'll assume mp3 since "Audio File" is a blanket phrase.
Method 1: fetch ID3 tag
Variant 1: 3rd party library
You will need to look at the ID3 tags in the mp3 file.
Unless you keep track of the metadata you want somewhere else.
To specifically get the Track length of the file you will need to look into the ID3 metadata tag for sure, specifically the 'TRCK' frame of the tag.
To only download the ID3 Tag part, you must first download the ID3 header part of the file.
This website contains very specific information about the ID3 Tag format. You will need to look at the version number of the ID3 Tag and then, based on that, you will need to find the information regarding how long the ID3 Tag is. Then, you must download the WHOLE tag because the frames are not in any specific order.
Then you should be able to use a third party library to find the TRCK frame and its data.
Variant 2: HTTP Hack
For ID3v2 tags, grab the start of the file. (It's possible for ID3v2 frames to be elsewhere, but in practice they're always there.) You can't tell how long the tag is going to be in advance. For text-only tags you're likely to find the information you want in the first 512-1024 bytes. Unfortunately more and more MP3s have embedded ‘album art’ pictures, which can be much longer; try to pick an ID3 library that will gracefully ignore truncated ID3 information.
ID3v1 tags are located at the end of the file. Again you can't tell how long they're going to be. And of course you don't know in advance whether the file has ID3v1 tags, ID3v2 tags, both or neither. Generally these days ID3v2 is a better bet though.
To read part of a file through HTTP you need the Range header. This too is not supported everywhere.
Method 2: Estimation
File size you can get with an HTTP HEAD request. Duration meaning playing time in seconds, cannot be gotten without fetching the entire file. You can guess, by fetching the first few MP3 frames, looking at their bitrate, and assuming that the rest of the file has the same bitrate, but given the popularity of Variable Bit-Rate encoding the likelihood this will be close to accurate is quite low.
ID3 tags can in theory contain information that might allow you to
guess the length better, in the ASPI and ETCO tags. But in practice
these are very rarely present.
Credits
Credits go to various authors on SO and the interwebs, ofcourse the guy on the first floor in my head.

Determining audioformat of audio files in Android

Is there a way to determine the audio format of an audio file in Android? On normal java I do it like this:
File file= new File(...);
AudioInputStream stream = AudioSystem.getAudioInputStream(file);
AudioFormat format= stream.getFormat();
android.media.AudioTrack[1] has the following methods to access information about audio data:
getChannelCount to determine the number of channels
getChannelConfiguration to determine if you deal with mono or stereo content
getSampleRate to find out the sampling frequency
and
getAudioFormat to determine if you deal with 8bit or 16bit sample width.
The AudioTrack.getXXX methods you list merely return the values supplied to the constructor. This doesn't solve the original poster's issue.

Categories

Resources