Playing audio from amr on Android - android

I can receive a byte array from a Socket. I need to play audio from this byte array - the audio is encoded with AMR 8000Hz.
I found that I can play AMR audio with MediaPlayer. However, MediaPlayer can't play music from byte array, and I don't want to write them to file.
Is there a way to play AMR sound from byte array on android?

The Android framework allows you to play back audio data directly from memory using the AudioTrack class; the drawback is that the audio must already be decoded into PCM data. If you are lucky enough to target Android 4.1, there are new APIs that allow you to decode the data separately so it can be passed to AudioTrack (see MediaExtractor and MediaCodec). However, prior to that there were no exposed APIs for encoding/decoding beyond MediaRecorder and MediaPlayer.
If targeting a version of Android prior to 4.1 (which I imagine you probably are) you have two options:
Find a 3rd party decoder for the AMR data so you can pass it on to AudioTrack
Save your data to a file (even temporarily) so it can be handed to MediaPlayer
HTH

Related

Why can not I play an audio recorded with AudioRecord on Android?

I need to record an audio on android that I later want to encrypt. So I'm working with the AudioRecord class, since it works the audio at a low level using the bytes directly.
I found a piece of code that works with short and then converts it in to bytes, which is what I want. But once I have created the audio, I can not play it with any audio player in the phone.
What should I have to do in order for the phone to recognize it as a valid audio file?
Please forgive me because I really don't remember all in detail, but I had this issue before and I do remember that the audio recorded by AudioRecord has no format, so in order to make it playable you first need to set a format to it, where you have to specify all of the characteristics that you've set up when initializing your AudioRecord instance (such as sample rate, number of channels, etc). I found an example of how to record an audio using AudioRecord and later setting up wav format: https://selvaline.blogspot.com/2016/04/record-audio-wav-format-android-how-to.html I hope it helps.

Playing audio with video using MediaDecoder

I have an MPEG2-transport file, and using the newfangled MediaCodec API in Android, I can play the video frames. I'm using MediaExtractor to load the file, and call getTrackFormat to get a MediaFormat. The reported MIME type for the format is "video/avc".
Is there a straightforward way to play the audio at the same time, and make sure the video and audio are synchronized?
I'm aware of the SimplePlayer sample code for stagefright, but that's in C and uses undocumented interfaces. Can it be done in Java?

Android custom audio player to replace MediaPlayer with libPd or OpenSL ES or AudioTrack

I have already developed Streaming Audio application using MediaPlayer API. All the features are works fine, expect it takes more time to start the playback (buffer time is more).
i want to add recording live audio stream(save the live stream data in disk, not the recording from MIC). as MediaPlayer does not provide any API to access the raw data stream, i am planning to build custom audio player.
i want to control the buffering time, access to the raw audio stream, should able to play all the audio format which are supported in android natively. which api (libPd or OpenSL ES or AudioTrack) will be suitable to build the custom audio player in Android?
In my experience OpenSL_ES would be the choice, here there is a Link that explains how to do audio streaming that you may find useful. bufferframes determines how many samples you will collect before playing, so smaller bufferframes faster response time, but you have to balance that with your device processing capabilities.
You can record with libpd (pd-for-android) too.
All the recording process is managed by libpd.
Check the ScenePlayer project, it uses libpd and lets you record audio into a folder on sdcard:
https://github.com/libpd/pd-for-android/tree/master/ScenePlayer

Android: "Encoded PCM 16/8-bit" what does it mean?

"Encoded PCM 16/8-bit" what does it mean?? lets say i have a mp3 music and i want to convert this to a encoded PCM so i could directly feed this to write() of AudioTrack object.
any tools by which i can convert??
and after conversion to PCM will it be playable in android.
(considering i am don't bother about quality)
Thank You!
PCM (pulse-code modulation) is a standard encoding scheme used in the WAV file format. It consists of 8- or 16-bit samples; there are a number of these per second of audio - that number is called the sample rate. AudioTrack is used to play back PCM data; this can be done in real-time while you write to its internal buffer (i.e. MODE_STREAM), or you can fill the buffer and then play back (MODE_STATIC). If you go with the streaming mode, it's important to continuously call write() to keep filling the buffer during playback, otherwise the AudioTrack will stop playing until it receives more data.
As for tools, a simple one is iTunes. Go to Preferences->General->Import Settings and choose "WAV encoder" from the drop-down menu. Now right-click a file you want to convert and select "Create WAV version". As you mentioned, you will lose a bit of quality, which is inevitable in conversion.
Alternatively to this method, consider using the MediaPlayer API, which will play MP3s natively.

Is There a Way to Playback Raw Audio/Video Streams in Android API's?

I have a Vorbis stream that I can decode to PCM if necessary, and I have a raw h264 stream all three of which are supported by Android when in a container. Wondering if there is any way to manually feed video and audio samples into the android MediaPlayer without any container. I would imagine I would have to override methods within the MediaPlayer. Does anyone have experience with this or have an easier way to do this? I can't imagine its impossible...
You may be able to play the audio pcm samples (For that also, I guess you may have to put a wav header) but you may not be able to play the H264 elementary stream without a container. In the media framework (Stagefright), there are sniffers functions registered for various container formats which could tell what kind of parser need to be used and then extractor(parser) is created. I don't think that you will be able to play the H264 elementary stream from the application using the inbuilt media framework of Android.

Categories

Resources