I need to process data that MediaRecorder provides when i recording video and then save into file as *.mp4. I have ParcelFileDescriptor and with that i send Recorder output to pipe which is read in my thread. Problem is that i have no idea how to save mp4 file. Which metadata i need to provide? Can i extract all of them from MediaRecorder. And how much bytes are provided for metadata. Possibly you have better idea how can i store that data as *.mp4 using some external library?
Thanks.
Related
I've been struggling to find the information on how to record an audio through a mic, save it as a mp3 or wav file (if possible) and then store it in an array using Kotlin. Does anybody know how to do that?
Is there anyway to write metadata to an audio file? The web is full of stuff on MediaMetadataRetriever but not editing. I've looked at MetadataEditor but that doesn't seem to let me write metadata to a specific file. Pretty much I want to add a Bitmap to an audio file so that when a 3rd party app plays the audio file it has cover art. Thanks :)
I have some audio data (raw uncompressed audio waveform, no file format) in memory and want to replay it. Android-internal Mediaplayer seems to support only replaying of files stored somewhere on disk, I could not find a setDataSource()-method that would accept an array of data instead of a path/URL.
So how can I replay waveforms that are not stored on disk?
Thanks!
Use an AudioTrack. See e.g. this page for an example of how to set up an use an AudioTrack (jump to the "Playing the sample with an AudioTrack" section, and just ignore the JNI-stuff).
I am developing an app. that will stream the video recorded from camera and stored as mp4 in sdcard...
I know there something called as RTSP which is used for that...
Please tell me where to start . .and is there any library that will do this for me...
I do not understand why you want to stream a local mp4 file, but maybe I misunderstood you problem. RTSP is used for live streaming, but you want to play a local mp4 file.
If you want to capture video and save it as mp4 onto a SD card, you need to use the Android MediaRecorder. Here a link to a simple example.
If you just want to playback a local mp4 file. Have a look at the Android API Demos.
These are part of the Android SDK installation and contain a good example of how to playback local media files.
I'm trying to use the AudioRecord class to record a WAV file. The problem is that it only supplies the raw PCM data, and if I write it to a file, there is no header information, so it will not play in any media player. How can I create a WAV file from this raw data?
Or alternatively, is there any other way to record sound in Android to a WAV file (or, alternatively MP3)?
Oh, and I know that MediaRecorder can'y be used because it doesn't support either WAV or MP3 formats.
OK, I've got this figured out. This post was crucial in helping me:
http://computermusicblog.com/blog/2008/08/29/reading-and-writing-wav-files-in-java
Basically, I used ByteArrayOutputStream to write the raw PCM data from AudioRecord, which then lets me get the byte array and its size when the process is done. I can then use that data in conjunction with the SampleRate, BitRate, and Stereo/Mono settings to create the WAV header as per the link above. The resulting file works perfectly!
Check the MediaRecorder.setOutputFormat(), you can set different container formats for your recording; there is MediaRecorder.OutputFormat.MPEG_4 and MediaRecorder.OutputFormat.THREE_GPP; the only allowed format along RAW is setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
Sorry but MP3 is not avail. You really need mp3 for recording?
WAV on the opposite of MP3 is a container, not a format; WAV can be any kind of encoding format.
You are always free to prepend some WAV RIFF header in front of your raw pcm data (as long as you exactly know the format). Check here for how it has to look like:
http://www-mmsp.ece.mcgill.ca/Documents/AudioFormats/WAVE/WAVE.html
You may want to use mediarecord class