I am trying to use a portion of a pre-recorded video file for processing on Android. Currently I am working with MediaCodec library on Android. I am using MediaExtractor to extract the data from video file. The idea is to extract only required gata stream for processing. The code looks like:
extractor = new MediaExtractor();
File file = new File(mInputFile);
FileInputStream inputStream = new FileInputStream(file);
long len = file.length();
long start = 200L;
extractor.setDataSource(inputStream.getFD(),start,len);
inputStream.close();
Logcat is:
java.io.IOException: Failed to instantiate extractor.
at android.media.MediaExtractor.setDataSource(Native Method).
It works well if I set starting bytes to 0. What am I doing wrong ? Can we not extract portion of videos using mediaextractor ?
Related
I am trying to get the audio input stream from a file in the local file system on an android device.
This is so that i can use this library to show a wave form for the audio file.
https://github.com/newventuresoftware/WaveformControl/blob/master/app/src/main/java/com/newventuresoftware/waveformdemo/MainActivity.java#L125
The example in the project uses rawResource like so
InputStream is = getResources().openRawResource(R.raw.jinglebells);
This input stream is later converted into byte array and passed to somewhere that uses it to paint a wave picture and sound.
however when I did
InputStream is = new InputFileSystem(new File(filePath));
But this does not seem to work properly. The image generated is wrong, and the sound played is nothing like what the file actually is.
This is the body of the function in that library that gets the input stream and convert it into byte arrays.
private short[] getAudioSample() throws IOException {
// If i replace this part with new FileInput(new File(filePath))
// the generated "samples" from it does not work properly with the library.
InputStream is = getResources().openRawResource(R.raw.jinglebells);
byte[] data;
try {
data = IOUtils.toByteArray(is);
} finally {
if (is != null) {
is.close();
}
}
ShortBuffer sb = ByteBuffer.wrap(data).order(ByteOrder.LITTLE_ENDIAN).asShortBuffer();
short[] samples = new short[sb.limit()];
sb.get(samples);
return samples;
}
The sound file that I would like to get processed and pass to that library is created by a MediaRecorder with the following configurations
MediaRecorder recorder = new MediaRecorder();
recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
recorder.setOutputFormat(MediaRecorder.OutputFormat.DEFAULT);
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
Basically that library requires PCM samples. Using that 3gp FileInputStream generated by MediaRecorder directly is not gonna cut it.
What I did is to use MediaExtractor and MediaCodec to convert the 3gp audio data into PCM samples by samples. and then it is fed into that library. Then everything worked =)
The logic in encoding audio data can be almost directly taken from here this awesome github repo
How can I get the byte offset of a video based on the video's play time offset? For example, given the play time offset of 15 seconds for a video, I'd like to know the byte offset for that second.
The reason for this is because I'd like to be able to "trim" a clip from a video. I'd like to be able to save a video clip from 00:00:20 to 00:00:35 of the video.
At the moment, here is what I have - but this saves the entire video from the url to the device:
URL url = new URL(http_url_path);
URLConnection ucon = url.openConnection();
// Define InputStreams to read from the URLConnection.
// uses 5KB download buffer
InputStream is = ucon.getInputStream();
BufferedInputStream in = new BufferedInputStream(is, BUFFER_SIZE);
FileOutputStream out = new FileOutputStream(file);
byte[] buff = new byte[BUFFER_SIZE];
int len = 0;
while ((len = in.read(buff)) != -1) {
out.write(buff, 0, len);
}
If you don't mind cutting at the nearest key frame (a/k/a sync frame), you can use MediaExtractor to extract the frames, using getSampleTime() to check the PTS, and MediaMuxer to put it back together minus the unwanted frames.
The video must start with a key frame, so you can't cut the stream at an arbitrary point unless you're willing to re-encode that GOP.
MP4 video files are not just a series of frames (I assume you're not operating on raw H.264 data). MediaMuxer will take care of rewriting the header and other supporting data.
You can try INDE Media for Mobile - https://software.intel.com/en-us/articles/intel-inde-media-pack-for-android-tutorials
It has transcoding\remuxing functionality as MediaComposer class and a possibility to select segments for resulted files. Since it uses MediaCodec API inside it is very battery friendly and works as fast as possible. Samples are here: https://github.com/INDExOS/media-for-mobile
I have followed this example to convert raw audio data coming from AudioRecord to mp3, and it happened successfully, if I store this data in a file the mp3 file and play with music player then it is audible.
Now my question is instead of storing mp3 data to a file i need to play it with AudioTrack, the data is coming from the Red5 media server as live stream, but the problem is AudioTrack can only play PCM data, so i can only hear noise from my data.
Now i am using JLayer to my require task.
My code is as follows.
int readresult = recorder.read(audioData, 0, recorderBufSize);
int encResult = SimpleLame.encode(audioData,audioData, readresult, mp3buffer);
and this mp3buffer data is sent to other user by Red5 stream.
data received at other user is in form of stream, so for playing it the code is
Bitstream bitstream = new Bitstream(data.read());
Decoder decoder = new Decoder();
Header frameHeader = bitstream.readFrame();
SampleBuffer output = (SampleBuffer) decoder.decodeFrame(frameHeader, bitstream);
short[] pcm = output.getBuffer();
player.write(pcm, 0, pcm.length);
But my code freezes at bitstream.readFrame after 2-3 seconds, also no sound is produced before that.
Any guess what will be the problem? Any suggestion is appreciated.
Note: I don't need to store the mp3 data, so i cant use MediaPlayer, as it requires a file or filedescriptor.
just a tip, but try to
output.close();
bitstream.closeFrame();
after yours write code. I'm processing MP3 same as you do, but I'm closing buffers after usage and I have no problem.
Second tip - do it in Thread or any other Background process. As you mentioned these deaf 2 seconds, media player may wait until you process whole stream because you are loading it in same thread.
Try both tips (and you should anyway). In first, problem could be in internal buffers; In second you probably fulfill Media's input buffer and you locked app (same thread, full buffer cannot receive your input and code to play it and release same buffer is not invoked because writing locks it...)
Also, if you don't doing it now, check for 'frameHeader == null' due to file end.
Good luck.
You need to loop through the frames like this:
While (frameHeader = bitstream.readFrame()){
SampleBuffer output = (SampleBuffer) decoder.decodeFrame(frameHeader, bitstream);
short[] pcm = output.getBuffer();
player.write(pcm, 0, pcm.length);
bitstream.close();
}
And make sure you are not running them on main thread.(This is probably the reason of freezing.)
I've found lots of tutorials and posts showing how to use AudioTrack to play wav files in AudioTrack.MODE_STREAM and I've successfully implemented this example.
However I'm having issues with performance when playing multiple audio tracks at once and thinking that I should first create the tracks using AudioTrack.MODE_STATIC then just call play each time.
I can't find any resources on how to implement this. How can I do this?
Thanks
The two main sticking points for me were realizing that .write() comes first and that the instantiated player must have the size of the entire clip as the buffer_size_in_bytes.
Assuming you have recorded a PCM file using AudioRecord, you can play it back with STATIC_MODE like so...
File file = new File(FILENAME);
int audioLength = (int)file.length();
byte filedata[] = new byte[audioLength];
try{
InputStream inputStream = new BufferedInputStream(new FileInputStream(FILENAME));
int lengthOfAudioClip = inputStream.read(filedata, 0, audioLength);
player = new AudioTrack(STREAM_TYPE, SAMPLE_RATE, CHANNEL_OUT_CONFIG, AUDIO_FORMAT,audioLength, AUDIO_MODE);
player.write(filedata, OFFSET, lengthOfAudioClip);
player.setPlaybackRate(playbackRate);
player.play();
}
I have some h264 frames already encoded with android encoder. Now i want to create and write them one by one to the mp4 file. Please advise how to do this on android in Java. I don't want to use OpenCV or native code.
mp4-parser can't do this as i understand
navite MPEG4Writer is too complicated to use
Wondering why such a common and very useful thing as mp4 writer NOT FROM CAMERA is not implemented in android
You may use my mp4parser project. You can mux H264 and AAC in pure Java with it:
H264TrackImpl h264Track = new H264TrackImpl(new FileInputStream("raw.h264").getChannel());
Movie m = new Movie();
m.addTrack(h264Track);
IsoFile out = new DefaultMp4Builder().build(m);
FileOutputStream fos = new FileOutputStream(new File("h264_output.mp4"));
out.getBox(fos.getChannel());
fos.close();