Compress Videos using android MediaCodec api - android

I want to compress locally saved video file to a smaller size in order to upload to a server.
Since i used MediaCodec , i have found some tips to compress video . Here are the steps that i followed
1) . Extracted the media file using MediaExrtactor and Decoded it.
2) . Creates the Encoder with required file format
3) . Create muxer to save file in local storage. (not complete)
Question : But i dont know how to encode the already decoded stream and save the stream in to the local storage using MediaMuxer.
public class CompressMedia {
private static final String SAMPLE = Environment
.getExternalStorageDirectory() + "/DCIM/Camera/20140506_174959.mp4";
private static final String OUTPUT_PATH = Environment
.getExternalStorageDirectory()
+ "/DCIM/Camera/20140506_174959_REC.mp4";
private MediaExtractor extractor;
private MediaCodec decoder;
private MediaCodec encoder;
String mime;
private static final String MIME_TYPE = "video/avc";
public void extractMediaFile() {
// work plan
// locate media file
// extract media file using Media Extractor
// retrieve decoded frames
extractor = new MediaExtractor();
try {
extractor.setDataSource(SAMPLE);
} catch (IOException e) {
// TODO Auto-generated catch block
// file not found
e.printStackTrace();
}
// add decoded frames
for (int i = 0; i < extractor.getTrackCount(); i++) {
MediaFormat format = extractor.getTrackFormat(i);
mime = format.getString(MediaFormat.KEY_MIME);
if (mime.startsWith("video/")) {
extractor.selectTrack(i);
decoder = MediaCodec.createDecoderByType(mime);
decoder.configure(format, null, null, 0);
break;
}
}
if (decoder == null) {
Log.e("DecodeActivity", "Can't find video info!");
return;
}
// - start decoder -
decoder.start();
extractor.selectTrack(0);
// - decoded frames can obtain in here -
}
private void createsEncoder() {
// creates media encoder to set formats
encoder = MediaCodec.createDecoderByType(MIME_TYPE);
// init media format
MediaFormat mediaFormat = MediaFormat.createVideoFormat(MIME_TYPE, /* 640 */
320, /* 480 */240);
mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 400000);
mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 25);
mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT,
MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar);
mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
encoder.configure(mediaFormat, null, null,
MediaCodec.CONFIGURE_FLAG_ENCODE);
encoder.start();
// - encoded data format is avaiable in here
}
private void createMuxer() {
// creates media muxer - media muxer will be used to write the final
// strem in to a desired file :)
try {
MediaMuxer muxer = new MediaMuxer(OUTPUT_PATH,
OutputFormat.MUXER_OUTPUT_MPEG_4);
int videoTrackIndex = muxer.addTrack(encoder.getOutputFormat());
//muxer.writeSampleData(videoTrackIndex, inputBuffers, bufferInfo);
muxer.start();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
Here are the links that i follwed
Android MediaCodec: Reduce mp4 video size and
Video compression on android using new MediaCodec Library

You can try Intel INDE on https://software.intel.com/en-us/intel-inde and Media Pack for Android which is a part of INDE, tutorials on https://software.intel.com/en-us/articles/intel-inde-media-pack-for-android-tutorials. It has a sample that shows how to use media pack to transcode=recompress video files. You can set smaller resolution and\or bitrate to output to get smaller file
in ComposerTranscodeCoreActivity.java
protected void setTranscodeParameters(MediaComposer mediaComposer) throws IOException {
mediaComposer.addSourceFile(mediaUri1);
mediaComposer.setTargetFile(dstMediaPath);
configureVideoEncoder(mediaComposer, videoWidthOut, videoHeightOut);
configureAudioEncoder(mediaComposer);
}
protected void transcode() throws Exception {
factory = new AndroidMediaObjectFactory(getApplicationContext());
mediaComposer = new MediaComposer(factory, progressListener);
setTranscodeParameters(mediaComposer);
mediaComposer.start();
}

Related

Add RIFF header to a buffer

it's the second week which I'm trying to just record audio in xamarin which has an RIFF header.
I tried Audio Recorder Plugin and Audio Record and Media Recorder . I asked many questions but got no answer.
the easiest way was Audio Recorder Plugin, but the output hasn't RIFF header.
the output of Media Recorder was .3gp which I couldn't convert it to .wav.
and the output of Media Recorder was .pcm which also couldn't convert to .wav
here is the last code I tried :
#region Properties
int SAMPLING_RATE_IN_HZ = 44100;
ChannelIn CHANNEL_CONFIG = ChannelIn.Mono;
Android.Media.Encoding AUDIO_FORMAT = Android.Media.Encoding.Pcm16bit;
int BUFFER_SIZE_FACTOR = 2;
int BUFFER_SIZE;
bool RecordingInProgress = false;
private AudioRecord recorder = null;
#endregion
public void Record()
{
BUFFER_SIZE = AudioRecord.GetMinBufferSize(SAMPLING_RATE_IN_HZ,
CHANNEL_CONFIG, AUDIO_FORMAT) * BUFFER_SIZE_FACTOR;
recorder = new AudioRecord(AudioSource.Mic, SAMPLING_RATE_IN_HZ,
CHANNEL_CONFIG, AUDIO_FORMAT, BUFFER_SIZE);
recorder.StartRecording();
RecordingInProgress = true;
RecordingTask();
}
public Task RecordingTask()
{
return Task.Run(() =>
{
string path = "appdir/demo.pcm";
MemoryStream buffer = new MemoryStream(BUFFER_SIZE);
FileOutputStream outStream = new FileOutputStream(path);
var demo2 = RecordingInProgress;
while (RecordingInProgress)
{
int result = recorder.Read(buffer.GetBuffer(), 0, BUFFER_SIZE);
if (result < 0)
{
throw new Exception("Reading of audio buffer failed: ");
}
outStream.Write(buffer.GetBuffer(), 0, BUFFER_SIZE);
}
});
}
public void Stop()
{
if (null == recorder)
{
return;
}
RecordingInProgress = false;
recorder.Stop();
recorder.Release();
recorder = null;
}
}
this code makes a .pcm file that can't convert to anything with even cloud converters.
I also tried this :
NWaves.Audio.WaveFile waveFile = new NWaves.Audio.WaveFile(buffer.GetBuffer());
waveFile.SaveTo(new FileStream("appdir/demo.wav", FileMode.Create));
insted of outStream.Write(buffer.GetBuffer(), 0, BUFFER_SIZE); at the bottom of while closing block
but it says : No RIFF found
there is about 4 or 5 way to record audio. but a package like Nwaves can't work with any of them.
the last try I want to do is add RIFF header to the recorded audio buffer(bytes) programmatically or convert .3gp or .pcm to .wav programmatically .
summery: someone help me to record an audio in xamarin which Nwaves can work with.
thanks

How to make reverse video and editing in video file in android programmatically? [duplicate]

I have created functionality to record video in my app.
When I play a song, that song is recorded with video and a video file is created, similar to a dubshmash application.
Now the problem that I am facing is that other voices such as near by sounds also get recorded. The song file is recorded in the video record screen and I play the song when video recording activity launches.
How can I have my application record only song with video?
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
mediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mediaRecorder.setProfile(CamcorderProfile.get(CamcorderProfile.QUALITY_HIGH));
Is there any solution in audio source set as a speaker , because song sound going through a speaker? if is it another possible way please reply me.
You can record video without audio and merge audio later on using mp4 parser like this:
/*
* #param videoFile path to video file
* #param audioFile path to audiofile
*/
public String mux(String videoFile, String audioFile) {
Movie video = null;
try {
video = new MovieCreator().build(videoFile);
} catch (RuntimeException e) {
e.printStackTrace();
return null;
} catch (IOException e) {
e.printStackTrace();
return null;
}
Movie audio = null;
try {
audio = new MovieCreator().build(audioFile);
} catch (IOException e) {
e.printStackTrace();
return null;
} catch (NullPointerException e) {
e.printStackTrace();
return null;
}
int size = audio.getTracks().size();
Track audioTrack = audio.getTracks().get((size - 1));
video.addTrack(audioTrack);
Container out = new DefaultMp4Builder().build(video);
File myDirectory = new File(Environment.getExternalStorageDirectory(), "/Folder Name");
if (!myDirectory.exists()) {
myDirectory.mkdirs();
}
filePath = myDirectory + "/video" + System.currentTimeMillis() + ".mp4";
try {
RandomAccessFile ram = new RandomAccessFile(String.format(filePath), "rw");
FileChannel fc = ram.getChannel();
out.writeContainer(fc);
ram.close();
} catch (IOException e) {
e.printStackTrace();
return null;
}
return filePath;
}
In build.gradle add following dependency
compile 'com.googlecode.mp4parser:isoparser:1.0.5.4'
If you want to working with video then you have to use FFMPEG library
That can be you can work with Video.
That for i have already give answer to How to use ffmpeg in android studio? see this LINK. Go step by step and import in your project
You can use a MediaRecorder without calling setAudio* on it.
remove this line
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
see this link
There is currently no way to directly record android output without "background noise".
Note that this is a security concern to restrict access to other apps audio output, therefore it is very unlikely that it could be achieved directly.
See this answer

How to video record with specific sound programmatically in android?

I have created functionality to record video in my app.
When I play a song, that song is recorded with video and a video file is created, similar to a dubshmash application.
Now the problem that I am facing is that other voices such as near by sounds also get recorded. The song file is recorded in the video record screen and I play the song when video recording activity launches.
How can I have my application record only song with video?
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
mediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mediaRecorder.setProfile(CamcorderProfile.get(CamcorderProfile.QUALITY_HIGH));
Is there any solution in audio source set as a speaker , because song sound going through a speaker? if is it another possible way please reply me.
You can record video without audio and merge audio later on using mp4 parser like this:
/*
* #param videoFile path to video file
* #param audioFile path to audiofile
*/
public String mux(String videoFile, String audioFile) {
Movie video = null;
try {
video = new MovieCreator().build(videoFile);
} catch (RuntimeException e) {
e.printStackTrace();
return null;
} catch (IOException e) {
e.printStackTrace();
return null;
}
Movie audio = null;
try {
audio = new MovieCreator().build(audioFile);
} catch (IOException e) {
e.printStackTrace();
return null;
} catch (NullPointerException e) {
e.printStackTrace();
return null;
}
int size = audio.getTracks().size();
Track audioTrack = audio.getTracks().get((size - 1));
video.addTrack(audioTrack);
Container out = new DefaultMp4Builder().build(video);
File myDirectory = new File(Environment.getExternalStorageDirectory(), "/Folder Name");
if (!myDirectory.exists()) {
myDirectory.mkdirs();
}
filePath = myDirectory + "/video" + System.currentTimeMillis() + ".mp4";
try {
RandomAccessFile ram = new RandomAccessFile(String.format(filePath), "rw");
FileChannel fc = ram.getChannel();
out.writeContainer(fc);
ram.close();
} catch (IOException e) {
e.printStackTrace();
return null;
}
return filePath;
}
In build.gradle add following dependency
compile 'com.googlecode.mp4parser:isoparser:1.0.5.4'
If you want to working with video then you have to use FFMPEG library
That can be you can work with Video.
That for i have already give answer to How to use ffmpeg in android studio? see this LINK. Go step by step and import in your project
You can use a MediaRecorder without calling setAudio* on it.
remove this line
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
see this link
There is currently no way to directly record android output without "background noise".
Note that this is a security concern to restrict access to other apps audio output, therefore it is very unlikely that it could be achieved directly.
See this answer

No sound is captured in video created using FFmpegFrameGrabber, FrameRecorder JavaCv in android

I am trying to apply Grayscale effect to a video file using JavaCV in android.Everything is working fine but there is no audio present in the output file. Below is the code. Please Help
File file = new File(Environment.getExternalStorageDirectory() + File.separator + "test3.mp4");
FFmpegFrameGrabber frameGrabber = new FFmpegFrameGrabber(file.getAbsolutePath());
FrameRecorder recorder = null;
Log.d("bharat", " Audio channels = " + frameGrabber.getAudioChannels()); // THIS IS RETurnING 0 and not 2
recorder = new FFmpegFrameRecorder("/mnt/sdcard/streaml_t.mp4", 270, 480, frameGrabber.getAudioChannels());
recorder.setVideoCodec(AV_CODEC_ID_H264);
recorder.setFormat("mp4");
recorder.setFrameRate(frameGrabber.getFrameRate());
recorder.setSampleFormat(frameGrabber.getSampleFormat());
recorder.setSampleRate(frameGrabber.getSampleRate());
try {
recorder.start();
frameGrabber.start();
int count = 0;
while (true) {
try {
Frame grabFrame = frameGrabber.grabFrame();
Log.d("bharat:", "frame " + count++);
if (grabFrame == null) {
System.out.println("!!! Failed cvQueryFrame");
break;
}
IplImage frame_copy = cvCreateImage(cvSize(grabFrame.image.width(), grabFrame.image.height()), IPL_DEPTH_8U, 1);
cvCvtColor(grabFrame.image, frame_copy, CV_RGB2GRAY);
grabFrame.image = frame_copy;
recorder.setTimestamp(frameGrabber.getTimestamp());
recorder.record(grabFrame);
} catch (Exception e) {
e.printStackTrace();
}
}
Log.d("bharat:", "frame done");
recorder.stop();
recorder.release();
} catch (Exception e) {
e.printStackTrace();
}
}
P. S : I found that frameGrabber.getAudioChannels() is returning 0 and not 2 (in case it helps)
Once I met a problem due to lack of docs in JavaCV like:
/** Grab next videoframe */
public IplImage grab()
/** Grab next video or audio frame */
public Frame grabFrame()
So you get audio or video frame and then you always try to process this as image. You should check is it audio or video frame before and don't try to call cvCvtColor() for audio.

MediaMetadataRetriever getFrameAtTime to retrieve video frame

private void sample(){
int FRAME_BYTES=326;
int FRAMESMAX=36;
String subFolder="media";
String mediafileName="sample.mp4";
MediaMetadataRetriever mediaMetadata=new MediaMetadataRetriever();
try{
AssetFileDescriptor afd=getApplicationContext().getAssets().openFd(subFolder+File.separator+mediaFileName);
;
mediaMetadata.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength());
Bitmap frame=null;
for(int currentFrame=0;currentFrame<FRAMESMAX; currentFrame++){
if(currentFrame<=0){
frame = mediaMetadata.getFrameAtTime();
}else{
frame = mediaMetadata.getFrameAtTime(FRAME_BYTES*currentFrame*1000, MediaMetadataRetriever.OPTION_CLOSEST_SYNC );
//currentFrame++;
}
// do some thing with frame
}
}catch(Exception e){
Log.i(TAG, " unable to get file descriptor of the frame"+e.toString());
}
}
}
I am able to read frames from mp4 media files, on emulator and other devices but Samsung galaxy S III throws and error saying that
MediaMetadataRetriever getFrameAttime failed to retrieve video frames .
Any input on this?
For to get data source, you can use:
mediaMetadata.setDataSource(Environment.getExternalStorageDirectory().getPath()+"your_folder/your_file");

Categories

Resources