Android FFmpeg grab frames in parallel from a video - android

Is there any way to read frames from an mp4 video using JavaCV in parallel?
I know that we could grab frames using FFmpegFrameGrabber but is there any other efficient method like using FrameGrabber.Array ?, I tried the below code but its not working.
frames = new Frame[grabber.getLengthInFrames()];
frameGrabbers = new FFmpegFrameGrabber[grabber.getLengthInFrames()];
*//*for (FFmpegFrameGrabber grabber : frameGrabbers) {
grabber = new FFmpegFrameGrabber(path);
}*//*
for (int i = 0; i < grabber.getLengthInFrames(); i++) {
frameGrabbers[i] = new FFmpegFrameGrabber(path);
}
grabberArray = grabber.createArray(frameGrabbers);
grabberArray.start();
frames = grabberArray.grab();
grabberArray.release();
The app crashes when I call grabberArray.start().
Thanks.

Related

Android: Read audio data from file uri

I want to analyse an audio file (mp3 in particular) which the user can select and determine what notes are played, when they're player and with what frequency.
I already have some working code for my computer, but I want to be able to use this on my phone as well.
In order to do this however, I need access to the bytes of the audio file. On my PC I could just open a stream and use AudioFormat to decode it and then read() the bytes frame by frame.
Looking at the Android Developer Forums I can only find classes and examples for playing a file (without access to the bytes) or recording to a file (I want to read from a file).
I'm pretty confident that I can set up a file chooser, but once I have the Uri from that, I don't know how to get a stream or the bytes.
Any help would be much appreciated :)
Edit: Is a similar solution to this possible? Android - Read a File
I don't know if I could decode the audio file that way or if there would be any problems with the Android API...
So I solved it in the following way:
Get an InputStream with
final InputStream inputStream = getContentResolver().openInputStream(selectedUri);
Then pass it in this function and decode it using classes from JLayer:
private synchronized void decode(InputStream in)
throws BitstreamException, DecoderException {
ArrayList<Short> output = new ArrayList<>(1024);
Bitstream bitstream = new Bitstream(in);
Decoder decoder = new Decoder();
float total_ms = 0f;
float nextNotify = -1f;
boolean done = false;
while (! done) {
Header frameHeader = bitstream.readFrame();
if (total_ms > nextNotify) {
mListener.OnDecodeUpdate((int) total_ms);
nextNotify += 500f;
}
if (frameHeader == null) {
done = true;
} else {
total_ms += frameHeader.ms_per_frame();
SampleBuffer buffer = (SampleBuffer) decoder.decodeFrame(frameHeader, bitstream); // CPU intense
if (buffer.getSampleFrequency() != 44100 || buffer.getChannelCount() != 2) {
throw new DecoderException("mono or non-44100 MP3 not supported", null);
}
short[] pcm = buffer.getBuffer();
for (int i = 0; i < pcm.length-1; i += 2) {
short l = pcm[i];
short r = pcm[i+1];
short mono = (short) ((l + r) / 2f);
output.add(mono); // RAM intense
}
}
bitstream.closeFrame();
}
bitstream.close();
mListener.OnDecodeComplete(output);
}
The full project (in case you want to look up the particulars) can be found here:
https://github.com/S7uXN37/MusicInterpreterStudio/

FFMPEG x264 encoder Android

I've compiled an FFMPEG library for use on Android with libx264 and using the NDK.
I want to encode an MPEG video file however the application is failing when opening the encoder codec, in avcodec_open2.
The FFMPEG logs I receive from avcodec_open2 are below with the function returning -22.
Picture size %ux%u is invalid.
ignoring invalid width/height values
Specified pix_fmt is not supported
On windows this code works fine, it's only on Android that there is a failure. Any ides why this would fail on Android?
if (!(codec = avcodec_find_encoder(AV_CODEC_ID_MPEG1VIDEO)))
{
return -1;
}
//Allocate context based on codec
if (!(context = avcodec_alloc_context3(codec)))
{
return -2;
}
//Setup Context
// put sample parameters
context->bit_rate = 4000000;
// resolution must be a multiple of two
context->width = 1280;
context->height = 720;
// frames per second
context->time_base = (AVRational){1,25};
context->inter_quant_bias = 96;
context->gop_size = 10;
context->max_b_frames = 1;
//IDs
context->pix_fmt = AV_PIX_FMT_YUV420P;
context->codec_id = AV_CODEC_ID_MPEG1VIDEO;
context->codec_type = AVMEDIA_TYPE_VIDEO;
if (AV_CODEC_ID_MPEG1VIDEO == AV_CODEC_ID_H264)
{
av_opt_set(context->priv_data, "preset", "slow", 0);
}
if ((result = avcodec_open2(context, codec, NULL)) < 0)
{
//Failed opening Codec!
}
This problem was caused by building FFMPEG with outdated source code.
I got the most recent source from https://www.ffmpeg.org/ and compiled it in the same way and the new library works fine.
Note: I hadn't considered the full implications regarding licenses of using libx264. I've since dropped it.

Extract bitmap from video in android

I am trying to extract all frames from a video.
By following code I want to fetch the first 30 frames of a video, but I got only first frame 30 times.
private ArrayList<Bitmap> getFrames(String path) {
try {
ArrayList<Bitmap> bArray = new ArrayList<Bitmap>();
bArray.clear();
MediaMetadataRetriever mRetriever = new MediaMetadataRetriever();
mRetriever.setDataSource("/sdcard/myvideo.mp4");
for (int i = 0; i < 30; i++) {
bArray.add(mRetriever.getFrameAtTime(1000*i,
MediaMetadataRetriever.OPTION_CLOSEST_SYNC));
}
return bArray;
} catch (Exception e) { return null; }
}
Now, how can I get all frames from a video?
Video support in Android SDK is limited and frame extraction for H264 encoded videos is only possible for key frames. In order to extract an arbitrary frame, you'll need to use a library like FFmpegMediaMetadataRetriever which uses native code to extract data from the video. It is very fast, comes with precompiled binaries (for ARM and x86) so you don't need to delve into C++ and makefiles, is licensed under Apache 2.0 and it comes with a demo Android app.
There is also a pure Java library, JCodec but it's slower and when I used it last year the colors of the extracted frame were distorted.
you have to pass the path to this method...Perfectly working code ! hope it helpfull
gradle--
implementation 'com.github.wseemann:FFmpegMediaMetadataRetriever-core:1.0.15'
public void VideoToGif(String uri) {
Uri videoFileUri = Uri.parse(uri);
FFmpegMediaMetadataRetriever retriever = new FFmpegMediaMetadataRetriever();
retriever.setDataSource(uri);
List<Bitmap> rev = new ArrayList<Bitmap>();
MediaPlayer mp = MediaPlayer.create(GitToImage.this, videoFileUri);
int millis = mp.getDuration();
System.out.println("starting point");
for (int i = 100000; i <=millis * 1000; i += 100000*2) {
Bitmap bitmap = retriever.getFrameAtTime(i, FFmpegMediaMetadataRetriever.OPTION_CLOSEST);
rev.add(bitmap);
}
GiftoImage((ArrayList) rev);
}
getFrameAt get data in milliseconds but you are incrementing .001 miliseconds in for loop.
for(int i=1000000;i<millis*1000;i+=1000000) // for incrementing 1s use 1000
{
bArray.add(mRetriever.getFrameAtTime(i,
MediaMetadataRetriever.OPTION_CLOSEST_SYNC));
}
change it like above . Above is sample for creating what you want. I also answered it here
Starting with Android 9.0 (API level 28), MediaMetadataRetriever has a getFrameAtIndex (int frameIndex) method, which accepts the zero-based index of the frame you want and returns a Bitmap.
See https://developer.android.com/reference/android/media/MediaMetadataRetriever.html#getFrameAtIndex(int)

how to integrate audio with video in android using javacv/opencv?

For my application i created video from set of images by using javacv/opencv in android.but that video plays with out sound.so i want to add my recorded audio(mp3 file) to that generated video how can i achieve it?
This is my code which is used to get video from images,
String path =SCREENSHOT_FOLDER2;
File folder = new File(path);
listOfFiles = folder.listFiles();
if(listOfFiles.length>0)
{
iplimage = new opencv_core.IplImage[listOfFiles.length];
for (int j = 0; j < listOfFiles.length; j++) {
String files="";
if (listOfFiles[j].isFile())
{
files = listOfFiles[j].getName();
}
String[] tokens = files.split("\\.(?=[^\\.]+$)");
String name=tokens[0];
System.out.println(" j " + name);
iplimage[j]=cvLoadImage("/mnt/sdcard/images/"+name+".png");
}
}
File videopath = new File(SCREENSHOT_FOLDER3);
videopath.mkdirs();
FFmpegFrameRecord recorder = new
FFmpegFrameRecord(SCREENSHOT_FOLDER3+
"video"+System.currentTimeMillis()+".mp4",320,480);
try {
recorder.setCodecID(CODEC_ID_MPEG4); //CODEC_ID_MPEG4
//CODEC_ID_MPEG1VIDEO
recorder.setBitrate(sampleVideoBitRate);
recorder.setFrameRate(10);
recorder.setPixelFormat(PIX_FMT_YUV420P); //PIX_FMT_YUV420P
recorder.start();
int x = 0;
int y = 0;
for (int i=0;i< 300 && x<iplimage.length;i++)
{
recorder.record(image[x]);
if (i>(y+10)) {
y=y+1;
x++;
}
}
recorder.stop();
}
catch (Exception e){
e.printStackTrace();
}
now how to integrate audio file(.mp3) file in this code.
OpenCV, and subsequently JavaCV has no support for audio.
You have to go with a different library for it. Look at the Android support for video/audio, third-pary libs, or any other way you may find useful.
But don't just expect OpenCV to help you because it's support for audio is 0%.
I would not say that JavaCV has no support for audio, as it integrates a lot of libraries that opencv does not - ffmpeg for example. Check this long thread for that issue.

How to encode non-camera video in Android

I am working on an android application in which a video is dynamically generated by compositing a sequence of animation frames. I tried to use the Android Media Recorder API for this but have not found a way to get it to accept a non-camera source as input. I have been attempting to use a FFMPEG port (based on the Rockplayer build) but am running into difficulties with missing functions since I am using it as an encoder, not a decoder.
The iPhone version of this app uses AVAssetWriter from the AVFoundation framework.
Is there an easier way to do this or am I stuck slugging it out with FFMPEG?
This may help (see the note on resolution though):-
How to encode using the FFMpeg in Android (using H263)
I'm not sure if they did a custom build of ffmpeg, or not, if so they may be able to offer advice on porting a more feature complete version.
-Anthony
Opencv has ViewBase class which takes the input from the camera as a frame and represent the frame as a bitmap , you can extand the class View base and make it for your own use , even though installing opencv on the android isn't very easy.
When you extend SampleCvViewBase you will have the following function which you can use pretty much hard work but the best I can think of.
#Override
protected Bitmap processFrame(VideoCapture capture) {
capture.retrieve(picture, Highgui.CV_CAP_ANDROID_COLOR_FRAME_RGBA);
if (Utils.matToBitmap(picture, bmp))
return bmp;
bmp.recycle();
return null;
}
You can use a pure Java open source library called JCodec ( http://jcodec.org ).
It contains a simple yet working H.264 encoder and MP4 muxer. The class below uses JCodec low level API and should be what you need ( CORRECTED ):
public class SequenceEncoder {
private SeekableByteChannel ch;
private Picture toEncode;
private RgbToYuv420 transform;
private H264Encoder encoder;
private ArrayList<ByteBuffer> spsList;
private ArrayList<ByteBuffer> ppsList;
private CompressedTrack outTrack;
private ByteBuffer _out;
private int frameNo;
private MP4Muxer muxer;
public SequenceEncoder(File out) throws IOException {
this.ch = NIOUtils.writableFileChannel(out);
// Transform to convert between RGB and YUV
transform = new RgbToYuv420(0, 0);
// Muxer that will store the encoded frames
muxer = new MP4Muxer(ch, Brand.MP4);
// Add video track to muxer
outTrack = muxer.addTrackForCompressed(TrackType.VIDEO, 25);
// Allocate a buffer big enough to hold output frames
_out = ByteBuffer.allocate(1920 * 1080 * 6);
// Create an instance of encoder
encoder = new H264Encoder();
// Encoder extra data ( SPS, PPS ) to be stored in a special place of
// MP4
spsList = new ArrayList<ByteBuffer>();
ppsList = new ArrayList<ByteBuffer>();
}
public void encodeImage(BufferedImage bi) throws IOException {
if (toEncode == null) {
toEncode = Picture.create(bi.getWidth(), bi.getHeight(), ColorSpace.YUV420);
}
// Perform conversion
for (int i = 0; i < 3; i++)
Arrays.fill(toEncode.getData()[i], 0);
transform.transform(AWTUtil.fromBufferedImage(bi), toEncode);
// Encode image into H.264 frame, the result is stored in '_out' buffer
_out.clear();
ByteBuffer result = encoder.encodeFrame(_out, toEncode);
// Based on the frame above form correct MP4 packet
spsList.clear();
ppsList.clear();
H264Utils.encodeMOVPacket(result, spsList, ppsList);
// Add packet to video track
outTrack.addFrame(new MP4Packet(result, frameNo, 25, 1, frameNo, true, null, frameNo, 0));
frameNo++;
}
public void finish() throws IOException {
// Push saved SPS/PPS to a special storage in MP4
outTrack.addSampleEntry(H264Utils.createMOVSampleEntry(spsList, ppsList));
// Write MP4 header and finalize recording
muxer.writeHeader();
NIOUtils.closeQuietly(ch);
}
public static void main(String[] args) throws IOException {
SequenceEncoder encoder = new SequenceEncoder(new File("video.mp4"));
for (int i = 1; i < 100; i++) {
BufferedImage bi = ImageIO.read(new File(String.format("folder/img%08d.png", i)));
encoder.encodeImage(bi);
}
encoder.finish();
}
}
You can get JCodec jar from a project web-site.

Categories

Resources