how to integrate audio with video in android using javacv/opencv? - android

For my application i created video from set of images by using javacv/opencv in android.but that video plays with out sound.so i want to add my recorded audio(mp3 file) to that generated video how can i achieve it?
This is my code which is used to get video from images,
String path =SCREENSHOT_FOLDER2;
File folder = new File(path);
listOfFiles = folder.listFiles();
if(listOfFiles.length>0)
{
iplimage = new opencv_core.IplImage[listOfFiles.length];
for (int j = 0; j < listOfFiles.length; j++) {
String files="";
if (listOfFiles[j].isFile())
{
files = listOfFiles[j].getName();
}
String[] tokens = files.split("\\.(?=[^\\.]+$)");
String name=tokens[0];
System.out.println(" j " + name);
iplimage[j]=cvLoadImage("/mnt/sdcard/images/"+name+".png");
}
}
File videopath = new File(SCREENSHOT_FOLDER3);
videopath.mkdirs();
FFmpegFrameRecord recorder = new
FFmpegFrameRecord(SCREENSHOT_FOLDER3+
"video"+System.currentTimeMillis()+".mp4",320,480);
try {
recorder.setCodecID(CODEC_ID_MPEG4); //CODEC_ID_MPEG4
//CODEC_ID_MPEG1VIDEO
recorder.setBitrate(sampleVideoBitRate);
recorder.setFrameRate(10);
recorder.setPixelFormat(PIX_FMT_YUV420P); //PIX_FMT_YUV420P
recorder.start();
int x = 0;
int y = 0;
for (int i=0;i< 300 && x<iplimage.length;i++)
{
recorder.record(image[x]);
if (i>(y+10)) {
y=y+1;
x++;
}
}
recorder.stop();
}
catch (Exception e){
e.printStackTrace();
}
now how to integrate audio file(.mp3) file in this code.

OpenCV, and subsequently JavaCV has no support for audio.
You have to go with a different library for it. Look at the Android support for video/audio, third-pary libs, or any other way you may find useful.
But don't just expect OpenCV to help you because it's support for audio is 0%.

I would not say that JavaCV has no support for audio, as it integrates a lot of libraries that opencv does not - ffmpeg for example. Check this long thread for that issue.

Related

Android Exoplayer play only high quality with HLS

I'd like to know if there is a way to specify to exoplayer to play only high quality of a stream in hls. My problem is that it takes too much time to play this quality even if I have a good network.
So if I could start playing in this quality and not in the lower one it would be great.
Any idea?
Regards,
Please modify as mentioned to pick high variant.
HlsChunkSource.java
OLD:
protected int computeDefaultVariantIndex(HlsMasterPlaylist playlist, Variant[] variants,
BandwidthMeter bandwidthMeter) {
int defaultVariantIndex = 0;
int minOriginalVariantIndex = Integer.MAX_VALUE;
for (int i = 0; i < variants.length; i++) {
int originalVariantIndex = playlist.variants.indexOf(variants[i]);
if (originalVariantIndex < minOriginalVariantIndex) {
minOriginalVariantIndex = originalVariantIndex;
defaultVariantIndex = i;
}
}
return defaultVariantIndex;
}
Chnage to :
protected int computeDefaultVariantIndex (HlsMasterPlaylist playlist, Variant[] variants,BandwidthMeter bandwidthMeter) {
int defaultVariantIndex = 0;
int minOriginalVariantIndex = Integer.MIN_VALUE;
for (int i = 0; i < variants.length; i++) {
int originalVariantIndex = playlist.variants.indexOf(variants[i]);
if (originalVariantIndex > minOriginalVariantIndex) {
minOriginalVariantIndex = originalVariantIndex;
defaultVariantIndex = i;
}
}
return defaultVariantIndex;
}
But if your device using Amlogic video codec(mostly set top boxes) , picking high variant cause video freeze which is Google closed as device issue.

Android: Read audio data from file uri

I want to analyse an audio file (mp3 in particular) which the user can select and determine what notes are played, when they're player and with what frequency.
I already have some working code for my computer, but I want to be able to use this on my phone as well.
In order to do this however, I need access to the bytes of the audio file. On my PC I could just open a stream and use AudioFormat to decode it and then read() the bytes frame by frame.
Looking at the Android Developer Forums I can only find classes and examples for playing a file (without access to the bytes) or recording to a file (I want to read from a file).
I'm pretty confident that I can set up a file chooser, but once I have the Uri from that, I don't know how to get a stream or the bytes.
Any help would be much appreciated :)
Edit: Is a similar solution to this possible? Android - Read a File
I don't know if I could decode the audio file that way or if there would be any problems with the Android API...
So I solved it in the following way:
Get an InputStream with
final InputStream inputStream = getContentResolver().openInputStream(selectedUri);
Then pass it in this function and decode it using classes from JLayer:
private synchronized void decode(InputStream in)
throws BitstreamException, DecoderException {
ArrayList<Short> output = new ArrayList<>(1024);
Bitstream bitstream = new Bitstream(in);
Decoder decoder = new Decoder();
float total_ms = 0f;
float nextNotify = -1f;
boolean done = false;
while (! done) {
Header frameHeader = bitstream.readFrame();
if (total_ms > nextNotify) {
mListener.OnDecodeUpdate((int) total_ms);
nextNotify += 500f;
}
if (frameHeader == null) {
done = true;
} else {
total_ms += frameHeader.ms_per_frame();
SampleBuffer buffer = (SampleBuffer) decoder.decodeFrame(frameHeader, bitstream); // CPU intense
if (buffer.getSampleFrequency() != 44100 || buffer.getChannelCount() != 2) {
throw new DecoderException("mono or non-44100 MP3 not supported", null);
}
short[] pcm = buffer.getBuffer();
for (int i = 0; i < pcm.length-1; i += 2) {
short l = pcm[i];
short r = pcm[i+1];
short mono = (short) ((l + r) / 2f);
output.add(mono); // RAM intense
}
}
bitstream.closeFrame();
}
bitstream.close();
mListener.OnDecodeComplete(output);
}
The full project (in case you want to look up the particulars) can be found here:
https://github.com/S7uXN37/MusicInterpreterStudio/

Android FFmpeg grab frames in parallel from a video

Is there any way to read frames from an mp4 video using JavaCV in parallel?
I know that we could grab frames using FFmpegFrameGrabber but is there any other efficient method like using FrameGrabber.Array ?, I tried the below code but its not working.
frames = new Frame[grabber.getLengthInFrames()];
frameGrabbers = new FFmpegFrameGrabber[grabber.getLengthInFrames()];
*//*for (FFmpegFrameGrabber grabber : frameGrabbers) {
grabber = new FFmpegFrameGrabber(path);
}*//*
for (int i = 0; i < grabber.getLengthInFrames(); i++) {
frameGrabbers[i] = new FFmpegFrameGrabber(path);
}
grabberArray = grabber.createArray(frameGrabbers);
grabberArray.start();
frames = grabberArray.grab();
grabberArray.release();
The app crashes when I call grabberArray.start().
Thanks.

Play Multiple online videos in FFMPEG

I'm trying to make an Android Applicaion that uses FFMPEG to play online streams. I gone through dolphin player source and with the help of this i achieved playing udp streams in android. Now i am trying to play multiple streams in my player. For example i have a list activity that has 10 names and by clicking each name the corrosponding video should be played. How can i achieve this in avformat_open_input. I am very new to ffmpeg. Please help me in achiving this. Thanks in advance.
My c code is as follows:
static int decode_module_init(void *arg)
{
VideoState *is = (VideoState *)arg;
AVFormatContext *pFormatCtx;
int err;
int ret = -1;
int i;
int video_index = -1;
int audio_index = -1;
is->videoStream=-1;
is->audioStream=-1;
char* Filename = "udp://.....";
global_video_state = is;
pFormatCtx = avformat_alloc_context();
pFormatCtx->interrupt_callback.callback = decode_interrupt_cb;
pFormatCtx->interrupt_callback.opaque = is;
**err = avformat_open_input(&pFormatCtx, Filename , NULL, NULL);**
if (err < 0) {
__android_log_print(ANDROID_LOG_INFO, "message", "File open failed");
ret = -1;
goto decode_module_init_fail;
}
__android_log_print(ANDROID_LOG_INFO, "message", "File open successful");
}

Extract bitmap from video in android

I am trying to extract all frames from a video.
By following code I want to fetch the first 30 frames of a video, but I got only first frame 30 times.
private ArrayList<Bitmap> getFrames(String path) {
try {
ArrayList<Bitmap> bArray = new ArrayList<Bitmap>();
bArray.clear();
MediaMetadataRetriever mRetriever = new MediaMetadataRetriever();
mRetriever.setDataSource("/sdcard/myvideo.mp4");
for (int i = 0; i < 30; i++) {
bArray.add(mRetriever.getFrameAtTime(1000*i,
MediaMetadataRetriever.OPTION_CLOSEST_SYNC));
}
return bArray;
} catch (Exception e) { return null; }
}
Now, how can I get all frames from a video?
Video support in Android SDK is limited and frame extraction for H264 encoded videos is only possible for key frames. In order to extract an arbitrary frame, you'll need to use a library like FFmpegMediaMetadataRetriever which uses native code to extract data from the video. It is very fast, comes with precompiled binaries (for ARM and x86) so you don't need to delve into C++ and makefiles, is licensed under Apache 2.0 and it comes with a demo Android app.
There is also a pure Java library, JCodec but it's slower and when I used it last year the colors of the extracted frame were distorted.
you have to pass the path to this method...Perfectly working code ! hope it helpfull
gradle--
implementation 'com.github.wseemann:FFmpegMediaMetadataRetriever-core:1.0.15'
public void VideoToGif(String uri) {
Uri videoFileUri = Uri.parse(uri);
FFmpegMediaMetadataRetriever retriever = new FFmpegMediaMetadataRetriever();
retriever.setDataSource(uri);
List<Bitmap> rev = new ArrayList<Bitmap>();
MediaPlayer mp = MediaPlayer.create(GitToImage.this, videoFileUri);
int millis = mp.getDuration();
System.out.println("starting point");
for (int i = 100000; i <=millis * 1000; i += 100000*2) {
Bitmap bitmap = retriever.getFrameAtTime(i, FFmpegMediaMetadataRetriever.OPTION_CLOSEST);
rev.add(bitmap);
}
GiftoImage((ArrayList) rev);
}
getFrameAt get data in milliseconds but you are incrementing .001 miliseconds in for loop.
for(int i=1000000;i<millis*1000;i+=1000000) // for incrementing 1s use 1000
{
bArray.add(mRetriever.getFrameAtTime(i,
MediaMetadataRetriever.OPTION_CLOSEST_SYNC));
}
change it like above . Above is sample for creating what you want. I also answered it here
Starting with Android 9.0 (API level 28), MediaMetadataRetriever has a getFrameAtIndex (int frameIndex) method, which accepts the zero-based index of the frame you want and returns a Bitmap.
See https://developer.android.com/reference/android/media/MediaMetadataRetriever.html#getFrameAtIndex(int)

Categories

Resources