I'm using Android's VideoView to play an embedded video in my app. It works fine on my device but I keep getting a "Can't play this video" message and a black screen in the Xamarin Android Player.
The corresponding error log looks like this:
Unable to play video
[MediaPlayer] Error (1,-38)
[VideoView] Error: 1,-38
I found a few posts regarding this error but none of them helped me solving this issue and I'm not able to find a proper description for this status code.
My C# code looks like this:
videoView = new VideoView (Context);
base.SetNativeControl (videoView);
videoView.SetOnErrorListener (new ErrorListener ());
string fileName = e.NewElement.FileSource;
fileName = fileName.ToLower ().Substring (0, fileName.LastIndexOf ("."));
int resourceID = Context.Resources.GetIdentifier (fileName, "raw", Context.PackageName);
var fullPath = String.Format ("android.resource://{0}/{1}", Context.PackageName, resourceID);
videoView.SetVideoPath (fullPath);
videoView.RequestFocus ();
videoView.Start ();
Seems to be an issue with the type of encoding that the emulator supports, if you install ffmpeg, if your on a mac by running these commands:
ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"
brew install ffmpeg
then process your video file with:
ffmpeg -i big_buck_bunny_720p_1mb.mp4 -c:v libx264 -profile:v baseline -c:a aac -strict -2 -b:a 128k output.mp4
and try to play the output it wont show that error but it will be a blank video (just a black screen). So I think the issue is just getting the right encoding, have tried some different encodings but all seems to just show a black screen.
Will do some more digging but for the time being it seems to be just the emulator does not support your encoding.
EDIT
Ok so I got the videoplay working, I processed the video with:
ffmpeg -i SampleVideo_1080x720_1mb.mp4 -codec:v libx264 -profile:v baseline -preset slow -b:v 250k -maxrate 250k -bufsize 500k -vf scale=-1:360 -threads 0 -codec:a aac -strict -2 -b:a 96k output.mp4
Check this site for the ffmpeg parameters.
I setup my VideoView like so:
public class Activity1 : Activity
{
VideoView videoView;
protected override void OnCreate (Bundle bundle)
{
base.OnCreate (bundle);
// Set our view from the "main" layout resource
SetContentView (Resource.Layout.Main);
videoView = FindViewById<VideoView> (Resource.Id.SampleVideoView);
videoView.SetMediaController(new MediaController(this));
videoView.SetVideoPath ($"android.resource://{PackageName}/{Resource.Raw.output}");
videoView.RequestFocus ();
videoView.Start ();
}
}
This seems to work on the Xamarin Android Player but only for API versions 16(JellyBean) and 19(Kitkat). 21 (lollipop) just doesn't load the video.
Then I downloaded the GenyMotion Emulator (need to create an account but its free for personal use) to check if it was the Xamarin Player or not. It works on all (16,17,18,19,20 + 22) apart from 21(lollipop). looks like something is wrong with the emulators for 21, I did all my testing on nexus 4 emulators. So if you want tot est video playback I would try to avoid emulators with API 21.
Different Android OS versions support different combinations of audio and video encodings within the video container. So it depends what version your Android Player is emulating. For the table see http://developer.android.com/guide/appendix/media-formats.html#core
Related
Using OpenCV 4.5.2 + FFMPEG on an android app
I'm trying to convert an .avi video file into a .mp4 file using x264, by running
ffmpeg -i input.avi -c:v libx264 output.mp4
The transcoding is processed correctly but, when I play the video, the colors are a bit... saturated?
This transcoding is part of the following flow:
Grab a .mov video file
Use OpenCV VideoCapture and VideoWriter to write text on the video frames (output is .avi)
Then I need to convert .avi file into .mp4 so it's reproducible on exoplayer.
In step 2. I'm looping all video frames and writing them to a new file, writing a text on them.
val videoWriter = VideoWriter(
outputFilePath,
VideoWriter.fourcc('M', 'J', 'P', 'G'),
15.0,
Size(1920.0, 1088.0),
true
)
val frame = Mat()
videoCapture.read(frame)
Imgproc.putText(
frame,
"This is a text",
Point(200.0, 200.0),
3,
5.0,
Scalar(
255.0,
124.0,
124.0,
255.0
),
1
)
videoWriter.write(frame)
I know that step 2. is probably not corrupting the frames because in my sample app, I'm displaying all frames in an ImageView, and they all match the original .mov video. So, my guess is that the issue is occurring on 3.
I'm using 'com.arthenica:mobile-ffmpeg-min-gpl:4.4' for android to execute the FFMPEG command as follows:
FFmpeg.executeAsync("-i $outputFilePath -c:v libx264 -y ${mp4File.path}")
where outputFilePath is the path for the .avi file and mp4File is an existing empty .mp4 file.
So I guess what I'm looking for is a way to have a lossless video color transcoding between .avi and .mp4 files.
Here's a screenshot of my sample app. The image on top is the last frame of the .avi video. The image on the bottom is the last frame played on a video player for the .mp4 transcoded video. This frame color difference is noticeable throughout the whole video.
EDIT: After some digging, I found out that the issue is that the VideoWritter is messing with the RGB colors. I still don't know the reason why this is happeninng.
Figured it out myself with some debug assitance from #llogan.
So, it looks like VideoCapture exports frames with BGR format, thus the Red and Blue colors being switched out. In order to fix my issue all I had to do was to convert the frame from BGR to RGB using the OpenCV utility method:
val frame = Mat()
val frame1 = Mat()
videoCapture.read(frame)
Imgproc.cvtColor(frame, frame1, Imgproc.COLOR_BGR2RGB)
videoWriter.write(frame1)
I am trying to place multiple gif on image and save as gif using FFmpeg. I had achieved placing of multiple gif but all gif doesnot play continuously i.e Second gif repeat only once first gif finish and started again ..the second gif stop and start again only when first gif finishes.
command_try[0]="-i";
command_try[1]=input;
command_try[2]="-i";
command_try[3]=gifthumbnail;
command_try[4]="-i";
command_try[5]=gifthumbnail;
command_try[6]="-i";
command_try[7]=thumbnail;
command_try[8]="-i";
command_try[9]=thumbnail2;
command_try[10]="-filter_complex";
command_try[11]="[0:v]scale=0:0[base];[1:v]scale=300:-1[img1];[2:v]scale=720:-1290[img2];[3:v]scale=80:-1[img3];[4:v]scale=50:-1[img4];[img1]rotate=45:c=black#0:ow=rotw(45):oh=roth(45)[r1];[img2]rotate=0:c=black#0:ow=rotw(0):oh=roth(0)[r2];" +
"[img3]rotate=0:c=black#0:ow=rotw(0):oh=roth(0)[r3];[img4]rotate=0:c=black#0:ow=rotw(0):oh=roth(0)[r4];[base][r1]overlay=100:70[tmp1];"+
"[tmp1][r2]overlay=55:55[tmp2];[tmp2][r3]overlay=65:65[tmp3];[tmp3][r4]overlay=30:30";
command_try[12]="-preset";
command_try[13]="veryfast";
command_try[14]="/storage/emulated/0/Pictures/imggif.gif";
As I am recently started working on FFmpeg need help to play gif continuously independently.
Use the -stream_loop -1 input option and add the shortest=1 option to any overlay with an infinitely looping GIF as an input.
Simplified example:
ffmpeg -i video.mp4 -stream_loop -1 -i 1.gif -filter_complex "overlay=shortest=1:format=auto" output.mp4
To avoid the yellow lines add the palettegen and paletteuse filters:
ffmpeg -i video.mp4 -stream_loop -1 -i 1.gif -filter_complex "overlay=shortest=1:format=auto,split[s0][s1];[s0]palettegen[p];[s1][p]paletteuse" output.gif
You are using FFmpeg 3.0.1 which is really old so you will have to change auto to rgb.
How to add text with animation on video in ffmpeg?
Now I am trying to add video management function on android app.
If you have a experiences on it, please help me.
I added the visible, marquee,several text lines animation on video in ffmpeg.
It works well.
Here is my code.
ffmpeg -i input.mp4 -vf "[in]drawtext=fontfile=/path/to/font.ttf: text='First Line': fontcolor=red: fontsize=40: x=(w-text_w)/2: y=if(lt(t\,3)\,(-h+((3*h-200)*t/6))\,(h-200)/2):enable='between(t,2.9,50)',drawtext=fontfile=/path/to/font.ttf: text='Second Line': fontcolor=yellow: fontsize=30: x=if(lt(t\,4)\,(-w+((3*w-tw)*t/8))\,(w-tw)/2): y=(h-100)/2:enable='between(t,3.5,50)',drawtext=fontfile=/path/to/font.ttf: text='Third Line': fontcolor=blue: fontsize=50: x=if(lt(t\,5)\,(2*w-((3*w+tw)*t/10))\,(w-tw)/2): y=h/2:enable='between(t,4.5,50)',drawtext=fontfile=/path/to/font.ttf: text='Fourth Line': fontcolor=black: fontsize=20: x=(w-text_w)/2: y=if(lt(t\,6)\,(2*h-((3*h-100)*t/12))\,(h+100)/2):enable='between(t,5.5,50)'[out]" out.mp4
Here "input.mp4" is your input video file and it will output with the name "out.mp4"
I want to cut or trim audio song in android programmatically. i have found FFMPEG solution but i am not getting what is the step behind to cut audio song and if any other way please help me.
most people give me this type answer
ffmpeg -t 30 -i inputfile.mp3 -acodec copy outputfile.mp3
what is this and how to use in android code to cut audio?
Please help me.
Thank You
Basically follow the steps described here: http://writingminds.github.io/ffmpeg-android-java/
1.) You have to include the ffmpeg Library into your project!
put this in your build.gradle File:
dependencies {compile 'com.writingminds:FFmpegAndroid:0.3.2'}
2.) before using the Lib, copy the binary file from assets to the device:
FFmpeg ffmpeg = FFmpeg.getInstance(context);
ffmpeg.loadBinary(new LoadBinaryResponseHandler() {...}
3.) when this is finished you can start executing commands and listen for the results like this:
ffmpeg.execute(cmd, new ExecuteBinaryResponseHandler() {...}
where cmd is the array of arguments:
String[] command1 = new String[8];
command1[0]="-t";
command1[0]="30";
...
BUT: actually I found that the ffmpeg version for android does not work as expected for cutting or trimming audio files! (I got errors for commands that do work on linux commandline...) Hope you will figure it out.
There's a simple hack:
figure out bytes per second or millisecond -- file.length()/duration
get the start and end position where you want to cut audio.
read file as bytes and save the portion between startPos and EndPos in separate file.
I recorded a Video for limited time. Now i want to fetch all frames of video. I am using the below code and by using it i am able to get frames but i am not getting all video frames. 3 to 4 frames are repeated then i got a different frame. But as we all know we can get 25- 30 frames in a second to display smooth video. How to get all frames.
for (int i = 0; i < 30; i++) {
Bitmap bArray = mediaMetadataRetriever.getFrameAtTime(
1000000 * i,
MediaMetadataRetriever.OPTION_CLOSEST);
savebitmap(bArray, 33333 * i);
}
I don't want to use NDK. I got this link don't know what should be the value for "argb8888". I am getting error here. Can anyone explain how to do it.
Getting frames from Video Image in Android
I faced the same problem before and the Android's MediaMetadataRetriever seems not appropriated for this task since it doesn't have a good precision.
I used a library called "FFmpegMediaMetadataRetriever" in android studio:
Add this line in your build.graddle under module app:
compile 'com.github.wseemann:FFmpegMediaMetadataRetriever:1.0.14'
Rebuild your project.
Use the FFmpegMediaMetadataRetriever class to grab frames with higher
precision:
FFmpegMediaMetadataRetriever med = new FFmpegMediaMetadataRetriever();
med.setDataSource("your data source");
and in your loop you can grab frame using:
Bitmap bmp = med.getFrameAtTime(i*1000000, FFmpegMediaMetadataRetriever.OPTION_CLOSEST);
To get image frames from video we can use ffmpeg.For integrating FFmpeg in android we can use precompiled libraries like ffmpeg-android.
To extract image frames from a video we can use below command
String[] complexCommand = {"-y", "-i", inputFileAbsolutePath, "-an",
"-r", "1/2", "-ss", "" + startMs / 1000, "-t", "" + (endMs - startMs)
/ 1000, outputFileAbsolutePath};
Here,
-y
Overwrite output files
-i
ffmpeg reads from an arbitrary number of input “files” specified by the -i option
-an
Disable audio recording.
-r
Set frame rate
-ss
seeks to position
-t
limit the duration of data read from the input file
Here in place of inputFileAbsolutePath you have to specify the absolute path of video file from which you want to extract images.
For complete code check out this on my repository .Inside extractImagesVideo() method I am running command for extracting images from video.
For complete tutorial regarding integration of ffmpeg library and using ffmpeg commands to edit videos, check out this post which I have written on my blog.
You need to do :
Decode the video.
Present the decoded images at least as fast as 24 images / second. I
suppose you can skip this step.
Save the decoded images.
It appears that decoding the video would be the most challenging step. People and companies have spent years developing codecs (encoder / decoder) for various video formats.
Use this library JMF for FFMPEG.