I am trying to write a video compression application that will run on Jellybean version of Android. Till now I could decode the given input to video/raw format and it is playable with mplayer. My problem is that when I Encode this video/raw item into video/avc format with width = 320, height = 240, bitRate = (480*1024), frameRate = 20, iFrameInterval = 7 and colorFormat = YUV420Planar, the output file is not playable by just double clicking on it. Can anyone suggest me a way to play it using any player.? Or can you tell me if I had made any mistake in giving the above parameters like bitrate, framerate etc..
Thanks in advance.! :)
Related
Using OpenCV 4.5.2 + FFMPEG on an android app
I'm trying to convert an .avi video file into a .mp4 file using x264, by running
ffmpeg -i input.avi -c:v libx264 output.mp4
The transcoding is processed correctly but, when I play the video, the colors are a bit... saturated?
This transcoding is part of the following flow:
Grab a .mov video file
Use OpenCV VideoCapture and VideoWriter to write text on the video frames (output is .avi)
Then I need to convert .avi file into .mp4 so it's reproducible on exoplayer.
In step 2. I'm looping all video frames and writing them to a new file, writing a text on them.
val videoWriter = VideoWriter(
outputFilePath,
VideoWriter.fourcc('M', 'J', 'P', 'G'),
15.0,
Size(1920.0, 1088.0),
true
)
val frame = Mat()
videoCapture.read(frame)
Imgproc.putText(
frame,
"This is a text",
Point(200.0, 200.0),
3,
5.0,
Scalar(
255.0,
124.0,
124.0,
255.0
),
1
)
videoWriter.write(frame)
I know that step 2. is probably not corrupting the frames because in my sample app, I'm displaying all frames in an ImageView, and they all match the original .mov video. So, my guess is that the issue is occurring on 3.
I'm using 'com.arthenica:mobile-ffmpeg-min-gpl:4.4' for android to execute the FFMPEG command as follows:
FFmpeg.executeAsync("-i $outputFilePath -c:v libx264 -y ${mp4File.path}")
where outputFilePath is the path for the .avi file and mp4File is an existing empty .mp4 file.
So I guess what I'm looking for is a way to have a lossless video color transcoding between .avi and .mp4 files.
Here's a screenshot of my sample app. The image on top is the last frame of the .avi video. The image on the bottom is the last frame played on a video player for the .mp4 transcoded video. This frame color difference is noticeable throughout the whole video.
EDIT: After some digging, I found out that the issue is that the VideoWritter is messing with the RGB colors. I still don't know the reason why this is happeninng.
Figured it out myself with some debug assitance from #llogan.
So, it looks like VideoCapture exports frames with BGR format, thus the Red and Blue colors being switched out. In order to fix my issue all I had to do was to convert the frame from BGR to RGB using the OpenCV utility method:
val frame = Mat()
val frame1 = Mat()
videoCapture.read(frame)
Imgproc.cvtColor(frame, frame1, Imgproc.COLOR_BGR2RGB)
videoWriter.write(frame1)
I'm streaming video h264 video and AAC audio over RTMP on Android using the native MediaCodec APIs. Video and audio look great, however while the video is shot in potrait mode, playback on the web or with VLC is always in landscape.
Having read through the h264 spec, I see that this sort of extra metadata can be specified in Supplemental Enhancement Information (SEI), and I've gone about adding it to the raw h264 bit stream. My SEI NAL unit for this follows this rudimentary format, I plan to optimize later:
val displayOrientationSEI = {
val prefix = byteArrayOf(0, 0, 0, 1)
val nalHeader = byteArrayOf(6) // forbidden_zero_bit:0; nal_ref_idc:0; nal_unit_type:6
val display = byteArrayOf(47 /* Display orientation type*/, 3 /*payload size*/)
val displayOrientationCancelFlag = "0" // u(1); Rotation information follows
val horFlip = "1" // hor_flip; u(1); Flip horizontally
val verFlip = "1" // ver_flip; u(1); Flip vertically
val anticlockwiseRotation = "0100000000000000" // u(16); value / 2^16 -> 90 degrees
val displayOrientationRepetitionPeriod = "010" // ue(v); Persistent till next video sequence
val displayOrientationExtensionFlag = "0" // u(1); No other value is permitted by the spec atm
val byteAlignment = "1"
val bitString = displayOrientationCancelFlag +
horFlip +
verFlip +
anticlockwiseRotation +
displayOrientationRepetitionPeriod +
displayOrientationExtensionFlag +
byteAlignment
prefix + nalHeader + display + BigInteger(bitString, 2).toByteArray()
}()
Using Jcodec's SEI class, I can see that my SEI message is parsed properly. I write out these packets to the RTMP stream using an Android JNI wrapper for LibRtmp.
Despite this, ffprobe does not show the orientation metadata, and the video when played remains in landscape.
At this point I think I'm missing a very small detail about how FLV headers work when the raw h264 units are written out by LibRtmp. I have tried appending this displayOrientationSEI NAL unit:
To the initial SPS and PPS configuration only.
To each raw h264 NAL units straight from the encoder.
To both.
What am I doing wrong? Going through the source of some RTMP libraries, like rtmp-rtsp-stream-client-java, it seems the SEI message is dropped when creating FLV tags.
Help is much, much appreciated.
Does RTMP support the Display Orientation SEI Message in h264 streams?
RTMP is unaware of the very concept. from RTMPs perspective, the SEI is just a series of bytes it copys. It never looks at them, it never parses them.
The thing that needs to support it, is the h.264 decoder (which RTMP is also unaware of) and the player software. If it is not working for you, you must check the player, or the validity of the encoded SEI, not the transport.
I am working on a project that needs to process a video using OpenGL on Android. I decided to use MediaCodec and I managed to get it works with the help from ExtractDecodeEditEncodeMuxTest. The result is quite good, I have it receives a video, extracts the tracks, decodes the videotrack, edits with OpenGL, and encodes to a video file.
The problem is that the result video can be play well on Android, but when it comes to iOS, two-thirds of the screen is green.
I tried to solve with the suggestions from here, here, and here, experiment different formats for the encoder, but the problem is still the same.
Could someone suggest me the reasons that can cause this problem and how to fix it?
This is the video when it's played on iOS
This is the configuration for the encoder
MediaCodec mediaCodec = MediaCodec.createEncoderByType("video/avc");
MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc", 540, 960);
mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 2000000);
mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 30);
mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, CodecCapabilities.COLOR_FormatSurface);
mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1);
mediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
Update
I wonder if i had any mistake with the video orientation, because the working partial of the output video has the same ratio as the desired output resolution, but in horizontal orientation. The input is vertical recorded, so does the desired output.
Here is the code of the decoder configuration:
inputFormat.setInteger(MediaFormat.KEY_WIDTH, 540);
inputFormat.setInteger(MediaFormat.KEY_HEIGHT, 960);
inputFormat.setInteger("rotation-degrees", 90);
String mime = inputFormat.getString(MediaFormat.KEY_MIME);
MediaCodec decoder = MediaCodec.createDecoderByType(mime);
decoder.configure(inputFormat, surface, null, 0);
Update Dec 25: I've tried different resolutions and orientations when configuring both encoder and decoder to check if the video's orientation is the problem or not, but the output video just got rotated, the green problem is still there.
I also tried "video/mp4v-es" for the encoder, the result video is viewable on Mac, but the iPhone cannot even play it.
I've just solved it. The reason turns out to be the MediaMuxer, it wraps the h264 raw stream in some sort of container that iOS cant understand. So instead of using MediaMuxer, I write the raw h264 stream from the encoder to a file, and use mp4parser to mux it into a mp4 file.
I know the answer now: It has to do with the fps range. I changed the fps rate on my camera params and on the media codec and suddenly it worked!
i am developing an android app, which plays live speex audio stream. So i used jspeex library .
The audio stream is 11khz,16 bit.
At android side i have done as follows:
SpeexDecoder decoder = new SpeexDecoder();
decoder.init(1, 11025,1, true);
decoder.processData(subdata, 0, subdata.length);
byte[] decoded_data = new byte[decoder.getProcessedDataByteSize()];
int result= decoder.getProcessedData(decoded_data, 0);
When this decoded data is played by Audiotrack , some part of audio is clipped.
Also when decoder is set to nb-mode( first parameter set to 0) the sound quality is worse.
I wonder there is any parameter configuration mistake in my code.
Any help, advice appreciated.
Thanks in advance.
Sampling rate and buffer size should be set in an optimized way for the specific device. For example you can use AudioRecord.getMinBufferSize() to obtain the best size for your buffer:
int sampleRate = 11025; //try also different standard sampleRate
int bufferSize = AudioRecord.getMinBufferSize(sampleRate,
AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_16BIT);
If your Audiotrack has a buffer which is too small or too large you will experience audio glitch. I suggest you to take a look here and play around with these values (sampleRate and bufferSize).
I am trying to decode a FLAC file with 24bit sample format using OpenSL ES on Android. Originally, I had my SLDataFormat_PCM for the SLDataSink setup like this.
_pcm.formatType = SL_DATAFORMAT_PCM;
_pcm.numChannels = 2;
_pcm.samplesPerSec = SL_SAMPLINGRATE_44_1;
_pcm.bitsPerSample = SL_PCMSAMPLEFORMAT_FIXED_16;
_pcm.containerSize = SL_PCMSAMPLEFORMAT_FIXED_16;
_pcm.channelMask = SL_SPEAKER_FRONT_LEFT | SL_SPEAKER_FRONT_RIGHT;
_pcm.endianness = SL_BYTEORDER_LITTLEENDIAN;
This is working well for basically any data format. Luckily the samplesPerSec is not respected (I don't want resampling).
Now I want to support the full bit-depth of a FLAC file with 24bit samples. When using this format, it apparently performs a bit-depth conversion, because once I load the file, and then check the ANDROID_KEY_PCMFORMAT_BITSPERSAMPLE info, it is 16.
When I put bitsPerSample = SL_PCMSAMPLEFORMAT_FIXED_24; or SL_PCMSAMPLEFORMAT_FIXED_32, then OpenSL ES rejects it
E/libOpenSLES(22706): pAudioSnk: bitsPerSample=32
W/libOpenSLES(22706): Leaving Engine::CreateAudioPlayer (SL_RESULT_CONTENT_UNSUPPORTED)
Any idea how this is meant to work? Is Android currently restricted to 16 bit int only?
I would also accept 32bit float, but I don't suppose that will work either.
Currently it only supports 8 and 16 bits
Sources:
Android source code (line 60)
Article (PCM data format section)