How to convert raw h.264 file to mp4 - android

I saw how to convert video to mp4 using MediaCodec. Those examples were using Surface.
Then I think that h.264 => decode => Surface => encode (file), This is an unnecessary encode / decode process.
Is it possible to make raw h.264 files mp4 without encode / decode using android MediaCodec / MediaMuxer?
Update
I implemented some code refering link(https://developer.android.com/reference/android/media/MediaMuxer) of comment.
while(!finished) {
// getInputBuffer() will fill the inputBuffer with one frame of encoded
// sample from either MediaCodec or MediaExtractor, set isAudioSample to
// true when the sample is audio data, set up all the fields of bufferInfo,
// and return true if there are no more samples.
byte[] tempBuffer = new byte[1000000];
try {
inputBuffer.clear();
int bytesRead = videoFis.read(tempBuffer, 0, inputBuffer.limit());
if(bytesRead <= -1) {
break;
}
else {
inputBuffer.put(tempBuffer, 0, bytesRead);
int currentTrackIndex = videoTrackIndex; // isAudioSample ? audioTrackIndex : videoTrackIndex;
muxer.writeSampleData(currentTrackIndex, inputBuffer, bufferInfo);
}
} catch (IOException e) {
e.printStackTrace();
}
}
the above code is not working. I don't get encoded one frame.
// getInputBuffer() will fill the inputBuffer with one frame of encoded
// sample from either MediaCodec or MediaExtractor
For get encoded one frame from raw h.264, Should I use MediaCodec?
Update2
bb.mark();
// read all
while(true) {
try {
int bb2 = videoFis.read(tempBuffer, 0, tempBuffer.length);
if(bb2 != -1) {
bb.put(tempBuffer, 0, bb2);
}
else {
break;
}
} catch (IOException e) {
e.printStackTrace();
}
}
bb.reset();
int mata[][] = new int[5][256];
mata[0][0] = 1;
mata[1][0] = 2;
mata[2][0] = 3;
mata[3][1] = 4;
int step = 0;
// find first key frame.
while(bb.hasRemaining()) {
step = mata[step][(bb.get() & 0xFF)];
if(step < 4) {
step = mata[step][(bb.get() & 0xFF)];
}
else if(step == 4) {
byte g = bb.get();
if((g & 0x1f) == 5) {
bb.position(bb.position() - 5);
break;
}
}
}
step = 0;
while(!finished) {
int nextPosition = -1;
int offset = bb.position();
bb.get(); bb.get(); bb.get(); bb.get();
byte nalUnit = bb.get();
while(bb.hasRemaining()) {
if(step < 4) {
step = mata[step][(bb.get() & 0xFF)];
}
else if(step == 4) {
byte g = bb.get();
if((g & 0x60) != 0) {
nextPosition = bb.position() - 5;
break;
}
}
}
step = 0;
bufferInfo.flags = (nalUnit & 0x1f) == 5 ? MediaCodec.BUFFER_FLAG_KEY_FRAME : 0;
bufferInfo.offset = offset;
bufferInfo.size = nextPosition - offset;
bufferInfo.presentationTimeUs += 1000 * 1000 / 15;
Log.d(",,", "offset: " + bufferInfo.offset + "..." + "size: " + bufferInfo.size + "..." + "pts: " + bufferInfo.presentationTimeUs);
muxer.writeSampleData(videoTrackIndex, inputBuffer, bufferInfo);
bb.position(nextPosition);
if(bufferInfo.presentationTimeUs >= 666660)
{
bufferInfo.size = 0;
break;
}
}
muxer.stop();
muxer.release();
Like above implementation, I find 0x00 0x00 0x00 0x01 called NAL, I filled bufferInfo with offset, size, pts. But I got messages
...
02-23 17:59:25.131 7597-7738/com.example.ksoo.ballbotpkg I/MPEG4Writer: setStartTimestampUs: 66666
Earliest track starting time: 66666
02-23 17:59:25.131 7597-7597/com.example.ksoo.ballbotpkg D/,,: offset: 284939...size: 10531...pts: 133332
02-23 17:59:25.132 7597-7597/com.example.ksoo.ballbotpkg D/,,: offset: 295470...size: 15908...pts: 199998
offset: 311378...size: 20194...pts: 266664
02-23 17:59:25.133 7597-7597/com.example.ksoo.ballbotpkg D/,,: offset: 331572...size: 20628...pts: 333330
offset: 352200...size: 21346...pts: 399996
02-23 17:59:25.134 7597-7597/com.example.ksoo.ballbotpkg D/,,: offset: 373546...size: 21407...pts: 466662
02-23 17:59:25.134 7597-7597/com.example.ksoo.ballbotpkg D/,,: offset: 394953...size: 22494...pts: 533328
02-23 17:59:25.135 7597-7597/com.example.ksoo.ballbotpkg D/,,: offset: 417447...size: 23230...pts: 599994
02-23 17:59:25.135 7597-7597/com.example.ksoo.ballbotpkg D/,,: offset: 440677...size: 23820...pts: 666660
02-23 17:59:25.136 7597-7597/com.example.ksoo.ballbotpkg I/MPEG4Writer: Normal stop process
02-23 17:59:25.136 7597-7597/com.example.ksoo.ballbotpkg D/MPEG4Writer: Video track stopping. Stop source
Video track source stopping
Video track source stopped
02-23 17:59:25.136 7597-7738/com.example.ksoo.ballbotpkg I/MPEG4Writer: Received total/0-length (10/0) buffers and encoded 10 frames. - Video
02-23 17:59:25.136 7597-7597/com.example.ksoo.ballbotpkg D/MPEG4Writer: Video track stopped. Stop source
Stopping writer thread
02-23 17:59:25.137 7597-7737/com.example.ksoo.ballbotpkg D/MPEG4Writer: 0 chunks are written in the last batch
02-23 17:59:25.137 7597-7597/com.example.ksoo.ballbotpkg D/MPEG4Writer: Writer thread stopped
02-23 17:59:25.138 7597-7597/com.example.ksoo.ballbotpkg I/MPEG4Writer: The mp4 file will not be streamable.
02-23 17:59:25.138 7597-7597/com.example.ksoo.ballbotpkg D/MPEG4Writer: Video track stopping. Stop source
Did I make a mistake?

Related

AMR decoding from RTP

I'm receving some RTP stream, which I know only its AMR-WB octet-aligned 100 ms per packet. Some 3rd party can receive same stream and its "hearable", so its proper. Now I'm receiving this data and trying to decode, without luck...
init:
val sampleRate = 16000
val mc = MediaCodec.createDecoderByType(MediaFormat.MIMETYPE_AUDIO_AMR_WB)
val mf = MediaFormat.createAudioFormat(MediaFormat.MIMETYPE_AUDIO_AMR_WB, sampleRate, 1)
mf.setInteger(MediaFormat.KEY_SAMPLE_RATE, sampleRate) // is it needed?
mc.configure(mf, null, null, 0)
mc.start()
decode each packet separatelly:
private fun decode(decoder: MediaCodec, mediaFormat: MediaFormat, rtpPacket: RtpPacket): ByteArray {
var outputBuffer: ByteBuffer
var outputBufferIndex: Int
val inputBuffers: Array<ByteBuffer> = decoder.inputBuffers
var outputBuffers: Array<ByteBuffer> = decoder.outputBuffers
// input
val inputBufferIndex = decoder.dequeueInputBuffer(-1L)
if (inputBufferIndex >= 0) {
val inputBuffer = inputBuffers[inputBufferIndex]
inputBuffer.clear()
inputBuffer.put(rtpPacket.payload)
// native ACodec/MediaCodec crash in here (log below)
decoder.queueInputBuffer(inputBufferIndex, 0, rtpPacket.payload.size, System.nanoTime()/1000, 0)
}
// output
val bufferInfo: MediaCodec.BufferInfo = MediaCodec.BufferInfo()
outputBufferIndex = decoder.dequeueOutputBuffer(bufferInfo, -1L)
Timber.i("outputBufferIndex: ${outputBufferIndex}")
when (outputBufferIndex) {
MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED -> {
Timber.d("INFO_OUTPUT_BUFFERS_CHANGED")
outputBuffers = decoder.outputBuffers
}
MediaCodec.INFO_OUTPUT_FORMAT_CHANGED -> {
val format: MediaFormat = decoder.outputFormat
Timber.d("INFO_OUTPUT_FORMAT_CHANGED $format")
audioTrack.playbackRate = format.getInteger(MediaFormat.KEY_SAMPLE_RATE)
}
MediaCodec.INFO_TRY_AGAIN_LATER -> Timber.d("INFO_TRY_AGAIN_LATER")
else -> {
val outBuffer = outputBuffers[outputBufferIndex]
outBuffer.position(bufferInfo.offset);
outBuffer.limit(bufferInfo.offset + bufferInfo.size);
val chunk = ByteArray(bufferInfo.size)
outBuffer[chunk]
outBuffer.clear()
audioTrack.write(
chunk,
bufferInfo.offset,
bufferInfo.offset + bufferInfo.size
)
decoder.releaseOutputBuffer(outputBufferIndex, false)
Timber.v("chunk size:${chunk.size}")
return chunk
}
}
// All decoded frames have been rendered, we can stop playing now
if (bufferInfo.flags and MediaCodec.BUFFER_FLAG_END_OF_STREAM != 0) {
Timber.d("BUFFER_FLAG_END_OF_STREAM")
}
return ByteArray(0)
}
sadly I'm getting on some (clean) Android 10
E/ACodec: [OMX.google.amrwb.decoder] ERROR(0x80001001)
E/ACodec: signalError(omxError 0x80001001, internalError -2147483648)
E/MediaCodec: Codec reported err 0x80001001, actionCode 0, while in state 6
E/RtpReceiver: java.lang.IllegalStateException
at android.media.MediaCodec.native_dequeueInputBuffer(Native Method)
at android.media.MediaCodec.dequeueInputBuffer(MediaCodec.java:2727)
I should probably pack up dequeueOutputBuffer+when in some while(true), but then I'm getting similar logs as above, but with 0x8000100b
on another device - Android 12 on Pixel - Im' getting similar
D/BufferPoolAccessor2.0: bufferpool2 0xb400007067901978 : 4(32768 size) total buffers - 4(32768 size) used buffers - 0/5 (recycle/alloc) - 0/0 (fetch/transfer)
D/CCodecBufferChannel: [c2.android.amrwb.decoder#471] work failed to complete: 14
E/MediaCodec: Codec reported err 0xe, actionCode 0, while in state 6/STARTED
E/RtpReceiver: java.lang.IllegalStateException
at android.media.MediaCodec.native_dequeueOutputBuffer(Native Method)
at android.media.MediaCodec.dequeueOutputBuffer(MediaCodec.java:3535)
I'm obviusly cutting off RTP header (payload used above), but nothing else done. Should I also recognize payload/AMR header? Inside of it there is e.g. FT - frame type index - which is determining bitrate, so decoder should got this param before start() call right? Or can I pass whole payload, with CMR, ToC with FT, Q etc. straight to decoder, but I've inited it not so well? Or my decode method is somehow wrongly implemented? In short: how to properly decode (and play) AMR-WB got from RTP stream?
edit: worth mentioning that payload starts with F0 84 84 84 84 04 for every packet
turned out that I have to "unpack" also AMR header and "re-pack" data into AMR frames. first bytes of payload posted in question are ToC list.
F0 is CMR and may be ommited, starting pos 1 we can calculate ToC size - number of consecutive bytes with 1 on msb (or as int >= 128 or as hex first char >= 8) + 1. so if payload[1] starts with 0(hex) then ToC size is 1 and payload is one frame and we can pass it to decoder (don't forget to skip first CMR byte!). in my sample ToC size is 5, so I have to divide rest of payload and interlace with ToC bytes, where "frame" = one byte for ToC + frame-payload.
my whole payload has 91 bytes
-1 for cmr
-5 ToCs
gives 85 bytes for 5 frames (toc size)
which gives 5 frames with 1 (toc byte) + 17 (85/5 amrpayload) size
we can just divide rest of payload, but its worth ensuring that size by checking bitrate mode passed in every ToC byte for every frame and comparing with fixed frame sizes per bitrate (check out index in below code)
fun decode(rtpPacket: RtpPacket): ByteArray {
var outData = ByteArray(0)
var position = 0
position++ // skip payload header, ignore CMR - rtpPacket.payload[0]
var tocLen = 0
while (getBit(rtpPacket.payload[position].toInt(), 7)) {
//first byte has 1 at msb
position++
tocLen++
}
if (tocLen > 0) { // if there is any toc detected
// first byte which has NOT 1 at msb also belongs to ToC
position++
tocLen++
}
//Timber.i("decoded tocListSize: $tocLen")
if (tocLen > 0) {
// starting from 1 because this is first ToC byte position after ommiting CMR
for (i in 1 until (tocLen + 1)) {
val index = rtpPacket.payload[i].toInt() shr 3 and 0xf
if (index >= 9) {
Timber.w("Bad AMR ToC, index=$index")
break
}
val amr_frame_sizes = intArrayOf(17, 23, 32, 36, 40, 46, 50, 58, 60, 5)
val frameSize = amr_frame_sizes[index]
//Timber.i("decoded i:$i index:$index frameSize:frameSize position:$position")
if (position + frameSize > rtpPacket.payloadLength) {
Timber.w("Truncated AMR frame")
break
}
val frame = ByteArray(1 + frameSize)
frame[0] = rtpPacket.payload[i]
System.arraycopy(rtpPacket.payload, position, frame, 1, frameSize)
outData = outData.plus(decode(frame))
position += frameSize
}
} else { // single frame case, NOT TESTED!!
outData = ByteArray(rtpPacket.payloadLength - 1) // without CMR
System.arraycopy(rtpPacket.payload, 1, outData, 0, outData.size)
outData = decode(outData)
}
return outData
}
returned data may be used instead of rtpPacket.payload in decode method posted in question (well, code of decoder itself may be a bit improved, as last lines are unreachable, but even in this form is working)
amr_frame_sizes is const array for my case, in which 100 ms of AMR is divided into 5 frames. these sizes are adjusted to such case - 20ms frame - and position according to index ("changeable" bitrate)

Android MediaCodec dequeueInputBuffer always returns -1

I'm trying to take raw data from the AudioRecord object and save it in a file using a MediaMuxer and MediaCodec.
I start the codec, start the muxer, load data into the input buffers and no such luck.
From debugging investigation, I've found that the problem is occurring in the call to dequeueInputBuffer(). It appears that the first few chunks of data succeed, but eventually dequeueInputBuffer() just returns -1 constantly.
Is there something obvious that I'm missing? This seems like what's happening is I'm filling up the input buffers but they're never being released by the codec.
Snippet of relevant code:
int numChunks = input.length / CHUNKSIZE;
mAudioEncoder.start();
for (int chunk = 0; chunk <= numChunks; chunk++) {
byte[] passMe = new byte[CHUNKSIZE];
int inputBufferIndex = -1;
Log.d("offerAudioEncoder","printing chunk #" + chunk + "of " + numChunks);
//Copy the data into the chunk array
if (chunk < input.length / CHUNKSIZE)
for (int i = 0; i < CHUNKSIZE; i++)
passMe[i] = input[chunk * CHUNKSIZE + i];
else {
eosReceived = true;
for (int i = 0; chunk * CHUNKSIZE + i < input.length; i++)
passMe[i] = input[chunk * CHUNKSIZE + i];
}
//Get the input buffer
if (android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.JELLY_BEAN_MR2) {
while(inputBufferIndex < 0)//justk keep trying.
inputBufferIndex = mAudioEncoder.dequeueInputBuffer(100);
inputBuffer = mAudioEncoder.getInputBuffer(inputBufferIndex);
} else {
//backwards compatibility.
ByteBuffer[] inputBuffers = mAudioEncoder.getInputBuffers();
inputBufferIndex = mAudioEncoder.dequeueInputBuffer(-1);
if (inputBufferIndex >= 0)
inputBuffer = inputBuffers[inputBufferIndex];
}
//Plop the data into the input buffer
if (inputBuffer != null) {
inputBuffer.clear();
inputBuffer.put(passMe);
}
long presentationTimeUs = chunk * 10000000; //each encoded chunk represents one second of audio
//this is what the frame should be labeled as
mAudioEncoder.queueInputBuffer(inputBufferIndex, 0, passMe.length, presentationTimeUs, 0);
//Pull the output buffer.
int encoderStatus = -1;
while(encoderStatus < 0) //Like, seriously, WAIT forever.
encoderStatus = mAudioEncoder.dequeueOutputBuffer(mAudioBufferInfo, -1);//wait forever, why not?
if (android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.JELLY_BEAN_MR2)
outputBuffer = mAudioEncoder.getOutputBuffer(encoderStatus);
else {
ByteBuffer[] encoderOutputBuffers = mAudioEncoder.getOutputBuffers();
outputBuffer = encoderOutputBuffers[encoderStatus];
}
if(encoderStatus >= 0) {
if (android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.JELLY_BEAN_MR2)
mMuxer.writeSampleData(audioTrackIndex, outputBuffer, mAudioBufferInfo);
//Done with the output buffer, release it.
mAudioEncoder.releaseOutputBuffer(encoderStatus, false);
}//TODO: Add cases for what to do when the output format changes
Okay, I figured it out. Ultimately I dumped the chunking logic and just increased the size of the input buffer by setting
audioFormat.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, 14000000);
For the MediaFormat object passed to the MediaCodec's configure method.
Also a good tip: Make sure to use 16-bit audio encoding and to use the AudioRecord.read method that spits out shorts. Bytes seem to produce screwy audio (probably because AudioRecord wants to be operating in 16 bit).

MediaMuxer unable to make MP4s that are streamable

I'm editing an MP4 on Android using MediaExtractor to fetch audio and video tracks then creating a new file using MediaMuxer. It works fine. I can play the new MP4 on the phone (and other players) but am unable to stream the file on the web. When I stop the MediaMuxer it generates a log message
"The mp4 file will not be streamable."
I looked at the underlying native code (MPEG4Writer.cpp) and it would appear that the writer is having trouble calculating the needed moov box size. It tries to guess using some heuristic if a bit rate is not supplied as a parameter to the writer. The problem is the MediaMuxer doesn't provider the ability to set MPEG4Writer's parameters. Am I missing something or am I stuck looking a some other means of generating the file (or header)? Thanks.
In MPEG4Writer.cpp:
// The default MIN_MOOV_BOX_SIZE is set to 0.6% x 1MB / 2,
// where 1MB is the common file size limit for MMS application.
// The default MAX _MOOV_BOX_SIZE value is based on about 3
// minute video recording with a bit rate about 3 Mbps, because
// statistics also show that most of the video captured are going
// to be less than 3 minutes.
This is a bad assumption on how MediaMuxer might be used. We are recording a max of 15 seconds of higher res video and MIN_MOOV_BOX_SIZE is way too small. So to make the file streamable I have to rewrite the file to move the moov header before mdat and patch up some offsets. Here is my code. It's not great. Error paths aren't handled correctly and it makes assumptions about the order of the boxes.
public void fastPlay(String srcFile, String dstFile) {
RandomAccessFile inFile = null;
FileOutputStream outFile = null;
try {
inFile = new RandomAccessFile(new File(srcFile), "r");
outFile = new FileOutputStream(new File(dstFile));
int moovPos = 0;
int mdatPos = 0;
int moovSize = 0;
int mdatSize = 0;
byte[] boxSizeBuf = new byte[4];
byte[] pathBuf = new byte[4];
int boxSize;
int dataSize;
int bytesRead;
int totalBytesRead = 0;
int bytesWritten = 0;
// First find the location and size of the moov and mdat boxes
while (true) {
try {
boxSize = inFile.readInt();
bytesRead = inFile.read(pathBuf);
if (bytesRead != 4) {
Log.e(TAG, "Unexpected bytes read (path) " + bytesRead);
break;
}
String pathRead = new String(pathBuf, "UTF-8");
dataSize = boxSize - 8;
totalBytesRead += 8;
if (pathRead.equals("moov")) {
moovPos = totalBytesRead - 8;
moovSize = boxSize;
} else if (pathRead.equals("mdat")) {
mdatPos = totalBytesRead - 8;
mdatSize = boxSize;
}
totalBytesRead += inFile.skipBytes(dataSize);
} catch (IOException e) {
break;
}
}
// Read the moov box into a buffer. This has to be patched up. Ug.
inFile.seek(moovPos);
byte[] moovBoxBuf = new byte[moovSize]; // This shouldn't be too big.
bytesRead = inFile.read(moovBoxBuf);
if (bytesRead != moovSize) {
Log.e(TAG, "Couldn't read full moov box");
}
// Now locate the stco boxes (chunk offset box) inside the moov box and patch
// them up. This ain't purdy.
int pos = 0;
while (pos < moovBoxBuf.length - 4) {
if (moovBoxBuf[pos] == 0x73 && moovBoxBuf[pos + 1] == 0x74 &&
moovBoxBuf[pos + 2] == 0x63 && moovBoxBuf[pos + 3] == 0x6f) {
int stcoPos = pos - 4;
int stcoSize = byteArrayToInt(moovBoxBuf, stcoPos);
patchStco(moovBoxBuf, stcoSize, stcoPos, moovSize);
}
pos++;
}
inFile.seek(0);
byte[] buf = new byte[(int) mdatPos];
// Write out everything before mdat
inFile.read(buf);
outFile.write(buf);
// Write moov
outFile.write(moovBoxBuf, 0, moovSize);
// Write out mdat
inFile.seek(mdatPos);
bytesWritten = 0;
while (bytesWritten < mdatSize) {
int bytesRemaining = (int) mdatSize - bytesWritten;
int bytesToRead = buf.length;
if (bytesRemaining < bytesToRead) bytesToRead = bytesRemaining;
bytesRead = inFile.read(buf, 0, bytesToRead);
if (bytesRead > 0) {
outFile.write(buf, 0, bytesRead);
bytesWritten += bytesRead;
} else {
break;
}
}
} catch (IOException e) {
Log.e(TAG, e.getMessage());
} finally {
try {
if (outFile != null) outFile.close();
if (inFile != null) inFile.close();
} catch (IOException e) {}
}
}
private void patchStco(byte[] buf, int size, int pos, int moovSize) {
Log.e(TAG, "stco " + pos + " size " + size);
// We are inserting the moov box before the mdat box so all of
// offsets in the stco box need to be increased by the size of the moov box. The stco
// box is variable in length. 4 byte size, 4 byte path, 4 byte version, 4 byte flags
// followed by a variable number of chunk offsets. So subtract off 16 from size then
// divide result by 4 to get the number of chunk offsets to patch up.
int chunkOffsetCount = (size - 16) / 4;
int chunkPos = pos + 16;
for (int i = 0; i < chunkOffsetCount; i++) {
int chunkOffset = byteArrayToInt(buf, chunkPos);
int newChunkOffset = chunkOffset + moovSize;
intToByteArray(newChunkOffset, buf, chunkPos);
chunkPos += 4;
}
}
public static int byteArrayToInt(byte[] b, int offset)
{
return b[offset + 3] & 0xFF |
(b[offset + 2] & 0xFF) << 8 |
(b[offset + 1] & 0xFF) << 16 |
(b[offset] & 0xFF) << 24;
}
public void intToByteArray(int a, byte[] buf, int offset)
{
buf[offset] = (byte) ((a >> 24) & 0xFF);
buf[offset + 1] = (byte) ((a >> 16) & 0xFF);
buf[offset + 2] = (byte) ((a >> 8) & 0xFF);
buf[offset + 3] = (byte) (a & 0xFF);
}
Currently MediaMuxer does not create streamable MP4 files
You can try Intel INDE on https://software.intel.com/en-us/intel-inde and Media Pack for Android which is a part of INDE, tutorials on https://software.intel.com/en-us/articles/intel-inde-media-pack-for-android-tutorials. It has a sample that shows how to use media pack to create and stream files over the network
For example for camera streaming it have sample CameraStreamerActivity.java
public void onCreate(Bundle icicle) {
capture = new CameraCapture(new AndroidMediaObjectFactory(getApplicationContext()), progressListener);
parameters = new StreamingParameters();
parameters.Host = getString(R.string.streaming_server_default_ip);
parameters.Port = Integer.parseInt(getString(R.string.streaming_server_default_port));
parameters.ApplicationName = getString(R.string.streaming_server_default_app);
parameters.StreamName = getString(R.string.streaming_server_default_stream);
parameters.isToPublishAudio = false;
parameters.isToPublishVideo = true;
}
public void startStreaming() {
configureMediaStreamFormat();
capture.setTargetVideoFormat(videoFormat);
capture.setTargetAudioFormat(audioFormat);
capture.setTargetConnection(prepareStreamingParams());
capture.start();
}
In addition there are simular samples for files streaming or game process capturing and streaming

Images to Video using MediaCodec and MediaMuxer

I have a bunch of local images saved as jpeg files. My images are captured using CameraPreview and the PreviewFormat is as default: NV21. I want to generate a small video from a fixed number of images.
I am not going to use FFMpeg because it requires NDK and will introduce compatibility issues.
MediaCodec and MediaMuxer seems work but there are not one working solutions on the web.
There are a few references lead to my current solution.
1.EncodeAndMuxTest: http://bigflake.com/mediacodec/EncodeAndMuxTest.java.txt
This one is written by fadden. It quite suits my needs except he is using createInputSurface not queueInputBuffer.
2.Convert bitmap array to YUV (YCbCr NV21)
I do the conversion following this answer. https://stackoverflow.com/a/17116985/3047840
3.Using MediaCodec to save series of images as Video
This question looks much similar as mine but I don't bother using MediaMuxer.
My code is the following:
public class EncodeAndMux extends Activity {
private static final String TAG = "EncodeAndMuxTest";
private static final boolean VERBOSE = false;
private static final File OUTPUT_DIR = Environment
.getExternalStorageDirectory();
private static final String MIME_TYPE = "video/avc";
private static final int FRAME_RATE = 10;
// 10 seconds between I-frames
private static final int IFRAME_INTERVAL = 10;
private static final int NUM_FRAMES = 5;
private static final String DEBUG_FILE_NAME_BASE = "/sdcard/test";
// two seconds of video size of a frame, in pixels
private int mWidth = -1;
private int mHeight = -1;
// bit rate, in bits per second
private int mBitRate = -1;
private byte[] mFrame;
// largest color component delta seen (i.e. actual vs. expected)
private int mLargestColorDelta;
// encoder / muxer state
private MediaCodec mEncoder;
private MediaMuxer mMuxer;
private int mTrackIndex;
private boolean mMuxerStarted;
private Utils mUtils;
private float mPadding;
private int mColumnWidth;
private static final int TEST_Y = 120; // YUV values for colored rect
private static final int TEST_U = 160;
private static final int TEST_V = 200;
private static final int TEST_R0 = 0; // RGB equivalent of {0,0,0}
private static final int TEST_G0 = 136;
private static final int TEST_B0 = 0;
private static final int TEST_R1 = 236; // RGB equivalent of {120,160,200}
private static final int TEST_G1 = 50;
private static final int TEST_B1 = 186;
private static final boolean DEBUG_SAVE_FILE = false; // save copy of
// encoded movie
// allocate one of these up front so we don't need to do it every time
private MediaCodec.BufferInfo mBufferInfo;
private ArrayList<String> mImagePaths = new ArrayList<String>();
byte[] getNV21(int inputWidth, int inputHeight, Bitmap scaled) {
int[] argb = new int[inputWidth * inputHeight];
scaled.getPixels(argb, 0, inputWidth, 0, 0, inputWidth, inputHeight);
byte[] yuv = new byte[inputWidth * inputHeight * 3 / 2];
encodeYUV420SP(yuv, argb, inputWidth, inputHeight);
scaled.recycle();
return yuv;
}
void encodeYUV420SP(byte[] yuv420sp, int[] argb, int width, int height) {
final int frameSize = width * height;
int yIndex = 0;
int uvIndex = frameSize;
int a, R, G, B, Y, U, V;
int index = 0;
for (int j = 0; j < height; j++) {
for (int i = 0; i < width; i++) {
a = (argb[index] & 0xff000000) >> 24; // a is not used obviously
R = (argb[index] & 0xff0000) >> 16;
G = (argb[index] & 0xff00) >> 8;
B = (argb[index] & 0xff) >> 0;
// well known RGB to YUV algorithm
Y = ((66 * R + 129 * G + 25 * B + 128) >> 8) + 16;
U = ((-38 * R - 74 * G + 112 * B + 128) >> 8) + 128;
V = ((112 * R - 94 * G - 18 * B + 128) >> 8) + 128;
// NV21 has a plane of Y and interleaved planes of VU each
// sampled by a factor of 2
// meaning for every 4 Y pixels there are 1 V and 1 U. Note the
// sampling is every other
// pixel AND every other scanline.
yuv420sp[yIndex++] = (byte) ((Y < 0) ? 0
: ((Y > 255) ? 255 : Y));
if (j % 2 == 0 && index % 2 == 0) {
yuv420sp[uvIndex++] = (byte) ((V < 0) ? 0
: ((V > 255) ? 255 : V));
yuv420sp[uvIndex++] = (byte) ((U < 0) ? 0
: ((U > 255) ? 255 : U));
}
index++;
}
}
}
public static Bitmap decodeFile(String filePath, int WIDTH, int HIGHT) {
try {
File f = new File(filePath);
BitmapFactory.Options o = new BitmapFactory.Options();
o.inJustDecodeBounds = true;
o.inPurgeable = true;
o.inInputShareable = true;
BitmapFactory.decodeStream(new FileInputStream(f), null, o);
final int REQUIRED_WIDTH = WIDTH;
final int REQUIRED_HIGHT = HIGHT;
int scale = 1;
while (o.outWidth / scale / 2 >= REQUIRED_WIDTH
&& o.outHeight / scale / 2 >= REQUIRED_HIGHT)
scale *= 2;
BitmapFactory.Options o2 = new BitmapFactory.Options();
o2.inSampleSize = scale;
o2.inPurgeable = true;
o2.inInputShareable = true;
return BitmapFactory.decodeStream(new FileInputStream(f), null, o2);
} catch (FileNotFoundException e) {
e.printStackTrace();
}
return null;
}
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_encode_and_mux);
mUtils = new Utils(this);
mImagePaths = mUtils.getBackFilePaths();
mPadding = TypedValue.applyDimension(TypedValue.COMPLEX_UNIT_DIP,
AppConstant.GRID_PADDING, getResources().getDisplayMetrics());
mColumnWidth = (int) ((mUtils.getScreenWidth() - ((AppConstant.NUM_OF_COLUMNS + 1) * mPadding)) / AppConstant.NUM_OF_COLUMNS);
try {
testEncodeDecodeVideoFromBufferToSurface720p();
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (Throwable e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
/**
* Returns the first codec capable of encoding the specified MIME type, or null if no
* match was found.
*/
private static MediaCodecInfo selectCodec(String mimeType) {
int numCodecs = MediaCodecList.getCodecCount();
for (int i = 0; i < numCodecs; i++) {
MediaCodecInfo codecInfo = MediaCodecList.getCodecInfoAt(i);
if (!codecInfo.isEncoder()) {
continue;
}
String[] types = codecInfo.getSupportedTypes();
for (int j = 0; j < types.length; j++) {
if (types[j].equalsIgnoreCase(mimeType)) {
return codecInfo;
}
}
}
return null;
}
/**
* Returns a color format that is supported by the codec and by this test code. If no
* match is found, this throws a test failure -- the set of formats known to the test
* should be expanded for new platforms.
*/
private static int selectColorFormat(MediaCodecInfo codecInfo, String mimeType) {
MediaCodecInfo.CodecCapabilities capabilities = codecInfo.getCapabilitiesForType(mimeType);
for (int i = 0; i < capabilities.colorFormats.length; i++) {
int colorFormat = capabilities.colorFormats[i];
if (isRecognizedFormat(colorFormat)) {
return colorFormat;
}
}
Log.e("","couldn't find a good color format for " + codecInfo.getName() + " / " + mimeType);
return 0; // not reached
}
/**
* Returns true if this is a color format that this test code understands (i.e. we know how
* to read and generate frames in this format).
*/
private static boolean isRecognizedFormat(int colorFormat) {
switch (colorFormat) {
// these are the formats we know how to handle for this test
case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar:
case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedPlanar:
case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar:
case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedSemiPlanar:
case MediaCodecInfo.CodecCapabilities.COLOR_TI_FormatYUV420PackedSemiPlanar:
return true;
default:
return false;
}
}
/**
* Returns true if the specified color format is semi-planar YUV. Throws an exception
* if the color format is not recognized (e.g. not YUV).
*/
private static boolean isSemiPlanarYUV(int colorFormat) {
switch (colorFormat) {
case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar:
case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedPlanar:
return false;
case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar:
case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedSemiPlanar:
case MediaCodecInfo.CodecCapabilities.COLOR_TI_FormatYUV420PackedSemiPlanar:
return true;
default:
throw new RuntimeException("unknown format " + colorFormat);
}
}
/**
* Does the actual work for encoding frames from buffers of byte[].
*/
private void doEncodeDecodeVideoFromBuffer(MediaCodec encoder, int encoderColorFormat,
MediaCodec decoder, boolean toSurface) {
final int TIMEOUT_USEC = 10000;
ByteBuffer[] encoderInputBuffers = encoder.getInputBuffers();
ByteBuffer[] encoderOutputBuffers = encoder.getOutputBuffers();
ByteBuffer[] decoderInputBuffers = null;
ByteBuffer[] decoderOutputBuffers = null;
MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();
MediaFormat decoderOutputFormat = null;
int generateIndex = 0;
int checkIndex = 0;
int badFrames = 0;
boolean decoderConfigured = false;
OutputSurface outputSurface = null;
// The size of a frame of video data, in the formats we handle, is stride*sliceHeight
// for Y, and (stride/2)*(sliceHeight/2) for each of the Cb and Cr channels. Application
// of algebra and assuming that stride==width and sliceHeight==height yields:
// Just out of curiosity.
long rawSize = 0;
long encodedSize = 0;
// Save a copy to disk. Useful for debugging the test. Note this is a raw elementary
// stream, not a .mp4 file, so not all players will know what to do with it.
if (toSurface) {
outputSurface = new OutputSurface(mWidth, mHeight);
}
// Loop until the output side is done.
boolean inputDone = false;
boolean encoderDone = false;
boolean outputDone = false;
while (!outputDone) {
Log.e(TAG, "loop");
// If we're not done submitting frames, generate a new one and submit it. By
// doing this on every loop we're working to ensure that the encoder always has
// work to do.
//
// We don't really want a timeout here, but sometimes there's a delay opening
// the encoder device, so a short timeout can keep us from spinning hard.
if (!inputDone) {
int inputBufIndex = encoder.dequeueInputBuffer(TIMEOUT_USEC);
Log.e(TAG, "inputBufIndex=" + inputBufIndex);
if (inputBufIndex >= 0) {
long ptsUsec = computePresentationTime(generateIndex);
if (generateIndex == NUM_FRAMES) {
// Send an empty frame with the end-of-stream flag set. If we set EOS
// on a frame with data, that frame data will be ignored, and the
// output will be short one frame.
encoder.queueInputBuffer(inputBufIndex, 0, 0, ptsUsec,
MediaCodec.BUFFER_FLAG_END_OF_STREAM);
inputDone = true;
Log.e(TAG, "sent input EOS (with zero-length frame)");
} else {
generateFrame(generateIndex, encoderColorFormat, mFrame);
//generateFrame(generateIndex);
ByteBuffer inputBuf = encoderInputBuffers[inputBufIndex];
// the buffer should be sized to hold one full frame
inputBuf.clear();
inputBuf.put(mFrame);
encoder.queueInputBuffer(inputBufIndex, 0, mFrame.length, ptsUsec, 0);
Log.e(TAG, "submitted frame " + generateIndex + " to enc");
}
generateIndex++;
} else {
// either all in use, or we timed out during initial setup
Log.e(TAG, "input buffer not available");
}
}
// Check for output from the encoder. If there's no output yet, we either need to
// provide more input, or we need to wait for the encoder to work its magic. We
// can't actually tell which is the case, so if we can't get an output buffer right
// away we loop around and see if it wants more input.
//
// Once we get EOS from the encoder, we don't need to do this anymore.
if (!encoderDone) {
int encoderStatus = encoder.dequeueOutputBuffer(info, TIMEOUT_USEC);
if (encoderStatus == MediaCodec.INFO_TRY_AGAIN_LATER) {
// no output available yet
Log.e(TAG, "no output from encoder available");
} else if (encoderStatus == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
// not expected for an encoder
encoderOutputBuffers = encoder.getOutputBuffers();
Log.e(TAG, "encoder output buffers changed");
} else if (encoderStatus == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
// not expected for an encoder
if (mMuxerStarted) {
throw new RuntimeException("format changed twice");
}
MediaFormat newFormat = encoder.getOutputFormat();
Log.e(TAG, "encoder output format changed: " + newFormat);
// now that we have the Magic Goodies, start the muxer
mTrackIndex = mMuxer.addTrack(newFormat);
Log.e(TAG, "muxer defined muxer format: " + newFormat);
mMuxer.start();
mMuxerStarted = true;
} else if (encoderStatus < 0) {
Log.e("","unexpected result from encoder.dequeueOutputBuffer: " + encoderStatus);
} else { // encoderStatus >= 0
ByteBuffer encodedData = encoderOutputBuffers[encoderStatus];
if (encodedData == null) {
Log.e("","encoderOutputBuffer " + encoderStatus + " was null");
}
// It's usually necessary to adjust the ByteBuffer values to match BufferInfo.
encodedData.position(info.offset);
encodedData.limit(info.offset + info.size);
encodedSize += info.size;
if ((info.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
// Codec config info. Only expected on first packet. One way to
// handle this is to manually stuff the data into the MediaFormat
// and pass that to configure(). We do that here to exercise the API.
MediaFormat format =
MediaFormat.createVideoFormat(MIME_TYPE, mWidth, mHeight);
format.setByteBuffer("csd-0", encodedData);
decoder.configure(format, toSurface ? outputSurface.getSurface() : null,
null, 0);
decoder.start();
decoderInputBuffers = decoder.getInputBuffers();
decoderOutputBuffers = decoder.getOutputBuffers();
decoderConfigured = true;
Log.e(TAG, "decoder configured (" + info.size + " bytes)"+format);
} else {
// Get a decoder input buffer, blocking until it's available.
int inputBufIndex = decoder.dequeueInputBuffer(-1);
ByteBuffer inputBuf = decoderInputBuffers[inputBufIndex];
inputBuf.clear();
inputBuf.put(encodedData);
decoder.queueInputBuffer(inputBufIndex, 0, info.size,
info.presentationTimeUs, info.flags);
encoderDone = (info.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0;
Log.e(TAG, "passed " + info.size + " bytes to decoder"
+ (encoderDone ? " (EOS)" : ""));
Log.e("encoderDone",encoderDone+"");
}
encoder.releaseOutputBuffer(encoderStatus, false);
}
}
// Check for output from the decoder. We want to do this on every loop to avoid
// the possibility of stalling the pipeline. We use a short timeout to avoid
// burning CPU if the decoder is hard at work but the next frame isn't quite ready.
//
// If we're decoding to a Surface, we'll get notified here as usual but the
// ByteBuffer references will be null. The data is sent to Surface instead.
if (decoderConfigured) {
int decoderStatus = decoder.dequeueOutputBuffer(info, 3*TIMEOUT_USEC);
if (decoderStatus == MediaCodec.INFO_TRY_AGAIN_LATER) {
// no output available yet
Log.e(TAG, "no output from decoder available");
} else if (decoderStatus == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
// The storage associated with the direct ByteBuffer may already be unmapped,
// so attempting to access data through the old output buffer array could
// lead to a native crash.
Log.e(TAG, "decoder output buffers changed");
decoderOutputBuffers = decoder.getOutputBuffers();
} else if (decoderStatus == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
// this happens before the first frame is returned
decoderOutputFormat = decoder.getOutputFormat();
Log.e(TAG, "decoder output format changed: " +
decoderOutputFormat);
} else if (decoderStatus < 0) {
Log.e(TAG, "unexpected result from deocder.dequeueOutputBuffer: " + decoderStatus);
} else { // decoderStatus >= 0
if (!toSurface) {
ByteBuffer outputFrame = decoderOutputBuffers[decoderStatus];
outputFrame.position(info.offset);
outputFrame.limit(info.offset + info.size);
mMuxer.writeSampleData(mTrackIndex, outputFrame,
info);
rawSize += info.size;
if (info.size == 0) {
Log.e(TAG, "got empty frame");
} else {
Log.e(TAG, "decoded, checking frame " + checkIndex);
if (!checkFrame(checkIndex++, decoderOutputFormat, outputFrame)) {
badFrames++;
}
}
if ((info.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
Log.e(TAG, "output EOS");
outputDone = true;
}
decoder.releaseOutputBuffer(decoderStatus, false /*render*/);
} else {
Log.e(TAG, "surface decoder given buffer " + decoderStatus +
" (size=" + info.size + ")");
rawSize += info.size;
if ((info.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
Log.e(TAG, "output EOS");
outputDone = true;
}
boolean doRender = (info.size != 0);
// As soon as we call releaseOutputBuffer, the buffer will be forwarded
// to SurfaceTexture to convert to a texture. The API doesn't guarantee
// that the texture will be available before the call returns, so we
// need to wait for the onFrameAvailable callback to fire.
decoder.releaseOutputBuffer(decoderStatus, doRender);
if (doRender) {
Log.e(TAG, "awaiting frame " + checkIndex);
outputSurface.awaitNewImage();
outputSurface.drawImage();
if (!checkSurfaceFrame(checkIndex++)) {
badFrames++;
}
}
}
}
}
}
Log.e(TAG, "decoded " + checkIndex + " frames at "
+ mWidth + "x" + mHeight + ": raw=" + rawSize + ", enc=" + encodedSize);
if (outputSurface != null) {
outputSurface.release();
}
if (checkIndex != NUM_FRAMES) {
Log.e(TAG, "awaiting frame " + checkIndex);
}
if (badFrames != 0) {
Log.e(TAG, "Found " + badFrames + " bad frames");
}
}
private void generateFrame(int frameIndex) {
Bitmap bitmap = decodeFile(mImagePaths.get(frameIndex), mColumnWidth,
mColumnWidth);
mFrame = getNV21(bitmap.getWidth(), bitmap.getHeight(), bitmap);
}
/**
* Generates data for frame N into the supplied buffer. We have an 8-frame animation
* sequence that wraps around. It looks like this:
* <pre>
* 0 1 2 3
* 7 6 5 4
* </pre>
* We draw one of the eight rectangles and leave the rest set to the zero-fill color.
*/
private void generateFrame(int frameIndex, int colorFormat, byte[] mFrame) {
final int HALF_WIDTH = mWidth / 2;
boolean semiPlanar = isSemiPlanarYUV(colorFormat);
// Set to zero. In YUV this is a dull green.
Arrays.fill(mFrame, (byte) 0);
int startX, startY, countX, countY;
frameIndex %= 8;
//frameIndex = (frameIndex / 8) % 8; // use this instead for debug -- easier to see
if (frameIndex < 4) {
startX = frameIndex * (mWidth / 4);
startY = 0;
} else {
startX = (7 - frameIndex) * (mWidth / 4);
startY = mHeight / 2;
}
for (int y = startY + (mHeight/2) - 1; y >= startY; --y) {
for (int x = startX + (mWidth/4) - 1; x >= startX; --x) {
if (semiPlanar) {
// full-size Y, followed by UV pairs at half resolution
// e.g. Nexus 4 OMX.qcom.video.encoder.avc COLOR_FormatYUV420SemiPlanar
// e.g. Galaxy Nexus OMX.TI.DUCATI1.VIDEO.H264E
// OMX_TI_COLOR_FormatYUV420PackedSemiPlanar
mFrame[y * mWidth + x] = (byte) TEST_Y;
if ((x & 0x01) == 0 && (y & 0x01) == 0) {
mFrame[mWidth*mHeight + y * HALF_WIDTH + x] = (byte) TEST_U;
mFrame[mWidth*mHeight + y * HALF_WIDTH + x + 1] = (byte) TEST_V;
}
} else {
// full-size Y, followed by quarter-size U and quarter-size V
// e.g. Nexus 10 OMX.Exynos.AVC.Encoder COLOR_FormatYUV420Planar
// e.g. Nexus 7 OMX.Nvidia.h264.encoder COLOR_FormatYUV420Planar
mFrame[y * mWidth + x] = (byte) TEST_Y;
if ((x & 0x01) == 0 && (y & 0x01) == 0) {
mFrame[mWidth*mHeight + (y/2) * HALF_WIDTH + (x/2)] = (byte) TEST_U;
mFrame[mWidth*mHeight + HALF_WIDTH * (mHeight / 2) +
(y/2) * HALF_WIDTH + (x/2)] = (byte) TEST_V;
}
}
}
}
}
/**
* Sets the desired frame size and bit rate.
*/
private void setParameters(int width, int height, int bitRate) {
if ((width % 16) != 0 || (height % 16) != 0) {
Log.w(TAG, "WARNING: width or height not multiple of 16");
}
mWidth = width;
mHeight = height;
mBitRate = bitRate;
mFrame = new byte[mWidth * mHeight * 3 / 2];
}
public void testEncodeDecodeVideoFromBufferToSurface720p() throws Throwable {
setParameters(1280, 720, 6000000);
encodeDecodeVideoFromBuffer(false);
}
}
Logcat:
12-17 18:25:47.405: E/EncodeAndMuxTest(16415): found codec: OMX.qcom.video.encoder.avc
12-17 18:25:47.405: I/OMXClient(16415): Using client-side OMX mux.
12-17 18:25:47.455: E/EncodeAndMuxTest(16415): found colorFormat: 21
12-17 18:25:47.455: E/EncodeAndMuxTest(16415): format: {frame-rate=10, bitrate=6000000, height=720, mime=video/avc, color-format=21, i-frame-interval=10, width=1280}
12-17 18:25:47.465: I/OMXClient(16415): Using client-side OMX mux.
12-17 18:25:47.495: E/ACodec(16415): [OMX.qcom.video.encoder.avc] storeMetaDataInBuffers (output) failed w/ err -2147483648
12-17 18:25:47.495: I/ACodec(16415): setupVideoEncoder succeeded
12-17 18:25:47.535: I/OMXClient(16415): Using client-side OMX mux.
12-17 18:25:47.545: E/EncodeAndMuxTest(16415): loop
12-17 18:25:47.545: E/EncodeAndMuxTest(16415): inputBufIndex=0
12-17 18:25:47.655: E/EncodeAndMuxTest(16415): submitted frame 0 to enc
12-17 18:25:47.655: E/EncodeAndMuxTest(16415): encoder output format changed: {csd-1=java.nio.ByteArrayBuffer[position=0,limit=8,capacity=8], height=720, mime=video/avc, csd-0=java.nio.ByteArrayBuffer[position=0,limit=18,capacity=18], what=1869968451, width=1280}
12-17 18:25:47.655: E/EncodeAndMuxTest(16415): muxer defined muxer format: {csd-1=java.nio.ByteArrayBuffer[position=0,limit=8,capacity=8], height=720, mime=video/avc, csd-0=java.nio.ByteArrayBuffer[position=0,limit=18,capacity=18], what=1869968451, width=1280}
12-17 18:25:47.655: I/MPEG4Writer(16415): limits: 2147483647/0 bytes/us, bit rate: -1 bps and the estimated moov size 3072 bytes
12-17 18:25:47.655: E/EncodeAndMuxTest(16415): inputBufIndex=2
12-17 18:25:47.795: E/EncodeAndMuxTest(16415): submitted frame 1 to enc
12-17 18:25:47.825: E/EncodeAndMuxTest(16415): decoder configured (26 bytes){csd-0=java.nio.DirectByteBuffer[position=0,limit=26,capacity=692224], height=720, width=1280, mime=video/avc}
12-17 18:25:47.855: E/EncodeAndMuxTest(16415): no output from decoder available
12-17 18:25:47.855: E/EncodeAndMuxTest(16415): inputBufIndex=0
12-17 18:25:47.976: E/EncodeAndMuxTest(16415): submitted frame 2 to enc
12-17 18:25:48.136: E/EncodeAndMuxTest(16415): passed 3188 bytes to decoder
12-17 18:25:48.176: E/EncodeAndMuxTest(16415): no output from decoder available
12-17 18:25:48.176: E/EncodeAndMuxTest(16415): inputBufIndex=1
12-17 18:25:48.296: E/EncodeAndMuxTest(16415): submitted frame 3 to enc
12-17 18:25:48.296: E/EncodeAndMuxTest(16415): passed 1249 bytes to decoder
12-17 18:25:48.326: E/EncodeAndMuxTest(16415): no output from decoder available
12-17 18:25:48.326: E/EncodeAndMuxTest(16415): loop
12-17 18:25:48.326: E/EncodeAndMuxTest(16415): inputBufIndex=2
12-17 18:25:48.396: E/EncodeAndMuxTest(16415): submitted frame 4 to enc
12-17 18:25:48.396: E/EncodeAndMuxTest(16415): passed 3085 bytes to decoder
12-17 18:25:48.436: E/EncodeAndMuxTest(16415): no output from decoder available
12-17 18:25:48.436: E/EncodeAndMuxTest(16415): inputBufIndex=0
12-17 18:25:48.436: E/EncodeAndMuxTest(16415): sent input EOS (with zero-length frame)
12-17 18:25:48.436: E/EncodeAndMuxTest(16415): passed 3056 bytes to decoder
12-17 18:25:48.466: E/EncodeAndMuxTest(16415): no output from decoder available
12-17 18:25:48.466: E/EncodeAndMuxTest(16415): passed 1085 bytes to decoder (EOS)
12-17 18:25:48.476: E/EncodeAndMuxTest(16415): decoder output buffers changed
12-17 18:25:48.496: E/EncodeAndMuxTest(16415): decoder output format changed:
Reading the Jpegs, decompressing them, and then recompressing them is going to cause loss of Image quality (and take CPU effort / time) simply appending them altogether and tossing them in a Video Container will be faster and produce a better Video.
The MJpeg Video Format is quite old so (almost) any Program can play Mjpeg Videos.
I suggest a Solution similar to this http://sourceforge.net/projects/jpegtoavi/ IE: make an Mjpeg Movie from your Jpegs. There is more than that one Program to choose from, use a Search Engine (or our Search Bar) to find more Source Code.
.
I tested my Phone to see if it can understand Mjpegs by creating a File using this command:
ffmpeg.exe -i test_in.mp4 -vcodec mjpeg -acodec copy test_out.mp4
In: Stream #0:0(und): Video: h264 (Main) (avc1 / 0x31637661), yuv420p, 1280x720 [SAR 1:1 DAR 16:9], 1568 kb/s, 29.97 fps, 29.97 tbr, 90k tbn, 59.94 tbc (default)
Out: Stream #0:0(und): Video: mjpeg (l[0][0][0] / 0x006C), yuvj420p, 1280x720 [SAR 1:1 DAR 16:9], q=2-31, 200 kb/s, 30k tbn, 29.97 tbc (default)
Unfortunatley the Android "Gallery Player" is one of the Programs that does not understand that format, but BSPlayer, VLC, and MPlayer for Android can play that Format if you want the resulting Video to play on your Phone (without writing more Code).

android AudioTrack setloop invalid value

I generate a PCM and want to loop the sound.
I follow the documentation, but Eclipse keep telling me that
08-05 15:46:26.675: E/AudioTrack(27686): setLoop invalid value: loopStart 0, loopEnd 44100, loopCount -1, framecount 11025, user 11025
here is my code:
void genTone() {
// fill out the array
for (int i = 1; i < numSamples - 1; i = i + 2) {
sample[i] = Math.sin(2 * Math.PI * i / (sampleRate / -300));
}
// convert to 16 bit pcm sound array
// assumes the sample buffer is normalised.
int idx = 0;
for (double dVal : sample) {
short val = (short) (dVal * 32767);
generatedSnd[idx++] = (byte) (val & 0x00ff);
generatedSnd[idx++] = (byte) ((val & 0xff00) >>> 8);
}
//write it to audio Track.
audioTrack.write(generatedSnd, 0, numSamples);
audioTrack.setLoopPoints(0, numSamples, -1);
//from 0.0 ~ 1.0
audioTrack.setStereoVolume((float)0.5, (float)1); //change amplitude
}
public void buttonPlay(View v) {
audioTrack.reloadStaticData();
audioTrack.play();
}
please help ~~
From the documentation: "endInFrames loop end marker expressed in frames"
The log print indicates that your track contains 11025 frames, which is less than the 44100 that you're trying to specify as the end marker (for 16-bit stereo PCM audio, the frame size would be 4 bytes).
Another thing worth noting is that "the track must be stopped or paused for the position to be changed".

Categories

Resources