How to play raw NAL units in Android MediaCodec - android

I initially tried How to play raw NAL units in Andoid exoplayer? but I noticed I'm gonna have to do things in low level.
I've found this simple MediaCodec example. As you can see, it's a thread that plays a file on a surface passed to it.
Notice the lines
mExtractor = new MediaExtractor();
mExtractor.setDataSource(filePath);
It looks like that I should create my own MediaExtractor which, instead of extracting the video units from a file, it'll use the h264 NAL units from a buffer I'll provide.
I can then call mExtractor.setDataSource(MediaDataSource dataSource), see MediaDataSource
It has readAt(long position, byte[] buffer, int offset, int size)
This is where it reads the NAL units. However, how should I pass them? I have no information on the structure of the buffer that needs to be read.
Should I pass a byte[] buffer with the NAL units in it, and if so, in which format? What is the offset for? If it's a buffer, shouldn't I just erase the lines that were read and thus have no offset or size?
By the way, the h264 NAL units are streaming ones, they come from RTP packets, not files. I'm gonna get them through C++ and store them on a buffer an try to pass to the mediaExtractor.
UPDATE:
I've been reading a lot about MediaCodec and I think I understand it better. According to https://developer.android.com/reference/android/media/MediaCodec, everything relies on something of this type:
MediaCodec codec = MediaCodec.createByCodecName(name);
MediaFormat mOutputFormat; // member variable
codec.setCallback(new MediaCodec.Callback() {
#Override
void onInputBufferAvailable(MediaCodec mc, int inputBufferId) {
ByteBuffer inputBuffer = codec.getInputBuffer(inputBufferId);
// fill inputBuffer with valid data
…
codec.queueInputBuffer(inputBufferId, …);
}
#Override
void onOutputBufferAvailable(MediaCodec mc, int outputBufferId, …) {
ByteBuffer outputBuffer = codec.getOutputBuffer(outputBufferId);
MediaFormat bufferFormat = codec.getOutputFormat(outputBufferId); // option A
// bufferFormat is equivalent to mOutputFormat
// outputBuffer is ready to be processed or rendered.
…
codec.releaseOutputBuffer(outputBufferId, …);
}
#Override
void onOutputFormatChanged(MediaCodec mc, MediaFormat format) {
// Subsequent data will conform to new format.
// Can ignore if using getOutputFormat(outputBufferId)
mOutputFormat = format; // option B
}
#Override
void onError(…) {
…
}
});
codec.configure(format, …);
mOutputFormat = codec.getOutputFormat(); // option B
codec.start();
// wait for processing to complete
codec.stop();
codec.release();
As you can see, I can pass input buffers and get decoded output buffers. The exact byte formats are still a mystery, but I think that's how it works. Also according to the same article, the usage of ByteBuffers is slow, and Surfaces are preferred. They consume the output buffers automatically. Although there's no tutorial on how to do it, there's a section in the article that says it's almost identical, so I guess I just need to add the additional lines
codec.setInputSurface(Surface inputSurface)
codec.setOutputSurface(Surface outputSurface)
Where inputSurface and outputSurface are Surfaces which I pass to a MediaPlayer which I use (how) to display the video in an activity. And the output buffers will simply not come on onOutputBufferAvailable (because the surface consumes them first), neither onInputBufferAvailable.
So the questions now are: how exactly do I construct a Surface which contains the video buffer, and how do I display a MediaPlayer into an activity
For output I can simply create a Surface and pass to a MediaPlayer and MediaCodec, but what about input? Do I need ByteBuffer for the input anyways, and Surface would just be for using other outputs as inputs?

you first need to remove the NAL units , and feed the raw H264 bytes into this method, how ever in your case your reading from the file , so no need to remove any thing since your not using packets , just feed the data bytes to this method:
rivate void initDecoder(){
try {
writeHeader = true;
if(mDecodeMediaCodec != null){
try{
mDecodeMediaCodec.stop();
}catch (Exception e){}
try{
mDecodeMediaCodec.release();
}catch (Exception e){}
}
mDecodeMediaCodec = MediaCodec.createDecoderByType(MIME_TYPE);
//MIME_TYPE = video/avc in your case
mDecodeMediaCodec.configure(format,mSurfaceView.getHolder().getSurface(),
null,
0);
mDecodeMediaCodec.start();
mDecodeInputBuffers = mDecodeMediaCodec.getInputBuffers();
} catch (IOException e) {
e.printStackTrace();
mLatch.trigger();
}
}
private void decode(byte[] data){
try {
MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();
int inputBufferIndex = mDecodeMediaCodec.dequeueInputBuffer(1000);//
if (inputBufferIndex >= 0) {
ByteBuffer buffer = mDecodeInputBuffers[inputBufferIndex];
buffer.clear();
buffer.put(data);
mDecodeMediaCodec.queueInputBuffer(inputBufferIndex,
0,
data.length,
packet.sequence / 1000,
0);
data = null;
//decodeDataBuffer.clear();
//decodeDataBuffer = null;
}
int outputBufferIndex = mDecodeMediaCodec.dequeueOutputBuffer(info,
1000);
do {
if (outputBufferIndex == MediaCodec.INFO_TRY_AGAIN_LATER) {
//no output available yet
} else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
//encodeOutputBuffers = mDecodeMediaCodec.getOutputBuffers();
} else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
format = mDecodeMediaCodec.getOutputFormat();
//mediaformat changed
} else if (outputBufferIndex < 0) {
//unexpected result from encoder.dequeueOutputBuffer
} else {
mDecodeMediaCodec.releaseOutputBuffer(outputBufferIndex,
true);
outputBufferIndex = mDecodeMediaCodec.dequeueOutputBuffer(info,
0);
}
} while (outputBufferIndex > 0);
}
please dont forget that iFrame (the first frame bytes) contains sensitive data and MUST be fed to the decoder first

Related

Whats wrong with my h264 MediaCodec implementation?

I'm pulling video from a drone that encodes the video stream in h264. It sends each NAL unit via 1-5 UDP packets. I have code that creates a NAL unit out of those packets and removes the header. It then passes it to MediaCodec which then passes it to a Surface object
The output should be a video feed but for some reason, all I get is a black screen. I know that the surface is working as intended because if I futz with the NAL unit I get this green garbage that I think happens when MediaCodec doesn't know what it's looking at.
Anyways here's the section of code that handles the actual decoding. Is there anything actually wrong with it or am I looking for the issue in the wrong place?
//This part initializes the decoder and generally sets up everything needed for the while loop down below
encodedVideo = new ServerSocket(11111, 1460, false, 0);
MediaFormat format = MediaFormat.createVideoFormat(MediaFormat.MIMETYPE_VIDEO_AVC, 1920, 1080);
format.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, 100000);
try {
m_codec = MediaCodec.createDecoderByType(MediaFormat.MIMETYPE_VIDEO_AVC);
m_codec.configure(format, new Surface(droneSight.getSurfaceTexture()), null, 0);
} catch (Exception e) {
e.printStackTrace();
}
m_codec.start();
running = true;
initialFrame = Arrays.copyOf(encodedVideo.getPacketData(true,true),encodedVideo.getPacketData(true,false).length);
//This section handles the actual grabbing and decoding of each NAL unit. Or it would, if it worked.
while (running) {
byte[] frame = this.getNALUnit();
int inputIndex = m_codec.dequeueInputBuffer(-1);
if (inputIndex >= 0) {
ByteBuffer buf = m_codec.getInputBuffer(inputIndex);
buf.put(frame);
m_codec.queueInputBuffer(inputIndex, 0, frame.length, 0, 0);
}
MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();
int outputIndex = m_codec.dequeueOutputBuffer(info, 0);
if (outputIndex >= 0) {
m_codec.releaseOutputBuffer(outputIndex, true);
}
}

Audio getting from ByteArray in android

I am trying getting audio from NV21 byte array,When I run below code I am getting error this line inputBuffer.put(input); name of error**"java.nio.BufferOverflowException"** how can I get audio from byte array?
I guess Ia ma getiing error from ByteBuffer but I cannot solve it, I should increase inputBuffer but how I can't find. Please Help me.
public void init(){
//initialize Audio Encoder
File audio_file = new File(Environment.getExternalStorageDirectory().getAbsolutePath() + "/", "audio_encoded.aac");
try {
outputStream = new BufferedOutputStream(new FileOutputStream(audio_file));
Log.e("AudioEncoder", "outputStream initialized");
} catch (Exception e){
e.printStackTrace();
}
try {
audioCodec = MediaCodec.createEncoderByType(audioType);
} catch (IOException e) {
e.printStackTrace();
}
final int kSampleRates[] = { 8000, 11025, 22050, 44100, 48000 };
final int kBitRates[] = { 64000, 128000 };
MediaFormat audioFormat = MediaFormat.createAudioFormat(audioType,kSampleRates[3],2);
audioFormat.setInteger(MediaFormat.KEY_AAC_PROFILE, MediaCodecInfo.CodecProfileLevel.AACObjectLC);
audioFormat.setInteger(MediaFormat.KEY_BIT_RATE, kBitRates[1]);
audioCodec.configure(audioFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
audioCodec.start();
}
}
// called AudioRecord's read
public synchronized void audioEncoder(byte[] input) {
Log.e("AudioEncoder", input.length + " is coming");
try {
ByteBuffer[] inputBuffers = audioCodec.getInputBuffers();
ByteBuffer[] outputBuffers = audioCodec.getOutputBuffers();
int inputBufferIndex = audioCodec.dequeueInputBuffer(-1);
if (inputBufferIndex >= 0) {
ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
inputBuffer.clear();
inputBuffer.put(input);
audioCodec.queueInputBuffer(inputBufferIndex, 0, input.length, 0, 0);
}
MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
int outputBufferIndex = audioCodec.dequeueOutputBuffer(bufferInfo,0);
//Without ADTS header
while (outputBufferIndex >= 0) {
ByteBuffer outputBuffer = outputBuffers[outputBufferIndex];
byte[] outData = new byte[bufferInfo.size];
outputBuffer.get(outData);
outputStream.write(outData, 0, outData.length);
Log.e("AudioEncoder", outData.length + " bytes written");
audioCodec.releaseOutputBuffer(outputBufferIndex, false);
outputBufferIndex = audioCodec.dequeueOutputBuffer(bufferInfo, 0);
}
} catch (Throwable t) {
t.printStackTrace();
}
}
private CameraProxy.CameraDataCallBack callBack = new CameraProxy.CameraDataCallBack() {
#Override
public void onDataBack(byte[] data, long length) {
// TODO Auto-generated method stub
Log.i(TAG, "length . " + length);
//audio play
int min_buffer_size = AudioRecord.getMinBufferSize(sampleRateInHz, channelConfig, audioFormats);
audioRecord = new AudioRecord(audioSource,sampleRateInHz,channelConfig,audioFormats,min_buffer_size);
audioRecord.read(data,0,data.length);
audioEncoder(data);
}
}
You can try;
audioFormat.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, inputSize);
inputSize should set according to input size. Then inputBuffers capacity will be enough.
Audio encoding offers a little more flexibility than video, in that it's easier to change the size of the input data.
In this case, I recommend checking the size of inputBuffer (inputBuffer.remaining()) and supplying exactly that amount of audio data. That means if data is too big, only put into the inputBuffer what will fit, and save the rest for the next input buffer. And if data is too small, buffer it someplace temporarily, until you get more audio data (enough to fill the entire inputBuffer). That's the way this codec is intended to be used.
As an aside: Your code appears to have some problems with confusing video data and audio data, the lifecycle of the AudioRecord object, and the proper threading arrangement for capturing video and audio simultaneously. Here are some hints:
NV21 is a picture format, not audio.
onDataBack() is giving you a picture -- then you're overwriting it with a bit of audio
In onDataBack(), data is going to be HUGE -- b/c it contains a picture. And you're trying to read in the whole thing with audio data. Depending on how the AudioRecord is configured, it may only read a few bytes. You should check the return value. From the docs:
Data should be read from the audio hardware in chunks of sizes inferior to the total recording buffer size.
If you are in need of a better piece of sample code, this project looks pretty decent.

MediaCodec failing on S7

I have set a series of classes to decode H264 streaming video from a server and render it on a SurfaceView. Now this code is working perfectly on every device I've tried including the emulator, but suddenly I bought myself a S7 and in this device it does not work properly anymore.
The weird thing is that some times it will work perfectly, then some other times it will throw this error:
06-15 16:41:40.249 13300-24605/cm.myapp E/ACodec: [OMX.Exynos.avc.dec] ERROR(0x90000012)
06-15 16:41:40.249 13300-24605/cm.myapp E/ACodec: signalError(omxError 0x90000012, internalError -2147483648)
06-15 16:41:40.249 13300-24604/cm.myapp E/MediaCodec: Codec reported err 0x90000012, actionCode 0, while in state 6
06-15 16:41:40.249 13300-24578/cm.myapp W/MediaStreamerThread: Failed to draw media.
Sometimes it will crash on the dequeueInputBuffers call:
java.lang.IllegalStateException
at android.media.MediaCodec.native_dequeueOutputBuffer(Native Method)
at android.media.MediaCodec.dequeueOutputBuffer(MediaCodec.java:2379)
And then again, some other times it will throw this very different error:
06-15 16:34:57.239 13300-16625/cm.myapp W/System.err: java.lang.IllegalArgumentException: The surface has been released
06-15 16:34:57.239 13300-16625/cm.myapp W/System.err: at android.media.MediaCodec.native_configure(Native Method)
06-15 16:34:57.239 13300-16625/cm.myapp W/System.err: at android.media.MediaCodec.configure(MediaCodec.java:1778)
java.lang.RuntimeException: Could not create h264 decoder
These errors by themselves are not very verbose and thus I cannot figure out where the problem might be.
Again my code works perfectly on most devices but it's failing on this one. How is this possible? Any ideas?
This is my decoding code:
public class H264Decoder
{
static private final long TIMEOUT_US = 10000L;
private MediaCodec mDecoder;
private Surface mSurface;
private static final List<byte[]> EMPTY_ENCODE_RESULT = new ArrayList<>();
public void init()
{
try
{
mDecoder = MediaCodec.createDecoderByType( "video/avc" );
MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc", 640, 480);
mDecoder.configure(mediaFormat, mSurface, null, 0);
mDecoder.start();
}
catch(NoClassDefFoundError ex) {
ex.printStackTrace();
throw new RuntimeException("Could not create h264 decoder", ex);
}
}
public List<byte[]> offer(byte[] data)
{
List<byte[]> returnValue = new ArrayList<>();
returnValue.add(decode(data, true));
return returnValue;
}
public void release()
{
assert mSurface != null;
assert mDecoder != null;
mDecoder.stop();
mDecoder.release();
mDecoder = null;
}
public H264Decoder(Surface surface)
{
mSurface = surface;
}
public byte[] decode(byte[] data, boolean updateRender)
{
if (mSurface == null)
{
return null;
}
// INPUT -----------------------------------------------------------------------------------
int inputBufferIndex = mDecoder.dequeueInputBuffer(TIMEOUT_US);
if (inputBufferIndex >= 0)
{
// Get an input buffer from the codec, fill it with data, give it back
ByteBuffer inputBuffer = mDecoder.getInputBuffers()[inputBufferIndex];
inputBuffer.put(data);
mDecoder.queueInputBuffer(inputBufferIndex, 0, data.length, 0, 0 );
}
// OUTPUT ----------------------------------------------------------------------------------
MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();
int outputBufferIndex = mDecoder.dequeueOutputBuffer(info, TIMEOUT_US);
if ( outputBufferIndex >= 0 )
{
final ByteBuffer[] outputBuffer = mDecoder.getOutputBuffers();
mDecoder.releaseOutputBuffer(outputBufferIndex, updateRender);
}
else if ( outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED )
{
//outputBuffers = codec.getOutputBuffers();
}
else if ( outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED )
{
MediaFormat format = mDecoder.getOutputFormat();
}
return null;
}
}
It is hard to say what can go wrong. One thing is though
mDecoder.releaseOutputBuffer(outputBufferIndex, updateRender);
if(!updateRender) {
return outputBuffer[outputBufferIndex].array();
}
I would not recommend returning array from output buffer. According to documentation it states
Once an output buffer is released to the codec, it MUST NOT be used
If you truly need to have encoded sample, I would better create a copy of output buffer, call release, and then return the copy.
If byte array that's returned used to be processed somehow else, I would recommend extracting this method of decoder into the loop and process the byte buffer (output buffer), and then call release output buffer. For more reference I would look into MediaCodec Synchronous Processing using Buffers
For Samsung S7, you have to make sure that the first H264 frame inserted in decoder is an I frame.
On Samsung Galaxy S8, I would get ERROR(0x90000012) asynchronously after calling MediaCodec.configure(), before feeding any stream input.
One sure way of getting ERROR(0x90000012) is to use streams in AVCC format (like Apple Quicktime movies).
In fact, it seems across many Android devices, we have more success using H264 in Annex-B format.
Annex-b also means not needing to pass out of band extra data using MediaCodec.BUFFER_FLAG_CODEC_CONFIG or sps/pps configured via MediaFormat (csd-0, csd-1 keys).

Convert an audio track with MediaCodec

I was thinking of using MediaCodec to convert an arbitrary audio track to a fixed format (codec, sample rate, bitrate, etc.).
Reading the official docs, an obvious solution would be to use two loops. One for the decoder and one for the encoder and save the raw output data obtained from the decoder to a temporary file.
for (;;) {
...
ByteBuffer inputBuffer = decoder.getInputBuffer(inputBufferIndex);
extractor.readSampleData(inputBuffer, 0);
...
ByteBuffer outputBuffer = decoder.getOutputBuffer(outputBufferIndex);
...
// save contents of output buffer somewhere
}
for (;;) {
...
ByteBuffer inputBuffer = encoder.getInputBuffer(inputBufferIndex);
// read saved data into input buffer
...
ByteBuffer outputBuffer = encoder.getOutputBuffer(outputBufferIndex);
...
mediaMuxer.writeSampleData(destAudioTrack, outputBuffer,...);
}
Is there a way to efficiently chain the decoder -> encoder cycles, so only one loop and no temporary storage is needed?
[Edit]
Based on fadden's help and the code from DecodeEditEncodeTest.java I attempted to copy buffers on the fly (currently truncating the contents of the input buffer if the output buffer is smaller). While this works fine on a Nexus 7, it doesn't work on a OPO with the same input file. Except for the first iteration, encoderInputBuf.remaining() is always 0. What am I doing wrong?
if (!decoderDone) {
int decoderStatus = decoder.dequeueOutputBuffer(bufferInfo, TIMEOUT_USEC);
...
if (decoderStatus >= 0) {
if (bufferInfo.size != 0) {
int inputBufIndex = encoder.dequeueInputBuffer(TIMEOUT_USEC);
if (inputBufIndex >= 0) {
ByteBuffer decoderOutputBuf = decoderOutputBuffers[decoderStatus];
ByteBuffer encoderInputBuf = encoderInputBuffers[inputBufIndex];
int maxTransfer = Math.min(encoderInputBuf.remaining(), decoderOutputBuf.remaining());
if (maxTransfer > 0) {
ByteBuffer tmpByteBuf = decoderOutputBuf.duplicate();
tmpByteBuf.limit (tmpByteBuf.position() + maxTransfer);
encoderInputBuf.put(tmpByteBuf);
}
encoder.queueInputBuffer(inputBufIndex, 0, maxTransfer, bufferInfo.presentationTimeUs, flags);
...
}
...
}
decoder.releaseOutputBuffer(decoderStatus, false);
}
}

Android, use Mediacodec with libstreaming

I've a problem with this library
https://github.com/fyhertz/libstreaming
it allows to send via wireless the streaming of photocamera, it use 3 methods: two with mediacodec and one with mediarecorder.
I would like to modify it, and I have to use only the mediacodec;however first of all I tried the code of the example 2 of the library, but I've always found the same error:
the log tell me that the device can use the mediacodec, it set the encoder and when it test the decoder it fall and the buffer is filled with -1.
This is the method in the EncoderDebugger class where the exception occurs, some kind soul can help me please?
private long decode(boolean withPrefix) {
int n =3, i = 0, j = 0;
long elapsed = 0, now = timestamp();
int decInputIndex = 0, decOutputIndex = 0;
ByteBuffer[] decInputBuffers = mDecoder.getInputBuffers();
ByteBuffer[] decOutputBuffers = mDecoder.getOutputBuffers();
BufferInfo info = new BufferInfo();
while (elapsed<3000000) {
// Feeds the decoder with a NAL unit
if (i<NB_ENCODED) {
decInputIndex = mDecoder.dequeueInputBuffer(1000000/FRAMERATE);
if (decInputIndex>=0) {
int l1 = decInputBuffers[decInputIndex].capacity();
int l2 = mVideo[i].length;
decInputBuffers[decInputIndex].clear();
if ((withPrefix && hasPrefix(mVideo[i])) || (!withPrefix && !hasPrefix(mVideo[i]))) {
check(l1>=l2, "The decoder input buffer is not big enough (nal="+l2+", capacity="+l1+").");
decInputBuffers[decInputIndex].put(mVideo[i],0,mVideo[i].length);
} else if (withPrefix && !hasPrefix(mVideo[i])) {
check(l1>=l2+4, "The decoder input buffer is not big enough (nal="+(l2+4)+", capacity="+l1+").");
decInputBuffers[decInputIndex].put(new byte[] {0,0,0,1});
decInputBuffers[decInputIndex].put(mVideo[i],0,mVideo[i].length);
} else if (!withPrefix && hasPrefix(mVideo[i])) {
check(l1>=l2-4, "The decoder input buffer is not big enough (nal="+(l2-4)+", capacity="+l1+").");
decInputBuffers[decInputIndex].put(mVideo[i],4,mVideo[i].length-4);
}
mDecoder.queueInputBuffer(decInputIndex, 0, l2, timestamp(), 0);
i++;
} else {
if (VERBOSE) Log.d(TAG,"No buffer available !7");
}
}
// Tries to get a decoded image
decOutputIndex = mDecoder.dequeueOutputBuffer(info, 1000000/FRAMERATE);
if (decOutputIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
decOutputBuffers = mDecoder.getOutputBuffers();
} else if (decOutputIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
mDecOutputFormat = mDecoder.getOutputFormat();
} else if (decOutputIndex>=0) {
if (n>2) {
// We have successfully encoded and decoded an image !
int length = info.size;
mDecodedVideo[j] = new byte[length];
decOutputBuffers[decOutputIndex].clear();
decOutputBuffers[decOutputIndex].get(mDecodedVideo[j], 0, length);
// Converts the decoded frame to NV21
convertToNV21(j);
if (j>=NB_DECODED-1) {
flushMediaCodec(mDecoder);
if (VERBOSE) Log.v(TAG, "Decoding "+n+" frames took "+elapsed/1000+" ms");
return elapsed;
}
j++;
}
mDecoder.releaseOutputBuffer(decOutputIndex, false);
n++;
}
elapsed = timestamp() - now;
}
throw new RuntimeException("The decoder did not decode anything.");
}
Here's my suggestions:
(1) check the settings of encoder and decoder, and make sure that they match. For example, revolution and color format are the same.
(2) make sure the very first packet generated by the encoder has been sent and pushed into the decoder. This packet defines the basic settings of the video stream.
(3) the decoder usually buffers 5-10 frames. So data in the buffer is invalid for a few hundred ms.
(4) while initiating the decoder, set the surface as null. Otherwise the output buffer will be read by the surface and probably released automatically.

Categories

Resources