How to get PCM bit depth from MediaCodec decoder output MediaFormat? - android

I'm decoding an online AAC stream with MediaCodec and trying to play it with AudioTrack. AudioTrack requires an AudioFormat which requires knowing the details of PCM encoding. I was able to get the audio to play correctly by configuring it as ENCODING_PCM_16BIT but is that guaranteed to always be 16-bit PCM encoding? I want to avoid making that assumption.
I would expect MediaCodec.Callback.onOutputFormatChanged to get the encoding information as part of it's MediaFormat parameter, which seems like it should be the case by this https://stackoverflow.com/a/49812393/2399236, but this is all the information I get out of the media format by calling toString() on it:
aac-drc-heavy-compression=1
sample-rate=48000
aac-drc-boost-level=127
aac-drc-output-loudness=-1
mime=audio/raw
channel-count=2
aac-drc-effect-type=3
aac-drc-cut-level=127
aac-encoded-target-level=-1
aac-max-output-channel_count=8
aac-target-ref-level=64
aac-drc-album-mode=0
Should I just assume that it's 16-bit PCM, or is there any way to get that information?
Note: I'm targetting min SDK 28, currently testing in debugger on emulated Pixel 2 API 30.

Related

using opus in linphone

i would like to use opus codec in my linphone application
but i have a few questions , if someone with opus codec knowledge could help me out would appreciate
OPUS. Does this codec compress as well as package the data?
What is the output data structure from OPUS?
Is the output data streaming or packets?
What does the audio sampling scheme look like?
and….
Within the audio sampling scheme, what are the values for silence?
Within the audio sampling scheme, what are the values for speech?
thx in advance
I see you asked this on the mailing list too, but I'll answer here. I'm not sure what you mean in some the questions, but here's a start. You've tagged your post as relating to Android; I mostly know about the reference C implementation, so if you're asking about the java interface available to Android applications this won't be much help.
OPUS. Does this codec compress as well as package the data?
The Opus codec compressed pcm audio data into packets. There's internal structure, but the codec requires a transport layer like RTP to keep track of the boundaries between compressed packets.
What is the output data structure from OPUS?
The reference encoder accepts a given duration of pcm audio data an fills in a given buffer with compressed data up to a maximum requested size. See opus_encode() and opus_encode_float() in the encoder documentation for details.
Is the output data streaming or packets?
Opus produces a sequence of packets.
What does the audio sampling scheme look like? and….
The reference encoder accepts interleaved mono, stereo, or surround pcm audio data with either 16-bit signed integer or floating point samples at 8, 12, 16, 24, or 48 kHz.
Within the audio sampling scheme, what are the values for silence?
Zero pcm values are silence. As a perceptual codec Opus will try to encode low-level noise if there is no other signal. There is also support for special zero-data compressed packets for sending silence or handling discontinuous transmission.
Within the audio sampling scheme, what are the values for speech?
I'm not sure what you're asking here. Speech is treated the same as music, and will sound equally normal down to 64 kbps. The codec can maintain transparency for speech down to much lower bitrates than for music (something like 24 kbps for mono) and is intelligible down to 6 kbps for narrowband speech.

buffer-to-buffer encode/decode from capture in android 4.1 (API 16)

I searched for hours ..
I just want a working decode/encode of a recorded movie.
Is this even possible on android 4.1?
Now i writes only a few kb's to my mp4 file. No errors.
After this will work, i will use KEY_FRAME_RATE and KEY_I_FRAME_INTERVAL to put it in slow motion.
I used a mediaExtractor to configure the MediaCodec.
I see 3 steps (see gist for complete code):
1./
encoder.dequeueInputBuffer(5000);
extractor.readSampleData(inputBuf, offset);
ptsUsec2 = extractor.getSampleTime();
encoder.queueInputBuffer(inputBufIndex, ...);
2./
encoder.dequeueOutputBuffer(info, 5000);
ByteBuffer encodedData = encoderOutputBuffers[encoderStatus];
//i write encodedData to a FileOutputStream (to save the MP4);
decoder.queueInputBuffer(inputBufIndex, ...);
3./
decoder.dequeueOutputBuffer(info, 5000);
decoder.releaseOutputBuffer(decoderStatus, ...);
Here is the complete function i modified from google's EncodeDecodeTest file:
gist
Thanks for help,
Felix
Some additional information is available on bigflake. In particular, FAQ item #9.
The format of frames coming out of the MediaCodec decoder is not guaranteed to be useful. Many popular devices decode data into a proprietary YUV format, which is why the checkFrame() function in the buffer-to-buffer test can't always verify the results. You'd expect the MediaCodec encoder to be able to accept the frames output by the decoder, but that's not guaranteed.
Coding against API 18+ is generally much easier because you can work with a Surface rather than a ByteBuffer.
Of course, if all you want is slow-motion video, you don't need to decode and re-encode the H.264 stream. All you need to do is alter the presentation time stamps, which are in the .mp4 wrapper. On API 18+, you can extract with MediaExtractor and immediately encode with MediaMuxer, without involving MediaCodec at all. On API 16, MediaMuxer doesn't exist, so you'd need some other way to wrap H.264 as .mp4.
Unless, of course, you have some aversion to variable-frame-rate video, in which case you'll need to re-encode it with the "slow motion" frames repeated (and timestamps adjusted appropriately). The KEY_FRAME_RATE and KEY_I_FRAME_INTERVAL values will not help you -- they're set when the encoder is configured, and have no affect on frame timing.

MediaCodec and 24 bit PCM

I am successfully using MediaCodec to decode audio, however when I load a file with 24-bit samples, I have no way of knowing this has occurred. Since the application was assuming 16-bit samples, it fails.
When I print the MediaFormat, I see
{mime=audio/raw, durationUs=239000000, bits-format=6, channel-count=2, channel-mask=0, sample-rate=96000}
I assume that the "bits-format" would be a hint, however this key is not declared in the API, and is not actually emitted when the output format changes. I get
{mime=audio/raw, what=1869968451, channel-count=2, channel-mask=0, sample-rate=96000}
(By the way what is the "what" key? I notice if I interpret as a 4charcode, it is "outC"... just a flag that it is an output format?)
So what is the best recourse here? If I feed the ByteBuffer straight to the AudioTrack it plays static of course (assuming PCM 16).
If I know the value, then I can convert it myself!
I understand from other questions that you cannot dictate the output format either.

How to use MediaCodec class to decode H.264 streams

I have been asked to display a video stream (the stream is not from HTTP)in android, the stream is raw H.264 which is Recorded and encoded in a PC ,and I get it through WIFI.
When I get the stream, can use the MediaCodec decoder to decode the stream and display it?
Yes. Configure the MediaCodec as a "video/avc" decoder, and pass an output Surface to the configure() call.
The MediaCodec API is pretty low-level, and there's not a lot of sample code available. It might be easier to use MediaPlayer instead.
Update:
There's now a bunch of sample code here. Most of it makes use of Android 4.3 (API 18) features, but if you don't need MediaMuxer or Surface input to MediaCodec it'll work on API 16.
See the Video Encoding Recommendations here

Decoding Raw H264 stream in android?

I have a project where I have been asked to display a video stream in android, the stream is raw H.264 and I am connecting to a server and will receive a byte stream from the server.
Basically I'm wondering is there a way to send raw bytes to a decoder in android and display it on a surface?
I have been successful in decoding H264 wrapped in an mp4 container using the new MediaCodec and MediaExtractor API in android 4.1, unfortunately I have not found a way to decode a raw H264 file or stream using these API's.
I understand that one way is to compile and use FFmpeg but I'd rather use a built in method that can use HW acceleration. I also understand RTSP streaming is supported in android but this is not an option. Android version is not an issue.
I can't provide any code for this unfortunately, but I'll do my best to explain it based on how I got it to work.
So here is my overview of how I got raw H.264 encoded video to work using the MediaCodec class.
Using the link above there is an example of getting the decoder setup and how to use it, you will need to set it up for decoding H264 AVC.
The format of H.264 is that it’s made up of NAL Units, each starting with a start prefix of three bytes with the values 0x00, 0x00, 0x01 and each unit has a different type depending on the value of the 4th byte right after these 3 starting bytes. One NAL Unit IS NOT one frame in the video, each frame is made up of a number of NAL Units.
Basically I wrote a method that finds each individual unit and passes it to the decoder (one NAL Unit being the starting prefix and any bytes there after up until the next starting prefix).
Now if you have the decoder setup for decoding H.264 AVC and have an InputBuffer from the decoder then you are ready to go. You need to fill this InputBuffer with a NAL Unit and pass it back to the decoder and continue doing this for the length of the stream.
But, to make this work I had to pass the decoder a SPS (Sequence Parameter Set) NAL Unit first. This unit has a byte value of 0x67 after the starting prefix (the 4th byte), on some devices the decoder would crash unless it received this Unit first.
Basically until you find this unit, ignore all other NAL Units and keep parsing the stream until you get this unit, then you can pass all other units to the decoder.
Some devices didn't need the SPS first and some did, but you are better of passing it in first.
Now if you had a surface that you passed to the decoder when you configured it then once it gets enough NAL units for a frame it should display it on the surface.
You can download the raw H.264 from the server, then offer it via a local HTTP server running on the phone and then let VLC for Android do playback from that HTTP server. You should use VLC's http/h264:// scheme to force the demuxer to raw H.264 (if you don't force the demuxer VLC may not be able to recognize the stream, even when the MIME type returned by the HTTP server is set correctly). See
https://github.com/rauljim/tgs-android/blob/integrate_record/src/com/tudelft/triblerdroid/first/VideoPlayerActivity.java#L211
for an example on how to create an Intent that will launch VLC.
Note: raw H.264 apparently has no timing info, so VLC will play as fast as possible.
First embedding it in MPEGTS will be better. Haven't found a Android lib that will do that yet.
Here are the resources I've found helpful in a similar project:
This video has been super insightful in understanding how MediaCodec handles raw h.264 streams on a high level.
This thread goes into a bit more detail as to handling the SPS/PPS NALUs specifically. As was mentioned above, you need to separate individual NAL Units using the start prefix, and then hand the remaining data to the MediaCodec.
This repo (libstreaming) is a great example of decoding an H264 stream in Android using RTSP/RTP for transmission.

Categories

Resources