Stream h264 to javafx possibly using javacv/ffmpeg - android

I'm really stuck on getting a video stream to play on a java fx project.
-- Short version:
I'm streaming h264/avcc flavor video from an android phone to a desktop computer. However javafx doesn't have an easy solution for displaying stream. I'm attempting to use javacv / ffmpeg in an attempt to make this work. However I am getting errors from ffmpeg.
1) Is there a better way to display streaming video on javafx?
2) Do you have a sample project or good tutorial for javacv ffmpegframegrabber?
3) I think I may be missing some small detail in mycode but Im not sure what i would be.
-- Longer Version:
1) On the android end Im getting video using mediarecorder. In order to get the sps/pps info I record and save a small movie to the device and then parse the sps and pps data.
2) Next, on the android, I split up the nalus to meet MTU req and send them over a udp connection to my desktop
3)On my desktop I reassmble the nalus( or trash them if they loose data) and feed those to an input stream that I gave to the framegreabber constructor.
-- The Code and Logs:
The errors are long and numerous depending on the flavor I feed it. Here are two separate examples which are usually repeated at great length
[h264 # 0000020225907a40] non-existing PPS 0 referenced
[h264 # 0000020225907a40] decode_slice_header error
[h264 # 0000020225907a40] no frame!
[h264 # 00000163d8637a40] illegal aspect ratio
[h264 # 00000163d8637a40] pps_id 3412 out of range
[AVBSFContext # 00000163e28a0e00] Invalid NAL unit 0, skipping.
!! One big caveat that I am aware off is that I have not implemented timestamps
which I created on the android device when feeding ffmpeg. I think it should still show distorted images without this though
Because I have spent all day guessing and trying I have several "flavors" of data I have shoved through. I am only showing the first section of each nal which I believe if correct would at least show a garbage image as long as my sps and pps are right
sps: 67 80 80 1E E9 01 68 22 FD C0 36 85 09 A8
pps: 68 06 06 E2
Below is annex B style.
These were each prefixed with either 00 00 01 and 00 00 00 01
Debug transfer 65 B8 40 0B E5 B8 7B 80 5B 85
Debug transfer 41 E2 20 7A 74 34 3B D6 BE FA
Debug transfer 41 E4 40 2F 01 E0 0C 06 EE 91
Debug transfer 41 E6 60 3E A1 20 5A 02 3C 6D
Debug transfer 41 E8 80 13 B0 B9 82 C3 03 F4
Debug transfer 41 EC C0 1B A3 0C 28 F1 B0 C8
Debug transfer 41 EE E0 1F CE 07 30 EE 05 06
Debug transfer 41 F1 00 08 ED 80 9C 20 09 73
Debug transfer 41 F3 20 09 E9 00 86 60 21 C3
VideoDecoderaddPacket type: 24
Debug transfer 67 80 80 1E E9 01 68 22 FD C0
Debug transfer 68 06 06 E2
Debug transfer 65 B8 20 00 9F 80 78 00 12 8A
Debug transfer 41 E2 20 09 F0 1E 40 7B 0C E0
Debug transfer 41 E4 40 09 F0 29 30 D6 00 AE
Debug transfer 41 E6 60 09 F1 48 31 80 99 40
[h264 # 000001c771617a40] non-existing PPS 0 referenced
Here I tried Avcc style. You can see the first line is the combination of the sps pps followed by idr and then repeated non idr
Debug transfer 18 00 0E 67 80 80 1E E9 01 68
Debug transfer 00 02 4A 8F 65 B8 20 00 9F C5
Debug transfer 00 02 2F DA 41 E2 20 09 E8 0F
Debug transfer 00 02 2C 34 41 E4 40 09 F4 20
Debug transfer 00 02 4D 92 41 E6 60 09 FC 2B
Debug transfer 00 02 47 02 41 E8 80 09 F0 72
Debug transfer 00 02 52 50 41 EA A0 09 EC 0F
Debug transfer 00 02 58 8A 41 EC C0 09 FC 6F
Debug transfer 00 02 55 F9 41 EE E0 09 FC 6E
Debug transfer 00 02 4D 79 41 F1 00 09 F0 3E
Debug transfer 00 02 4D B6 41 F3 20 09 E8 64
The following class is where I try to get javacv/ffmpeg to show the video. I dont think its an ideal solution and am researching canvasfram as a replacement to the image view.
public class ImageDecoder {
private final static String TAG = "ImageDecoder ";
private ImageDecoder(){
}
public static void streamImageToImageView(
final ImageView view,
final InputStream inputStream,
final String format,
final int frameRate,
final int bitrate,
final String preset,
final int numBuffers
)
{
System.out.println("Image Decoder Starting...");
try( final FrameGrabber grabber = new
FFmpegFrameGrabber(inputStream))
{
final Java2DFrameConverter converter = new Java2DFrameConverter();
grabber.setFrameNumber(frameRate);
grabber.setFormat(format);
grabber.setVideoBitrate(bitrate);
grabber.setVideoOption("preset", preset);
grabber.setNumBuffers(numBuffers);
System.out.println("Image Decoder waiting on grabber.start...");
grabber.start(); //---- this call is blocking the loop
System.out.println("Image Decoder Looping---------------------------
-------- hit stop");
while(!Thread.interrupted()){
//System.out.println("Image Decoder Looping");
final Frame frame = grabber.grab();
if (frame != null){
final BufferedImage bufferedImage =
converter.convert(frame);
if (bufferedImage != null){
Platform.runLater(() ->
view.setImage(SwingFXUtils.toFXImage(bufferedImage, null)));
}else{
System.out.println("no buf im");
}
}else{
System.out.println("no fr");
Thread.currentThread().interrupt();
}
}
}catch (Exception e){
System.out.print(TAG + e);
}
}
}
Any help is greatly appreciated.

So I had two problems.
The first was that my sps pps parsing method had a mistake. Notice the 2nd and 3rd bytes are same
The second was I accidentally over sized a buffer and created large 0x00 padded areas which emulated start codes.
This was a big project for me and I want to help other. Please visit my website where I wrote a lengthy multi-part discussion about streaming h264

Related

APDU commands to get data from NFC tag using ISODEP

I am trying to read the information stored on my German Sparkasse Girocard. My app successfully recognizes the (ISODEP) tag. To read the stored information, I need to send a sequence of APDU commands, but I am not sure which.
From my understanding I need to send a SELECT command first:
byte[] SELECT = {
(byte) 0x00, // CLA Class
(byte) 0xA4, // INS Instruction
(byte) 0x04, // P1 Parameter 1
(byte) 0x00, // P2 Parameter 2
(byte) 0x09, // Lc
(byte) 0xD2,0x76,0x00,0x00,0x25,0x47,0x41,0x01,0x00, // AID
(byte) 0x00 //Le
};
Tag tagFromIntent = intent.getParcelableExtra(NfcAdapter.EXTRA_TAG);
IsoDep tag = IsoDep.get(tagFromIntent);
tag.connect();
byte[] result = tag.transceive(SELECT);
text.setText(Integer.toHexString(result[0]) + ", " + Integer.toHexString(result[1]));
The status response should be 9000 if it works. I am getting 6F44 which indicates that there was some sort of error (I think). I am also not quite sure if I am using the correct AID, but it has also not worked using others, which I thought could be correct.
What is my error and which commands do I have to send to retrieve the data?
Thanks in advance.
Getting data from an EMV card (e.g. Girocard, Mastercard, Visacard) is more a "question and answer" puzzle - you are asking the card, get a response, analyze the data and ask the next question.
The analyzing part is done here by using the "TLV Utilities" from emvlab.org (https://emvlab.org/tlvutils/).
To get more information about the Application Identifier ("AID"s) see the complete list at: https://www.eftlab.com/knowledge-base/211-emv-aid-rid-pix/.
Here is an example of reading a German Girocard (mine is from a "Volksbank"):
Step 1: ask the card which applications are on the card using the "select PPSE" command. The 2 bytes at the end "90 00" say that the answer is successfull:
selectPpseCommand: 00 A4 04 00 0E 32 50 41 59 2E 53 59 53 2E 44 44 46 30 31 00
selectPpseResponse: 6F 67 84 0E 32 50 41 59 2E 53 59 53 2E 44 44 46 30 31 A5 55 BF 0C 52 61 19 4F 09 A0 00 00 00 59 45 43 01 00 87 01 01 9F 0A 08 00 01 05 01 00 00 00 00 61 1A 4F 0A A0 00 00 03 59 10 10 02 80 01 87 01 01 9F 0A 08 00 01 05 01 00 00 00 00 61 19 4F 09 D2 76 00 00 25 47 41 01 00 87 01 01 9F 0A 08 00 01 05 01 00 00 00 00 90 00
Parsed response:
6F File Control Information (FCI) Template
84 Dedicated File (DF) Name
325041592E5359532E4444463031
A5 File Control Information (FCI) Proprietary Template
BF0C File Control Information (FCI) Issuer Discretionary Data
61 Application Template
4F Application Identifier (AID) – card
A00000005945430100
87 Application Priority Indicator
01
9F0A Unknown tag
0001050100000000
61 Application Template
4F Application Identifier (AID) – card
A0000003591010028001
87 Application Priority Indicator
01
9F0A Unknown tag
0001050100000000
61 Application Template
4F Application Identifier (AID) – card
D27600002547410100
87 Application Priority Indicator
01
9F0A Unknown tag
0001050100000000
There are 3 applications with 3 different AIDs available on the card:
A00000005945430100: Zentraler Kreditausschuss (ZKA) Germany Girocard Electronic Cash
A0000003591010028001: Euro Alliance of Payment Schemes s.c.r.l. – EAPS Belgium Girocard EAPS ZKA (Germany)
D27600002547410100: ZKA Germany Girocard ATM
Step 2: Now I'm reading the first AID using the "Select AID" command - the AID is 18 characters = 9 bytes long:
selectAidCommand: 00 A4 04 00 09 A0 00 00 00 59 45 43 01 00 00
selectAidResponse: 6F 47 84 09 A0 00 00 00 59 45 43 01 00 A5 3A 50 08 67 69 72 6F 63 61 72 64 87 01 01 9F 38 06 9F 02 06 9F 1D 02 5F 2D 04 64 65 65 6E BF 0C 1A 9F 4D 02 19 0A 9F 6E 07 02 80 00 00 30 30 00 9F 0A 08 00 01 05 01 00 00 00 00 90 00
Parsed response:
6F File Control Information (FCI) Template
84 Dedicated File (DF) Name
A00000005945430100
A5 File Control Information (FCI) Proprietary Template
50 Application Label
g i r o c a r d
87 Application Priority Indicator
01
9F38 Processing Options Data Object List (PDOL)
9F02 06
9F1D 02
5F2D Language Preference
d e e n
BF0C File Control Information (FCI) Issuer Discretionary Data
9F4D Log Entry
190A
9F6E Unknown tag
02800000303000
9F0A Unknown tag
0001050100000000
Step 3: get the processing options to read the card details
This is the first "tricky" point as it is not easy how to explain to get the right data for your card/AID. In the above response there is a section for the Processing Options Data Object List (PDOL) the card is requesting and the length of the fields - here we do have 2 fields with a length of 6 and 2 bytes, in total 8 bytes. The 8 bytes are just 8 "x00"s with the "header" 83 08, so the complete length is 10 bytes = x0A:
9F38 Processing Options Data Object List (PDOL)
9F02 06
9F1D 02
A more detailed explanation can be found here: https://stackoverflow.com/a/20810855/8166854
getProcessingOptionsCommand: 80 A8 00 00 0A 83 08 00 00 00 00 00 01 00 00 00
getProcessingOptionsResponse: 77 1E 82 02 19 80 94 18 18 01 01 00 20 01 01 00 20 04 04 00 08 05 05 01 08 07 07 01 08 03 03 01
Parsed response:
77 Response Message Template Format 2
82 Application Interchange Profile
1980
94 Application File Locator (AFL)
18010100 20010100 20040400 08050501 08070701 08030301
The most important part for the next step is the "Application File Locator (AFL)" - we need to read the file system with the data that are coded in these 4 byte blocks.
Step 4: read the records from the card
This is the part where I get lost when trying to read the card. You need to get the SFI and RECORD from the first 3 bytes of an AFL block and run a read record command. The following command reads the 4th sector of the AFL list - the command may work or not with your card and if it works you may get different data from your card.
WARNING: providing the response data to an internet form may reveal data like account number or
credit card number - my response is masked so the account number is not 1111111111:
readRecordCommand: 00 B2 05 0C 00
readRecordResponse: 70 38 5F 24 03 21 12 31 5A 0A 67 26 42 89 11 11 11 11 11 7F 5F 34 01 02 5F 28 02 02 80 9F 07 02 FF C0 9F 0D 05 FC 40 A4 80 00 9F 0E 05 00 10 18 00 00 9F 0F 05 FC 40 A4 98 00 90 00
Parsed response:
70 EMV Proprietary Template
5F24 Application Expiration Date
211231
5A Application Primary Account Number (PAN)
6726428911111111117F
5F34 Application Primary Account Number (PAN) Sequence Number
02
5F28 Issuer Country Code
0280
9F07 Application Usage Control
FFC0
9F0D Issuer Action Code – Default
FC40A48000
9F0E Issuer Action Code – Denial
0010180000
9F0F Issuer Action Code – Online
FC40A49800
I strongly recommend that you use a library for the steps 3 and 4; I'm using
https://github.com/devnied/EMV-NFC-Paycard-Enrollment
for this.
This is just a basic explanation for the first steps but now you get some useful responds from your card - good luck for your next steps.
A complete Android example app for the above mentioned library is here (disclaimer: I'm the author):
https://github.com/AndroidCrypto/Android-EMV-NFC-Paycard-Example

WhatsApp video as Gif sharing on Android programatically

How can i convert a mp4 video file to a WhatsApp gif file (it's simple showed as gif inside app UI but internally is a specific mp4 format) to be use in android share intent, being recognized as this type of media by whatsapp chat app???
I search a lot but i can't find any information from WhatsApp docs (they don't have this kind of doc anyway) or any dev with the same problem as i.
WHAT I HAVE:
I have discovered that at beginning of whatsapp "gif" mp4 files is present a loop value if you read they on hex editor, all files have this. Remove this value make whatsapp receive as regular video (not sharing as gif).
How can i add this value using ffmpeg encoding? (editing my mp4 files manually with this value corrupt the files, maybe i have to fix some mp4 header index that i don't know yet...)
FIRST 80 BYTES in hexadecimal (from beginning to start of "moov" atom from mp4 structure):
00 00 00 1C 66 74 79 70 6D 70 34 32 00 00 00 01 6D 70 34 31 6D 70 34 32 69 73 6F 6D 00 00 00 18 62 65 61 6D 01 00 00 00 01 00 00 00 00 00 00 00 05 00 00 00 00 00 00 0C 6C 6F 6F 70 00 00 00 00 00 00 00 08 77 69 64 65 00 00 04 9F 6D 6F 6F 76
A short mp4 file generated by WhatsApp that internally (at app) was showed as a Gif (with different UI):
https://www.dropbox.com/s/kpynmx1bg3z76lz/VID-20171024-WA0009.mp4?dl=0
"...The problem is that I can't edit another MP4 file to add this atom
without corrupt the file.
Test this small_VC1edit.mp4 in WhatsApp. If it does what you want then read on...
To make a playable MP4 :
Using original small.mp4 as an editing example (download file and open with a hex editor).
1) In a blank byte array, add the first 72 bytes of the shown WhatsApp-style MP4 header.
00 00 00 1C 66 74 79 70 6D 70 34 32 00 00 00 01 6D 70 34 31 6D 70 34 32 69 73 6F 6D 00 00 00 18 62 65 61 6D 01 00 00 00 01 00 00 00 00 00 00 00 05 00 00 00 00 00 00 0C 6C 6F 6F 70 00 00 00 00 00 00 00 08 77 69 64 65
You've shown 80 bytes but the last 8 bytes are not needed yet (also four of those eight byte's values must be different for your output file).
2) Calculate the deltas.
Note the (new) WhatsApp header is 72 bytes (before moov atom).
Note the (original) Small.mp4 has 160 bytes of header (before moov atom).
So just use this logic (a or b):
a) If WhatsApp header is bigger than input MP4 :
delta = ( WhatsApp_header - input_MP4_header)
b) If input MP4 header is bigger than WhatsApp :
delta = ( input_MP4_header - WhatsApp_header )
So for the input small.mp4 which has 160 header bytes (is followed next by 4 bytes of moov's SIZE (as 00 00 0D 83) and then follows another 4 bytes now of moov's NAME (as 6D 6F 6F 76 or utf-8 text of "moov").
We can say : 160 MP4 bytes - 72 WhatsApp bytes = Delta of 88.
If you delete these original 160 bytes and replace them with the shorter 72 WhatsApp bytes, they'll be 88 less bytes which must now be accounted for in the another section of MOOV data. That section is the STCO atom.
3) Update the STCO atom with new offsets:
In small.mp4 the STCO atom begins at offset 1579 (as 73 74 63 6F). The previous 4 bytes (offsets: 1575 to 1578) are stco's SIZE bytes (as 00 00 00 B8) which is decimal value 184. This total SIZE of bytes length includes accounting for those 4 SIZE bytes too.
Skip 12 bytes from the starting byte 73 of the stco's NAME bytes 73 74 63... so skip these:
73 74 63 6F 00 00 00 00 00 00 00 2A
Now you reach point to sequentially update every 32-bit integer (4 bytes) of offsets with the new delta value. But how many offsets to update?
atomEditTotal = ( (stco_SIZE - 16) / 4); //gives 42 //PS: Minus by 16 is to trim off non-offset bytes.
So there are 42 entries to edit. Our Delta as 88 so for each integer we read value, minus it by 88, then write it back new value at same place, repeat another 41 times (using While loop with an if condition to break; the loop).
For testing, given a corrupt file, if you edit the first entry it should be enough to show frame 1 of video (if non-audio file).
PS: After editing the STCO offsets of small.mp4, just delete its starting 160 bytes and join/concat those remaining MP4 bytes to the back/end of the 72 byte WhatsApp header. Save array as new file and test.

Error in TLS handshake when using mutual authentication

While testing my app (https://play.google.com/apps/testing/com.degoo.android) I've found that on some devices the TLS handshake between the app and our servers fail if the https request requires a client certificate (i.e. mutual authentication). The same code works on Windows and on OS X, so I know that it's not caused by an incorrect cert or that I have forgotten to include the cert into the SSLContext. I have detected it on both some Android 4.4 devices and on some 5.0 devices. Unfortunately I haven't found any common denominator which causes it to fail (other than Android). However, on devices were it fails it fails 100% of the time.
I've analyzed the network traffic to see more precisely when the error occurs. The following thing works on all devices:
The connection is established and the client successfully validates the server's certs.
The client sends it's HTTP request.
The server detects that the request is for protected area and sends a certificate_request.
The client decides which client certificate to send (we can see that it selects the correct certificate) and sends it.
After the client has responded with its certificate it sends two more TLS records and then the handshake fails.
Here's an example of how what these two TLS records looks like on a device were the handshake SUCCEEDS:
14 03 01 00 20 9c 07 49 78 9f ba 09 03 41 6b 66 ad 46 e2 75 94 f7 cf 18 bd 11 cf 35 a2 eb 5e b8 a8 4c 2a 1d c5
16 03 01 00 30 20 0e 13 d7 48 b9 6e b2 1b 96 6f 10 56 67 81 63 d9 d8 c7 73 23 95 3b f9 da f9 ce f4 f8 d1 7e 1b a4 12 92 4c 4f 54 a5 f8 49 75 d5 46 f4 2d 29 97
Here's an example of how what these two TLS records looks like on a device were the handshake FAILS:
14 03 01 00 20 a9 86 c2 fd 03 0a f8 08 fa f8 9e eb b7 97 07 56 6f 27 c0 d6 8f 95 be 77 c1 44 84 e9 e1 56 6f 6a
16 03 01 00 30 ff 36 76 e5 47 87 84 71 1c ce c9 08 41 45 fc 09 c6 ef 08 e6 21 ff 45 3a 10 ae 8d 5a 99 5f ca c5 ac bd bf 7e ca 69 32 4d 1f 01 c6 30 83 8e 06 cb
From the first byte of the records I can see that both clients send a change cipher spec record and then a handshake record. A funny thing about the failing devices is that byte with index 5 of the second record has value ff. When I look in TLS specification I don't see any handshake record type with that value. The device that succeeds has that byte set to 20, which means "finished".
Any idea on what's going on? What could cause this?

Decrypting APDU response from iPhone6

I transmitted the following APDU command, from an android app, in my android phone,
send: 00 A4 04 00 07 A0 00 00 00 03 10 10 00
to an iPhone 6 through NFC and got the following response,
resp: 6F 39 84 07 A0 00 00 00 03 10 10 A5 2E 9F 38 1B 9F 66 04 9F 02 06 9F 03 06 9F 1A 02 95 05 5F 2A 02 9A 03 9C 01 9F 37 04 9F 4E 14 BF 0C 0D 9F 4D 02 14 01 9F 5A 05 11 08 40 08 40 90 00
Now, I've been trying to decrypt this using various sources, but the confusing part is to understand, whether this is the PKPaymenttoken data (which we receive in apple pay response) or is it just the encrypted card data from the passbook of iPhone 6.
I compared this result with the response that I got from PassKit- framework's-> paymentAuthorizationViewController method's-> payment.token string, both are completely different. So I guess its not the token response for apple pay. My concerns are,
Is this the encrypted card data itself? Can I decrypt it directly to get the card details? (After all, will Apple give out the card details that easily?)
My ultimate requirement is to accept payment through NFC in an Android phone from an iPhone6. So is my APDU request the correct one to obtain card data from iPhone6 (Passbook)?
Any thought is appreciated. Thanks.
I tried with the help of this link
https://github.com/devnied/EMV-NFC-Paycard-Enrollment
It really works. It gives device account number directly from the iPhone. No decryption needed to get this device account number.
How to use this lib
Download the library from this link
Use the code below
IProvider prov = new YourProvider();
// Create parser (true for contactless false otherwise)
EMVParser parser = new EMVParser(prov, true);
// Read card
EMVCard card = parser.readEmvCard();
You will get the card details in card object.
To answer your two major questions
This is just a response for APDU command, if you use the library I've mentioned, you will directly get the card details required for payment via any gateways.
The first answer is also the answer for your second question, you can read the iphone 6 from an android phone using the library. so just two steps Integrate the library -> get the card details -> forward to payment gateway. By this method, you need not send any private keys to the gateways, just send the card details like you do for normal card payment processing.
Is this the encrypted card data itself? Can I decrypt it directly to get the card details?
No and no. As Anand correctly pointed out in their answer, this is the FCI template returned in response to your SELECT (by AID/DF name) command concatenated with the status word 9000 (indicating success).
The FCI template is a TLV (tag-length-value) encoded data structure following basic encoding rules (BER). So your FCI
6F 39 84 07 A0 00 00 00 03 10 10 A5 2E 9F 38 1B 9F 66 04 9F 02 06 9F 03 06 9F 1A 02 95 05 5F 2A 02 9A 03 9C 01 9F 37 04 9F 4E 14 BF 0C 0D 9F 4D 02 14 01 9F 5A 05 11 08 40 08 40
decodes to (see http://www.emvlab.org/tlvutils/):
6F [39] File Control Information (FCI) Template
84 [07] Dedicated File (DF) Name
A0000000031010 (full AID of the application that you just selected)
A5 [2E] File Control Information (FCI) Proprietary Template
9F38 [1B] Processing Options Data Object List (PDOL)
9F66 04
9F02 06
9F03 06
9F1A 02
95 05
5F2A 02
9A 03
9C 01
9F37 04
9F4E 14
BF0C [0D] File Control Information (FCI) Issuer Discretionary Data
9F4D [02] Log Entry
1401 (Transaction log file with at most 1 record is available at SFI 0x14)
9F5A [05] Application Program Identifier (Program ID)
1108400840
My ultimate requirement is to accept payment [...]. So is my APDU request the correct one to obtain card data from iPhone6?
Partially, yes. After application selection, you will need to perform an EMV payment transaction (following the EMV specifications for contactless payment systems, you can get them form http://www.emvco.com/). However, be aware that this is not as easy as getting some "card data". You will need to retrieve some static card data from the contactless "card" (i.e. iPhone). In addition you will also need to let the iPhone generate some dynamic transaction cryptogram/transaction authorization code. You can then use this data to clear the transaction.
Its not encrypted data, it just FCI information
APDU send by you:
00 A4 04 00 07 A0 00 00 00 03 10 10 00.
P1=0x04 means you are selecting MF by DF name and P2=0x00 means returns FCI information.
Your response 6F 39 84 07 A0 00 00 00 03 10 10 A5 2E 9F 38 1B 9F 66 04 9F 02 06 9F 03 06 9F 1A 02 95 05 5F 2A 02 9A 03 9C 01 9F 37 04 9F 4E 14 BF 0C 0D 9F 4D 02 14 01 9F 5A 05 11 08 40 08 40 90 00
Your response description are as follows:
6F->its FCI templates(i.e. set of control parameters and management data).
39->6F tag length, these bytes are 6F tag data 84 07 A0 00 00 00 03 10 10 A5 2E 9F 38 1B 9F 66 04 9F 02 06 9F 03 06 9F 1A 02 95 05 5F 2A 02 9A 03 9C 01 9F 37 04 9F 4E 14 BF 0C 0D 9F 4D 02 14 01 9F 5A 05 11 08 40 08 40 90 00
84 means Tag for DF name
07 is the Lengths
A0 00 00 00 03 10 10 data values i.e. DF name
A5 is the Tag for Proprietary information encoded in BER-TLV
2E is the Length
9F 38 1B 9F 66 04 9F 02 06 9F 03 06 9F 1A 02 95 05 5F 2A 02 9A 03 9C 01 9F 37 04 9F 4E 14 BF 0C 0D 9F 4D 02 14 01 9F 5A 05 11 08 40 08 40 90 00->A5 tag values

Decoding AAC audio stream coming from red5 server with javacv-ffmpeg on Android

I am trying to play AAC audio live stream coming from Red5 server, so to decode the audio data i am using Javacv-ffmpeg. Data is received as packets of byte[]
Here is what i tried
public Frame decodeAudio(byte[] adata,long timestamp){
BytePointer audio_data = new BytePointer(adata);
avcodec.AVCodec codec1 = avcodec.avcodec_find_decoder(avcodec.AV_CODEC_ID_AAC);// For AAC
if (codec1 == null) {
Log.d("showit","avcodec_find_decoder() error: Unsupported audio format or codec not found: " + audio_c.codec_id() + ".");
}
audio_c = null;
audio_c = avcodec.avcodec_alloc_context3(codec1);
audio_c.sample_rate(44100);
audio_c.sample_fmt(3);
audio_c.bits_per_raw_sample(16);
audio_c.channels(1);
if ((ret = avcodec.avcodec_open2( audio_c, codec1, (PointerPointer)null)) < 0) {
Log.d("showit","avcodec_open2() error " + ret + ": Could not open audio codec.");
}
if (( samples_frame = avcodec.avcodec_alloc_frame()) == null)
Log.d("showit","avcodec_alloc_frame() error: Could not allocate audio frame.");
avcodec.av_init_packet(pkt2);
samples_frame = avcodec.avcodec_alloc_frame();
avcodec.av_init_packet(pkt2);
pkt2.data(audio_data);
pkt2.size(audio_data.capacity());
pkt2.pts(timestamp);
pkt2.pos(0);
int len = avcodec.avcodec_decode_audio4( audio_c, samples_frame, got_frame, pkt2);
}
But len after decoding returns -1 for first frame and then -22 always.
First packet is like this always
AF 00 12 08 56 E5 00
Further packets are like
AF 01 01 1E 34 2C F0 A4 71 19 06 00 00 95 12 AE AA 82 5A 38 F3 E6 C2 46 DD CB 2B 09 D1 00 78 1D B7 99 F8 AB 41 6A C4 F4 D2 40 51 17 F5 28 51 9E 4C F6 8F 15 39 49 42 54 78 63 D5 29 74 1B 4C 34 9B 85 20 8E 2C 0E 0C 19 D2 E9 40 6E 9C 85 70 C2 74 44 E4 84 9B 3C B9 8A 83 EC 66 9D 40 1B 42 88 E2 F3 65 CF 6D B3 20 88 31 29 94 29 A4 B4 DE 26 B0 75 93 3A 0C 57 12 8A E3 F4 B9 F9 23 9C 69 C9 D4 BF 9E 26 63 F2 78 D6 FD 36 B9 32 62 01 91 19 71 30 2D 54 24 62 A1 20 1E BA 3D 21 AC F3 80 33 3A 1A 6C 30 3C 44 29 F2 A7 DC 9A FF 0F 99 F2 38 85 AB 41 FD C7 C5 40 5C 3F EE 38 70
Couldn't figure out where is the problem, whether in setting the AVcodec context audio_c or setting packet for decoder.
Any help appreciated. Thanks in advance.
The first packet (config) describes the stream data if I'm not mistaken the follow on data is the encoded audio. You can't assume the sample rate etc.. as you have done above, you need to pull that out of the config data which is marked "AF 00".
I have a similar problem. I've intercepted the packets with Wireshark and here is what it told me:
AF is the control byte of AAC frame, and it decodes to following bits:
1010 .... = Format: HE-AAC
.... 11.. = Sample rate: 44kHz (allthough FFMPEG shows me 48kHz and I would lean to believe it more)
.... ..1. = Sample size: 16 bit
.... ...1 = Channels: stereo
I still can't figure out how to universally decode this data.
edit:
Ha! I've got something :)
I guess the first 2 bytes are RTMP specific bytes. The second one seems to state, whether it is the configuration (0) or actual payload (1) - i've found no sources confirming that, it is just my assumption.
Then the first, short package is the AAC configuration description described here:
http://thompsonng.blogspot.com/2010/03/aac-configuration.html
In my case it is:
11 90
which is binary:
0001 0001 1001 0000
and that decodes to:
0001 0... .... .... = 2 = AAC LC
.... .001 1... .... = 3 = 48 kHz
.... .... .001 0... = 2 = 2 channels (stereo)
.... .... .... .0.. = 0 = 1024 sample length
.... .... .... ..0. = 0 = doesn't depends on core code (?)
.... .... .... ...0 = 0 = extension flag

Categories

Resources