Decoding a binary file - android

I am using the following code to read a structured binary file
FileInputStream objIS = new FileInputStream("/sdcard/waverider.usr");
DataInputStream objBR = new DataInputStream(objIS);
objwVer = objBR.readShort();
objwSVer = objBR.readShort();
objNoWaypoints = objBR.readShort();
System.out.println(objwVer);
System.out.println(objwSVer);
System.out.println(objNoWaypoints);
objIS.close();
This produces the result:
512
0
8706
The result I am after is
2
0
546
That is what I get with VB.net using:
objwVer = objBR.ReadInt16
objwSVer = objBR.ReadInt16
objNoWaypoints = objBR.ReadInt16
The binary file is as follow:
02 00 00 00 22 02 00 00 4b a9 c8 ff b2 2d 33 00
00 00 00 00 04 00 00 00 48 6f 6d 65 00 00 00 00
1c 68 53 0d 10 27 00 00 00 00 01 00 7e 8c c8 ff
Am I missing something here
9d 33 33 00 00 00 00 00 06 00 00 00

Your problem is that
your file is written in little endian byte order but your code is expecting big endian byte order.

Your stream is encoded in little-endian and DataInputStream only supports big-endian.
You could use ByteBuffer instead, which allows to parse little-endian formatted stream: http://developer.android.com/reference/java/nio/ByteOrder.html#LITTLE_ENDIAN

try JBBP framework, it is android compatible and its JBBPBitInputStream allows read data in different byte and bit order
new JBBPBitInputStream(in).readInt(JBBPByteOrder.LITTLE_ENDIAN);

Related

WhatsApp video as Gif sharing on Android programatically

How can i convert a mp4 video file to a WhatsApp gif file (it's simple showed as gif inside app UI but internally is a specific mp4 format) to be use in android share intent, being recognized as this type of media by whatsapp chat app???
I search a lot but i can't find any information from WhatsApp docs (they don't have this kind of doc anyway) or any dev with the same problem as i.
WHAT I HAVE:
I have discovered that at beginning of whatsapp "gif" mp4 files is present a loop value if you read they on hex editor, all files have this. Remove this value make whatsapp receive as regular video (not sharing as gif).
How can i add this value using ffmpeg encoding? (editing my mp4 files manually with this value corrupt the files, maybe i have to fix some mp4 header index that i don't know yet...)
FIRST 80 BYTES in hexadecimal (from beginning to start of "moov" atom from mp4 structure):
00 00 00 1C 66 74 79 70 6D 70 34 32 00 00 00 01 6D 70 34 31 6D 70 34 32 69 73 6F 6D 00 00 00 18 62 65 61 6D 01 00 00 00 01 00 00 00 00 00 00 00 05 00 00 00 00 00 00 0C 6C 6F 6F 70 00 00 00 00 00 00 00 08 77 69 64 65 00 00 04 9F 6D 6F 6F 76
A short mp4 file generated by WhatsApp that internally (at app) was showed as a Gif (with different UI):
https://www.dropbox.com/s/kpynmx1bg3z76lz/VID-20171024-WA0009.mp4?dl=0
"...The problem is that I can't edit another MP4 file to add this atom
without corrupt the file.
Test this small_VC1edit.mp4 in WhatsApp. If it does what you want then read on...
To make a playable MP4 :
Using original small.mp4 as an editing example (download file and open with a hex editor).
1) In a blank byte array, add the first 72 bytes of the shown WhatsApp-style MP4 header.
00 00 00 1C 66 74 79 70 6D 70 34 32 00 00 00 01 6D 70 34 31 6D 70 34 32 69 73 6F 6D 00 00 00 18 62 65 61 6D 01 00 00 00 01 00 00 00 00 00 00 00 05 00 00 00 00 00 00 0C 6C 6F 6F 70 00 00 00 00 00 00 00 08 77 69 64 65
You've shown 80 bytes but the last 8 bytes are not needed yet (also four of those eight byte's values must be different for your output file).
2) Calculate the deltas.
Note the (new) WhatsApp header is 72 bytes (before moov atom).
Note the (original) Small.mp4 has 160 bytes of header (before moov atom).
So just use this logic (a or b):
a) If WhatsApp header is bigger than input MP4 :
delta = ( WhatsApp_header - input_MP4_header)
b) If input MP4 header is bigger than WhatsApp :
delta = ( input_MP4_header - WhatsApp_header )
So for the input small.mp4 which has 160 header bytes (is followed next by 4 bytes of moov's SIZE (as 00 00 0D 83) and then follows another 4 bytes now of moov's NAME (as 6D 6F 6F 76 or utf-8 text of "moov").
We can say : 160 MP4 bytes - 72 WhatsApp bytes = Delta of 88.
If you delete these original 160 bytes and replace them with the shorter 72 WhatsApp bytes, they'll be 88 less bytes which must now be accounted for in the another section of MOOV data. That section is the STCO atom.
3) Update the STCO atom with new offsets:
In small.mp4 the STCO atom begins at offset 1579 (as 73 74 63 6F). The previous 4 bytes (offsets: 1575 to 1578) are stco's SIZE bytes (as 00 00 00 B8) which is decimal value 184. This total SIZE of bytes length includes accounting for those 4 SIZE bytes too.
Skip 12 bytes from the starting byte 73 of the stco's NAME bytes 73 74 63... so skip these:
73 74 63 6F 00 00 00 00 00 00 00 2A
Now you reach point to sequentially update every 32-bit integer (4 bytes) of offsets with the new delta value. But how many offsets to update?
atomEditTotal = ( (stco_SIZE - 16) / 4); //gives 42 //PS: Minus by 16 is to trim off non-offset bytes.
So there are 42 entries to edit. Our Delta as 88 so for each integer we read value, minus it by 88, then write it back new value at same place, repeat another 41 times (using While loop with an if condition to break; the loop).
For testing, given a corrupt file, if you edit the first entry it should be enough to show frame 1 of video (if non-audio file).
PS: After editing the STCO offsets of small.mp4, just delete its starting 160 bytes and join/concat those remaining MP4 bytes to the back/end of the 72 byte WhatsApp header. Save array as new file and test.

Open Mobile API & Extended logical channel

I'm currently working a project with Open Mobile API. Basically, I got this problem when i exchange apdu to the UICC, all my commands converted automatically to the extended logical APDU command (CLA : 0xC1). I'm using Samsung Galaxy S6 Edge during this test with android version : 5.0.2.
APDU > Header [CLA INS P1 P2] 00 70 00 00 194,69 etu MANAGE CHANNEL
< Outgoing data 01
< Return code [SW1 SW2] 90 00
APDU > Header [CLA INS P1 P2] 01 A4 04 00 194,69 etu SELECT
Incoming data A0 00 00 05 59 10 10 FF FF FF FF 89 00 00 01 00
< Outgoing data 6F 1A 84 10 A0 00 00 05 59 10 10 FF FF FF FF 89
00 00 01 00 A5 06 73 00 9F 65 01 FF
< Return code [SW1 SW2] 90 00
APDU > Header [CLA INS P1 P2] C1 E2 91 00 187,69 etu
Incoming data BF 2D 00
< Return code [SW1 SW2] 6D 00
APDU > Header [CLA INS P1 P2] 00 70 80 01 192,69 etu MANAGE CHANNEL
< Return code [SW1 SW2] 90 00
What could be a problem? Who is responsible to change my CLA command to 0xC1? Why the phone change the CLA command to 0xC1?
Note : Based on my application log, I send this 81 E2 91 00 02 BF 2D 00
Thanks for your help.

Radiusnetwork Sample for make RaspberryPi as a Ibeacon device failed

Followed by the sample, I started my Pi with the command:
pi#raspberrypi ~ $ sudo hciconfig hci0 up
pi#raspberrypi ~ $ hciconfig
hci0: Type: BR/EDR Bus: USB
BD Address: 00:1A:7D:DA:71:13 ACL MTU: 310:10 SCO MTU: 64:8
UP RUNNING
RX bytes:1094 acl:0 sco:0 events:54 errors:0
TX bytes:768 acl:0 sco:0 commands:54 errors:0
pi#raspberrypi ~ $ sudo hcitool -i hci0 cmd 0x08 0x0008 1e 02 01 1a 1a ff 4c 00 02 15 e2 c5 6d b5 df fb 48 d2 b0 60 d0 f5 a7 10 96 e0 00 00 00 00 c5 00 00 00 00 00 00 00 00 00 00 00 00 00
< HCI Command: ogf 0x08, ocf 0x0008, plen 44
1E 02 01 1A 1A FF 4C 00 02 15 E2 C5 6D B5 DF FB 48 D2 B0 60
D0 F5 A7 10 96 E0 00 00 00 00 C5 00 00 00 00 00 00 00 00 00
00 00 00 00
> HCI Event: 0x0e plen 4
01 08 20 12
pi#raspberrypi ~ $ sudo hciconfig hci0 leadv 3
Then, turned on my Android 4.4 phone which works perfectly with the Ibeacon devices I ordered from online retailers(no brand).
works perfectly means I can see all the advertising data in Java code(by following onLeScan(...) callback), as well as the RSSI, and parsing them(UUID, major, minor and etc.).
The strange thing is I put a debug info into:
public synchronized void onLeScan(final BluetoothDevice device,
int rssi, byte[] scanRecord)
the byte[] scanRecord I got from PI is:
02 01 0A 02 0A 08 0C 09 43 53 52 38 35 31 30 20 41 31 30 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
I can't read these bytes which seems totally mismatched with the one set via Pi command line, I tried with the App IBeacon Locate, still couldn't find my Pi, please help, thanks.
Add more details:
Only the BLE dongle is a different brand from the sample since I'm not in US, and I'm not sure if its drivers were installed correctly in PI and actually I didn't explicitly installed any drivers for it(I'm a new guy for Linux), I just pulg in and started the command line.
Some folks with different bluetooth dongles have reported having to alter the order of the commands and disabling advertising before enabling it. Try:
sudo hciconfig hci0 up
sudo hciconfig hci0 noleadv
sudo hciconfig hci0 leadv
sudo hcitool -i hci0 cmd 0x08 0x0008 1e 02 01 1a 1a ff 4c 00 02 15 e2 c5 6d b5 df fb 48 d2 b0 60 d0 f5 a7 10 96 e0 00 00 00 00 c5 00 00 00 00 00 00 00 00 00 00 00 00 00
You might also let us know the model of your bluetooth dongle and what it reports itself as to Linux. You can see this by typing lsusb
BTW, nice job capturing the bytes read by Android's onLEScan method. Super helpful!

Android MediaCodec 3gpp encoder output buffer contains incorrect bytes

I'm trying to encode Audio stream from MIC as 3gpp (AMR-NB). The problem is that the ouput buffer contains weird data. Code and output follows:
Creating media encoder:
MediaFormat format = MediaFormat.createAudioFormat("audio/3gpp", 8*1024, 1);
format.setInteger(MediaFormat.KEY_BIT_RATE, 8*1024);
format.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, minBufSize);
MediaCodec encoder = MediaCodec.createEncoderByType("audio/3gpp");
encoder.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
encoder.start();
PCM data from MIC seems to be correct (stored to file, listened with Audacity)
Reading encoded bytes (buffers, running in thread):
ByteBuffer[] outputBuffers = encoder.getOutputBuffers();
int outputBufferIndex = 0;
while( outputBufferIndex >= 0 )
{
MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
outputBufferIndex = encoder.dequeueOutputBuffer(bufferInfo, -1);
if (outputBufferIndex >= 0)
{
ByteBuffer outputBuffer = outputBuffers[outputBufferIndex];
byte[] outData = new byte[bufferInfo.size];
outputBuffer.get(outData);
outputBuffer.clear();
encoder.releaseOutputBuffer(outputBufferIndex, false);
Log.d(LOG_TAG_ENCODING, util.bytesToString(outData));
}
else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED)
{
outputBuffers = encoder.getOutputBuffers();
}
}
And the output is:
07-11 13:13:58.622: 34 6c 1e 08 27 80 05 28 56 40 00 00 03 00 00 00 00 00 00 00 00 00 00 00 00 00 00
07-11 13:13:58.632: 34 6c 1e 08 27 80 05 28 56 40 00 00 03 00 00 00 00 00 00 00 00 00 00 00 00 00 00
07-11 13:13:58.667: 34 ff d9 08 27 80 05 28 56 40 00 00 03 00 00 00 00 00 00 00 00 00 00 00 00 00 00
07-11 13:13:58.672: 34 6c 1e 08 27 80 05 28 56 40 00 00 03 00 00 00 00 00 00 00 00 00 00 00 00 00 00
07-11 13:13:58.677: 34 6c 1e 08 27 80 05 28 56 40 00 00 03 00 00 00 00 00 00 00 00 00 00 00 00 00 00
I've googled and have found no help. Android docs on MediaCodec usage are not excellent, too - lot of trial and error with ByteBuffer.clear() usage in outputbuffer context.
best regards,
Ahti.
To all the fellow sufferers out there, answering my own question.
The real issue actually was feeding raw PCM data to encoder input. Android docs are vague on how to exactly feed in the data into the input buffer (ok, it has actually more to do with ByteBuffer behaviour to be honest):
int inputBufferIndex = codec.dequeueInputBuffer(timeoutUs);
if (inputBufferIndex >= 0) {
// fill inputBuffers[inputBufferIndex] with valid data
...
codec.queueInputBuffer(inputBufferIndex, ...);
}
My interpretation was to add data as following:
inputBuffers[inputBufferIndex].clear();
inputBuffers[inputBufferIndex].put(audioPCMbuffer);
codec.queueInputBuffer(inputBufferIndex, ...);
The above code has one bit missing: flip the position of the ByteBuffer!
inputBuffers[inputBufferIndex].flip();
Keeping it here for the future reference as it was quite hard to find simple code to see the implementation.

Gmail on Android corrupt some files

In one of my application, I open binary files, and I got some error report by users on some files. When they send me the files, if I download them on Gmail in the desktop, the file displays nicely in my app. When I download them with the native Android GMail app, the file doesn't open.
Here are the first 64 bytes of the original file, and as it appear when downloaded from the desktop (displayed as hexa):
03 00 08 00 D8 0C 00 00 01 00 1C 00 BC 02 00 00
2D 00 00 00 00 00 00 00 00 01 00 00 D0 00 00 00
00 00 00 00 00 00 00 00 10 00 00 00 25 00 00 00
33 00 00 00 3D 00 00 00 44 00 00 00 49 00 00 00
And here are the first 64 bytes of the file downloaded with the native GMail app (hexa again) :
EF BF BD EF BF BD 2D EF BF BD 25 33 3D 44 49 4D
52 63 72 76 EF BF BD EF BF BD EF BF BD EF BF BD
EF BF BD EF BF BD EF BF BD EF BF BD EF BF BD EF
BF BD EF BF BD 29 2E 3E 43 54 59 69 6E 7F EF BF
Is there a sort of compression applied to this file or is the GMail app corrupting it ? Especially if you look at the end of the first sample, you have the following bytes 10, 25, 33, 3D, 44, 49, which also appear in the first line of the second sample, which leed me to think that it's a compression of some sort.
I'm not sure of the exact source, but if you look at http://www.cogsci.ed.ac.uk/~richard/utf-8.cgi?input=%F6&mode=char then that pattern is due to something trying to interpret the file as UTF-8, doing replacement, then writing the file as UTF-8.

Categories

Resources