I'm new in NFC and i am developing an android application to read and write data in an nfc, but i'm having some problems.
it's code i'm using (WRITE):
#Override
protected void onNewIntent(Intent intent) {
super.onNewIntent(intent);
if (intent.hasExtra(NfcAdapter.EXTRA_TAG)) {
Toast.makeText(this, R.string.message_tag_detected, Toast.LENGTH_SHORT).show();
}
Tag currentTag = intent.getParcelableExtra(NfcAdapter.EXTRA_TAG);
byte[] id = currentTag.getId();
String myData = "ABCDEFGHIJKL";
for (String tech : currentTag.getTechList()) {
if (tech.equals(NfcV.class.getName())) {
NfcV tag5 = NfcV.get(currentTag);
try {
tag5.connect();
int offset = 0;
int blocks = 8;
byte[] data = myData.getBytes();
byte[] cmd = new byte[] {
(byte)0x20,
(byte)0x21,
(byte)0x00, (byte)0x00, (byte)0x00, (byte)0x00, (byte)0x00, (byte)0x00, (byte)0x00, (byte)0x00,
(byte)0x00,
(byte)0x00, (byte)0x00, (byte)0x00, (byte)0x00
};
System.arraycopy(id, 0, cmd, 2, 8);
for (int i = 0; i < blocks; ++i) {
cmd[10] = (byte)((offset + i) & 0x0ff);
System.arraycopy(data, i, cmd, 11, 4);
response = tag5.transceive(cmd);
}
}
catch (IOException e) {
Toast.makeText(getApplicationContext(), e.getMessage(), Toast.LENGTH_LONG).show();
return;
}
}
}
}
When i read a tag in app TagInfo, the output is:
[00] . 41 42 43 44 [ABCD]
[01] . 42 43 44 45 [BCDE]
[02] . 43 44 45 46 [CDEF]
[03] . 44 45 46 47 [DEFG]
[04] . 45 46 47 48 [EFGH]
[05] . 46 47 48 49 [FGHI]
[06] . 47 48 49 4A [GHIJ]
[07] . 48 49 4A 4B [HIJK]
[08] . 00 00 00 00 [. . . .]
. . .
Is this output correct?
If 'NOT', where am i going wrong?
To me this looks wrong but not an expert in NfcV only used NDEF nfc cards.
[00] . 41 42 43 44 [ABCD]
[01] . 45 46 47 48 [EFGH]
[02] . 49 4A 4B 4C [IJKL]
As the what actually your are wanting to do
I think the problem lies with System.arraycopy(data, i, cmd, 11, 4);
You are copying 4 bytes of data from your source data array but only incrementing the start position by 1 byte of data hence the next block start on letter later.
I think System.arraycopy(data, i*4, cmd, 11, 4); would produce the results you want.
As this increments the start of the arraycopy in the source data by the number of bytes you have already stored.
As you 12 bytes of data and each block stores 4 bytes you only need to use 3 blocks, so only loop 3 times by setting int blocks = 3; otherwise you will run out of data to copy in to cmd to send to the card generating IndexOutOfBoundsException from arraycopy
If you don't have a multiple of 4 bytes of data you will have to pad the data with zeros to be a multiple of 4 bytes OR handle a IndexOutOfBoundsException from arraycopy to correctly copy the remaining bytes.
Related
I have OMAP5432 EVM running Android 4.2 with 4 USB connected Logitech C270 cameras.
I use V4L2 driver from NDK C-code to open and stream from cameras in MJPEG mode.
Everything works fine except right after power cycling cameras.
After power cycling cameras two or sometimes three of them come up correctly but one start spitting frame buffers that are missing MJPEG SOI 0xFF, 0xD8 at the beginning of the buffer while EOI 0xFF, 0xD9 is present.
Closing and reopening offending camera files /dev/videoX with posix close() fixes the problem for good till next power cycle.
Never happens with single camera connected, only with 3 or 4 of those.
I've tried replacing USB hubs, connecting cameras directly to no avail.
struct v4l2_format fm = {0};
fm.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
int r = xioctl(c->fd, VIDIOC_G_FMT, &fm);
if (r != 0) {
printf("VIDIOC_G_FMT %s r=%d error %d, %s", c->name, r, errno, strerror(errno));
return -1;
}
fm.fmt.pix.width = c->width; // 640
fm.fmt.pix.height = c->height; // 480
fm.fmt.pix.pixelformat = V4L2_PIX_FMT_MJPEG;
fm.fmt.pix.field = V4L2_FIELD_ANY;
fm.fmt.pix.colorspace = V4L2_COLORSPACE_JPEG;
if (c->fm != V4L2_PIX_FMT_MJPEG) {
unsigned int min = fm.fmt.pix.width * 2;
if (fm.fmt.pix.bytesperline < min) {
fm.fmt.pix.bytesperline = min;
}
min = fm.fmt.pix.bytesperline * fm.fmt.pix.height;
if (fm.fmt.pix.sizeimage < min) {
fm.fmt.pix.sizeimage = min;
}
} else {
fm.fmt.pix.bytesperline = 0;
fm.fmt.pix.sizeimage = 0;
}
r = xioctl(c->fd, VIDIOC_S_FMT, &fm);
followed by
static int init_mmap(camera_t_* c) {
struct v4l2_requestbuffers rq = {0};
rq.count = BUFFERS;
rq.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
rq.memory = V4L2_MEMORY_MMAP;
if (xioctl(c->fd, VIDIOC_REQBUFS, &rq) != 0) {
if (EINVAL == errno) {
printf("%s does not support memory mapping", c->name);
return -1;
} else {
printf("VIDIOC_REQBUFS error %d, %s", errno, strerror(errno));
return -1;
}
}
if (rq.count < 2) {
printf("Insufficient buffer memory on %s", c->name);
return -1;
}
c->buffers = malloc(rq.count * sizeof(buffer_t));
if (c->buffers == null) {
printf("out of memory");
return -1;
}
c->n_buffers = rq.count;
for (int i = 0; i < rq.count; i++) {
struct v4l2_buffer b = {0};
b.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
b.memory = V4L2_MEMORY_MMAP;
b.index = i;
if (xioctl(c->fd, VIDIOC_QUERYBUF, &b) != 0) {
printf("VIDIOC_QUERYBUF error %d, %s", errno, strerror(errno));
return -1;
}
c->buffers[i].length = b.length;
c->buffers[i].start = mmap(null, b.length, PROT_READ | PROT_WRITE, MAP_SHARED, c->fd, b.m.offset);
if (c->buffers[i].start == MAP_FAILED) {
printf("mmap error %d, %s", errno, strerror(errno));
return -1;
}
}
return 0;
}
the data I am getting back is like this:
missing FFD8 tag 0x688fb000 bytes=6946 length=213333
0x688fb000: AB 74 10 0C EE FD 34 FB B3 FA 3E D1 60 9D 33 B0 ?t????4???>?`?3?
0x688fb010: F1 83 7B 8A AF B8 F9 BF EF 33 EE FF 89 91 7F 9F ??{??????3?????
0x688fb020: 47 EF A4 B9 73 1C BC FB AD 46 6A D5 22 A2 2C 32 G???s????Fj?"?,2
0x688fb030: 2B A8 71 7E 56 56 73 23 15 7B 11 B2 F0 FA AA D3 +?q~VVs#?{??????
.............
0x688fcae2: 00 A2 80 0A 28 03 FF D5 C6 A2 80 0A 28 00 A2 80 ????(???????(???
0x688fcaf2: 0A 28 00 A2 80 0A 28 00 A2 80 0A 28 00 A2 80 0A ?(????(????(????
0x688fcb02: 28 03 FF D6 C6 A2 80 0A 28 00 A2 80 0A 28 00 A2 (???????(????(??
0x688fcb12: 80 0A 28 00 A2 80 0A 28 00 A2 80 0A 28 03 FF D9 ??(????(????(???
and it is the same data for each of dequeued buffer...
Is it something wrong with C270 or V4L2 or something wrong with my code?
Anybody experience similar issues?
OK, it looks like I've found a root of the problem looking thru kernel logs with:
adb shell sudo echo 8 > /proc/sys/kernel/printk
adb >/tmp/adb_klog.txt 2>/tmp/adb_klog.txt shell sudo cat /proc/kmsg
commands. It looks like VIDIOC_G_FMT is failing for 1 or 2 out of 4 cameras without reporting the error to the user land ioctl() call:
<7>[ 7274.495178] uvcvideo: uvc_v4l2_open
<7>[ 7274.495239] ehci-omap ehci-omap.0: reused qh ca5bf1c0 schedule
<7>[ 7274.495269] usb 1-2.2.1.2: link qh16-0001/ca5bf1c0 start 1 [1/0 us]
<7>[ 7274.495330] uvcvideo: uvc_v4l2_ioctl(VIDIOC_G_FMT)
<7>[ 7274.495452] uvcvideo: uvc_v4l2_ioctl(VIDIOC_S_FMT)
<7>[ 7274.495483] uvcvideo: Trying format 0x47504a4d (MJPG): 640x480.
<7>[ 7274.495513] uvcvideo: Using default frame interval 33333.3 us (30.0 fps).
<7>[ 7279.495208] usb 1-2.2.1.2: .<process_name> timed out on ep0out len=0/26
<3>[ 7279.495239] uvcvideo: Failed to set UVC probe control : -110 (exp. 26).
<7>[ 7279.502746] uvcvideo: uvc_v4l2_ioctl(VIDIOC_G_FMT)
<7>[ 7279.507080] uvcvideo: uvc_v4l2_ioctl(VIDIOC_LOG_STATUS)
<7>[ 7279.507080] uvcvideo: Unknown ioctl 0x00005646
The simplest workaround I came up with is to start streaming after opening camera, wait (with timeout) till the frames start flowing and if the corrupted frames appear close and reopen the camera.
In my code it looks like this:
static bool has_bad_frames(camera_t* camera) {
/* Logitech C270 randomly spits corrupted MJPEGs after power cycle. Known workaround is to reopen camera */
camera_t_* c = (camera_t_*)camera;
camera_set_callback(c, empty_callback); // otherwise MJPEG frames are not going to be decompressed
camera_start(camera);
int retry = 300; // 3 seconds max
while (retry > 0 && c->good_frames == 0 && c->bad_frames == 0) {
nsleep(NANOSECONDS_IN_SECOND / 100); /* 1/100 second */
retry--;
}
bool ok = c->bad_frames == 0 && c->good_frames > 0;
camera_stop(camera);
return !ok;
}
int camera_open(void* that, camera_t* o, int id, int w, int h, int bpp) {
int r = try_to_open(that, o, id, w, h, bpp);
if (r == 0) {
if (has_bad_frames(*o)) {
camera_t_* c = (camera_t_*)*o;
if (c->bad_frames + c->good_frames == 0) {
trace("%s is not streaming; retrying...", c->name);
} else {
trace("%s spits corrupted frames. Probable VIDIOC_S_FMT silently failed; retrying...", c->name);
}
camera_close(*o);
*o = null;
return try_to_open(that, o, id, w, h, bpp);
}
}
return r;
}
I am trying to send some data to the Nexus 4 through NFC (i.e. the card emulation mode). I tried a number of the command APDUs such as writing and updating APDUs, but I couldn't get them to work.
What I am trying to say is, I want to send some data (that is not the AID) to the phone after the select APDU command.
Thanks in advance,
Bader
The HCE emulated card will understand exactly those commands that your HCE app's APDU service processes. So, for instance, if your HCE service's processCommandApdu() callback method looks like this:
final static byte[] SW_NO_ERROR = new byte[]{ (byte)0x90, (byte)0x00 };
final static byte[] SW_INCORRECT_P1P2 = new byte[]{ (byte)0x6A, (byte)0x86 };
final static byte[] SW_INS_NOT_SUPPORTED = new byte[]{ (byte)0x6D, (byte)0x00 };
final static byte[] SW_ERR_UNKNOWN = new byte[]{ (byte)0x6F, (byte)0x00 };
#Override
public byte[] processCommandApdu(byte[] apdu, Bundle extras) {
if (apdu.length >= 4) {
if ((apdu[1] == (byte)0xA4) && (apdu[2] == (byte)0x04)) {
// SELECT APPLICATION
return SW_NO_ERROR;
} else if ((apdu[1] == (byte)0xCA) && (apdu[2] == (byte)0x02)) {
// GET DATA (SIMPLE TLV)
switch (apdu[3] & 0x0FF) {
case 0x001:
return new byte[]{ apdu[3], (byte)0x02, (byte)0x01, (byte)0x00, (byte)0x90, (byte)0x00 };
case 0x002:
return new byte[]{ apdu[3], (byte)0x02, (byte)0x12, (byte)0x34, (byte)0x90, (byte)0x00 };
case 0x003:
return new byte[]{ apdu[3], (byte)0x06, (byte)0xAA, (byte)0xBB, (byte)0xCC, (byte)0xDD, (byte)0xEE, (byte)0xFF, (byte)0x90, (byte)0x00 };
default:
return SW_INCORRECT_P1P2;
}
} else {
return SW_INS_NOT_SUPPORTED;
}
}
return SW_ERR_UNKNOWN;
}
Your HCE app would understand the following command APDUs:
SELECT APPLICATION (by AID)
00 A4 04 xx ...
GET DATA for data object 0201
00 CA 02 01 00
GET DATA for data object 0202
00 CA 02 02 00
GET DATA for data object 0203
00 CA 02 03 00
Other commands will result in various errors.
Here is my problem,
I have implemented a server side application using Red5, which sends H.264 encoded live stream, on client side the stream is received as byte[]
In order to decode it on Android client side i have followed the Javacv-FFmpeg library. The code for decoding is as follows
public Frame decodeVideo(byte[] data,long timestamp){
frame.image = null;
frame.samples = null;
avcodec.av_init_packet(pkt);
BytePointer video_data = new BytePointer(data);
avcodec.AVCodec codec = avcodec.avcodec_find_decoder(codec_id);
video_c = null;
video_c = avcodec.avcodec_alloc_context3(codec);
video_c.width(320);
video_c.height(240);
video_c.pix_fmt(0);
video_c.flags2(video_c.flags2()|avcodec.CODEC_FLAG2_CHUNKS);
avcodec.avcodec_open2(video_c, codec, null))
picture = avcodec.avcodec_alloc_frame()
pkt.data(video_data);
pkt.size(data.length);
int len = avcodec.avcodec_decode_video2(video_c, picture, got_frame, pkt);
if ((len >= 0) && ( got_frame[0] != 0)) {
....
process the decoded frame into **IPLImage of Javacv** and render it with **Imageview** of Android
}
}
Data received from server is as follows
Few Frames having following pattern
17 01 00 00 00 00 00 00 02 09 10 00 00 00 0F 06 00 01 C0 01 07 09 08 04 9A 00 00 03 00 80 00 00 16 EF 65 88 80 07 00 05 6C 98 90 00...
Many frames having following pattern
27 01 00 00 00 00 00 00 02 09 30 00 00 00 0C 06 01 07 09 08 05 9A 00 00 03 00 80 00 00 0D 77 41 9A 02 04 15 B5 06 20 E3 11 E2 3C 46 ....
With H.264 codec for decoder, decoder outputs length >0 but got_frames=0 always.
With MPEG1 codec, decoder outputs length >0 and got_frames>0 but the output image is green or distorted.
However following FFmpegFrameGrabber code of javacv i can decode the local files( H.264 encoded ) with similar code as above.
I wonder what details i am missing, and header related data manipulation or setting codec appropriate for decoder.
Any suggestion, help appreciated.
Thanks in advance.
Atlast... finally got to working after lots of RnD.
What i am missing is alalyze the video frame structure. Video is made up of "I" , "P" frames.. "I" frame is information frame, which stores the information about next subsequent frames. "P" frame is picture frame, which holds actual video frame...
So i need to decode the "P" frames w.r.t information in "I" frame..
So the final code is something as follows
public IplImage decodeFromVideo(byte[] data, long timeStamp) {
avcodec.av_init_packet(reveivedVideoPacket); // Empty AVPacket
/*
* Determine if the frame is a Data Frame or Key. IFrame 1 = PFrame 0 = Key
* Frame
*/
byte frameFlag = data[1];
byte[] subData = Arrays.copyOfRange(data, 5, data.length);
BytePointer videoData = new BytePointer(subData);
if (frameFlag == 0) {
avcodec.AVCodec codec = avcodec
.avcodec_find_decoder(avcodec.AV_CODEC_ID_H264);
if (codec != null) {
videoCodecContext = null;
videoCodecContext = avcodec.avcodec_alloc_context3(codec);
videoCodecContext.width(320);
videoCodecContext.height(240);
videoCodecContext.pix_fmt(avutil.AV_PIX_FMT_YUV420P);
videoCodecContext.codec_type(avutil.AVMEDIA_TYPE_VIDEO);
videoCodecContext.extradata(videoData);
videoCodecContext.extradata_size(videoData.capacity());
videoCodecContext.flags2(videoCodecContext.flags2()
| avcodec.CODEC_FLAG2_CHUNKS);
avcodec.avcodec_open2(videoCodecContext, codec,
(PointerPointer) null);
if ((videoCodecContext.time_base().num() > 1000)
&& (videoCodecContext.time_base().den() == 1)) {
videoCodecContext.time_base().den(1000);
}
} else {
Log.e("test", "Codec could not be opened");
}
}
if ((decodedPicture = avcodec.avcodec_alloc_frame()) != null) {
if ((processedPicture = avcodec.avcodec_alloc_frame()) != null) {
int width = getImageWidth() > 0 ? getImageWidth()
: videoCodecContext.width();
int height = getImageHeight() > 0 ? getImageHeight()
: videoCodecContext.height();
switch (imageMode) {
case COLOR:
case GRAY:
int fmt = 3;
int size = avcodec.avpicture_get_size(fmt, width, height);
processPictureBuffer = new BytePointer(
avutil.av_malloc(size));
avcodec.avpicture_fill(new AVPicture(processedPicture),
processPictureBuffer, fmt, width, height);
returnImageFrame = opencv_core.IplImage.createHeader(320,
240, 8, 1);
break;
case RAW:
processPictureBuffer = null;
returnImageFrame = opencv_core.IplImage.createHeader(320,
240, 8, 1);
break;
default:
Log.d("showit",
"At default of swith case 1.$SwitchMap$com$googlecode$javacv$FrameGrabber$ImageMode[ imageMode.ordinal()]");
}
reveivedVideoPacket.data(videoData);
reveivedVideoPacket.size(videoData.capacity());
reveivedVideoPacket.pts(timeStamp);
videoCodecContext.pix_fmt(avutil.AV_PIX_FMT_YUV420P);
decodedFrameLength = avcodec.avcodec_decode_video2(videoCodecContext,
decodedPicture, isVideoDecoded, reveivedVideoPacket);
if ((decodedFrameLength >= 0) && (isVideoDecoded[0] != 0)) {
.... Process image same as javacv .....
}
Hope it wil help others..
I'm trying to read a smartcard via my LG P710 Optimus L7 2.
I'm following this tutorial
I can select the "1PAY.SYS.DDF01" directory
I can select the Application
But I can't perform an "GET PROCESSING OPTIONS"
It always result in an 6700 error (Lc or Le wrong)
here is my code
NfcAdapter mNFCAdapter;
Intent intent;
PendingIntent pendingIntent;
private TextView mTextView;
String[][] techList;
IntentFilter[] filters = new IntentFilter[3];
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
mTextView = (TextView) findViewById(R.id.title);
mNFCAdapter = NfcAdapter.getDefaultAdapter(this);
intent = new Intent(getApplicationContext(), getClass());
intent.setFlags(Intent.FLAG_ACTIVITY_SINGLE_TOP);
pendingIntent = PendingIntent.getActivity(getApplicationContext(), 0, intent, 0);
techList = new String[][]{
new String[]
{ MifareClassic.class.getName() },
new String[]
{ IsoDep.class.getName() }
};
filters[0] = new IntentFilter();
filters[0].addAction(NfcAdapter.ACTION_NDEF_DISCOVERED);
filters[0].addCategory(Intent.CATEGORY_DEFAULT);
// add type of tag data you want to have - here ndef -> plain text
try {
filters[0].addDataType(MIME_TEXT_PLAIN);
} catch (MalformedMimeTypeException e) {
e.printStackTrace();
}
filters[1] = new IntentFilter();
filters[1].addAction(NfcAdapter.ACTION_TAG_DISCOVERED);
filters[1].addCategory(Intent.CATEGORY_DEFAULT);
filters[2] = new IntentFilter();
filters[2].addAction(NfcAdapter.ACTION_TECH_DISCOVERED);
filters[2].addCategory(Intent.CATEGORY_DEFAULT);
}
#Override
protected void onNewIntent(Intent intent) {
super.onNewIntent(intent);
String action = intent.getAction();
mTextView.setText(action);
Toast.makeText(getApplicationContext(), action, Toast.LENGTH_SHORT).show();
Tag tagFromIntent = intent.getParcelableExtra(NfcAdapter.EXTRA_TAG);
IsoDep tagIsoDep;
if((tagIsoDep = IsoDep.get(tagFromIntent)) != null)
if(handleIsoDep(tagIsoDep))
return;
}
private boolean handleIsoDep(IsoDep tag){
try{
tag.connect();
tag.setTimeout(20);
byte[] responseAPDU;
//2PAY.SYS.DDF01
byte[] select_Dir = new byte[]{
(byte)0x00, (byte)0xa4, (byte)0x04, (byte)0x00, (byte)0x0e,
(byte)0x32, (byte)0x50, (byte)0x41, (byte)0x59, (byte)0x2e,
(byte)0x53, (byte)0x59, (byte)0x53, (byte)0x2e, (byte)0x44,
(byte)0x44, (byte)0x46, (byte)0x30, (byte)0x31
};
//Select CC Applet
byte[] select_Applet = new byte[]{
(byte)0x00, (byte)0xa4, (byte)0x04, (byte)0x00, (byte)7,
(byte)0xa0, (byte)0x00, (byte)0x00, (byte)0x00, (byte)0x04,
(byte)0x30, (byte)0x60
};
//Send GET PROCESSING OPTIONS command
byte[] Send_Get = new byte[]{
(byte)0x80,(byte)0xA8,(byte)0x00,(byte)0x00,(byte)0x02,
(byte)0x83,(byte)0x00,
(byte)0x00
};
responseAPDU = tag.transceive(select_Dir);
mTextView.setText(mTextView.getText() + handleResponse(responseAPDU));
this returns the APDU-Statusword 9000 -> success
responseAPDU = tag.transceive(select_Applet);
mTextView.setText(mTextView.getText() + handleResponse(responseAPDU));
this returns the APDU-Statusword 9000 -> success
responseAPDU = tag.transceive(Send_Get);
mTextView.setText(mTextView.getText() + handleResponse(responseAPDU));
and this one is making problems: it returns 6700 -> wrong Lc or Le
mTextView.setText(mTextView.getText() + "\n\nDone");
tag.close();
} catch (IOException e) {
e.printStackTrace();
return false;
}
return true;
}
The function handleResponse just parses the "responseAPDU" from Binary to Hex an highlights the Statusword
Can anybody tell my what is going wrong?
or just help me out?
PS sry for bad english ;)
As response to my application-select I get:
6f298407a0000000043060a51e50074d41455354524f5f2d046465656e9f38039f5c08bf0c059f4d020b0a9000
6F -> FCI Template 29
84 -> DF Name 07 A0 00 00 00 04 30 60
A5 -> FCI Properietary Template 1E
50 -> Application Lable 07 4D 41 45 53 54 52 4F 5F 2D 04 64 65 6E
9F38 -> PDOL 03 9F 5C 08
BF0C -> FCI Issuer Data 05
9F4D -> Log Entry 02 0B
0A Additional Issuer Data
But I don't know what ive to insert into the Data fild from the GET PROCESSING OPTIONS.
Iv'e red the guidelines in EMV Book 3, section "5.4 Rules for Using a Data Object List (DOL)".
So do I just have to set the data field 83 03 9F 5C 08
and Lc = 5?
In order to help you, the entire ADPU dialog (commands/responses) would be needed.
However, based on your code : hardcoding your select_Dir and select_Applet commands is correct, but you can't hardcode the GET PROCESSING OPTIONS command whose syntax depends on the response of the card (ICC) to your select_Applet command.
EMV 4.3 Book 1, "Table 45: SELECT Response Message Data Field (FCI) of an ADF", explains that a successful card response to the SELECT command contains a "Processing Options Data Object List" (PDOL, tag 9F38). That's the list of fields required by the card to process the transaction (ex : amount, ...). These fields values are to be returned to the card by the terminal (your phone) through the GET PROCESSING OPTIONS command data field (tag 83), as documented in EMV 4.3 book 3, section "6.5.8.3 Data Field Sent in the Command Message" :
The data field of the command message is a data object coded according to the PDOL provided by the ICC, as defined in section 5.4, and is introduced by the tag '83'. When the data object list is not provided by the ICC, the terminal sets the length field of the template to zero. Otherwise, the length field of the template is the total length of the value fields of the data objects transmitted to the ICC.
Knowing that :
Your selected AID (A0 00 00 00 04 30 60) is a Mastercard Maestro one, which is unlikely to have an empty PDOL
But your GET PROCESSING OPTIONS command does not list any value in its data field
You probably have a mismatch between the length of your GET PROCESSING OPTIONS data field and the total length of the fields asked by the card in the PDOL, hence the 6700 checking error returned by the card (EMV Book 1, "Table 33: GET RESPONSE Error Conditions").
You have identified the PDOL requested by the card as : 9F38 -> 03 9F 5C 08.
The 03 tells you the PDOL is 3 bytes long. 9F5C is the tag of the requested field, 08 is the length of the field value that is to be returned by the phone.
Tag 9F5C is defined in EMV Contactless 2.3 Book C2 kernel 2 specification, section "A.1.59 DS Requested Operator ID". The DS Requested Operator ID is defined as
Contains the Terminal determined operator identifier for data
storage. It is sent to the Card in the GET PROCESSING
OPTIONS command.
I'm not familiar with this tag, so I can't tell you what a proper value is.
However, here is what the data field of the GET PROCESSING OPTIONS command should look like, assuming a DS Requested Operator ID has value 01 02 03 04 05 06 07 08, and given the Data Object List formatting guidelines in EMV Book 3, section "5.4 Rules for Using a Data Object List (DOL)" :
83 08 01 02 03 04 05 06 07 08
and Lc = 10
I'm wanting to read MIDI data from an asset stream. The file is a MIDI0 file of length 150 bytes according to Windows. Using this code, I read 150 bytes measured by count, but the output string is only 127.5 bytes.
try {
assetStream = assets.open("MIDI0_7.mid");
int count=0;
do {
byteValue = assetStream.read();
count++;
outputString = outputString + Integer.toHexString(byteValue);
} while (byteValue > -1) ;
Log.d("MUSIC", "Final string " +outputString);
Log.d("MUSIC", "bytes read " +count);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
This read HEX data also doesn't match with the MIDI spec AFAICT. The opening 8 bytes should read
4d 54 68 64 00 00 00 06
but I get
4d 54 68 64 00 06
I can't be sure about the save format of my MIDI file (exported test file with 7 notes from Cakewalk SONAR) so I'm not sure why the MIDI doesn't correspond with the standards, but before I can solve that I need to know where my missing data is! What am I doing wrong to see some bytes getting dropped from my output stream?
Edit:
Okay, found it. Bytes less than 16 are returned as a single character by Integer.toHexString() instead of being a 0x figure. Easily fixed.