Microsoft Face API for Android - android

I am going through the Get Started tutorial for Microsoft's Face API for Android handhelds. As of right now, everything except for the recognition part works. I can browse through photos alright. However, the detect method somehow always returns null and so no red rectangle is drawn. If someone has already successfully gone through the tutorial, I would be grateful if you could help me. Here is the detect method:
public Face[] detect(InputStream image, boolean analyzesFaceLandmarks, boolean analyzesAge, boolean analyzesGender, boolean analyzesHeadPose) throws ClientException, IOException {
Map<String, Object> params = new HashMap<>();
params.put("analyzesAge", analyzesAge);
params.put("analyzesGender", analyzesGender);
params.put("analyzesFaceLandmarks", analyzesFaceLandmarks);
params.put("analyzesHeadPose", analyzesHeadPose);
String path = ServiceHost + "/detections";
String uri = WebServiceRequest.getUrl(path, params);
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
int bytesRead;
byte[] bytes = new byte[1024];
while ((bytesRead = image.read(bytes)) > 0) {
byteArrayOutputStream.write(bytes, 0, bytesRead);
}
byte[] data = byteArrayOutputStream.toByteArray();
params.clear();
params.put("data", data);
String json = this.restCall.request(uri, "POST", params, "application/octet-stream");
Type listType = new TypeToken<List<Face>>() {
}.getType();
List<Face> faces = this.gson.fromJson(json, listType);
return faces.toArray(new Face[faces.size()]);
}

It looks as if the code you have posted is fine. This makes me wonder if the issue might be with the InputStream you are passing in as the input parameter. It might be that having read the source into the InputStream the position of the stream is at the end and therefore when you read from it nothing is being uploaded to the api.
I was suggesting you check the amount of image data you are uploading:
byte[] data = byteArrayOutputStream.toByteArray();
int length = data.length; // <- what is this value?
It might be necessary to reset the stream position to the beginning before reading it again. Something like this may be required before reading from image
if(image.markSupported())
{
image.reset();
}
BTW, my java is very rusty so there might be better code to use but hopefully you get the gist.

Related

How to cache a video in background in android?

I am building an android application where a user can view some listed video. Those videos are categories into some channel. Once a channel is selected by user I want to cache all the video related to that channel in my cache memory so can play the video when there is no internet also.
Can anyone have more understanding about video cache without playing please help me in understanding how I can achieve this task.
Right now I am able to cache video If it's played using some library.
I have find the following working solution for caching video in background (single/multiple) using below lib, no need of player/video_view.use AsyncTaskRunner
Videocaching Lib
Add following in line in your gradle file
compile 'com.danikula:videocache:2.7.0'
Since we just need to kick start the prefetching, no need to do anything in while loop.
Or we can use ByteArrayOutputStream to write down the data to disk.
URL url = null;
try {
url = new URL(cachingUrl(cachingUrl));
InputStream inputStream = url.openStream();
int bufferSize = 1024;
byte[] buffer = new byte[bufferSize];
int length = 0;
while ((length = inputStream.read(buffer)) != -1) {
//nothing to do
}
} catch (IOException e) {
e.printStackTrace();
}
Important code from lib. to do
Create static instance in application class using following code
private HttpProxyCacheServer proxy;
public static HttpProxyCacheServer getProxy(Context context) {
Applications app = (Applications) context.getApplicationContext();
return app.proxy == null ? (app.proxy = app.newProxy()) : app.proxy;
}
private HttpProxyCacheServer newProxy() {
//return new HttpProxyCacheServer(this);
return new HttpProxyCacheServer.Builder(this)
.cacheDirectory(CacheUtils.getVideoCacheDir(this))
.maxCacheFilesCount(40)
.maxCacheSize(1024 * 1024 * 1024)
.build();
}
Write following code in your activity to pass url
public String cachingUrl(String urlPath) {
return Applications.getProxy(this).getProxyUrl(urlPath, true);
}

Messages size of sensor_msgs.image/compressedImage type will changed when publish/subscriber in ROS-Android?

I need to get the preview-image-data from Android-phone-camera and publish it by ROS, and here is my sample-code:
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
if(data != null){
Camera.Size size = camera.getParameters().getPreviewSize();
YuvImage yuvImage = new YuvImage(data, ImageFormat.NV21, size.width, size.height, null);
if(yuvImage != null){
ByteArrayOutputStream baos = new ByteArrayOutputStream();
ChannelBufferOutputStream stream = new ChannelBufferOutputStream(MessageBuffers.dynamicBuffer());
yuvImage.compressToJpeg(new Rect(0, 0, yuvImage.getWidth(), yuvImage.getHeight()), 80, baos);
yuvImage = null;
stream.buffer().writeBytes(baos.toByteArray());
try{
baos.flush();
baos.close();
baos = null;
}
catch(IOException e){
e.printStackTrace();
}
// compressedImage type
sensor_msgs.CompressedImage compressedImage = compressedImagePublisher.newMessage();
compressedImage.getHeader().setFrameId("xxx"); // frame id
Time curTime = connectedNode.getCurrentTime();
compressedImage.getHeader().setStamp(curTime); // time
compressedImage.setFormat("jpeg"); // format
compressedImage.setData(stream.buffer().copy()); // data
stream.buffer().clear();
try {
stream.flush();
stream.close();
stream = null;
}
catch (IOException e) {
e.printStackTrace();
}
// publish
System.out.println("-----Publish: " + compressedImage.getData().array().length + "-----");
compressedImagePublisher.publish(compressedImage);
compressedImage = null;
System.gc();
}
else{
Log.v("Log_Tag", "-----Failed to get yuvImage!-----");
}
}
else{
Log.v("Log_Tag", "-----Failed to get the preview frame!-----");
}
}
And then, I had subscribed the topic, just to check if the messages had been published completely and correctly. Just like the following code did:
#Override
public void onStart(ConnectedNode node) {
this.connectedNode = node;
// publisher
this.compressedImagePublisher = connectedNode.newPublisher(topic_name, sensor_msgs.CompressedImage._TYPE);
// subscriber
this.compressedImageSubscriber = connectedNode.newSubscriber(topic_name, sensor_msgs.CompressedImage._TYPE);
compressedImageSubscriber.addMessageListener(new MessageListener<CompressedImage>() {
#Override
public void onNewMessage(final CompressedImage compressedImage) {
byte[] receivedImageBytes = compressedImage.getData().array();
if(receivedImageBytes != null && receivedImageBytes.length != 0) {
System.out.println("-----Subscribe(+46?): " + receivedImageBytes.length + "-----");
// decode bitmap from byte[] with a strange number of offset and necessary
Bitmap bmp = BitmapFactory.decodeByteArray(receivedImageBytes, offset, receivedImageBytes.length - offset);
...
}
}
});
}
I'm so confused about the number of offset. It's means the size of image-bytes had changed after packaged and published by ROS, and if I don't set the offset there're will be wrong to decode a bitmap. And more strangely, sometimes the number of offset had a change too.
I don't know why, and I had read some articles about the jpg structure, and suspect that it's maybe the head-information of the jpg byte messages. However, this problem just happen in ros-android scene.
Anyone have a good idea about this?
OK! I know that the question I asked and problem described previously is terrible, that's why it got 2 negative-points. I'm so sorry about these, and I have to make up my fault by telling you guys more information, and making the problem more clear now.
At first, forget about all the code I pasted before. The problem happened in my ros-android project. In this project, I need to send the sensor messages of compressed image type to ros-server and get the processed image-bytes(jpg format) in publish/subscribe way. And in theory, the size of the image-bytes should be same, and in fact, this had been proved in my ros-C and ros-C# project under same conditions.
However, they're different in ros-android, it's get bigger! For this reason, I can't just decode a bitmap from the image bytes which I had subscribed, and I have to leave out the excrescent bytes with a offset in the image byte array. I don't know why this happened in ros-android or ros-java, and what's these adding part.
I can't find the reason from my code, that's why I pasted my code in detail for you. I really need help! Thanks in advance!
Maybe I really need to check the API first before ask question here. This's a so simple question if I had check the properties of ChannelBuffer in API, cause the property arrayOffset had already told me the answer of this question!
Well, for the sake of be more cautious and somebody of you guys need a clarification of my answer, so I have to make an explanation in details!
At first, I have to say that, I still don't know how the ChannelBuffer package the data of image bytes array, that's means I still don't know why there're should have the arrayOffset before the array data. Why we can't just get data? Is there really some important reasons for the being of the arrayOffset? For the sake of safety, or efficiency? I don't know, I can't find an answer from the API. So I'm really tired of this question now, and I'm tired of whether you guys make a negative point of this question or not, just let it go!
Hark back to the subject, the problem can be solved in this way:
int offset = compressedImage.getData().arrayOffset();
Bitmap bmp = BitmapFactory.decodeByteArray(receivedImageBytes, offset, receivedImageBytes.length - offset);
OK! I'm still hope that someone who have good knowledge of this can tell me why, I'll be so appreciate of this! If you guys are just tired of this question like me, so just let's vote to close it, I'll be appreciate it too! Thanks anyway!

Cannot VIew the Data Sent By a TCP Packet Sending Program(Packet Server)

I'm trying to develop a small app which receives some data via sockets and based on the data it receives,it prints a toast message for the user.I' getting the data,but the data apparently cannot be read properly.Here is the relavent portion for the same.
int red = -1;
byte[] buffer = new byte[1024]; // a read buffer of 5KiB
byte[] redData;
while ((red = cs.getInputStream().read(buffer)) > -1) {
String redDataTextappend;
redData = new byte[red];
redDataTextappend = new String(redData);
Log.w("Data got",redDataTextappend);
if (redDataTextappend == "hi")
{
//Display Toast message using runonUIThread(new Runnable);
}
else
{//Display Message Using runonUITHread(new Runnable);
}
This code runs on a separate thread as android does not allow networking operations on a separate thread.
The 4 Diamonds is the data displayed by the android studio and cs is the name of the socket which accepts connections.
Thanks.
You are simply printing the String that encodes to zero bytes since you never copy the data that is read in.
It would make more sense to convert the byte array to a hex string if the array contains arbitrary data: see the answers to this question for options for that.
If the array contains the encoding of a String in some charset, for example UTF-8, then do the following:
byte[] redData = Arrays.copyOf(buffer, red);
String redDataTextappend = new String(redData, Charset.forName("UTF-8"));
Log.w("Data got",redDataTextappend);

How to compress Ti.utils.base64encode and decompress using .Net method?

Does anyone know how to compress Ti.Utils.base64encode??
for example i have this code :
uploadFile = Ti.Filesystem.getFile(pathFile, listing[_fileCtr].toString());
uploadFileName = listing[_fileCtr].toString();
encodedFile = Ti.Utils.base64encode(uploadFile.read()).toString();
//Send Image to .NET web service
And this is the method in my web services for decompressing image from titanium (if i can compress my image before):
static byte[] Decompress(byte[] input)
{
using (MemoryStream output = new MemoryStream(input))
{
using (GZipStream zip = new GZipStream(output, CompressionMode.Decompress))
{
List<byte> bytes = new List<byte>();
int b = zip.ReadByte();
while (b != -1)
{
bytes.Add((byte)b);
b = zip.ReadByte();
}
return bytes.ToArray();
}
}
Until now, i can't find some method for compressing my byte array so i can decompress them using my .NET method..
If u guys have any information about my problem, please tell me..
Many thanks.. :)
In .Net you can use System.Convert.FromBase64String to converts the specified string, which encodes binary data as base-64 digits, to an equivalent 8-bit unsigned integer array.
System.Convert.FromBase64String(encodedString);

Which is better solution get server response data?

We usually get data from server response in android development.
/*
* get server response inputStream
*/
InputStream responseInputStream;
Solution1: get response string by multiple read.
/*
* get server response string
*/
StringBuffer responseString = new StringBuffer();
responseInputStream = new InputStreamReader(conn.getInputStream(),"UTF-8");
char[] charBuffer = new char[bufferSize];
int _postion = 0;
while ((_postion=responseInputStream.read(charBuffer)) > -1) {
responseString.append(charBuffer,0,_postion);
}
responseInputStream.close();
Solution2: get response only one read.
String responseString = null;
int content_length=1024;
// we can get content length from response header, here assign 1024 for simple.
responseInputStream = new InputStreamReader(conn.getInputStream(),"UTF-8");
char[] charBuffer = new char[content_length];
int _postion = 0;
int position = responseInputStream.read(charBuffer)
if(position>-1){
responseString = new String(charBuffer,0,position );
}
responseInputStream.close();
Which solution has better performance? why?
Notes: server response json format data that less than 1M bytes.
Why you're reinventing a wheel? ;)
If you're using HttpClient then just use EntityUtils.toString(...).
I guess you're using HttpURLConnection. Then look at EntityUtils.toString(...) from Apache HttpClient - source code. Your first approach is similar to it.
BTW, the second code is worse because:
new String(charBuffer,0,position ) runs garbage collector
In both and even in EntityUtils:
int content_length = 1024; in most cases 8192 is default for socket buffer, so your code might run while loop 8 times more often than it could.
I would recommend the second method IF you do not want to display the amount of data downloaded/transferred . As the object is read as a whole and since the size of your JSON string is comparable to 1M, it will take some time to download. At that time you can, atmost, put up a text for the user saying downloading... You cannot notify the user the amount downloaded.
But if you want to display the amount of data downloaded, use the first method that you gave. Where the you read the data from the server in parts. You can update the UI, with the amount downloaded. For eg 25 % downloaded...
char[] charBuffer = new char[bufferSize];
int _postion = 0;
int i=0;
while ((_postion=responseInputStream.read(charBuffer)) > -1) {
//((i*buffer_size)/content_length) * 100 % completed..
i++;
}
So, I would say the seconds method is better.
BTW Did you consider this?
ObjectInputStream in = new InputStreamReader(conn.getInputStream(),"UTF-8");
if(resposeCode==200)
{
String from_server=(String) in.readObject();
}
Reading the input String as an object. Any object whoe class implements serializable can be passed using ObjectOutputStream and received using ObjectInputStream()
I Think First one is good
Because, in First that will reading your response in char to char method .
Where , Second that will try to read whole response object or as key Filed of Object.
So ,As i think and as per my knowledge First is Better to camper with second.If anyone want to edit then it will truly appreciated.

Categories

Resources