Aws Rekognition takes too long to compare face between 2 pictures - android

I am using the following code for Rekognition.
AWSCredentials credentials = new BasicAWSCredentials(xx, yy);
AmazonRekognition rekognitionClient = new AmazonRekognitionClient(credentials);
rekognitionClient.setRegion(Region.getRegion(Regions.US_EAST_1));
CompareFacesRequest request = new CompareFacesRequest()
.withSourceImage(new Image().withBytes(byteBufferSrc))
.withTargetImage(new Image().withBytes(byteBufferDest)).withSimilarityThreshold(90f);
CompareFacesResult response = rekognitionClient.compareFaces(request);
boolean matched = false;
for (CompareFacesMatch singleMatch : response.getFaceMatches()) {
if (singleMatch.getSimilarity() >= 90f) {
return true;
}
}
It is taking almost 1 min to finish the face detection between the 2 images.
Is this normal? I find it excessive so I am wondering if there is a way to speed it up or if I am doing anything wrong
Thank you

Related

UnityWebRequest does nothing on IL2CPP Build

I've been trying to simply call an api on an android build supporting 64 bit (IL2CPP build) and the UnityWebRequest class didnt seem to work. It's being called via a simple ui button click. It hits the webRequest.SendWebRequest(); and nothing happens. Ive tried the following samples. One, directly from the Unity docs for UnityWebRequest and others using standard HttpClient.
UnityWebRequest:
IEnumerator GetRequest(string uri)
{
using (UnityWebRequest webRequest = UnityWebRequest.Get(uri))
{
webRequest.SetRequestHeader("Authorization", "Bearer " + API_KEY);
yield return webRequest.SendWebRequest();
if (webRequest.isNetworkError)
{
debugText.text = ": Error: " + webRequest.error;
coroutineAllowed = false;
}
else
{
debugText.text = ":\nReceived: " + webRequest.downloadHandler.text;
dynamic jsonObj = JsonConvert.DeserializeObject(webRequest.downloadHandler.text);
foreach (var obj in jsonObj["businesses"])
{
businessResults.Add(new Business()
{
name = (string)obj["name"],
image_url = (string)obj["image_url"],
review_count = (string)obj["review_count"],
rating = (string)obj["rating"],
Coordinates = new Coordinates()
{
Latitude = (float)obj["coordinates"]["latitude"],
Longitude = (float)obj["coordinates"]["longitude"]
},
price = (string)obj["price"]
});
}
debugText.text = businessResults.Count.ToString();
//coroutineAllowed = true;
}
debugText.text = "getRequest 4";
}
}
This unfortunately did nothing at the yield return webRequest.SendWebRequest();
The next sample I tried was using HttpClient():
IEnumerator HttpClientCall(string uri) //possibly wrap in IEnumerator
{
debugText.text += "http coroutine started" +Environment.NewLine;
using (var httpClient = new HttpClient())
{
httpClient.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", API_KEY);
var response = httpClient.GetAsync(uri);
if (response.Result.StatusCode != HttpStatusCode.OK)
{
debugText.text += "FAILED HTTP GET";
}
yield return response.Result.Content.ReadAsStringAsync();
dynamic jsonObj = JsonConvert.DeserializeObject(response.Result.Content.ReadAsStringAsync().Result);
foreach (var obj in jsonObj["businesses"])
{
businessResults.Add(new Business()
{
name = (string)obj["name"],
image_url = (string)obj["image_url"],
review_count = (string)obj["review_count"],
rating = (string)obj["rating"],
Coordinates = new Coordinates()
{
Latitude = (float)obj["coordinates"]["latitude"],
Longitude = (float)obj["coordinates"]["longitude"]
},
price = (string)obj["price"]
});
debugText.text += Environment.NewLine + ((string)obj["name"]);
}
}
}
Once again, nothing when it hits yield return response.Result.Content.ReadAsStringAsync();
These all work on PC, and they both return results that i'm expecting.
The next thing i heard was about setting the android manifest application tag with android:usesCleartextTraffic="true"
This unfortunately, also did nothing for me lol. I know it has to be the 64 support, because this works on a standard build. The moment i go to build with 64 support, it doesnt work.
Any help on why it's not returning appropriately would be very helpful.
side note, i know the code is pretty ugly, but after i can figure out why the build doesnt work on the device a heavy refactoring is going to be in play. Thanks in advance!
So after a lot of trouble shooting ive found out why this was not working. The main issue seems to be stemming from my use of the standard Newtonsoft Json package when Unity, apparently, has their own internal JsonUtility class. After changing this:
dynamic jsonObj = JsonConvert.DeserializeObject(response.Result.Content.ReadAsStringAsync().Result);
To This:
var js = JsonUtility.FromJson<T>(response.Result.Content.ReadAsStringAsync().Result);
my results are finally showing in the the apk build correctly.
Also, to note that to map correctly, the JsonUtility.FromJson must be typed to a class that exactly mirrors the incoming json object explicitly.
The page article that finally helped me with this issue is here.
P.S.
Thank you to #RetiredNinja for trying to help instead of just downvoting and saying nothing of value. You're amazing!

Faces indexed by iOS/Android app are not detected by Android/iOS App - AWS Rekognition

So I have been working on a product (Android First and then iOS) for a long time that index faces of people using AWS Rekognition and when they are again scanned later, it identifies them.
It's working great when I index a face from an Android device and then try to search it with an Android device. But if I try to search it later on iOS app, it doesn't find it. Same is the result if I go other way round. Index with iOS, search with Android, not found.
The collection ID is same while indexing and searching on both devices. I couldn't figure out how is it possible that a face indexed by one OS type, same region, same collection, couldn't be found while on other device.
If anyone here could try and help me with the issue, please do. I'll be really thankful.
Update 1: I have called "listCollections" function on both iOS and android apps. Both of them are showing different list of collections. This is the issue. But I can't figure our why it is happening. The identity pool and region is same on both of them.
Here is my Android Code to access Rekognition:
mCredentialsProvider = new CognitoCachingCredentialsProvider(
mContext,
"us-east-2:xbxfxexf-x5x5-xax7-x9xf-x5x0xexfx1xb", // Identity pool ID
Regions.US_EAST_2 // Region
);
mUUID = UUID.randomUUID().toString().replace("-", "");
mAmazonS3Client = new AmazonS3Client(mCredentialsProvider);
mAmazonS3Client.setRegion(Region.getRegion(Regions.US_EAST_2));
mAmazonRekognitionClient = new AmazonRekognitionClient(mCredentialsProvider);
if(!mAmazonS3Client.doesBucketExist(mFacesBucket)) {
mAmazonS3Client.createBucket(mFacesBucket);
}
Log.i(TAG, "Uploading image to S3 Bucket");
mAmazonS3Client.putObject(mFacesBucket, getS3ObjectName(), new File(data[0].toString()));
Log.i(TAG, "Image Uploaded");
Image image = new Image();
try {
image.setBytes(ByteBuffer.wrap(Files.toByteArray(new File(data[0].toString()))));
} catch (IOException e) {
e.printStackTrace();
}
Log.i(TAG, "Indexing image");
IndexFacesRequest indexFacesRequest =new IndexFacesRequest()
.withCollectionId(mFacesCollection)
.withImage(image)
.withExternalImageId(mUUID)
.withDetectionAttributes("ALL");
mAmazonRekognitionClient.indexFaces(indexFacesRequest);
Here is my iOS code to access Rekognition:
func uploadToCollection(img: UIImage)
{
let myIdentityPoolId="us-east-2:xbxfxexf-x5x5-xax7-x9xf-x5x0xexfx1xb"
let credentialsProvider = AWSCognitoCredentialsProvider(regionType: .USEast2, identityPoolId: myIdentityPoolId)
//store photo in s3()
let configuration = AWSServiceConfiguration(region: .USEast2, credentialsProvider: credentialsProvider)
AWSServiceManager.default().defaultServiceConfiguration = configuration
rekognitionClient = AWSRekognition.default()
guard let request = AWSRekognitionIndexFacesRequest() else
{
puts("Unable to initialize AWSRekognitionindexFaceRequest.")
return
}
var go=false
request.collectionId = "i_faces" + self.firebaseID.lowercased() //here iosCollection will be replaced by firebase Current UserID
request.detectionAttributes = ["ALL", "DEFAULT"]
request.externalImageId = self.UUID //this should be mUUID, passed as parameter to this function
let sourceImage = img
let image = AWSRekognitionImage()
image!.bytes = sourceImage.jpegData(compressionQuality: 0.7)
request.image = image
self.rekognitionClient.indexFaces(request) { (response:AWSRekognitionIndexFacesResponse?, error:Error?) in
if error == nil
{
print("Upload to Collection Complete")
}
go=true
return
}
while(go==false){}
}
Create a collection and added images to the collection and create an index. I suspect few things in your setup and code.
1) The Identity Pool Id, AWS Region used across iOS and Android
2) The name of the collection used (pay attention to the delimiters used in the collection name)
Android:
CognitoCachingCredentialsProvider credentialsProvider = new CognitoCachingCredentialsProvider(appContext, "MyPoolID", Regions.US_EAST_1);
public void searchFacesByImage() {
Image source = new Image().withS3Object(new S3Object().withBucket("us-east-1-bucket").withName("ms.jpg"));
Image ms2 = new Image().withS3Object(new S3Object().withBucket("us-east-1-bucket").withName("ms-2.jpg"));
Image ms3 = new Image().withS3Object(new S3Object().withBucket("us-east-1-bucket").withName("ms-3.jpg"));
Image ms4 = new Image().withS3Object(new S3Object().withBucket("us-east-1-bucket").withName("ms-4.jpg"));
String collectionId = "MyCollectionID";
AmazonRekognitionClient client = new AmazonRekognitionClient(credentialsProvider);
try {
System.out.println("Creating collection: " + collectionId );
CreateCollectionRequest request = new CreateCollectionRequest().withCollectionId(collectionId);
CreateCollectionResult createCollectionResult = client.createCollection(request);
System.out.println("CollectionArn : " + createCollectionResult.getCollectionArn());
System.out.println("Status code : " + createCollectionResult.getStatusCode().toString());
} catch (Exception ex) {
ex.printStackTrace();
}
IndexFacesRequest indexFacesRequest = new IndexFacesRequest();
indexFacesRequest.setImage(source);
indexFacesRequest.setCollectionId(collectionId);
client.indexFaces(indexFacesRequest);
indexFacesRequest = new IndexFacesRequest();
indexFacesRequest.setImage(ms2);
indexFacesRequest.setCollectionId(collectionId);
client.indexFaces(indexFacesRequest);
indexFacesRequest = new IndexFacesRequest();
indexFacesRequest.setImage(ms4);
indexFacesRequest.setCollectionId(collectionId);
client.indexFaces(indexFacesRequest);
SearchFacesByImageRequest searchFacesByImageRequest = new SearchFacesByImageRequest();
searchFacesByImageRequest
.withCollectionId(collectionId)
.withImage(ms3)
.withFaceMatchThreshold(80F);
SearchFacesByImageResult searchFacesByImageResult =
client.searchFacesByImage(searchFacesByImageRequest);
List <FaceMatch> faceImageMatches = searchFacesByImageResult.getFaceMatches();
for (FaceMatch face: faceImageMatches) {
Log.d(TAG, face.toString());
}
}
iOS:
Create the Cognito Credentials Provider
AWSCognitoCredentialsProvider *credentialsProvider = [[AWSCognitoCredentialsProvider alloc] initWithRegionType:AWSRegionUSEast1 identityPoolId: #"MyPoolID"];
AWSServiceConfiguration *configuration = [[AWSServiceConfiguration alloc] initWithRegion:AWSRegionUSEast1 credentialsProvider:credentialsProvider];
[AWSServiceManager defaultServiceManager].defaultServiceConfiguration = configuration;
Use the same Identity Pool Id and Region (us-east-1).
func faceIndexNoFacesSearch() {
let rekognition = AWSRekognition.default()
let faceRequest = AWSRekognitionSearchFacesByImageRequest()
do {
let image = AWSRekognitionImage()
image?.s3Object = AWSRekognitionS3Object()
image?.s3Object?.bucket = "us-east-1-bucket"
image?.s3Object?.name = "ms-2.jpg"
faceRequest!.image = image
faceRequest!.collectionId = "MyCollectionID"
rekognition.searchFaces(byImage: faceRequest!).continueWith { (response) -> Any? in
XCTAssertNil(response.error)
XCTAssertNotNil(response.result)
if let result = response.result {
XCTAssertNotNil(result.faceMatches)
}
return nil
}.waitUntilFinished()
} catch {
print("exception")
}
}
Please post questions in the comment and we can discuss there.
Ok so the problem turned out to be much different and solution was rather very simple. I posted another question regarding the same problem when I found it was a bit different and I have posted an answer as well.
Here it is:
https://stackoverflow.com/a/53128777/4395264

AS3 android camera roll crash and restart

Making an AS3 app for android that uses camera roll to load, select and then use an image.
The cameraroll browser works fine, but when the image is selected, the app crashes - almost every time - as it has worked on a handful of occasions (!?!) We assumed a memory issue and attempted to close all other windows / use smaller photos in the camera roll etc. and try different devices - but cannot recreate success consistently. Cannot find another ANE that works either
It fails at the point where the photo has been selected, and RESTARTS the app...
here's the relevant code, any help appreciated.
public function openGallery():void {
var cameraRoll:CameraRoll = new CameraRoll();
if(CameraRoll.supportsBrowseForImage) {
cameraRoll.addEventListener(MediaEvent.SELECT, imageSelected);
cameraRoll.addEventListener(flash.events.Event.CANCEL, browseCanceled);
cameraRoll.addEventListener(flash.events.ErrorEvent.ERROR, mediaError);
cameraRoll.browseForImage();
}
else { trace( "Image browsing is not supported on this device."); }
}
private function imageSelected(event:MediaEvent):void {
trace("Media selected...");
var imagePromise:MediaPromise = event.data as MediaPromise;
_dataSource = imagePromise.open();
if(imagePromise.isAsync) {
trace("Asynchronous media promise.");
var eventSource:IEventDispatcher = _dataSource as IEventDispatcher;
eventSource.addEventListener(flash.events.Event.COMPLETE, onMediaLoaded);
} else {
trace("Synchronous media promise.");
readMediaData();
}
}
private function onMediaLoaded(event:flash.events.Event):void{
trace("Media load complete");
_mediaBytes = new ByteArray();
_dataSource.readBytes(_mediaBytes);
_tempDir = File.createTempDirectory();
var now:Date = new Date();
var filename:String;
filename = now.fullYear + now.month + now.day+now.hours + now.minutes + now.seconds + ".JPG";
_file = _tempDir.resolvePath(filename);
//writing temporal file to display image
_stream = new FileStream();
_stream.open(_file,FileMode.WRITE);
_stream.writeBytes(_mediaBytes);
_stream.close();
if(_file.exists){
_imageLoader = new Loader();
_imageLoader.contentLoaderInfo.addEventListener(Event.COMPLETE,onMediaLoadedBitmapData);
_imageLoader.loadBytes(_mediaBytes);
}
}
private function onMediaLoadedBitmapData(event:Event):void{
trace("onMediaLoadedBitmapData");
var loaderInfo:LoaderInfo = LoaderInfo(event.target);
_bitmapData = new BitmapData(loaderInfo.width,loaderInfo.height,false,0xFFFFFF);
_bitmapData.draw(loaderInfo.loader);
addPictureToScreen();
}
I had a similar thing happen in Air for iOS but it was related to picking large images. I just checked the dimensions of the chosen image and then used a scalar matrix if they were past a certain point. Sounds like you've thought about this but maybe look further into that?

Android TrafficStats - Oversized value is a bug?

I am investigating about traffic measuring in android. I am developing on a Galaxy S4 and I programmed a service that catches TrafficStats API one time per minute, saves in SharedPreference the accumulated traffic (AKA BaseTraffic) and saves in database the difference between the current traffic minus BaseTraffic.
The problem is that in short periods (15 min) TrafficStats return an oversized value (1.6 GB per minute) and ever the same value. Someone know if this is a bug or other issue.
My code is the next for get the traffic:
public class TrafficTracker {
public static long getCurrentTraffic() {
long traff = 0;
traff = (TrafficStats.getTotalRxBytes() + TrafficStats.getTotalTxBytes());
if (traff > 0) {
return traff;
} else {
throw new UnsupportedOperationException("TrafficStats not supported");
}
}
public static long getTrafficWithOutBase(long baseTraffic) {
return TrafficStats.getTotalTxBytes() + TrafficStats.getTotalRxBytes() - baseTraffic;
}
}
And call this code here:
if (preferences.getBaseTraffic() != null) {
if (TrafficTracker.getCurrentTraffic() > preferences.getBaseTraffic().getByteTraffic()) {
TrafficObject trafficObject = new TrafficObject(new Date(calendar.getTimeInMillis()), TrafficTracker.getTrafficWithOutBase(preferences.getBaseTraffic().getByteTraffic()));
daoTraffic.create(trafficObject);
preferences.setBaseTraffic(new TrafficObject(new Date(System.currentTimeMillis()), preferences.getBaseTraffic().getByteTraffic() + trafficObject.getByteTraffic()));
} else {//when stats are reseted
TrafficObject trafficObject = new TrafficObject(new Date(calendar.getTimeInMillis()), TrafficTracker.getCurrentTraffic());
daoTraffic.create(trafficObject);
preferences.setBaseTraffic(trafficObject);
}
}
** UPDATE **
I found my error :). I replace >= instead of >. Now works properly when it is disconnected from data or wifi.
if (TrafficTracker.getCurrentTraffic() >= preferences.getBaseTraffic().getByteTraffic())
I found my error :). I replace >= instead of >. Now works properly when it is disconnected from data or wifi.
if (TrafficTracker.getCurrentTraffic() >= preferences.getBaseTraffic().getByteTraffic())

Android(AIR/Actionscript) Cant return FB access_token from stageWebView

Im simply trying to get my users access token from FB using stageWebView. When i trace the output from the location change/changing events all i get are google/FB urls, none of which have the token.
Here is my code:
var webView:StageWebView = new StageWebView();
function onChanging(e:LocationChangeEvent):void {
trace(e.location);
e.preventDefault();
webView.loadURL(e.location);
webView.stage = null;
}
function onChange(e:LocationChangeEvent):void {
trace(webView.location);
if(webView.location.indexOf("http://google.com") == 0 && webView.location.indexOf("access_token")!=-1) {
trace("?"+webView.location.substring(webView.location.indexOf("access_token"), webView.location.indexOf("&expires_in")));
webView.stage = null;
}
}
function connectFb() {
webView.addEventListener(LocationChangeEvent.LOCATION_CHANGING, onChanging);
webView.addEventListener(LocationChangeEvent.LOCATION_CHANGE, onChange);
webView.stage = stage;
webView.viewPort = new Rectangle(0, 0, stage.stageWidth, stage.stageHeight);
webView.loadURL("https://graph.facebook.com/oauth/authorize?client_id=123456789012345&redirect_uri=http://google.com&type=user_agent&display=popup")
}
My Codes output:
https://graph.facebook.com/oauth/authorize?client_id=164534120383085&redirect_uri=http://google.com&type=user_agent&display=popup
https://www.facebook.com/dialog/oauth?client_id=164534120383085&redirect_uri=http%3A%2F%2Fgoogle.com&type=user_agent&display=popup
https://www.facebook.com/dialog/oauth?client_id=164534120383085&redirect_uri=http%3A%2F%2Fgoogle.com&type=user_agent&display=popup
http://google.com/
http://google.com/
http://www.google.com/
http://www.google.com/
I tried every tutorial on the net and even bought a book on AS3 facebook dev & still cant figure this out; Any help would be VERY appreciated since this is a fairly important project to me
Use this where you are Updating UI example Loading Profile photo or Name of User
String acc = session.getAccessToken();
Log.e("Access-Token", acc);
Best of Luck

Categories

Resources