I am in the midst of developing a mobile application using Xamarin.Forms. The app connects to a BLE device which transmits 16 bytes of data every 100 ms. I am plotting the data with Syncfusion in a bar chart format.
I can connect to the device and receive data without issues. But after a very small amount of time, the app starts to significantly decrease in performance. Soon hereafter, it completely stalls. Obviously I am doing something wrong in handling the incoming data (unless it is a performance issue with the Syncfusion chart).
In a nutshell, this is the process I go through in the app
Pair to the device (outside of the app)
Connect to the device (in the app)
Set up the transmission
Process the incoming data via a Model called SpectrogramModel
Graph the data with Syncfusion in a View called DataPage, which is bound to a ViewModel called DataViewModel
Getting into the nitty-gritty of it all, after pairing and connecting to the device, the following method is called. Could it be the Device.BeginInvokeOnMainThread() call which eventually starts blocking the app? This method is called from a Connection class, which has a reference to the DataViewModel
private void UpdateSpectrogramChart(object sender, EventArgs e)
{
DebugHelper.Message(Type.Method, "UpdateSpectrogramChart");
_characteristic.ValueUpdated += (o, args) =>
{
var raw = args.Characteristic.Value;
for (int i = 0; i < raw.Length; i++)
{
Debug.WriteLine("Level[{0}] = {1}", i, raw[i]);
}
Xamarin.Forms.Device.BeginInvokeOnMainThread(() =>
{
DataPageViewModel.Levels.Clear();
for (int i = SpectrogramModel.FrequencyOffset; i < raw.Length; i++)
{
if (SettingsViewModel.IsViewRawData)
{
DataPageViewModel.Title = "Raw data";
DataPageViewModel
.Levels
.Add(
new SpectrogramModel(
raw[i],
1 + (i - SpectrogramModel.FrequencyOffset))
);
}
if (SettingsViewModel.IsViewProcessedData)
{
DataPageViewModel.Title = "Processed data";
DataPageViewModel
.Levels
.Add(
new SpectrogramModel(
raw[i],
1 + (i - SpectrogramModel.FrequencyOffset),
i));
}
}
});
};
}
The SpectrogramModel looks like this
public class SpectrogramModel
{
public SpectrogramModel(byte level, int frequency)
{
Level = level;
Frequency = frequency;
}
public SpectrogramModel(byte level, int frequency, int index) : this(level, frequency)
{
Level = ProcessRawLevel(level, index);
}
private double ProcessRawLevel(byte b, int index)
{
double multiplier = 0.75;
double val = b;
val *= multiplier;
return val;
}
public static readonly int FrequencyOffset = 4;
...
The DataPage looks like this
<chart:SfChart>
<chart:SfChart.Title>
<chart:ChartTitle
Text="{Binding Title}">
</chart:ChartTitle>
</chart:SfChart.Title>
<chart:SfChart.PrimaryAxis>
<chart:CategoryAxis>
</chart:CategoryAxis>
</chart:SfChart.PrimaryAxis>
<chart:SfChart.SecondaryAxis>
<chart:NumericalAxis
Minimum="20"
Maximum="100">
</chart:NumericalAxis>
</chart:SfChart.SecondaryAxis>
<chart:SfChart.Series>
<chart:ColumnSeries ItemsSource="{Binding Levels}" XBindingPath="Frequency" YBindingPath="Level"/>
</chart:SfChart.Series>
</chart:SfChart>
Finally, the DataViewModel which the DataPage is bound
public class DataViewModel : BaseViewModel
{
public DataViewModel()
{
Init();
}
private void Init()
{
Levels = new ObservableCollection<SpectrogramModel>();
for (int i = 0; i < 16; i++) Levels.Add(new SpectrogramModel(20, i));
}
private ObservableCollection<SpectrogramModel> _levels;
public ObservableCollection<SpectrogramModel> Levels
{
get { return _levels; ; }
set
{
_levels = value;
OnPropertyChanged();
}
}
private string _title;
public string Title
{
get { return _title; }
set
{
_title = value;
OnPropertyChanged();
}
}
}
It should be noted that the UpdateSpectrogramChart() is wrapped in a timer, which looks like this
public void InitTimers()
{
DebugHelper.Message(Type.Method, "InitTimers");
int SECOND = 1000 * 2;
SpectrogramChartTimer = new Timer();
SpectrogramChartTimer.Elapsed += new ElapsedEventHandler(UpdateSpectrogramChart);
SpectrogramChartTimer.Interval = SECOND;
}
I wrapped the call to the UpdateSpectrogramChart() method in a (clear) failed attempt to reduce the performance decrease.
For completeness sake, here is the method body of the method which sets up receiving from the BLE device
public async Task ReceiveFromGattCharacteristic(string service, string characteristic, string descriptor = null)
{
DebugHelper.Message(Type.Method, "ReceiveFromGattCharacteristic");
bleAdapter.DeviceConnected += async (s, e) =>
{
try
{
DebugHelper.Message(Type.Info, "bleAdapter.DeviceConected += async (s, e) ...");
string[] deviceInfo = { e.Device.Name, e.Device.Id.ToString() };
// Connect to service
try
{
DebugHelper.Message(Type.Info, "Connecting to service...");
_service = await e.Device.GetServiceAsync(Guid.Parse(service));
DebugHelper.Message(Type.Info, "OK");
}
catch (Exception)
{
DebugHelper.Error(ErrorType.GATT, "Could not connect to service");
}
// Connect to characteristic
try
{
DebugHelper.Message(Type.Info, "Connecting to characteristic...");
_characteristic = await _service.GetCharacteristicAsync(Guid.Parse(characteristic));
DebugHelper.Message(Type.Info, "OK");
}
catch (Exception)
{
DebugHelper.Error(ErrorType.GATT, "Could not connect to characteristic");
}
await ConfigureSpectrogram(UpdateFrequency.High, 0x1);
try
{
await _characteristic.StartUpdatesAsync();
}
catch
{
DebugHelper.Error(ErrorType.GATT, "Error starting UpdatesAsync");
}
_characteristic.ValueUpdated += (o, args) =>
{
var raw = args.Characteristic.Value;
for (int i = 4; i < raw.Length; i++)
{
Debug.WriteLine("Level[{0}] = {1}", i - 4, raw[i]);
}
};
}
catch (Exception)
{
DebugHelper.Error(ErrorType.GATT, "Error in ReceiveFromGattCharacteristic");
}
};
}
Well, I am not sure if this really qualifies as an answer but I seem to have solved the problem although I can't say for sure why this has solved it.
After fiddling with a BackgroundWorker, which introduced even more errors (probably because I am no expert on the usage of it), I revised the code and moved the update of the Model and the View directly into the ReceiveFromGattCharacteristic(), method instead of updating the Model and the View in a separate method, as follows:
public void ReceiveFromGattCharacteristic(string service, string characteristic, string descriptor = null)
{
DebugHelper.Message(Type.Method, "ReceiveFromGattCharacteristic");
bleAdapter.DeviceConnected += async (s, e) =>
{
try
{
DebugHelper.Message(Type.Info, "bleAdapter.DeviceConected += async (s, e) ...");
string[] deviceInfo = { e.Device.Name, e.Device.Id.ToString() };
// Connect to service
try
{
DebugHelper.Message(Type.Info, "Connecting to service...");
_service = await e.Device.GetServiceAsync(Guid.Parse(service));
DebugHelper.Message(Type.Info, "OK");
}
catch (Exception)
{
DebugHelper.Error(ErrorType.GATT, "Could not connect to service");
}
// Connect to characteristic
try
{
DebugHelper.Message(Type.Info, "Connecting to characteristic...");
_characteristic = await _service.GetCharacteristicAsync(Guid.Parse(characteristic));
DebugHelper.Message(Type.Info, "OK");
}
catch (Exception)
{
DebugHelper.Error(ErrorType.GATT, "Could not connect to characteristic");
}
await ConfigureSpectrogram(UpdateFrequency.High, 0x1);
try
{
await _characteristic.StartUpdatesAsync();
}
catch
{
DebugHelper.Error(ErrorType.GATT, "Error starting UpdatesAsync");
}
// ADDITION
_characteristic.ValueUpdated += (o, args) =>
{
var raw = args.Characteristic.Value;
Xamarin.Forms.Device.BeginInvokeOnMainThread(() =>
{
DataPageViewModel.Levels.Clear();
for (int i = Models.Spectrogram.FrequencyOffset; i < raw.Length; i++)
{
if (SettingsViewModel.IsViewRawData)
{
DataPageViewModel.Title = "Raw data";
DataPageViewModel
.Levels
.Add(
new Models.Spectrogram(
raw[i],
1 + (i - Models.Spectrogram.FrequencyOffset))
);
}
if (SettingsViewModel.IsViewProcessedData)
{
DataPageViewModel.Title = "Processed data";
DataPageViewModel
.Levels
.Add(
new Models.Spectrogram(
raw[i],
1 + (i - Models.Spectrogram.FrequencyOffset),
i));
}
}
});
};
}
// END OF ADDITION
catch (Exception)
{
DebugHelper.Error(ErrorType.GATT, "Error in ReceiveFromGattCharacteristic");
}
};
}
We would like to let you know that some of the SfChart configuration need to consider while using huge amount of data and increasing the performance.
Make use of SuspendSeriesNotification and ResumeSeriesNoification.
We can stop the chart from being updated for each modification in the items source collection. By using SuspendSeriesNotification and ResumeSeriesNotification methods.
Xamarin.Forms.Device.BeginInvokeOnMainThread(() =>
{
DataPageViewModel.Levels.Clear();
Chart.SuspendSeriesNotification();
for (int i = SpectrogramModel.FrequencyOffset; i < raw.Length; i++)
{
if (SettingsViewModel.IsViewRawData)
{
DataPageViewModel.Title = "Raw data";
DataPageViewModel
.Levels
.Add(
new SpectrogramModel(
raw[i],
1 + (i - SpectrogramModel.FrequencyOffset))
);
}
if (SettingsViewModel.IsViewProcessedData)
{
DataPageViewModel.Title = "Processed data";
DataPageViewModel
.Levels
.Add(
new SpectrogramModel(
raw[i],
1 + (i - SpectrogramModel.FrequencyOffset),
i));
}
}
Chart.ResumeSeriesNotification();
});
Avoid use of Category Axis.
We have figured that you are using Category axis with column series. We always suggested to use Numeric or datetime axis with fast line series to get better performance. If you really need the category axis and column series, please let us know how many data your chart will load or any reason to use of category axis.
Some of the tips to get better performance from SfChart, please read the below blog.
https://blog.syncfusion.com/post/7-tips-to-optimize-xamarin-charts-performance.aspx#comment-10677
Regards,
Bharathi.
Related
I am very new to android, as I got a project for maintenance. I completed other part of the project like authentication, token setting etc... In that face recognition is used to identify the person. Previously it was working fine and taken images, trained with it and recognized the person.(Obviously not done by me :)). Now it throws error as
Add Person ActivityCvException [org.opencv.core.CvException:
cv::Exception:
/build/master_pack-android/opencv/modules/core/src/matrix.cpp:1047:
error: (-13) The matrix is not continuous, thus its number of rows can
not be changed in function cv::Mat cv::Mat::reshape(int, int) const
Code sample is as follows
public void training() {
Thread thread;
try{
PreferenceManager.setDefaultValues(getApplicationContext(), R.xml.preferences, false);
}catch (Exception e){
AddPersonActivity.this.runOnUiThread(new Runnable() {
public void run() {
WriteLog("Add Person Activity" +e.fillInStackTrace());
errorAlert("Add Person Activity" +e.fillInStackTrace());
VolleyHelper.progressDialog.dismiss();
}
});
}
WriteLog("training 1 ");
final Handler handler = new Handler(Looper.getMainLooper());
thread = new Thread(new Runnable() {
public void run() {
if (!Thread.currentThread().isInterrupted()) {
try {
WriteLog("training 2 ");
PreProcessorFactory ppF = new PreProcessorFactory(AddPersonActivity.this);
PreferencesHelper preferencesHelper = new PreferencesHelper(AddPersonActivity.this);
String algorithm = preferencesHelper.getClassificationMethod();
FileHelper fileHelper = new FileHelper();
fileHelper.createDataFolderIfNotExsiting();
final File[] persons = fileHelper.getTrainingList();
if (persons.length > 0) {
Recognition rec = RecognitionFactory.getRecognitionAlgorithm(getApplicationContext(), Recognition.TRAINING, algorithm);
for (File person : persons) {
if (person.isDirectory()) {
File[] files = person.listFiles();
int counter = 1;
for (File file : files) {
if (FileHelper.isFileAnImage(file)) {
Mat imgRgb = Imgcodecs.imread(file.getAbsolutePath());
Imgproc.cvtColor(imgRgb, imgRgb, Imgproc.COLOR_BGRA2RGBA);
Mat processedImage = new Mat();
imgRgb.copyTo(processedImage);
List<Mat> images = ppF.getProcessedImage(processedImage, PreProcessorFactory.PreprocessingMode.RECOGNITION);
if (images == null || images.size() > 1) {
continue;
} else {
processedImage = images.get(0);
}
if (processedImage.empty()) {
continue;
}
String[] tokens = file.getParent().split("/");
final String name = tokens[tokens.length - 1];
for (int i = 0; i < files.length; i++) {
File myfile = new File(person +
"\\" + files[i].getName());
String long_file_name = files[i].getName();
System.out.println(long_file_name);
System.out.print(long_file_name);
myfile.renameTo(new File(person +
"\\" + long_file_name + "_101" + ".png"));
}
WriteLog("training 3 ");
MatName m = new MatName("processedImage", processedImage);
fileHelper.saveMatToImage(m, FileHelper.DATA_PATH);
rec.addImage(processedImage, name, false);
counter++;
}
}
}
}
try {
if (rec.train()) {
if (zipFileAtPath("/storage/emulated/0/Pictures/facerecognition/training/" + lcode, "/storage/emulated/0/Pictures/facerecognition/data/SVM/" + lcode + ".zip")) {
WriteLog("training 4 ");
if (zipFileAtPath("/storage/emulated/0/Pictures/facerecognition/data/SVM", "/storage/emulated/0/Pictures/facerecognition/" + "SVM_" + lcode + ".zip")) {
WriteLog("training 5 ");
fileupload(getintent.getStringExtra("lcode"));
} else {
Toast.makeText(getApplicationContext(), "No Face Recognised", Toast.LENGTH_SHORT).show();
}
} else {
Toast.makeText(getApplicationContext(), "No Face Recognised", Toast.LENGTH_SHORT).show();
}
} else {
Toast.makeText(getApplicationContext(), "Try Again", Toast.LENGTH_SHORT).show();
}
} catch (Exception e) {
WriteLog("Add Person Activity" +e.fillInStackTrace());
errorAlert("Add Person Activity" +e.fillInStackTrace());
}
handler.post(new Runnable() {
#Override
public void run() {
}
});
} else {
Thread.currentThread().interrupt();
}
} catch (Exception e) {
AddPersonActivity.this.runOnUiThread(new Runnable() {
public void run() {
VolleyHelper.progressDialog.dismiss();
WriteLog("Add Person Activity" +e.fillInStackTrace());
errorAlert("Add Person Activity" +e.fillInStackTrace());
}
});
}
}
}
});
thread.start();
}
The input image needs to be stored as a continuous block of bytes in memory. I came to a similar situation, particularly because Android (at least the version I tried) does not seem to store images continuously. 1) Check if the image is continuous using: inputImage.isContinuous(). It returns a bool. It should be true. 2) If the image is not continuous, you can create a copy of the image that it is by creating a clone of the image via inputImage.clone() . Behind the scenes, we are creating a deep copy of the input using the create method, that should guarantee that the new matrix is continuous.
//check image before continuing:
if ( !inputImage.isContinuous() ){
//if the image is not continuous, create a deep copy:
inputImage = inputImage.clone();
}
I'm developing an Android application, I have to implement a function that discover all the host of the network where I'm connected (for example WiFi).
I implemented a function that work, this is my code:
public class ScanNetwork {
private static final int NB_THREADS = 10;
private ArrayList < String > hosts;
public ArrayList < String > ScanNetwork(String ipAddress) {
hosts = new ArrayList <> ();
String subnet = ipAddress.substring(0, ipAddress.lastIndexOf("."));
ExecutorService executor = Executors.newFixedThreadPool(NB_THREADS);
for (int dest = 0; dest < 255; dest++) {
String host = subnet + "." + dest;
executor.execute(pingRunnable(host));
}
executor.shutdown();
try {
executor.awaitTermination(60 * 1000, TimeUnit.MILLISECONDS);
} catch (InterruptedException ignored) {}
return hosts;
}
private Runnable pingRunnable(final String host) {
return new Runnable() {
public void run() {
try {
InetAddress inet = InetAddress.getByName(host);
boolean reachable = inet.isReachable(1000);
if (reachable) {
hosts.add(host);
}
} catch (UnknownHostException e) {
System.out.println("Log: " + e);
} catch (IOException e) {
System.out.println("Log: " + e);
}
}
};
}
}
This code work on a certain category of IP such as 192.168.1.10, but not for each other like 10.1.25.1.
The question in simple, how I can implement a function that discover all host in a certain network considering all kinds of IP?
I have 500 records on parse class or table and now i need to get 10 random records out of 500 records?
Please tell me how can I do this.
ParseQuery<ParseObject> query = ParseQuery.getQuery(parseTableName);
query.setLimit(10);
query.findInBackground(new FindCallback<ParseObject>() {
#Override
public void done(List<ParseObject> parseObjects,com.parse.ParseException e) {
}
}
The best way is probably to write a CloudCode module that downloads 500 objects, then randomly selects 10 to send down to your Android app in the response. That's much better than downloading 500 objects to your device and choosing 10.
It's been a while since I've written CloudCode, but you could do something like the following.
in iOS app (you can do a little work to find the Android equivalent):
[PFCloud callFunctionInBackground:#"get500obj" withParameters:#{} block:^(id result, NSError *error) {
// do something with result
}];
in CloudCode (this should be treated as pseudocode as it's untested):
Parse.Cloud.define('get500obj', function(request, response) {
// for getting random element
Array.prototype.randomElement = function () {
return this[Math.floor(Math.random() * this.length)]
}
var query = new Parse.Query("your-object-class-name");
query.find({
success: function(results) {
var final10 = [];
for (var i = 0; i < 10; i++) {
var myRandomElement = results.randomElement()
if (final10.indexOf(myRandomElement) == -1) {
final10.push(myRandomElement);
} else {
i--;
}
}
response.success(final10);
},
error: function() {
response.error(error);
}
});
});
Here is an Objective C code sample.
PFQuery *query = [PFQuery queryWithClassName:#"MyTable"];
int count = [query countObjects];
int r = arc4random_uniform(count);
PFQuery *query2 = [PFQuery queryWithClassName:#"MyTable"];
query2.skip = r;
PFObject *result = [query2 getFirstObject];
`
This is working code for fetching random objects from given class in Parse
In cloud code,
Parse.Cloud.define("get10Obj",function(request,response)
{
query = new Parse.Query(request.params.movie);
Array.prototype.randomElement = function (min, max) {
return Math.floor(Math.random() * (max - min + 1)) + min;
}
query.find ({
success: function(results) {
var final10 = [];
for (var i = 0; i < 5; i++) {
var myRandomElement = results.randomElement(0,results.length);
if (final10.indexOf(myRandomElement) == -1) {
final10.push(myRandomElement);
} else {
i--;
}
}
var datalist =[];
for(var j=0;j<final10.length;j++)
{
var index= final10[j];
datalist.push(results[index]);
}
response.success(datalist);
},
error: function() {
response.error(error);
}
});
});
In Android,
HashMap<String, Object> params = new HashMap<String, Object>();
params.put("movie", "The Matrix");
ParseCloud.callFunctionInBackground("averageStars", params, new
FunctionCallback<ParseObject>() {
void done(ParseObject ratings, ParseException e) {
if (e == null) {
// Do your stuff
}
}
});
Thanks to st.derrick for biggest hint.
So after finally getting my head around Xamarin.Forms DependencyService I have nearly got it returning the device's current location.
my interface
public interface ICurrentLocation
{
MyLocation SetCurrentLocation();
}
MyLocation
public class MyLocation
{
public double Latitude {get; set;}
public double Longitude{get; set;}
}
the line that calls it
MyLocation location = DependencyService.Get<ICurrentLocation>().SetCurrentLocation();
and in the CurrentLocation class in the Android project that implements the Geolocation class of Xamarin.Mobile
[assembly: Dependency(typeof(CurrentLocation))]
namespace MyCockburn.Droid
{
public class CurrentLocation : Activity, ICurrentLocation
Geolocator locator;
Position position = new Position();
MyLocation location;
public MyLocation SetCurrentLocation()
{
GetPosition();
location = new MyLocation()
{
Latitude = position.Latitude,
Longitude = position.Longitude
};
return location;
}
async void GetPosition()
{
try
{
locator = new Geolocator(this) { DesiredAccuracy = 50 };
if (locator.IsListening != true)
locator.StartListening(minTime: 1000, minDistance: 0);
position = await locator.GetPositionAsync(timeout: 20000);
}
catch (Exception e)
{
Log.Debug("GeolocatorError", e.ToString());
}
}
}
my problem seems to be that is returning location before position holds the longitude and latitude
I am hoping my mistake is glaringly obvious
EDIT: the code works if I run it as a normal Android Activity
I would do a slight modification since best practice is to either do all async or none. When you try to return the result from an async method from a non async method you can run into problems with deadlocks. Also, since you aren't using the await keyword when calling the GetPosition method, you are getting back a Task, but aren't checking when the operation is complete. I suggest slightly modifying your code as such.
public interface ICurrentLocation {
Task<MyLocation> GetCurrentLocation();
}
public async Task<MyLocation> GetCurrentLocation()
{
var position = await GetPosition();
return new MyLocation()
{
Latitude = position.Latitude,
Longitude = position.Longitude
};
}
async Task<Location> GetPosition()
{
try
{
locator = new Geolocator(this) { DesiredAccuracy = 50 };
if (locator.IsListening != true)
locator.StartListening(minTime: 1000, minDistance: 0);
return await locator.GetPositionAsync(timeout: 20000);
}
catch (Exception e)
{
Log.Debug("GeolocatorError", e.ToString());
}
}
You aren't waiting for the position function to finish. Many different options and keeping it async is the best one but if you want it synchronous then try this blocking call:
void GetPosition()
{
try
{
locator = new Geolocator(this) { DesiredAccuracy = 50 };
if (locator.IsListening != true)
locator.StartListening(minTime: 1000, minDistance: 0);
position = locator.GetPositionAsync(timeout: 20000).Result;
}
catch (Exception e)
{
Log.Debug("GeolocatorError", e.ToString());
}
}
I also recommend taking a look at Xamarin.Forms.Labs as it already has abstracted GPS service and working sample that is functional for all 3 platforms:
https://github.com/XForms/Xamarin-Forms-Labs
Try adding the assembly above the namespace and awaiting the GetPosition method.
Take a look at this image:
http://developer.xamarin.com/guides/cross-platform/xamarin-forms/dependency-service/Images/solution.png
I developed an app that works fine in getting GPS location. I believe the codes below will be of great help.
You can then edit the SubmitGPSLocation function to your preference.
public async Task Run(CancellationToken token)
{
await Task.Run(async () =>
{
if (GPSService.Instance.IsListening)
{
GPSService.Instance.StopListening();
}
GPSService.Instance.StartListening(2500, 50, true);
GPSService.Instance.PositionChanged += Instance_PositionChanged;
System.Diagnostics.Debug.WriteLine(getRunningStateLocationService());
while (getRunningStateLocationService())
{
token.ThrowIfCancellationRequested();
await Task.Delay(500).ConfigureAwait(true);
}
GPSService.Instance.StopListening();
//await CrossGeolocator.Current.StopListeningAsync().ConfigureAwait(true);
GPSService.Instance.PositionChanged -= Instance_PositionChanged;
return;
}, token).ConfigureAwait(false);
}
private void Instance_PositionChanged(object sender, PositionEventArgs e)
{
try
{
isEvenCount = !isEvenCount;
if (e.Position != null)
{
var message = new LocationMessage
{
Latitude = e.Position.Latitude,
Longitude = e.Position.Longitude,
Accuracy = e.Position.Accuracy,
Speed = e.Position.Speed,
Heading = e.Position.Heading,
TimeStamp = e.Position.Timestamp.DateTime
};
SubmitGPSLocation(e).ConfigureAwait(false);
}
else
{
CrossToastPopUp.Current.ShowToastMessage("Failed to get GPS location");
}
}
catch (Exception ex)
{
CrossToastPopUp.Current.ShowToastMessage(ex.Message);
}
}
private static async Task SubmitGPSLocation(PositionEventArgs e)
{
if (!NetworkCheck.IsInternet())
{
return;
}
if (!int.TryParse(App.PhoneID, out var number))
{
return;
}
try
{
var thetrackers = Convert.ToString(Application.Current.Properties["AuthorizedTrackers"]);
GeneralUserPhoneLocation MyGeneralUserPhoneLocation = new GeneralUserPhoneLocation();
MyGeneralUserPhoneLocation.PhoneID = int.Parse(App.PhoneID);
MyGeneralUserPhoneLocation.Latitude = e.Position.Latitude.ToString("n6");
MyGeneralUserPhoneLocation.Longitude = e.Position.Longitude.ToString("n6");
MyGeneralUserPhoneLocation.Accuracy = e.Position.Accuracy;
MyGeneralUserPhoneLocation.Heading = e.Position.Heading;
MyGeneralUserPhoneLocation.Speed = e.Position.Speed;
MyGeneralUserPhoneLocation.TimeStamp = e.Position.Timestamp.DateTime;
MyGeneralUserPhoneLocation.RequestType = "N";
MyGeneralUserPhoneLocation.Comment = thetrackers;
string servicestring = JsonConvert.SerializeObject(MyGeneralUserPhoneLocation);
HttpContent theusercontent = new StringContent(servicestring, Encoding.UTF8, "application/json");
using (HttpClient client = new HttpClient())
{
client.BaseAddress = new Uri("https://mygpswebapi.com");
var response = await client.PostAsync("Home/SaveGeneralUserPhoneLocationAPP/", theusercontent).ConfigureAwait(true);
if (response.IsSuccessStatusCode)
{
}
else
{
}
}
}
catch (Exception ex)
{
CrossToastPopUp.Current.ShowToastMessage(ex.Message);
}
}
This is regarding the usage of Cordova plugin "me.apla.cordova.app-preferences", which is used for saving to and retrieving from application preferences (SharedPreferences in Android and NSUserDefaults in iOS). While trying to fetch a previously saved value, some times success callback or failure callback is not being fired for both iOS and Android. But sometimes it fires successfully. Its a strange behavior. Here is the snippet of the JS code that I've used for retrieving value using the plugin:
var prefs = window.plugins.appPreferences; prefs.fetch(prefReadSucess, prefReadFailed, 'key');
function prefReadSucess(value) {
// handle success call back
}
function prefReadFailed(error) {
// handle failure callback
}
The jave code for the plugin is,
private boolean fetchValueByKey(final String key, final CallbackContext callbackContext) {
cordova.getThreadPool().execute(new Runnable() {public void run() {
SharedPreferences sharedPrefs = PreferenceManager.getDefaultSharedPreferences(cordova.getActivity());
String returnVal = null;
if (sharedPrefs.contains(key)) {
Object obj = sharedPrefs.getAll().get(key);
String objClass = obj.getClass().getName();
if (objClass.equals("java.lang.Integer")) {
returnVal = obj.toString();
} else if (objClass.equals("java.lang.Float") || objClass.equals("java.lang.Double")) {
returnVal = obj.toString();
} else if (objClass.equals("java.lang.Boolean")) {
returnVal = (Boolean)obj ? "true" : "false";
} else if (objClass.equals("java.lang.String")) {
if (sharedPrefs.contains("_" + key + "_type")) {
// here we have json encoded string
returnVal = (String)obj;
} else {
String fakeArray = null;
try {
fakeArray = new JSONStringer().array().value((String)obj).endArray().toString();
} catch (JSONException e) {
// TODO Auto-generated catch block
e.printStackTrace();
callbackContext.error(0);
return;
}
returnVal = fakeArray.substring(1, fakeArray.length()-1);
// returnVal = new JSONStringer().value((String)obj).toString();
}
} else {
Log.d("", "unhandled type: " + objClass);
}
// JSONObject jsonValue = new JSONObject((Map) obj);
callbackContext.success(returnVal);
} else {
}
}});
return true;
}
It is working properly when I comment the line from the java code,
cordova.getThreadPool().execute(new Runnable() {public void run() {
I tried to print the value of returnVal just before the line,
callbackContext.success(returnVal);
I am getting the values always. But some time the success call back is not firing,
I've ensured that the plugin is being accessed on or after the 'device ready' event. Any kind of help is appreciated.