I'm researching ways on how to do activity recognition using an android smartwatch. Currently, I'm focusing on detecting whether the user is walking or standing still. My first idea was to use the built in step counter, but then I came across the Android Activity Recognition API (I'm relatively new to Android^^) which seems to be used in mobile apps only.
I'm now stucking at answering the following questions:
Is the current API already making use of a connected wearable device?
(e.g. automatically accessing built-in wearable sensors)
Is there a seperate API available for Android Wear?
Is there any other best practice on how to use wearables for activity recognition? (especially walking and standing still)
During my research I've already tried the following things:
Reading through the Android Activity Recognition Guide
Reading through this article about Google's Activity Recognition API
Implementing a simple Android Wear App which uses the current Activity Recognition API. I tested the app on my LG G Watch without success. It seems like I can connect to the ActivityRecognitionClient but I never receive any activity updates. I tried the same code on my Nexus 5 - everything works fine.
Reading through this post about Google Play Services. Here the author is like "...We like the Activity Recognition API for Android Wear, as we’ve always thought the location tracking technology was a great backbone for this type of functionality...". So according to this, there is a seperate API, right?
I would be very thankful for any helpful information from you guys. In my opinion, a cool thing (see first question) would be to automatically detect a connected wearable device and use its sensors for enhancing the accurancy when the mobile phone is unsure about the current user's activity.
You ask
Is the current API already making use of a connected wearable device?
(e.g. automatically accessing built-in wearable sensors)
No, and this would not make sense would it? The wearable and handheld device is not always carried at the same time; the watch can be moving, and the handheld still. (vice versa) I am not sure what the value of a combined measurement would be.
Is there a seperate API available for Android Wear?
Yes. google provide a different Google Play Services library or wearables you see this in the compile dependencies;
compile'com.google.android.gms:play-services-wearable:6.5.87'
vs
compile 'com.google.android.gms:play-services:6.5.87'
So, when you tested the API in your first Moto360 app, you actually imported the play services libraries meant for handhelds instead of the wearable version. The constant "ActivityRecognition.API" is not included in the wearable version of the client API.
Is there any other best practice on how to use wearables for activity
recognition? (especially walking and standing still)
One way would be to use the raw accelerometer data to detect motion. It is fairly easy to detect that the device is not moving at all, to detect anything else is not trivial.
You could push sensor data from the wearable to the handheld for processing there if you like. Ping me if you'd like some code showing just that. I don't want to post it since it is not relevant to the question.
My guess is that Google will include this API on the handheld device in the future. Spending a lot of time "rolling your own" might be a risk...
Unfortunately, the activity recognition API is not yet implemented on Wear devices. When I tested a simple ActivityRecognitionClient API example program on my Motorola Moto 360 (with "4.4W 2"), I got a message indicating that on the logcat stream.
Related
I like to have single android app in our managed devices, we want only that app to be used on the device with necessary restriction, such that,
Single use - Device will have only one app, user can't use other apps, like browsing, youtube or anything,
the initial setting like notification sound, GPS always on, notification and ring sound maximum level can't be modified.
user can not power off the device.
this setting can only be changed by our servers.
i think i have 2 option,
1) Using samsung knox sdk on samsung devices,
Here's MDM proving feature of Samsung Knox Standard!
2) General Android way, Set up Single-Purpose Devices, COSU solution
Android Developer's site.
Wanted to know your's view on this, may be if you guys have done any of the two or any other ways, i could use some of the guidelines or a path.
Thanks for reading, and please comment if i was unable to articulate the subject or it needs editing.
You can use Google's new Android Management API, it seems to suit your needs.
It is a new cloud API that allows to manage Android devices from a server, without having to build an on-device agent (a device policy controller).
I have broad experience of using Samsung Kiosk Mode from Knox Standard SDK which is free and Pro-Kiosk mode from Knox Customization SDK (which has more functions but is not free).
So I can tell you for sure that all 4 points that you have mentioned can be achieved by using Knox Standard SDK.
Singe Purpose: https://seap.samsung.com/api-references/android-standard/reference/android/app/enterprise/kioskmode/KioskMode.html
LocationPolicy (you can turn on GPS and restrict changing): https://seap.samsung.com/api-references/android-standard/reference/android/app/enterprise/LocationPolicy.html
Yes. It is possible but I forgot the exact implementation.
Yes, as well.
Only downside of using this SDK is:
You are tied to Samsung (which I personally okay with, since Samsung has such market penetration and you could get service almost anywhere in the world and in enterprise world it is critical)
About Android native functionality: never tried it
Update March 7, 2019: Now I am playing around Device Owner, we use it for Kiosk mode, works well and works on android Device with Nougat and earlier.
for a project we're working on we have recently bought a Moto 360 Sport (2nd gen). I'm trying to find out if it's possible to listen to sensor events from the watch (like heart rate) in an Android app without writing a Wear app.
Basically the same as if you would listen to Location updates in your phone app but then for heart rate.
Is this even possible? Or do you always need a phone and companion wear app to get the watch sensor values? I haven't found any examples of this.
You can access to fit's data with google account from your phone, I think. All data are synchronized with user's account.
Try to check GoogleAPIClient.
As #Leo said, you can use Google Fit to know the heart rate. You can use the Google Fit app on Android Wear devices as well as on other Android devices such as smartphones. You'll get the most out of Google Fit on your Android Wear watch when you pair it with your Android mobile phone or tablet. You can also use Fit only on your Android Wear watch. But you won't be able to see as much information or track as much of your activity.
Here's a documentation on how to use Fit on Android Wear devices.
I have been doing some heavy research into the field of Visual Search, and I tried the technologies from Google (Goggles), Amazon (Firefly), and other vendors.
I can say that FireFly is actually the best, because its instant identification (no need to snap a photo and send it to some server for processing), plus its able to identify products accurately without having to scan their barcode, which is fascinating.
The thing is, Amazon exposed the Firefly SDK but only for their phone's Fire OS. And you can't use it for other Android development.
However, I am pretty sure this is not a hardware limitation, because Amazon has an app called Flow which runs on Android and iOS which uses the same identification technology, so I am sure any camera can be used and not just the one on the Fire phone.
Does anyone know if it's possible to use the Firefly SDK somehow on Android? I know this might be impossible without some sort of reverse engineering for FireOS, but even so at least it would be technically possible!
Thanks for your response in advance.
I have a quick question about trying to develop a new app on Android. So as the title says, I'm to use Bluetooth to link up two Bluetooth capable devices and when they are at a specific maximum range (5-20 ft), have the phone play some sort of alarm.
I do not however, own an Android Phone, but I am instead developing it on the Emulator on my computer. Earlier, I discovered that the emu is not Bluetooth capable nor does it have any form of sounds.
Is there any way of continuing on besides getting a real phone? Any helpful workarounds are welcome
regarding the emulator capabilities : you'd better buy a device soon ... duplicate?
And regarding the alarm triggering i'm a bit curious : on what are you relying ? SNR ? is there an easy way to access this kind of data on Android ? (only had a glance at the reference pages)
I am trying to use speech recognition on a Samsung Galaxy S phone (as I know the emu doesn't have the intent). The Galaxy S has a 2.1 Android ROM. For some reason I get that the package does not exist on the device, which doesn't make sense because other apps (google maps, voice dialer, etc) are clearly using this.
Does anyone have any ideas on how I can get this to work?
The code is more or less the same as google's example (http://developer.android.com/resources/articles/speech-input.html).
As a further note, I found this thread which seems to indicate that the srec library is missing randomly on devices:
http://groups.google.com/group/android-discuss/browse_thread/thread/2a53ec01bdff8e67
Is there a way I can do this manually (i.e. contact Google's SOAP API for speech recognition)? Alternatively, can I just copy the srec source code from somewhere and put it directly into my project?
Thanks.
Not a total expert on this, but I do know the actual recognizer task is performed on remote google servers (the voice feature will be sent out). Speech engine itself requires significant amount of memory and computing power. On the device itself, it is only possible to perform limited grammar tasks (for example, call XXX).
Maybe the device manufactures/operators don't have agreement with google?