I want to know that is it possible to use geofencing feature without any app installed? is there some cloud or something which you can register your geofencing area so anyone with the location on can see your notifications??
I don't know about Android, but for iOS you can use Apple Wallet Passes. You can attach up to 10 locations (geo points) to your passes, and when the device that installed the pass is near one of those locations, the pass will appear on the lock screen.
Relevance Information Displays Passes on the Lock Screen
Passes let your users take some action in the real world, so accessing them needs to be easy and fast. Wallet makes relevant passes immediately accessible by integrating them with the lock screen. To take advantage of this feature, add >information about where and when your pass is relevant. A pass’s relevancy can be based either on the time or the place where the user can actually utilize it. For example, a gym membership is relevant at the gym, while a boarding pass is relevant at the time the flight begins boarding.
Inside the pass, you provide a point in time and points in space. Wallet then determines whether the pass should appear on the lock screen based on these settings. It calculates when the user is close enough to the specified time and locations based on the pass’s style. Don’t try to manipulate the pass’s presence on the lock screen by providing inaccurate relevance information.
Of course, integrating with Apple Wallet requires your own web server, and is not a simple task. However, It doesn't require any app installed, which is what you asked about.
Related
I've read the Management API documentation and I think there's no reference to this topic, I didn't find any information about it here on SO either.
We're analyzing EMM solution providers for provisioning devices with a set of applications, trying to understand if they cover all our needs, or as a last resort we might come up with a custom implementation.
Can we control policies to be enforced on the devices only within a given time window? Specially at what times applications updates are allowed? I know system updates can be controlled, so I was wondering if I missed something in the docs.
We need to have control over that because we don't want to disrupt the UX on the devices when a new app update is available, especially in our launcher app. We have one main application running in KIOSK mode, with a few other apps accessible from within that kiosk app. The UX is really important, so app updates must be seamless - right now, updates are managed through a DO application that handles a few scenarios, one of which is checking for available app updates, downloading and applying them (the applications are not publicly available in Google Play Store at the moment). When the launcher updates, the DO application takes its place on screen during the update, providing the end user custom visual feedback about what's happening, and when the update is done the kiosk app takes control of the screen again.
So imagine one user has the kiosk app opened on his device, and we distribute a new version for it. When the policy is enforced, is it likely that the app would be abruptly closed for updating? Would it update only if not in use? Will we need to keep custom logic for ensuring a smooth update, and if so, how can we know an update will take place inside our app, since policies are enforced by a 3rd party application?
SystemUpdate can also be used for updating your app within the desired window hours, you can setup this by setting your SystemUpdateType to WINDOWED, and change the setting of startMinutes to your desired time (the start of the maintenance window, measured as the number of minutes after midnight in the device's local time. This value must be between 0 and 1439, inclusive.)
"systemUpdate":
{
"type": "WINDOWED",
"startMinutes": 0,
"endMinutes": 1439
}
Can I build an app that accesses the sim-toolkit options (network-> send money, check balance etc) in the background (as if the user is actually accessing the sim-toolkit application). I want to do this because the standard sim-toolkit is limited and does not provide better search, history, saving options etc. I say ìn the background because I have read that direct access to the sim-toolkit is problematic.
The SIM toolkit just sends USSD codes and parses the results, which means that one does not need it. And it makes no difference how you access it, while the system is designed to keep this area out of reach. eg. just try calling *101# to see the current balance. Or just get some prepaid USIM from a provider, which offers their own API & management app... because else you have the next problem with SMS permissions, which may be required to change eg. the data plan via SMS. Even for such basic functionality ...every provider has another number and other keywords for that.
I made a small Google Home App and my service returns a response with a SimpleMessage + Card.
It works perfectly when running the app in the console.actions.google.com simulator. I get the card all good.
But when I test talking to the Google Home, it only sends the text, no trace of the Cards anywhere.
However If i talk to the Google home app on my phone, it does send the card correctly.
Is there something to enable to be able to receive cards sent by Google Home? Is it possible at all?
There is no way to make cards that were sent while the user is talking via Google Home visible, but there are several techniques that you, as a developer, can use if cards are necessary.
First of all - good design suggests that cards should be use to supplement the conversation, not be the focus of the conversation. Make sure the voice conversation itself is important and use the visual elements only when necessary. If your action is overly visual - it may be better suited as a mobile or web app, rather than an Action.
If your device requires a screen, then you can set this in the Action Console when you configure your question. This will, however, prevent it from being used on a Google Home device.
If you don't want to go this route, and want to allow it to be used on a smart speaker, but still take advantage of a screen where it is available, you have a few options.
First is that you can just send the cards. As you've discovered, they won't show up, but they won't cause any problems.
If you want to act slightly differently if a screen is available, you can check for the surface capabilities that the user's Assistant is capable of at that moment. If you're using the node.js library, you can have a command such as
let hasScreen = app.hasSurfaceCapability(app.SurfaceCapabilities.SCREEN_OUTPUT)
to determine if a screen is available and take action based on the variable hasScreen. If you're using JSON, you need to check the array at surface.capabilities or data.google.surface.capabilities to see if "actions.capability.SCREEN_OUTPUT" is one of the available surfaces.
If not, and you get to a point in the conversation where you feel you need to send a visual result, you can also request to continue the conversation on a device that does support screen output.
First, you'll need to make sure that they have a screen available. You'll do this with the node.js library with something like
const screenAvailable = app.hasAvailableSurfaceCapabilities(app.SurfaceCapabilities.SCREEN_OUTPUT);
or by checking the availableSurfaces.capabilities or data.google.availableSurfaces.capabilities parameters in JSON.
If one is available, you can request to continue the conversation there with something like
app.askForNewSurface(context, notif, [app.SurfaceCapabilities.SCREEN_OUTPUT]);
where context is the message that will be said on the Google Home, and notif is the notification that will appear on their mobile device (for example) to let them continue the conversation. If using JSON, you'll need to use a actions.intent.NEW_SURFACE next intent.
Either way, the user will get a notification on their mobile device. Selecting the notification will start up the Assistant on that device and your Action will be called again with parameters that let you check if they are on the new surface. If so - you can send the card.
Is there a way to programmatically request current status information (location, tta, speed) from an active instance of the HERE WeGo app? I know I can use an alternate method for location but would like to access TTA and possibly additional current status information from a running GPS program without requiring the user to input data. Example: Current time, location, speed, ETA(TTA), logged with the click of a button or voice command. I would specifically like to be able to use HERE because of its off-line capabilities.
No that is not possible. It would violate the user's privacy and allow an unsuspecting application to track the user :)
For start I must say that I´m new to Android.
I´m a Msc Student and for my Msc thesis I've to develop a system that collects all user inputs on the touchscreen, regardless of the applications that are being used, and it must be done in background.
The objective is to use that data to establish a user profile and then apply an algorithm that continuously compare the new inputs with that old ones to grant the legitimate user authentication.
In other words I've to develop a touchlogger, but not for malicious purposes.
My question is: Are the initial permissions, that a user accepts in the installation process, enough to allow my app to collect the touch inputs from other applications, or it will be blocked because of the sandbox?
Note: The system is to be used for a regular person in a regular device, so rooting the device must not be an option.
Thanks for your help.
Best Regards
This is not possible, especially in newer versions of Android.
This is because even though there are system overlays allow you to display things like chat heads over any other app, you cannot capture touch events and pass them down to the app below. So even if you manage to capture the events, you will end up rendering the device useless, as nothing below your layover will work.