Up to now I was only developing in J2ME and would like to know about the differences compared to Android.
The situation for a java enabled phone is that it might, for example, has a built-in camera, but the manufacturer didn't implemented the Java API for camera capabilities. Which means you can't use that API. It is even possible that only parts of an API were implemented.
Now, what about Android? For what I know, when a device has Android Platform 2.2 it supports every API Level up to Level 8. And I would guess, that if the built in camera doesn't have a flash then you can't use the Android API call to change the flash mode. Is that right?
Now let's assume that the device has a built in flash enabled camera. Can a developer be sure that the function for changing the flash mode can be used, or is it possible that the manufacturer didn't implemented that specific function even though it is part of the supported API Level which the device was advertised for?
And I would guess, that if the built in camera doesn't have a flash then you can't use the Android API call to change the flash mode. Is that right?
So long as you set it to a valid value, you can always use the API. However, in your case, there is probably only one valid value (i.e., no flash).
Now let's assume that the device has a built in flash enabled camera. Can a developer be sure that the function for changing the flash mode can be used, or is it possible that the manufacturer didn't implemented that specific function even though it is part of the supported API Level which the device was advertised for?
Android devices that do not have the Android Market have no guarantees whatsoever regarding their compatibility with third-party apps.
Android devices that do have the Android Market must pass a compatibility test suite. Whether or not that test suite has a specific test for a specific API can only be determined by looking at the test suite code.
So, it depends on how you define "sure". Developers usually don't have to worry about it, but device firmware bugs do happen.
Related
I am actually studying mobile security and I am focusing on hardware protection, so, I wonder, if is it possible an app can access to mobile hardware resources, and dominate on them, so, no access to any hardware resource by other apps only if permission was given by the dominating app? (as a second level of hardware security).
At least for camera, I'm aware of using Android Device Administration with a policy to Disable camera: https://developer.android.com/guide/topics/admin/device-admin.html Typically using a Mobile Device Manager would be used on authorized devices by corporations/governments in sensitive locations. Other restrictions are probably available as the Device Manager software would be accessing Device Manufacturer APIs.
Another method to achieve your goal of 'wrapping' the hardware is to not let the Android app actually talk to Android framework methods. Basically the client app would be running a virtual container with the container mocking Android framework APIs.
See: How to execute APKs on a customized data directory?
And Commonsware's blog post for additional details: https://commonsware.com/blog/2017/01/17/droidception.html
This would allow you to add additional restrictions at the cost of having all of the apps go through the virtual container. Enforcing the user to always use the container isn't really possible unless you are building a custom ROM or perhaps rooted.
And if you are doing a custom ROM or rooted you might as well add any additional hardware restrictions through those methods.
I like to have single android app in our managed devices, we want only that app to be used on the device with necessary restriction, such that,
Single use - Device will have only one app, user can't use other apps, like browsing, youtube or anything,
the initial setting like notification sound, GPS always on, notification and ring sound maximum level can't be modified.
user can not power off the device.
this setting can only be changed by our servers.
i think i have 2 option,
1) Using samsung knox sdk on samsung devices,
Here's MDM proving feature of Samsung Knox Standard!
2) General Android way, Set up Single-Purpose Devices, COSU solution
Android Developer's site.
Wanted to know your's view on this, may be if you guys have done any of the two or any other ways, i could use some of the guidelines or a path.
Thanks for reading, and please comment if i was unable to articulate the subject or it needs editing.
You can use Google's new Android Management API, it seems to suit your needs.
It is a new cloud API that allows to manage Android devices from a server, without having to build an on-device agent (a device policy controller).
I have broad experience of using Samsung Kiosk Mode from Knox Standard SDK which is free and Pro-Kiosk mode from Knox Customization SDK (which has more functions but is not free).
So I can tell you for sure that all 4 points that you have mentioned can be achieved by using Knox Standard SDK.
Singe Purpose: https://seap.samsung.com/api-references/android-standard/reference/android/app/enterprise/kioskmode/KioskMode.html
LocationPolicy (you can turn on GPS and restrict changing): https://seap.samsung.com/api-references/android-standard/reference/android/app/enterprise/LocationPolicy.html
Yes. It is possible but I forgot the exact implementation.
Yes, as well.
Only downside of using this SDK is:
You are tied to Samsung (which I personally okay with, since Samsung has such market penetration and you could get service almost anywhere in the world and in enterprise world it is critical)
About Android native functionality: never tried it
Update March 7, 2019: Now I am playing around Device Owner, we use it for Kiosk mode, works well and works on android Device with Nougat and earlier.
I'm trying to find out which, if any, android devices implement the manual focus fonctions included in Android API since version 14.(http://developer.android.com/reference/android/hardware/Camera.Parameters.html#setFocusAreas%28java.util.List%3Candroid.hardware.Camera.Area%3E%29)
I would suspect the devices branded "google" to have the fullest implementation of the API but I couldn't find any information on this.
Nexus 6 seems to have the manual focus control capability. Please download manual camera app from play store and check if your phone has the capability of manual focus control. Apparently Lollipop's camera API supports this feature with selected new phones.
I am trying to find out about sensors in Android phones.
Do all/most phones have a basic set of sensors or do I have to look at the individual specifications to find what each supports. The specs I have looked at seem rather unclear as to what each phone actually provides.
I haven't been able to find even an out of date list of phones and their sensors, but if anyone can point me to one, I would be grateful.
I should have made it clear that I am looking for this information as my application may need a specific sensor or combination of sensors. If these are not generally available then the application may not be worth developing. In addition, it may be possible to use more than one combination of sensors to do the job, so information on what is likely to be available will aid development.
Android has no minimum hardware requirements for Android when it comes to sensors. The Android Compatibility Program states:
Android 2.3 includes APIs for accessing a variety of sensor types. Devices implementations generally MAY omit these sensors, as provided for in the
following subsections.
The best way of going about it is to specify the sensors your app uses in your AndroidManifest.xml file, along with whether or not those sensors are required for the application to work. The android market uses the details of required sensors (and other hardware features) to hide the app from unsupported devices on the market.
Details of the different flags and how to use them can be found here.
When you publish a draft of the app, you can see a list of all the devices which support the features requested.
What are the differences developers should be aware of?
I am aware of these limitations:
Pre-installed software. Real device can have preinstalled a lot more applications than emulator.
You cannot use "capture" photo/video functions in emulator.
According to emulator documentation, its limitations are:
The functional limitations of the emulator include:
No support for placing or receiving actual phone calls. You can
simulate phone calls (placed and received) through the emulator
console, however.
No support for USB connections
No support for device-attached headphones
No support for determining network connected state
No support for determining battery charge level and AC charging state
No support for determining SD card insert/eject
No support for Bluetooth
IMO you can use emulator to simplify UI development, to view UI on "device screen", to be sure that app layout is ok, app can be run, you can test some special cases by simulating gps position, network speed or messaging etc. But testing on real device is a must.
With the 1.5 SDK the following limitations exists (from the SDK website):
No support for placing or receiving actual phone calls. You can simulate phone calls placed and received) through the emulator console, however.
No support for USB connections
No support for camera/video capture (input).
No support for device-attached headphones
No support for determining connected state
No support for determining battery charge level and AC charging state
No support for determining SD card insertion/removal
No support for Bluetooth
No support for Multitouch
Based on experience I've noticed the following differences in actual developemnt:
There are bugs you'll be able to ignore in the emulator that will crash the device (not closing Cursors for example)
You interact with the device differently than the emulator. I use landscape mode a lot more with the real device than I do with the emulator.
There's a different CPU. Things that are fast on your emulator will be slower on the real device.
You can dogfood with the device. It is harder to dogfood with the emulator.
There is a google group here if you need real device testers.
One cannot test touch events with emulator which has to be tested only by means of mouse clicks on emulator which any developer going to develop an application based on touch screens should be aware of.
I'd say the main thing is that there are several "real devices" currently using Android, and there will be more, with different hardware endowments -- some will have GPS and some won't, ditto for touchscreen, real keyboard as opposed to virtual on-screen one, camera resolution, etc, etc.
While the OS will do a lot of the heavy lifting for you, you still want to make sure your design a user experience that makes sense on every Android device you intend to support, despite the variation in their HW features -- in this sense, designing applications for Android is more similar to designing them for, say, Linux, Windows, or the Web (cater for a wide variety of hardware-configuration details), rather than e.g. Macs or iPhone (where you need to consider a much narrower set of possible HW configurations).
The emulator is (or tries to be;-) "one" Android device -- but there will be others ("real" ones;-) with different screen resolutions, input peripheral devices, etc, etc...
One comment regarding google accounts: With version 8 of the google APIs for Android 2.2, you can add a google account on the device. However, it will only allow authentication for tests of the google APIs (e.g. google documents) but not syncing of contacts etc.
This is a bug, since camera and video support was attempted (incorrectly): the camera and video intents do not store their output in the MediaStore database after "capture."
In simple terms, an emulator is a device that runs on your computer (as software) whereas a real device is something you can hold. There will of course be a few differences between the two such as some device-specific features won't be available on the emulator.
Edit: Removed a link from the answer that had expired.