Now that android apps can be run on the Chrome OS and using {"resize":"reconfigure"} they have become re-sizable, there is now a need for more responsive layouts. In order to make my layout more responsive I need an event listener that knows when the app window changes size.
Is there anyone out there doing this and how?
FYI: I want to do something similar to polymers Scaffold concept(https://www.polymer-project.org/0.5/components/core-elements/demo.html#core-scaffold)
ARC does not send a simple application level event on window resize. Android does define a resize event which is used when the virtual keyboard pops up, but this isn't useful to ARC since it is more about restricting your app to a smaller portion of the display (which is still the same size)
Instead it sends more of a system level event, causing the Android code to think that the display size has changed. This is substantially the same as what happens when you rotate an Android phone causing an orientation change event to propagate. The Android framework then passes this on to your app as the standard Activity.onConfigurationChanged() call.
You should be able to then use the values contained in the passed Configuration newConfig argument (such as screenWidthDp and screenHeightDp) to decide how to display your UI.
Related
I am trying to make a launcher application that launches some predefined set of apps in the multi-window Free form mode, and I have made some progress in doing that so far.
However I am working on a new use case where I need to restrict the resizing capacity of a newly launched window to either horizontal/vertical direction but not in both.
Here is what it looks like.
Presently both the activity windows can be freely expanded/shrunk in both horizontal/vertical directions. However I want to restrict to any one of them. I cannot find any means to do this yet, and would really appreciate any help.
Thanks in advance.
Is it possible a mobile app working like a background process to blur or fade the screen of the smartphone, no matter what active app is being displayed? Does the android and ios APIs expose such features?
Definitely not on Android. There are accessibility services which do modify the screen regardless of which app is visible but those are mostly first party applications. It would be a pretty big security risk issue if apps could block/blur the screen of other apps.
That being said you can create an overlay which lets touches through. I'm not sure if you can get the actual pixels of the UI behind the overlay and run a blur yourself but you can draw over top of elements on the screen (using accessibility APIs you can get the positions of UI elements) which may suit your needs depending on what exactly you're trying to achieve.
I am currently trying to draw a line on top of a textview content in android default screen for example., settings screen. it is possible to get content of the screen if that screen belongs to our own application, i want to know how can i get content of device default screen content from my application. as part of going through few discussion i got if we get the view object of that screen we can able to do that. so is it possible to get the view object of a android default screen? kindly help on how can we achieve this.
thanks
so is it possible to get the view object of a android default screen?
No. That app runs in another process. Its objects cannot magically transport themselves to your process.
i want to know how can i get content of device default screen content from my application
On Android 5.0+, with user permission, you can use the media projection APIs to take screenshots.
Im trying to build an app that can copy and save all text that user selected in any othe apps such as chrome, adobe reader, sms .....
In fact I have no idea how should I do that or even is that possible or not
I will be thankful if any one can help me.
The Accessibility API allows you to investigate the view hierarchy displayed on the screen and read text from different views. On top of that, you may be able to fetch the coordinates of where user tapped (maybe by using some kind of system overlay view) and translate them to the position of text the user most likely wanted to copy. Note that if this even works at usable level, will be very hacky and most likely making it work across considerable number of devices will be hard.
Is it possible for an android app to run in the background, listen to specific triggers and modify the visible application content in the screen (in both system and third party apps)?
For an example, A 2 finger tap should lead to a ripple effect on the screen. If the temperature is high, The screen turns more reddish. If I say "do a barrel roll", The entire UI does a "barrel-roll" like the google Easter egg. And this should happen whether the user is in the homescreen, settings or his Instagram.
The best working example I can give is the built-in "Magnification Gesture" provided by Android. Three taps anywhere will zoom everything up except for the keyboard and the navigation bar. And it doesn't zoom as an image, the touch points are preserved.
Is this possible to do without or with root? Do I need a framework like Xposed?
Thanks.