I am thinking of using Flex4/Air for developing an Android app. I want the app to have hover to play like ability. I mean if a video thumbnail is selected from alist of videos but not clicked it should play a 5 second clip just like in bing.com/videos. I am assuming this is the closest we come to "hover" in Android devices - please correct me if ths is not the case.
Does Flex4/Air have this capability? Otherwise can we implement hover-to-play ability on Android devices?
Appreciate any help/pointers.
Does Flash Builder 4.5 have this capability? No! I think you meant to ask does Flex 4.5 have this capability, but the question would be best stated: "does AIR for Android have this capability? If so, how can I access it in Flex?"
When developing code for mobile devices, I would take great care when implementing functionality based on a "hover" approach. However, you can take a look at these touch events:
TOUCH_OVER
TOUCH_ROLL_OVER
I thought a Long Press / Long Touch event may be what you need, but I couldn't find any documented one supported by AIR's touch API.
Related
I was messing around with libvlcsharp on Xamarin and the (fairly) new MediaPlayerElement UI on Android. For devices such as Android TV, there is no touch interface so you have to use something like a remote control.
For this case, I end up capturing keypresses in DispatchKeyEvent and send them to my app via MessagingCenter. I was wondering whether MediaPlayerElement can support non-touch devices automatically OR if not, what the best approach would be to handling keypresses in the app. I would have to "highlight" various controls of the MediaPlayerElement and then be able to select them when "DpadCenter" is pressed.
My questions are:
Does MediaPlayerElement already support non touch gestures? This site here seems to suggest it might with the comment that you can turn them off.
If it doesn't support them (and you have to roll your own), is there a programmatic way to highlight (e.g. change the background color) of the individual controls/buttons (such as pause or play) and invoke them?
Does MediaPlayerElement already support non touch gestures? This site here seems to suggest it might with the comment that you can turn them off.
If it doesn't support them (and you have to roll your own), is there a programmatic way to highlight (e.g. change the background color) of the individual controls/buttons (such as pause or play) and invoke them
You can override functionality for any control, so you should be able to hook your DpadCenter event and modify the behavior you expect of the player element.
It is on the roadmap to provide better docs for this https://code.videolan.org/videolan/LibVLCSharp/-/issues/309
For customization of the control, a similar answer was created a while ago: How to create LibVLCSharp custom playback controls in Xamarin Forms?
Do share what you build with it :-) We don't have any Android TV sample for this.
I want to design VR app in android which detects/recognise things using camera. and show details.
I know about unity, but this shows assets provided while designing background like gaming set.
In augmented reality we can scan image and accordingly we get results, the same way I want using VR screen.
Check out Vuforia library for android. It works like a charm.
I have implemented what you're talking about. It'll take a little effort and time to get the hang of it. But it'll definitely work.
I would like to use the mobile camera and develop a smart magnifier that can zoom and freeze-frame what we are viewing, so we don't have to keep holding the device steady while we read. Also should be able to change colors as given in the image in the link below.
https://lh3.ggpht.com/XhSCrMXS7RCJH7AYlpn3xL5Z-6R7bqFL4hG5R3Q5xCLNAO0flY3Fka_xRKb68a2etmhL=h900-rw
Since i'm new to android i have no idea on how to start, do you have any idea?
Thanks in advance for your help :)
I've done something similar and published it here. I have to warn you though, this is not a task to start Android development with. Not because of development skills, the showstopper here is a need for massive amount of devices to test it on.
Basically, two reasons:
Camera API is quite complicated and the different HW devices behave differently. Forget about using emulator, you would need a bunch of real HW devices.
There is a new API, Camera2 for platform 21 and higher, and the old Camera API is deprecated (kind of 'in limbo' state).
I have posted some custom Camera code on GitHub here, to show some of the hurdles involved.
So the easiest way out in your situation would be to use camera intent approach, and when you get your picture back (it is a jpeg file) just decompress it and zoom-in to the center of the resulting bitmap.
Good Luck
I noticed there are a large number of player guides on YouTube that show a "heat map" or visual of typical user interaction with a touchscreen application, like this example:
http://www.youtube.com/watch?v=H5mVS1sEAZI
I have an Android application being used for research purposes, and we are already tracking (in a SQLLite database) when and where users touch / interact with a video.
We would love to create a visualization of where and when users are touching the screen.
Are there any tools, APIs, etc. out there that anyone has seen for generating this kind of data visualization?
If not, is there any good way to take screenshots of the video / application at a moment in time when users touch the application?
For iOS you can use https://heatma.ps SDK
Does Android have the software capabilities to, if a phone has video-out, to open or push content solely to the video out.
So for example if the user is in and clicks on a YouTube link, the app, instead of opening the content on the main screen over the app it would push it to the video out so the YouTube video would display on their connect display and they could continue to browse.
I know Motorola's have the WebTop software and this idea is similar to what I am trying to accomplish but on a much more basic level. It's more similar to Apples AirPlay but much less complex again (without a network/external player - just video out).
Or if even that is to complex an even simpler solution of having the video-out still output even when the phone is locked. Currently the video-out mirroring on both my HTC Incredible and Galaxy Nexus will stop when the phone is locked.
EDIT:
I've noticed while using my phone that playing a video through the Google Videos app that on the phone controls will overlay on the screen i.e. play, pause, seek bar, and, the soft buttons, but the video-out display (Television) plays the video continuously/seamlessly without any of the controls over-layed. Now this is very primitive example of what i'm ultimately alluding too but it does show a real world example of an android device (no 3rd party manufacture software) doing video out that isn't exactly mirroring.
Well... I hate to furnish this as an answer, but it really looks like there's simply nothing in the API for it at all...
http://groups.google.com/group/android-developers/browse_thread/thread/9e3bcd1eea2c379
which just redirects to:
https://groups.google.com/forum/#!topic/android-developers/Jxp_9ZtzL60
I'm definitely going to favorite this question, and hope someone chimes in with something more useful than a "doesn't look like it's possible, though that does appear to be the correct answer, at this time.
Much like Dr.Dredel has mentioned, there is nothing current for multiple displays in terms of display 1 showing 'A' and display 2 showing 'B'
There is support for multiple screen sizes per the following:
http://developer.android.com/guide/practices/screens_support.html#support
This will be the case for a little while longer until someone creates the support for it.