how to detect 3D touchs in Android? - android

I want to implement 3D touches in android,just like the 3d touches in the Iphone 6S and 6S plus.
I looked around in google and couldn't find any consistent material.
I could only find an example in Lua language and i am not sure yet if it's exactly what i am looking for.
So i thought may be if there is no libraries out there, then i should implement the algorithm from scratch, or maybe create a library for it.
But i don't know where to start ? do you guys have any clue ?

I believe you could implement something similar using MotionEvent, it has a getPressure() method that is supposed to return a value between 0 and 1 representing the amount of pressure on a screen. You could then do something different depending on the amount of pressure detected.
Note that some devices do not support this feature, and some (notably the Samsung Galaxy S3) will return inconsistent values.

I don't think it is possible on currently available Android devices. 3D touch is hardware technology embedded in displays in iPhones. I don't think you can implement this just writing some code in your Android application.

Short answer - no.
You need to wait for Google to actually copy the technology if it proves to be useful. But I doubt it'll happen in near future. This is because Android is all about accessibility and these screens will be quite expensive.
Long answer - Android is open source. If you are making something internal then go on, it'll allow you to do that with some modifications. Build a device, put in your modified code, create your own application that takes advantage of the feature and be happy to announce it to the world.

Related

AnchorNode/Models are disappeared in android sceneform sdk

Detect the plane in the sceneform/ArCore and add a few models on AnchorNode.
But Models are disappearing in the following cases.
Move phone faster
Lights are low
Blocking camera vision
So, Why is it Disappearing?
Does anyone have an idea, how to overcome this issue?
It is natural because the 3 cases you listed up makes ARCore hard to track the feature points.
And surely, there is no way to overcome this issue because tracking feature points is ARCore's job and not yours.
I'd rather let users be aware that in some specific environment application might not work properly. Or you could go ask ARCore developers

Here Map slow, how to turn off 3d- buildings?

I'm implementing a simple navigation app with the here-sdk for Android.
It has some great features that would be quite useful compared to my current google maps based app.
However, the app is very slow when navigating as well as when I simply scroll around on the map. I assume that turning off the 3d- buildings would improve the performance, but I cant find a way to achieve this...
Is it possible? And how?
Thanks
Check out Map.setExtrudedBuildingsVisible (boolean visible)
See:
https://developer.here.com/mobile-sdks/documentation/android-hybrid-plus/topics_api_nlp_hybrid_plus/com-here-android-mpa-mapping-map.html#topic-apiref__setextrudedbuildingsvisible-boolean
There's another type of 3D Buildings (3DLandmarks, some 3D Models of famous buildings). Those you can activate/deactivate via setLandmarksVisible(false)
Btw: What device are you running ? What CPU/GPU chipset is it having ? We know that extruded buildings can cause some performance trouble on some few GPUs (see: https://developer.here.com/mobile-sdks/documentation/android-hybrid-plus/topics/development-tips.html)

Controlling camera hardware in Android phone

I want to control the aperture, shutter speed and ISO on my android phone. Is there a way in which I can access the hardware features?
I won't say it's impossible to do this, but it IS effectively impossible to do it in a way that's generalizable to all -- or even many -- Android phones. If you stray from the official path defined by the Android API, you're pretty much on your own, and this is basically an embedded hardware development project.
Let's start with the basics: you need a schematic of the camera subsystem and datasheets for everything in the image pipeline. For every phone you intend to support. In some cases, you might find a few phones with more or less identical camera subsystems (particularly when you're talking about slightly-different carrier-specific models sold in the US), and occasionally you might get lucky enough to have a lot of similarity between the phone you care about and a Nexus phone.
This is no small feat. As far as I know, not even NEXUS phones have official schematics released. Popular phones (especially Samsung and HTC) usually get teardowns published, so everyone knows the broad details (camera module, video-encoding chipset, etc), but there's still a lot of guesswork involved in figuring out how it's all wired together.
Make no mistake -- this isn't casual hacking territory. If terms like I2C, SPI, MMC, and iDCT mean nothing to you, you aren't likely to get very far. If you don't literally understand how CMOS image sensors are read serially, and how bayer arrays are used to produce RGB images, you're almost certainly in over your head.
That doesn't mean you should throw in the towel and give up... but it DOES mean that trying to hack the camera on a commercial Android phone probably isn't the best place to start. There's a lot of background knowledge you're going to need in order to pull off a project like this, and you really need to acquire that knowledge from a hardware platform that YOU control & have proper documentation for. Make no mistake... on the hierarchy of "hard" Android software projects, this ranks pretty close to the top of the list.
My suggestion (simplified and condensed a bit): buy a Raspberry Pi, and learn how to light up a LED from a GPIO pin. Then learn how to selectively light up 8 LEDs through an 74HC595 shift register. Then buy a SPI-addressed flash chip on a breakout board, and learn how to write to it. At some point, buy a video image sensor with "serial" (fyi, "serial" != "rs232") interface from somebody like Sparkfun.com & learn how to read it one frame at a time, and dump the raw RGB data to flash. Learn how to use i2c to read and write the camera's control registers. At this point, you MIGHT be ready to tackle the camera in an Android phone for single photos.
If you're determined to start with an Android phone, at least stick to "Nexus" devices for now, and don't buy the phone (if you don't already own it) until you have the schematics, datasheets, and sourcecode in your possession. Don't buy the phone thinking you'll be able to trace the schematic yourself. You won't. At least, not unless you're a grad student and have one hell of a graduate-level electronics lab (with X-Ray capabilities) at your disposal. Most of these chips and modules are micro-BGA. You aren't going to trace them with a multimeter, and every Android camera I'm aware of has most of its low-level driver logic hidden in loadable kernel modules whose source isn't available.
That said, I'd dearly love to see somebody pull a project like this off. :-)
Android has published online training which contain all the information you need:
You can find it here - Media APIs
However, there are limitations, not all hardware's support all kind of parameters.
And if I recall correctly, you can't control the shutter speed and ISO.

How to get a software's hardware requirements when transplant from windows to Android?

I want to transplant a 3D program written in OpenGL on windows platform to Android, but I wonder if it can run smoothly on general Android platforms, so i want to estimate how much hardware resource is sufficient for it to run smoothly. It is some kind like the hardware requirements for a software or 3d game that a company will recommend the users. I don't know how can i get a hardware requirements of my program when transplant to Android.
i used gdebugger and it gave me some information but i don't think that is enough for me. Anyone here have some idea or solution? Many thanks in advance!
If your program is simple enough, you could write up some estimates about texture fill rate, which is a pretty basic (and old) metric of rendering performance. Nearly every 3D chip comes with a theoretical fill rate, so you can get the theoretical numbers of both your desktop system and some Android phones.
The texture memory footprint is another thing that you can estimate, especially using gdebugger. Once again, these numbers are known for most chips.
This is a quick way to produce some numbers, obviously without any real life performance guarantees.
The best way would be to test it on an actual device, and get an idea of what hardware works well. You could distribute a beta app and get some feedback too.
Depends on feature set that you use. For example, if you use FBO, the device will have to support framebuffer extension. If you use MSAA, smooth line, the device will have support corresponding extensions.
After listing down your requirements, you can use glGet to check for the device suppport
http://www.opengl.org/sdk/docs/man/xhtml/glGet.xml

Android with E-Ink display

I'm interested in using Android for a E-Ink
based platform. I know it has been demonstrated once by MOTO, but I'm interested in using it for a commercial grade product and not 'just' a technology demo. I have got a question on the ability to change the platform to cope with specific display effect caused by E-Ink. I'm asking this question from the role of system architect and have no prior experience with Android.
E-ink has several characteristics which are very different than the common LCD displays:
time to update display (50-700ms)
it costs power to change the display (none to maintain)
display life time is determined by number of display updates!
tradeoffs can be made between quality, performance and display lifetime
grayscale versions available
The great thing: it costs no power to retain display information and they can be read in bright sunlight with no backlight. Also the display can be literally as thin as paper...
This means that the platform software needs to have a degree of control over the number of display updates and the type of display updates to get the best performance. Otherwise, an application which is unaware of the display characteristics could quickly drain the battery, or worse, shorten display life time to months instead of years. Conceptually I'd be interested in replacing a display driver, but I'm not sure if this part is open. I know it is hard to get info on the Qualcomm chipsets....
My question: can this be done? Can the Android platform be modified to support a drastically different display effect? Any pointers to an android roadmap?
The reason I find Android interesting for this application is because there is a significant overlap in functionality (from cell phone to browser).
Thanks!
I cannot agree more and started to lobby with app and OS developers on improving readability on e-ink:
Make scrolling and page turns e-ink friendly http://github.com/aarddict/android/issues/28#issuecomment-3512595
Looking around on the web I find a recurring theme "we had to rebuild WebView from scratch to adapt it to the e-ink display"
There are already coding solutions which reduce flicker and page refreshes. Most of them are kept by those who market the e-ink readers who prefer to keep them as frontends to their shops.
I contacted the author(s) of cool reader on their implementation of
smooth scrolling on e-ink devices and got the following reply:
Hello, Look at N2EpdController.java Author
is DairyKnight from xda-developers. At least you can use it under GPL.
For use in closed project I would recommend to contact him.
Ideally, display components for e-ink devices should be part of the Webkit's WebView framework. I've submitted a feature request via
http://bugs.webkit.org/show_bug.cgi?id=76429
fyi, E-Ink has an Android on E-Ink development kit, AM350 that's being sold now. http://www.eink.com/sell_sheets/AM350_Kit_Sell_Sheet.pdf
http://www.linuxworld.com/news/2007/112707-kernel.html
In this case the application domain is e-reading, in which case the advantages of E-ink are more imporant than the disadvantages (slow display updates).
I've done some further studies of Android. I believe the trick is to perform display updates asynchronously; to provide applications with an environment which mimicks immediate display updates, whilst detecting the relevant updates (i.e. by using graphics processor and/or MMU) to have an intelligent display update. Not all types of applications would be suitable; i.e. games and video playback require immediate display updates.
Making such a platform will be less than trivial; however, with the growing number of different hardware platforms, abstractions are becoming better all the time.
I know this is an old question, but I have found it through Google - others might want to know this too.
PocketBook Pro 902/903 are based on Android and feature e-ink screen. You might want to check them out. There might be other models too - I am interested in these because of their 10" screen. YMMV.

Categories

Resources