Recognize active objects with a capacitive touch screen display - android

I'm trying to develop an app that can recognize an active object (for example: a memory) that touch the smartphone display. Before I start to develop I've to know if there's any objects that my touch screen display can recognize? Which device can be recognizable by a smartphone display? I'm interested to know that for iPhone or for Android phone.
I found this app and you can see that with a card I can interact with a mobile device, now I'm asking you if anyone know how to do this kind of app with an iPhone or with an Android phone.
Does anyone knows how to do that? There's a library (iOS or Android) to recognize object that I put over the display?

volumique is the company that develops the monopoly card technology that you are talking about. However I will suggest two things.
For Android devices you can use NFC. Its kind of what you are doing right now but you just need to bring your object closer to the screen, no need to actually touch it.
For iOS, there is no NFC or RFID technology available. However you can develop a hardware which has active capacitors arranged in a pattern over it so when you bring your device closer to the iOS screen, the touch controller should recognize the pattern of the capacitors and report this to the main controller which can do recognition of the object with the help of your code. basically capacitive touch screens used in iPhones are just an array of capacitors arranged in a grid pattern. So when you touch using your finger, you change the capacitance of one or two capacitors and then the controller finds out the location of the change. However if you change the capacitance of say 5 6 sensors at the same time, in a particular order like in a pentagon, then you can write software for your controller that if the location of the sensors whose capacitance has been changed by this external object form the shape of a pentagon, then show the viewer that it is a 5 $ card (just an example). This is one way I can think of doing this.
Thanks

Related

How to support dual screen on android with an interactive screen?

I am making a game running on an android phone with an interactive station, the station looks like an interactive screen, when the phone connected to the station ,the station would show the same desktop content as phone and can launch app and do anything as same as phone ,it can also launch different app on phone and station respectively. I wanna achieve that launch my app and can see a different view in phone screen and station screen, both phone screen, and station screen can receive user input event.
I have tried to use,
ExtCamera.SetTargetBuffers(Display.displays[1].colorBuffer, Display.displays[1].depthBuffer);
to output what ExtCamera view to the secondary screen.
I can see different camera view in the station screen and phone screen, but the station screen can not receive user input, I wonder it because this method is only to render a texture to screen so it can not receive user input ? or there is another way to achieve this goal? I knew that android native offer class presentation to support multi-display, but it seems a large workload between android and unity to interact.
Any help would be appreciated!

How to position virtual objects in specific places in a scene without a marker?

I have a question about using ARcore.
How could one develop an app that recognizes the environment and distributes certain virtual elements to specific places in the scene? For example, when viewing a hallway with a few doors, a ladder and an exit, the app places a virtual board (sign) over the ladder with the word 'ladder' written upon it; upon each door, a board with the name of the room and a board saying 'output' on the exit. Is this possible? It is not a GeolocationApp, because GPS would not be used. wanted to do this from the recognition of the environment.
Even with Vuforia I found it difficult to do so, and so far I have not.
Can someone help me? Is there a manual or tutorial about it? Preferably not on video.
I thank everyone.
You will not go to space today
You want to do something that virtually no software can do yet. The only system I've seen that can do anything even remotely close to what you want is the Microsoft Hololens. And even it can't do what you're asking. The hololens can only identify the physical shape of the environment, providing details such as "floor like, 3.7 square meters" and "wall like, 10.2 square meters," and it does so in 8 cm cube increments (any given cube is updated once every few minutes).
On top of that, you want to identify "doors." Doors come in all shapes and sizes and in different colors. Recognizing each door in a hallway and then somehow identifying each one with a room number from a camera image? Yeah, no. We don't have the technology to do that yet, not outside Google Labs and Boston Dynamics.
You will not find what you are looking for.

Translate Android movement to pixels

I would like to create Android app for viewing images.
The idea is that users keep their tablet flat on the table (for sake of simplicity, only X and Y for now) and to scroll the picture (one that is too big to fit the screen) by moving their tablets (yes, this app has to use tablet movement, sorry, no fingers allowed :) ).
I managed to get some basic framework implemented (implementing sensor listeners is easy), but I'm not sure how to translate "LINEAR_ACCELERATION" to pixels. I'm pretty sure it can be done (for example, check "photo sphere" or "panorama" apps that move content exactly as you move your phone) but I can't find any working prototype online.
Where can I see how that kind of "magic" is done in real world?

How to access the raw capacitance image on an Android device

I get stuck on how to get the capacitance image out of an android phone. Is it necessary to Hack the android system to get the capacitance image in Android?
In the released Android API, it only allows me to get which point an finger is touching on the screen. But since the touchscreen is using capacitive sensing, I am wondering how can I get the raw capacitance sensing data from the system? I guess I may need to hack the system for further information. Does anyone have done it before? Or give me some hints or directions on where I should start with?

Is it possible to show different content on the Android's screen and the screen connected via HDMI?

I'm wondering if I can develop an application, where I can have the main content displayed on a big screen connected to my Android via HDMI and use the Android's touchscreen as a controller displaying different content.
So far the videos I've seen about Android's HDMI feature only mirror the phone's screen to the big screen.
You can use the Android Presentation API (API 17).
Works very well.
Your presentation is connected to an Activity, which lets you display e.g a Live Stream on the TV (through e.g HDMI) and use phone's display as a remote. I've done this in an app, and also out of laziness added a second app for a second phone which is used as a bluetooth remote control.
Hope this answers your question.
Surface flinger only sees two different kind of graphic buffers, frame buffers for normal ui display and overlay buffers for videos and camera previews. So frames buffers (or overlay buffers) should be transferred to hdmi by display controller when hdmi cable is plugged in. But unfortunately there isn't public api to control this kind of data flow. It is highly dependent on how hardware overlay or hdmi device drivers are implemented by chipset vendor or device manufacturer.
I dont think you can do this, unless you develop for a device for which vendor published HDMI API, like for some Motorola devices. For the rest, they typically have some hdmi OS service (not accessible to apps) that uses ioctls and /dev/ access for hdmi control (again, not accessible to unsigned apps).
You can exploit the flaw in HDMI overlay communication to achieve this. Your video input goes directly to the android output but you can use another layout for the screen which will not be visible in HDMI due to overlay issue.

Categories

Resources