SDK for stop-motion mannequin tracking? - android

I am looking for a (preferably Unity) SDK that can track a real-life humanoid mannequin with passive trackers at the end of its joints, using ONLY a built-in mobile device camera. From a technical point of view, essentially I want to achieve the same functionality as presented in this video: https://www.youtube.com/watch?v=KyO5FNhoApw
If I could find an SDK that does this or at least does similar stuff, I could build upon that I think. The important part is to avoid using Kinect and only utilize the camera of the mobile device that runs the application.
Do you have any starting points in mind, maybe an SDK that might be helpful to check out?
Thank you so much in advance! T

Related

AnchorNode/Models are disappeared in android sceneform sdk

Detect the plane in the sceneform/ArCore and add a few models on AnchorNode.
But Models are disappearing in the following cases.
Move phone faster
Lights are low
Blocking camera vision
So, Why is it Disappearing?
Does anyone have an idea, how to overcome this issue?
It is natural because the 3 cases you listed up makes ARCore hard to track the feature points.
And surely, there is no way to overcome this issue because tracking feature points is ARCore's job and not yours.
I'd rather let users be aware that in some specific environment application might not work properly. Or you could go ask ARCore developers

Can I use open source code to run and monitor a Nest security camera?

I have a macbook and I would like to use it to monitor a nest wireless security camera, including an approximately 1 tb archive of continuously updated video history (perhaps of motion detected clips only). This can be done by subscribing to a nest cloud account, but that can get expensive, especially for several cameras, so I'd rather do it myself.
Can anyone point me to open-source code that will handle this? If not, is there another type of camera that will allow me to do this over wifi?
As promised above, I will update the status of this issue.
After a significant amount of work and also significant progress, I was able to connect to the live nest camera feed programatically but was never able to actually record the live stream into short videos, although this was easy for my MacBook webcam. My belief is that Nest has engineered this feed such that camera owners cannot directly access it, leaving no option but to use their "Nest Aware" monthly service. I do not want to do this as I do not want to pay for it and because I want to create options that Nest Aware does not offer.
Searching the web, it appears that this kind of thing might be done by using another software package, "blue iris". I did not want to get this either as I am sure that flexibility would be sacrificed and also the camera would need to be made publicly shared(!)
So I am giving up on Nest, although I like the hardware.
I did find an alternative. I also had an Arlo Q camera and I tried that, using an open source API on GitHub:
https://github.com/jeffreydwalter/arlo
I was able to access the camera and save motion detected videos to my disk within an hour of finding the above link. So, if you want to do this type of thing, I recommend Arlo over Nest.

Sony Alpha R 7 Camera - - On Camera App

I have a Sony Alpha 7R Camera and look for information about the "build in" application support. What are those apps, android based? I there information public about how to create and install your own camera app -- NOT talking about the remote api.
The few available apps are kind of primitive and limited, in particular I'd like to create a more versatile "Interval timer" app -- the time lapse thing is kind of too simple for my purpose.
To be particular: versatile bracketing, absolute start/stop times, complex shooting programs with pre programmed ISO, Shutter, bracketing etc. for a series for programmed interval shooting, or simply as fast as possible... As an example -- I just suffered "lost valuable time" shooting a Eclipse as I had to reconfigure/switch modes, etc.
Ideal would be a scenario I could upload a shooting script to the app on the camera.
The real answer is that you can build applications for the Camera API using many different methods. When you create an application for the Camera API you are just making API calls to the camera while your code is connected to the camera Wifi somehow. In the end the easiest way to distribute your code is using smartphones as it will work for IOS, Windows, etc.. as well as Android, but you are not limited to these technologies. Please let me know if I can provide more information.

how to detect 3D touchs in Android?

I want to implement 3D touches in android,just like the 3d touches in the Iphone 6S and 6S plus.
I looked around in google and couldn't find any consistent material.
I could only find an example in Lua language and i am not sure yet if it's exactly what i am looking for.
So i thought may be if there is no libraries out there, then i should implement the algorithm from scratch, or maybe create a library for it.
But i don't know where to start ? do you guys have any clue ?
I believe you could implement something similar using MotionEvent, it has a getPressure() method that is supposed to return a value between 0 and 1 representing the amount of pressure on a screen. You could then do something different depending on the amount of pressure detected.
Note that some devices do not support this feature, and some (notably the Samsung Galaxy S3) will return inconsistent values.
I don't think it is possible on currently available Android devices. 3D touch is hardware technology embedded in displays in iPhones. I don't think you can implement this just writing some code in your Android application.
Short answer - no.
You need to wait for Google to actually copy the technology if it proves to be useful. But I doubt it'll happen in near future. This is because Android is all about accessibility and these screens will be quite expensive.
Long answer - Android is open source. If you are making something internal then go on, it'll allow you to do that with some modifications. Build a device, put in your modified code, create your own application that takes advantage of the feature and be happy to announce it to the world.

Human Position Tracking

I need to write a software project for Human Location Tracking in a room, something like this: https://www.youtube.com/watch?v=4R5LlpGUpqY
Requirement is the ability to enable tracking of the phone (application in android) and 2D tracking result in this application.
So I use application that communicates with eg raspberry (arduino?) Enables tracing and sends data to phone? This is good idea?
I started thinking about the hardware that I should use, what would be the best idea?
Radar, motion sensor, wifi? (https://www.youtube.com/watch?v=YuQsGNq_6as)? How many sensor - radar etc ?
I do not know where I should start...
However your question appears to be off topic but for starting point I suggest you take a look at this tag in SO which is about indoor positioning. and this question and also see This Project that might help you more.
Other links that might help:
1- 10-things-you-need-to-know-about-indoor-positioning
2- Indoor Wireless Navigation

Categories

Resources