android audio - calculating the distance between two devices - android

I'm having a real hard time calculating the distance between two android phones using sound.
-the main idea is having 2 phones sync'ed on same time, making mobile A send a msg to mobile B to let him know he is playing sound soon. note that mobile A save this time.
-then mobile B sends "ok, u can go ahead" to mobile A while it starts recording the next 1 second or so.
-Then mobile A gets the "ok" and start playing a 1000Hz sound.
-Mobile B detect that freq and send its current time to mobileA
now we have all the info to calculate the distance. problem is that at theory this is all good, but when i implement this i have lots of random time added into the equation.
the main problem is that I cant point at the ABSOLUTE time when mobile B got the good freq.
I tried not recording the whole 1000 ms but lots of "mini" chunks of (12~24ms) but the time the mobile spend on the recorder_.startRecording()/recorder_.read()/recorder_.stop() commands is too much, and im missing the freq by lots of ms (each ms is equal to 30cm so i cant effort much errors...)
can any one tell me what im doing wrong or guiding me to better ways of doing that??
The main issue is the recording device cant point on the actual time he recorded the wanted freq.....
thanks in advanced,
Ofer.

Please have a look at new audio features introduced in API 19.

Related

App using Accelerometer in Android

I am biggener in android. I am trying to implement a fitness app that can keep track of the running speed and running distance in Android. How can i calculate the above mentioned things ?
In theory you could analyse windows of accelerometer data and count the number of peaks and the forces of those to determine running. Then, if the user has entered an average step distance, that could give an equation of distance.
Would be a lot easier using GPS as it provides the speed directly.
You might be interested in this library: https://github.com/mcharmas/Android-ReactiveLocation I recently added Activity Recognition, which can tell you whenever a user starts running. Might take a little while from one begins to run before the phone 'knows' that as being the activity, though.

Building an app that records the last X mintues of you android device

I am trying to build an app that makes a video of your screen and only keeps the last X mintues of the video, i have found a code for running an adb shell commands from app:
Process process = Runtime.getRuntime().exec("your command");
BufferedReader bufferedReader = new BufferedReader(
new InputStreamReader(process.getInputStream()));
But i have looked for a lead of how i can keep the last X mintues with no luck, any ideas of how i can do that? or maybe its not possible without rooting?
tnx
I am trying not to build but just to find an android app that when I press "record" the previous (2-3-10 or choose from etc) minutes already saved, in RAM I imagine.
You may know when minidisk created (let's say about millenium or before) a few MD players had that function, where you can turn on & off this feature: If you had it on, when you push the record button it was the last 6 seconds already saved.
The logic of that was really simple: When you record a song from the radio and you may miss the start etc for a few seconds and press "record", the MD had the last 6 seconds already saved!
So, if you had it on and want to record it, you already had a few seconds written and then with just a simple edit you can delete them and set the start time (and delete the previous seconds) and the endting time (with deleting the last seconds).
Simple really.. But the point is that apart it's not something new, is 100% workable & easy to develop and I am amazed that I have not seen EVEN ONE so far...
Is it this hard to develop such a thing for a camera and a mobile?
I am asking about the driving of course, to not make videos everytime I drive (and I drive all day every day..) just with the mobile in it's stand in the car.
So, IF someone makes this application I have two things to say to him to add -> 1st to let the user decide how many minutes before he's like to have already saved and if possible, to work as long as the screen is even locked.
Why don't give it a try? I know I will not use it but nobody knows that for sure, but maybe the one time I'll need it, at last I will know that with the "recorde" or "capture" or whatever, that it's capturing but in the beginnings of it I already have the XX minutes recorded already.
USEFUL very indeed for drivers like me that don't want to put cameras in the car & with my phone holder IF I had that application with that feature it would be perfect.
If I am missing something, if there's somewhere an app like this and didn't check it, I don't know.
But for the developers ....just thing about it.
It's very practical, simple and you can have video proof in a ..let's say crash event without putting 2-3 cameras but only with mobile.
No idea is a bad idead for a smart developer I thing, then whoever wants sits and writes it.
Thanks, take care guys

Android automatically delete everything before last 2 minutes of video stream

I want to make an android app, that records a video stream and when the user does not push a button, everything before the last 120 seconds of the video stream gets deleted. This should run for hours so only ~50mb are in use all the time. Has anyone an idea how to record a video like a never-ending flow of data that allows me to access certain points and delete everything before those points?
I know this question is pretty general but I find it very hard to access android camera close to the hardware.
You'll probably run into file size limitations if nothing else.
A better approach would be to just keep recording 30-second videos, and delete any that are more than two minutes old until the user presses the "record" button, at which time you start keeping them.
Then splice them together into one long video afterwards.
By the way, this will kill your battery. I assume you're equipped to deal with that.

Android app for morsecode using camera flash

I am trying to develop an app which transmits morse code using camera flash light on phone. My transmitting part works fine. I am turning flash on based on DOT or DASH and off based on GAP, LETTER_GAP and WORD_GAP. all DOT, DASH, GAP, LETTER_GAP and WORD_GAP has different time duration for which they will be ON or OFF.
I have difficult time figuring out how to decode this on receiver side...I am using opencp binary threshold to see if there exist a bright spot in the image and not. Based on camera fps I can calculate how many frames had flash on or off consecutively which determines dot/dash/gap. here is the example.
Say from transmitter phone i am sending "abc xyz" as string. on receiver phone I am getting these String
.-#-.*..#-.*-. -.*.-#-.*--#--*.*. where,
"." - DOT
"-" - DASH
"*" - GAP
"#" - LETTER GAP
" " - WORD GAP
this string is exactly represents "abc xyz". The problem is I can not think of a way for receiver phone where to start looking for new message and when to stop, as everything is being sent using light signals. there is no sync between transmit and receive. I mean there is no way for receiver to identify start and end signal as i just process raw camera frames provided by opencv. Is there any way i can impose these? or alternative solution to make detection/decoding?
Please let me know if I am not clear. Thank You!
Well, there can be multiple answers. First you can ask for manual input from the receiver and analyze all frames for the first seconds. Perhaps you can always monitor and set a threshhold on the light pattern strength. You can also make a re-sync sequence where the sender shines the light for exactly one second and starts transmission. This would be the handshake and the rest the message.
Great work and hopefully you make an app out of it.
Check out the approach Shivam Kalra used here:http://www.codeproject.com/Articles/46174/Computer-Vision-Decoding-a-Morse-Code-Flashing-LED
tldr: allow user to set a coordinate in the picture frame and monitor brightness of the pixel(s) underneath the coordinate.

ActionScript 3: How can I keep an accurate BPM counter?

I'm looking to create a drum machine in ActionScript 3 (as an Adobe AIR Android app), which will keep to a user defined tempo (BPM).
I am struggling to find a way to keep the project in time, I have, at the moment, made it so that 5 different sounds are represented in rows of 8 squares, and the user can click each square to choose when to play that sound (hope this makes sense).
At the moment I am using Timer to keep the project in time, which is very laggy and inconsistent.
using timer is a bad idea for this, there I said it...
The issue is that the timer has a drift and fires several milliseconds later.
Try a simple test where you have a timer that executes every 500ms, and then compare the getTimer() count. What I have found in my experiments that the timer is continually off and it looks like it doesn't self correct. I've tried using a self-correcting timer, that changes the firing time based on the getTimer() difference since last run, but it's still not reliable. and anytime your processor's load picks up, the timer will be off anyway.
The correct way of dealing with this is to use byteArray data as a source for the sound. Based on the calculation of sampling resolution you can populate the stream with the data in advance, and the sound will play on time, pretty much guaranteed. I haven't gone as far as to create something that does this myself. But there are several libraries that you can utilize that can help you with this.
My top two decremented libraries are SiON and tonfall
you can see a sample of SiON here http://wonderfl.net/c/qf4b
and tonfall example at http://tonematrix.audiotool.com/
While I haven't tried them on android, I think either should work

Categories

Resources