I read a paper on an app named fingerIO that would use active Sonar and two microphones to trilaterate(length based triangulation) to map where a hand moved. I thought that if you just had the source attempting to identify itself you could take it a step further.
What it does
It will track in 3D space a phone emitting a designed series of chirps 13 times second. These chirps are inaudible to humans.
How we built it
We used 3 laptops and and IPhone. We put the coordinates of the laptops and the phones starting position and then began playing the chips at regular intervals. we used this to calculate how far the phone was from each laptop, and then trilaterate the position. We would then plot this in 3D in matplotlib.
Challenges we ran into
The clock speed of each of the computers is slightly different. Because sound travels at 340 meters per second a drift of less than milliseconds would make it impossible to track. We ended up hard coding in a 0.0000044 second adjusted period of chirps to compensate for this.
Accomplishments that we're proud of
That it actually worked! Also that we overcame so many obstacles to make something that has never been made before.
What we learned
We learned a lot about how sonar systems are designed and how to cross-correlate input signals containing random white noise with known signals. We also learned how to use many of the elements in scipy like fourier transforms, frequency modulated chirps, and efficient array operations.
What's next for Trimaran
I would like to use the complex portion of the fourier transform to identify the phase offset and get distance readings more accurate than even the 96000 Hz sound input rate from our microphones could find. Also, it would be cool to add this to a VR headset like google glass so you could move around in the VR space instead of just moving your head to look around.