Our team loves playing with hardware but wanted to learn web design and test the power of real-time servers. We are very fond of drones, and given the availability of hardware, we decided to take advantage of using a parrot drone with Arduino sensors. Upon seeing the recent trial of drone surveillance systems in the UK airport, as well as one of our team members working on a drone research project as an internship, we were very fascinated with the possibilities and decided to implement our own drone deployed surveillance package.

How we came up with the name

  1. We're using a drone called Parrot AR.Drone 2.0
  2. Parrots are a type of bird, and so are eagles
  3. Surveillance is synonymous to an artificial eye since they are always watching
  4. Eagles are famous for having extraordinarily good vision aka. surveillance

(Funny discovery: there's a thriller film called Eagle Eye where these two kids are being controlled by technology and put in increasing danger. Surveillance systems typically symbolize highly oppressive governments, and drones have become a very popular form of technology lately. Obviously, our system was built to provide a safety mechanism for the general population, such as monitoring our homes when away)

How It Works

An Arduino board is connected to a motion sensor, accelerometer, and gyroscope. Each of these sensors was implemented and programmed to constantly return readings. For example, the motion sensor will return a value of 0 if it detects no movement, 1 as soon as it detects movement, and 2 as it is rebooting. These readings are stored into a text file data.txt via. a serial port. The web server (which also happens to have a very attractive UI) automatically calls that generated text file every few seconds and will update the displayed HTML to include new data values.

The web server contains information about the project and was coded using HTML, CSS and some Javascript. There are many design features implemented like background locking when scrolling, multiple pages, and creative fonts imported from Google Font API.

The final aspect of the project involves an elaborate speech-to-text system which first transforms audio into text, then analyzes the contents of the original audio file to extract key information such as important persons, significant locations and political affiliations. The bulk of the logic behind this process is housed on Windows PowerShell, which also contains a Python script embedded within it. Mozilla’s Project DeepSpeech based on Google’s TensorFlow is the software used to convert the audio files to text. The PowerShell script then exports this data to a text file from which the Python script reads the text and feeds it to the Indico Machine Learning software. From this point, several analyses are run on the data.

Challenges we Ran Into

We were initially going to attach a sound sensor to retrieve and upload sound files, however due to the lack of op-amps capable of amplifying the voltage output with gain over 1000, this could no longer be implemented. Unfortunately, the microphone hardware for the Arduino was broken so it could not be interfaced directly with the Arduino. As a substitute, there is functionality to create audio files using Audacity and the PowerShell script is directly linked to the save location and is able to detect the most recently added file. This serves as a proof of concept for the Eagle-eye’s audial espionage component.

Another major challenge we had was a lack of experience with Javascript and jQuery. It caused us to struggle with finding a way to have real-time uploading to the web server whenever new data was available from the Arduino. We looked through many tutorials using HTML5 and Ajax to no success. We also tried using Google Firebase, however there were too many JSON structures and unfamiliar concepts. However, thanks to one the mentors, he guided us in using jQuery and Ajax to upload our retrieved data almost instantaneously to the server!

We also registered for a domain called quite early (see attachment below) but never got a response. :( So our web server only runs locally but regardless, still has full functionality.

What we Learned

The scope of this project prompted all of us to explore new technologies and search for solutions we had never considered when working on projects of this scale. How to use version control systems and to code in front-end development were very critical skills we would not have obtained otherwise.

One thing we learned was a method of Serial communication between the Arduino and the computer. An ultimate goal of the project was to link the data obtained from the payload back to the webservice so that information could be displayed and updated. Since we did not have an SD card to adequately store any data on the Arduino, nor did we have an ethernet shield to assist us in interfacing with the web, we found a unique solution within the open source Processing language. Using processing, we were able to communicate with the Arduino and write serial data into a text file. From here, we were encouraged to learn the JSON Javascript framework as to properly format this data so it could be transmitted from the file onto the web service. Plenty of research and development went into our speech-to-text solution. As there are very few open source projects pertaining to the idea, we researched DeepSpeech, based on TensorFlow, to attempt an adequate implementation. Finally, the core of EagleEye is established by its radio communication between two remote Arduinos, as to transmit data from a remote location back to the Arduino connected to the computer of the user.


EagleEye was purposely designed to be modular and adaptable for any application. With more time and resources, additional sensors and components could easily be added to the payload. A scenario which involves location tracking could use a GPS shield, high capacity battery and extended range antenna. Live video+audio could be achieved with a camera and microphone. A parachute deployed by a servo could even be considered, for release from high altitudes. The radio transceivers used allow multiple channels, opening the possibility of multiple EagleEyes being deployed from a single drone and forming a radio network, greatly increasing their coverage.

Built With

Share this project: