Project Story: The Journey of AURA

The Inspiration: Solving a Uniquely African Problem

In Africa, it is not ultimately difficult to see the vast potential of technology in sectors like agriculture , smart infrastructure and telecommunications. It is also simple to see the fundamental constraint holding it which is power. I was inspired by this paradox. The future of a data driven Africa having smart farms and highly connected infrastructure felt impossible and and hollow as long the sensors powering this future were affected by the power deficit in Africa or could not last more than a few months since they operate on battery.

My inspiration emerged from a brain neuroscientific principle called Hebbian Learning which goes: “neurons which fire together wire together“. However I wanted to apply a slight twist to the traditional neural network paradigm. Instead of following the conventional neural network model which neurons receive neural inputs, I took the sensors themselves as the neurons with different values (readings) at different times. We then normalize the values of these sensors between 0 and 1 for comparison. When they co-move according to our mathematical formula (metric), we say they fire together and thus wire together. Since they are co-moving, the system identifies this as redundancy and this brain inspired approach became the foundation for AURA.

The Journey: From a Formula to a Living System

The primary challenge then was to create a metric which would measure this redundancy. The main problem was to design a mathematical formula which had several critical properties: when a number of sensors n or a pair of sensors had no co-movement (diverse readings), it would give a value of zero and 1 when there was maximum co-movement else it would give in-between; it had to be symmetric and lastly it was suppossed to be flexible enough to compare any number of sensors while retaining all its properties, not jus pairwise comparisons.

After much experimentation, I developed a new, generalized mathematical formula that met all these criteria, which I call the AURA Index (A):

$$ A = \frac{\sum_{i=1}^{n} \sin^2 \left( \frac{\pi \cdot s_i}{\sum_{j=1}^{n} s_j} \right)}{n \cdot \sin^2 \left( \frac{\pi}{n} \right)} $$

This formula became the "brain" of the entire project. The next step was to build a body for it.

How I Built It:

The project evolved through a series of logical stages, moving from pure simulation to a complete, physical prototype:

  1. The Simulation Core: I started by first creating high perfomance engine in python. I used Numba which is a high perfomant library in python to compile the core logic to machine code for maximum code. This gave me the privilege to rapidly test the perfomance of the AURA on a large datasets.
  2. Adding Intelligence (The Learner): To be able to disable a sensor, the AURA index involving that sensor at that point must exceed a certain threshold. Moreover, we are supposed to specify the deactivation period for that sensor and the algorithm is supposed to intelligently determine the two parameters, threshold and the deactivation period. I integrated the differential evolution to running in the background periodically to autonomously discover the optimal parameters. Its objective is to maximize data fidelity (accuracy) while minimizing power usage.
  3. Bringing it to Life (The GUI): To give life to the system, I built a full-stack application.
    • Backend: The backend was implemented in FASTAPI due to its high perfomance and morden asynchronous characteristics. Moreover, using a python framework gives seamless integration with the algorithm logic. The backend has separate threads, one for the core simulation and the other for learner. This ensures that the system is always responsive.
    • Frontend: For the frontend I used Next.js and Three.js to create an immersive , interactive visualization of a sensor environment. It consists a real-time digital twin of the sensor network and a dashboard. This allows anybody to immediately understand what the systems was and is currently doing.
  4. Making it Real (The Hardware): The final objective was trying to try to simulate a real-world scenario. So I fetched LED lights to represent sensor nodes and mirror the state of the 3D visualization and the spatial orientation of the sensors in real-time. I integrated Arduino Mega due to its large number of pins. The Arduino Mega is connected to the FASTAPI sever, which collects the live sensor status and sensor commands to it.

The Challenges and What I Learned

This project was a journey of constant learning and problem-solving. The biggest challenges were not in the algorithm itself, but in the integration of these complex, disparate systems:

  • AURA index formula design: The most complex problem was designing a mathematical formula which possessed our desired properties. This compelled me to interrogate advanced algebra evaluate multiple mathematical functions.
  • Concept building: Finding a suitable parallel to draw this solution from, which is neuroscientific hebbian learning and then applying a slight twist to the neural network paradigm required some level of scientific and engineering expertise which made it insurmountable at certain times.
  • The hardware hurdle: Since I usually emphasize on algorithm design, multidisciplinary innovation and software engineering, my initial attempts to work with hardware and microcontrollers was met with cryptic errors and burning hardware components. This was ultimately frustrating but these were invaluable lessons to hardware debbuging and cautious practice when dealing with hardware. I also learnt how invaluable AI is in debugging for instance when the fix was simply to install specific drivers to make the hardware communicate with the computer.
  • The Frontend Crash: When I tried to integrate the Three.js 3D scene to simulate the sensor environment, I constantly encountered white screens an multiple errors. I then realized that my prefered that the framework I was using which is Next.js which supports server-side rendering but Three.js must handled carefully since its a client side library. With this intuition , I then made neccessary adjustments and everything worked.
  • The "Honest" Metric: On my first attempt, the first results seemed too low (~3% power saving). I was skeptical of this since I was confident in the theoretical expectations I anticipated. I scrutinized everything and realized that there was a logical flaw in implementation. I learnt two lessons from this: You should the fundamental and theoretical foundations of your solutions to be able to identify the presence of semantic issues and secondly I learnt a critical lesson in data science which is: the way you measure success in data science is as important the algorithm itself and should not always assume the algorithm is working as expected.

Through these difficulties, I realized that the the deep tech project is not just about the eventual single and brilliant idea. It is made even more beautiful and complete through the iterative process of building, testing, breaking and fixing a complete system from the ground up and the final AURA solution is a testament to that journey.

Built With

Share this project:

Updates