The project is a spin off of the class "Machines that Make [almost] Anything" offered at MIT every three years. A current field of research in it is the usage of data flow to design, control and integrate data into machines (the largest example being Rhino's Grasshopper) - this project tries to create a solution to data integration and processing through the use of AR: we developed a full AR interface, a set of image protocols and a collection of devices (microcontrollers with sensors, mechanical parts, output etc.) to show that data flow in large IoT systems is not only manageable, but can also allow for a multitude of data processing techniques and data paths.
How we built it
We were able to easily set up a full set of IoT devices through the use of Photons (see Particle). We created 5 modules with different functions, listed below:
- light-out: has an LED that outputs light according to
- light-in: a photoresistor that detects the environment's brightness
- multi-light-out: LED strip that takes two parameters (intensity and hue) and outputs light accordingly
- poten-in: a potentiometer
- motor-out: a stepper motor that rotates according to a angular speed input
Challenges we ran into and accomplishments that we're proud of
Most of the challenges were having to learn and face new APIs and frameworks for working with augmented reality, data flow and management and IoT devices, such Photon. We're impressed that we were able to achieve so much with so little time (we didn't expect our prototypes to work so well haha). There's definitively much to take out of this - from the basics of integration of the web with our maker skills, to fast prototyping and improvising when time is short.