Have you ever experienced leaving home and miss your family, your pet, and especially your room!? Worry not, we feel you. That's when the MOVR Telebot comes in. In fact, we watched Modern Family (link) and thought the idea was inspiring. When video-calling your loved ones, MOVR lets you wander around the room via an enhanced UX design. Whether it's back home or on a beach in Mexico, it makes long-distance feel just a little bit closer.
What it does
Our MOVR Telebot project is a combination of an Android application, an Intel Edison-based apparatus. When connected in a video call, callers on both ends can control the robot remotely through the application. The communication between the application and the microcontroller is established on Wi-Fi.
How we built it
The basis of our application begins with the Intel Edison board which we programmed using node.js.
- Apart from the Intel Edison board, we used limited materials such as DC motors, a breadboard, popsicle sticks and cardboard to build our vehicle. We used duct tape to solidify the structure, and on top of that, we used EV3 Lego wheels. The circuit simply uses the digital pins on the Edison to receive commands from the Android application, and the IoT capability of the microcontroller let us set up a wiFi connection between the user and the driving system.
- Development was first done by setting up the ssh on the Intel Edison.
- Then, we set up a web server to control the robot from afar.
- Tests involving blinking lights were then run based on online tutorials to familiarize ourselves with syntax.
- We then coded the motor control system on the Web App, creating a simple user interface with control buttons (left, forward, right).
- Once we verified that the system worked, we started working on the Android app. The app was to be used to control the robot while simultaneously streaming video via Sockets. By using the open source Sinch SDK, we were able to implement a quick application by adding additional code for the buttons to control the telebot to send requests to the Node.js server. The streaming of the video was taken care by the Sinch SDK.
Challenges we ran into
- The first challenge was setting up the Intel Edison with Node.js. We spend more time than we wanted to set up the Edison. However, once this was completed, everything else fell into place. Implementing the server was straightforward. However, our second obstacle was using Sockets to send requests from the Android app. The version for Java was outdated compared to its Node.js counterpart. As a result, the app could not connect to the server. After hours of research, we fixed it by figuring out the root of the problem by downgrading the version on the server, and following older documentation.
Accomplishments that we're proud of
- Learning completely new technologies (Edison) and being able to integrate all of the different components into a seamless application.
What we learned
- We learned how to use new technologies, at least one of which was new to each member of the team. We learned to manage our time effectively and prioritize the core components of the system. Most importantly, we learned how different technologies can be unified to create a wonderful user experience.
What's next for MOVR Telebot
- As this prototype had strict time constraints (~24 hours), this is but a working model to demonstrate one of the many possible applications a robot with video-guided control system.
- Features such as 360 view (the phone stand could be set up on a rotating servo to look around), could be added and be used for a better view.
- If built rigidly, and if there is a way to deal with electromagnetic interference, our simple interface could be used to control robots navigating through rough terrain (natural disaster areas, areas with radiation, warzone areas to minimize human casualty risks). -The possibilities are only limited by your creativity.