When partaking in an activity so general and essential as buying groceries, it is easy to neglect the labor intensive aspect of bringing in groceries. Moreover, for some, such as the injured, elderly and disabled members of our society, the minor burden of bringing groceries into one’s house becomes a monumental, and in some cases, impossible, task Foods such as milk, which clocks in at approximately 8.6 lbs and rice bags, which can reach over 50 lbs., which are ubiquitous in homes, turn what should be a simple trip to the grocery store, to a process that is expensive and cumbersome at best to impossible at worst. Simply put, our society does not have adequate accommodations for something as essential for grocery shopping. While there are ample services and employees within groceries to help address this issue, their help stops short at the grocery store and serves no help to those suffering from arthritis, for example. Especially as over 25% of senior citizens aged 60 and older live by themselves in the United States, this issue becomes increasingly more problematic. With these dire issues in mind, we sought to create MOVE, an autonomous assistive robot for grocery unloading; one that is designed to assist the injured, elderly or disabled people of our society.

What it does

Move is designed to be an autonomous grocery carrying robot, designed specifically to help elderly, disabled, and injured people with the essential task of bringing in groceries. The robot interfaces with a mobile application in order to call the robot to pick up the groceries. The setup occurs in the mobile app, where parameters are sent to the robot which describes the path that the robot must take to the trunk of the car. Additionally, the user is given the ability to prompt the robot to unload the groceries, once again through the mobile app. On the hardware side, in order to facilitate the movement of groceries, MOVE implements a spinning storage plate which serves as an area to store bags while in transport, increasing efficiency by allowing multiple grocery bags to be delivered in a single trip. The rotating aspect of this subsequently allows for these bags to be dropped off without unnecessary movement of the robot. When MOVE is ready to unload the bags, it keeps the claw at a stationary position and rotates the storage plate to allow the claw access to the various grocery bags. Following the common theme, this can also store other items that need to be transported.

How we built it/How it works

The mobile application is written in Swift, using SwiftUI. The app communicates with an ESP8266 NodeMCU module through WiFi, specifically through a web server. The workflow of this infrastructure works by the iOS app making get and post requests to the flask webserver, and the robot doing similarly. For instance, when submitting coordinates in the app, the app makes a POST request to the server, and subsequently, via the Esp module, the robot can make a GET request to get these values. The robot parses the JSON and processes the data. The advantage of this is that it works through wifi, making this a solution that is not dependent on factors such as distance from the robot, as would be a critical limitation of technologies such as bluetooth. Finally, the flow of data concludes with the passage of data from the esp to the v5 brain via serial communication. The brain uses RS485 communication whereas the ESP requires UART, so an integrated circuit ship was used that allowed for the connection between the microcontrollers. The TI SN65HVD178 allows for a half-duplex communication to go both ways through the use of a control wire that switches between receive and transmit with LOW and HIGH respectively.

The chassis of the robot was designed and constructed with four omni wheels in a standard tank drive configuration. Each wheel is directly driven by a 200rpm brushed DC motor. The skeleton of the chassis is also designed and constructed with aluminum C-channel bars. Moving up the robot we reach the turntable. It is constructed with a 0.08” thick lexan sheet cut to a circle with a diameter of a yard. This is then secured to a 84 tooth gear 2.1Nm torque DC motor, on a 1:7 gear reduction for a combined torque of 14.7 Nm torque for the rotation of heavier objects. This skeleton of this system is braced by vertical and triangular supports to better endure the weight. Next, is the two stage chain bar system with a unique arrangement of sprockets, and chain. A single stage chain bar is able to keep the two opposing ends parallel no matter how the arm moves. This is accomplished by a careful assembly that moves only the front sprocket and chain by lockinging the back sprocket’s rotation. We realized this has limitations to its motions so we brainstormed a two staged system for the chain. This means the first stage will move (1:7 gear reduction) while keeping the second stage oriented, and the second stage (also 1:7 gear reduction) can move with changing the orientation of the claw, allowing for a variety of complex V-shaped motions and increased maneuverability.

Finally, the end sprockets are attached to a claw. This aluminum framed claw is created using C-channels and a gear reduction of 1:5 powered by a 100 rpm brushed DC motor in order to have enough torque to grab items securely and reliably. Lastly, we added sheets of anti slip matting on the inner surfaces of the claw to reduce slippage and padding to the items we are grabbing.

The robot code uses odometry combined with a pure pursuit controller to travel. Based on the odometry, a path is calculated that consists of an array of coordinates that the robot moves along. The robot uses encoders and an inertial motion unit to detect its own heading and absolute position. The pure pursuit controller calculates the next point of incidence on the path, and using the data from the odometry calculations to get the robot’s absolute position, generates a circle that connects the two points. A robot can easily move along a circle since the drive used is a differential drive i.e. the robot turns by powering the right and left sides differently. We used the difference in chassis speed as a function of radius, and the pure pursuit algorithm takes in R, and calculates the difference in motor speeds. This constitutes the feedforward loop in response to the feedback from the odometry calculations. The algorithm then identifies the next reference point. There is also a 2D motion profiler running on top of the motion so that there is a clean acceleration curve so that the error is minimized, and to have the least jerk and least overshoot in each motion to the next reference point.

Accomplishments we’re proud of

We are proud of our unique adaptive robot algorithms that use feedback and feed-forward control systems so the robot takes inputs from the physical world and reacts and calculates its motions accordingly. We are also proud of the wireless communication we were able to develop in the short amount of time as it is our teams first time ever working with wireless communication methods via WiFi modules. Similarly, we are also proud of the app that communicates via WiFi with the robot. We are especially proud of the user interface we were able to create, which is designed to be aesthetically pleasing, as well as intuitive. Thinking about, visualizing the design, as well as using design tools like Figma took a significant amount of effort, so we are extremely happy about how the UI turned out. We also take much pride in our unique mechanics. The two stage chain bar adds reliable, smooth, precise movements for the claw’s path. This was the first time our team ever made such a mechanism or something remotely similar. The complexity of this robotic manipulator combines multiple different basic and advanced base mechanisms to widen the range of the manipulator.

Challenges we ran into

Covid-19 acted as a significant barrier when working on this project. MOVE entails a significant hardware component, and the physical nature of this project made it more difficult for us to collaborate effectively. The way we got around this hurdle was through a strict division of roles with this project. However, this solution inadvertently resulted in other issues, such as the difficulty that arose from ensuring that every part of the project can communicate with the other parts so that a stable workflow can be established. We also faced issues with the execution as well, simply because we were developing our own algorithms for the robot’s motions and had to do numerous article searches to learn about some more new advanced algorithms in order to accomplish the motions on the robot with the physical factors: momentum, friction`, etc.

What we learned

Mechanically, we learned/developed a new method of maintaining orientation of the end of an arm by physical means even though the arm is able to bend V-shaped and even straighten out.

Software and electronically, we learned how to code and wire a WiFi board, ESP-8266. We learned how to also integrate two very different MCUs. Lastly, we learned how to use image processing in order to detect the bags,

Team wise, with the challenge of social distancing, we had to learn to be more efficient debugging especially with such a project that heavily focuses on the use of software being integrated with hardware/mechanics. We learned to communicate problems and solutions faster by having a schedule and setting strict time periods where we must call and check up where each subsystem is at in its development.

What’s next for MOVE

On the mobile app end, with more phones implementing LiDARS allowing for more accurate AR, the user could potentially map out the area where the robot will be used and select an end point or a location for the robot to move to using his/her finger rather than manually plugging in the coordinates. However, a key advantage of how we designed this system is in its extensibility. With only a few changes to only a few parts of the system, we can effectively address other issues commonly faced consisting of heavy-object lifting. A few of such applications would be box stacking or flight baggage handling to prevent back injuries. These applications would require a more robust motor.

Built with

C++, Swift (SwiftUI), C (Arduino), VEX Robotics Hardware, python (flask)

Share this project: