Inspiration

We combined our passion for computer engineering with the purpose of helping society. AI in combination with robotics has an enormous potential in future health care applications. The Internet of Things will start the next 'industrial' revolution. Smart sensors and (humanoid) robots will be deployed in a variety of environments, including but not limited to clinics and retirement homes. For this hackathon, we focused on clinical applications and created the Duckiebots meet AI - Clinical Human-Robot Interaction:

What it does

We have implemented five main pillars that integrate NLU, Machine-Learning and Visual Navigation with our Robotics Platform:

  • Visual understanding of the environment
  • Object recognition
  • Assistance in critical situation (e.g. CPR)
  • Patient prioritization
  • Visitor guidance

For instance, the command -- "Map and explore your environment" -- initiates the visual understanding of the environment.

How we built it

Little sleep, teamwork, the duckiebot robotics platform, open source and sponsored API's (thank you Nuance) were key to create this project. We divided our project in smaller subtasks and later integrated all modules to the presented solution. In terms of algorithms and libraries, we used ORB-SLAM to visually teach the robot about the environment and to create an ever improving map. Robot movements are controlled by ROS and the camera feed is monitored by YOLO to recognize objects in the images. Thanks to the Nuance Mix API, we integrated verbal human command structures that make the integration within the proposed environments easier.

Challenges we ran into

Besides the lack of sleep, we ran into unexpected issues with ROS, YOLO and OpenCV. We also reached the computational limit of our duckiebot robot platform (Raspberry Pi3) and had to change our design from on board processing to external computation.

Accomplishments that we're proud of

We are proud of the processing pipeline that we implemented during this hackathon from scratch. The integration of the different modules was challenging, but fun! We set up the arena especially for this event and created new functionality for the Duckiebots that previously did not exist. For the short amount of time we had, we are very proud of our localization tools and that we integrated Software and Hardware.

What we learned

We learned about making crucial design changes when required, especially regarding the computational limits of the Raspberry Pi. In addition, there were new topics for every team member. Note to team: "Directory Inception" is a bad idea.

What's next for Duckiebots meet AI - Clinical Human-Robot Interaction

The next level of this project is to further develop the AI capabilities, integrate chat bots to answer in a human manner and design a humanoid robot that uses this technology (e.g. inMoov). As a final remark, we would like to mention that applications that can be derived from these technologies are not limited to the healthcare sector and can be used in emergency response and disaster handling, among many other areas, too.

Acknowledgements The team would like to thank Professor Giovanni Beltrame for providing the MIST Laboratory as test grounds. We would also like to thank Professor Liam Paull for providing the Duckiebots.

Built With

Share this project:
×

Updates