Inspiration
In the past, one of us helped do research and used to work very closely with special needs children as a volunteer teaching assistant. During the research, I found many studies that showed that special needs students actually worked better in environments interacting with robots because they recognized that robots were judgement-free and students were less likely to be embarrassed or afraid of being disappointing. As a teaching assistant, I often times had to work with students on an individual basis but I would have loved to know if another student needed my attention. And thus, the idea for robota was born.
What it does
So robota acts as a ideal teaching assistant for special needs classrooms by
Step 1) roams classroom until it recognizes frustration in a student
Step 2) prompts frustrated student for an explanation
Step 3) completes tone & sentiment analysis on student’s response
Step 4) sends analysis if necessary to teacher/parent
How we built it
We used the Autonomous TurtleBot2 to take pictures of its surroundings and detect a frustrated student. It then uses a python script to play audio and record the student speaking. We then use an IBM-Watson speech to text API to transcribe the audio, and pass the information to our pubnub block and do sentiment analysis to notify the teacher/parent using Nexmo.
Challenges we ran into
We had a difficult time figuring out how to use PubNub especially when their APIs involving Clarifai were very outdated. Also, utilizing the custom Clarifai image models were challenging.
Accomplishments that we're proud of
Our first times using a robot, computer vision, and machine learning at a hackathon.
What we learned
Learned how to write a python script, use custom image modeling through Clarifai, and use Autonomous' turtlebot.
What's next for robota
Holding fully developed conversations with students Improving facial analysis and learning how to handle situation instead of just reporting to parent/teacher.
Log in or sign up for Devpost to join the conversation.