Inspiration

I'm a brain function analyst who likes to teach... and program. I'm also interested in how people learn and how we process natural language. I helped teach a middle school class on the brain and after learning about this competition, I decided wanted to use that experience to create a new Skill. Since I have experience with brainwave data analysis and a bunch of data I've collected on myself, I thought it would be cool to take kids through some of that information, giving them perspective on brain function. (In my original class, I actually did an EEG demonstration, showing them 3D data representations in real time - what is known as LORETA).

What it does

The game is structured as a list of locations which you visit and learn about. There are paths you can follow by saying, "next", as well as choice points. These are based on what we know about the actual connectivity of the brain. There are also 'facts' that you can learn by asking for information on a 'topic'. There are currently 23 topics including:'Alpha waves', 'Beta waves', 'Brainwaves', 'Cortex', 'Kim Peek', 'Neuron', 'Brain structure', 'Lobes of the Brain', 'Brain fat', 'The knee jerk', 'Brain function', 'Memory', 'Procedural Memory', 'Henry Molaison', 'Lateralization', 'Seizures', and 'Color blindness'.

At any time, you can ask "how am I doing?" or "score" to find out how many of the locations and facts you have covered. You can also ask for '"help" which will give you information about your options, as well as list a location and a fact you haven't heard yet. The easiest way to get new facts or locations is to tell Alexa, "you pick." She likes that.

The program tracks all the places and facts you visit, and will change the message you get at different locations depending on what you have visited. This is designed so that I can use more general and simpler terms early in play, moving to more complex and detailed terms later.

How I built it

The code is over 2000 lines of Javascript, aimed at running on Node on Amazon Lambda. There are more lines of JSON files that contain the locations and facts and their messages. There is also a set of rules that define what level of message to speak depending on what milestones have been covered previously. I designed some utility functions to verify the relationships between all the datafiles, as well as some command line functions to manually add elements the speech interaction model.

User data is stored on Amazon DynamoDB. I did unit testing with Mocha and used Amazon's testflow program to do integration testing. You can find the changes I made to the Alexa SDK to run a local instance of DyanmoDB in PR #242 on Github, although that is hopefully fixed and irrelevant by the time you read this.

Challenges I ran into

The main challenge I had was that the project is big. I still haven't included a fourth of the topics I envisioned from the class I taught. Being responsible, I wanted to make sure I had literature citations for any fact I included. There was a lot of detail work going typos in text or checking the specific ways things are said, like pronouncing 'conduct' as a verb (moving) instead of a noun (behavior). I created the sound effects myself (and that could be better!). The interaction of levels of knowledge affecting description of locations is a bit complicated and has subtle differences based on wording. If you are looking at a fact and say, "next" it gives you more info whereas when you are looking at a location and say, "next" you move along your story line. The "more info" command will deepen the level of information whether it is location or fact. I went through several iterations of deciding how I wanted each type of wording to process each of these types of distinctions. I am still having issues with the getting the word recognition model that Amazon provides to initiate Intents the way I want.

I've made Alexa skills before, but this is the first time I created any unit tests as part of development. I also had to make changes in the Alexa SDK to allow use of a local DynamoDB instance for my local testing - I felt I had to have a test harness I could run without using the Internet. I'm finding there are a lot of small tools that could accelerate development, but you have to build them yourself.

Accomplishments that I'm proud of

It works in basic form and passed Amazon certification! It took three passes, and I'm grateful that Devpost extended the certification deadline.

What I learned

A lot! Especially about balancing priorities.

What's next for Milton's Brain

I really want to encourage an understanding of how the brain places constraints on behavior, as I see so many people have challenges recognizing the amount of work different that tasks are for different people. I believe I can achieve this through adding more relevant facts and information. I also want to encourage a respect for the layers of history that build our knowledge, as we are currently faced with stories of "fake news" and a loss of confidence in the scientific method. I believe this can also be addressed with the proper education. You will see there are a lot of stories about who and how we acquired our current understanding of the brain, and I'd like to expand this.

More information will be available at http://emotrics.com/miltonsbrain/

Built With

Share this project:

Updates

posted an update

Still looking for more feedback on good and bad features. One of the middle school teachers I worked with in the 'Brain Explorer' class said she thought that the auditory modality might be too difficult for covering such material. I still like the idea of wandering through a mental map of connections, and want more opinions. I have now removed it as a kid's skill as the Skill didn't make the final cut, and this will allow me to have people in other countries see it. (Amazon Kid's skills only work in the US at this point).

Log in or sign up for Devpost to join the conversation.